DeepSight Challenge 2026
18 hours to push the boundaries of Computer Vision
& Robotics — innovate, build, and compete for glory.
What is DeepSight?
AI Meets Real-World
Computer Vision
DeepSight Challenge 2026 is RoboGenesis Club's flagship technical hackathon focusing on Artificial Intelligence and Computer Vision. Designed to provide a platform for students to engage in practical, real-world problem solving, it encourages innovation, analytical thinking, and collaborative development under time constraints.
Problem statements and datasets will be revealed on the day of the event. Teams must match or exceed a provided benchmark performance metric, or propose a strong alternative approach. Evaluation will be conducted by a panel of faculty members and industry experts through a structured, multi-stage process.
Computer Vision Focus
Tackle real-world CV challenges with state-of-the-art tools and datasets.
Expert Evaluation Panel
Judged by faculty members and industry experts through multi-stage evaluation.
Benchmark-Driven
Meet or exceed provided benchmarks, or justify a compelling alternative approach.
Why DeepSight?
Core Objectives
Building the next generation of AI & Robotics innovators through hands-on experience.
Promote Innovation
Foster innovation in Artificial Intelligence and Robotics through practical application and creative problem solving.
Hands-On Exposure
Provide hands-on experience with real-world datasets, benchmark metrics, and production-level Computer Vision challenges.
Teamwork & Rapid Prototyping
Develop teamwork, analytical thinking, and rapid prototyping skills under real-time constraints and pressure.
Judging Criteria
Evaluation Rubric
A transparent, structured scoring system evaluated by faculty and industry experts.
Model Performance
Accuracy/F1/IoU vs benchmark, generalization capability, and inference efficiency.
Innovation & Originality
Novel approach, creativity, improvement over baseline, and use of advanced techniques.
Technical Implementation
Code quality, modularity, framework usage, and reproducibility of results.
Problem Understanding
Problem clarity, approach justification, and data pipeline design methodology.
Presentation & Communication
Clarity of explanation, result visualization, and Q&A handling proficiency.
Progress & Consistency
Checkpoint progress, teamwork dynamics, and effective task distribution.
Performance Bands
Exceeds benchmark significantly, highly innovative, production-level implementation
Meets or slightly exceeds benchmark, solid implementation with minor gaps
Meets basic requirements, limited innovation or incomplete optimization
Does not meet benchmark or lacks proper implementation
Prizes & Rewards
What You Can Win
Compete for cash prizes, certificates, and recognition from industry experts.
+ Certificate & Recognition
+ Certificate & Recognition
+ Certificate & Recognition
Every Participant Receives
Registration
Registration Policy
Team-based participation. Bring your own laptops and your best ideas.
Bennett University
Teams
- No registration fee
- Refreshments included
- Participation certificate
- Event ID card
External Teams
With Food & Snacks
- Food & snacks provided
- Refreshments included
- Participation certificate
- Event ID card
External Teams
Without Food
- Refreshments included
- Participation certificate
- Event ID card
- Bring your own meals
Event Schedule
18 Hours of Pure Focus
Hackathon Kickoff 🚀
Problem statements and datasets revealed. Teams begin working. The 18-hour clock starts ticking.
Evaluation Checkpoint 1
First progress check by the evaluation panel. Assessment of problem understanding, methodology, and initial approach.
Evaluation Checkpoint 2
Second progress evaluation. Monitoring of implementation strategy and model development. Midnight snacks & fuel provided.
Evaluation Checkpoint 3
Third progress evaluation. Deep dive into model performance and optimization approach.
Evaluation Checkpoint 4
Final progress check. Last chance to refine solutions and prepare presentations.
Code Freeze & Final Judging 🏆
All coding stops. Final evaluation based on completed solution, performance metrics, and team presentation. Winners announced!
Important
Rules & Guidelines
Benchmark Requirement
Teams must meet or exceed the provided benchmark, or justify an alternative approach convincingly to the panel.
Bring Your Own Laptop
All participants must bring their own laptops and necessary peripherals for the hackathon.
Document Pre-Trained Models
Any use of pre-trained models must be properly documented and disclosed to the evaluation panel.
Zero Tolerance for Plagiarism
Plagiarism or use of external solutions without proper attribution will lead to immediate disqualification.
FAQ
Got Questions?
The hackathon is open to all students with a passion for AI and Computer Vision. Both Bennett University students and external teams are welcome. Team-based participation is allowed.
Bennett University teams participate for free. External teams can register at ₹1,000 per team (with food) or ₹750 per team (without food). All participants receive refreshments, certificates, and event ID cards.
Problem statements and datasets will be revealed on the day of the event at the hackathon kickoff (3:00 PM, April 24). A benchmark performance metric will also be provided at that time.
Evaluation is conducted by faculty and industry experts through a multi-stage process: continuous evaluation at 4-hour intervals to assess progress and approach, monitoring of problem understanding, methodology, and implementation strategy, and a final evaluation based on your completed solution, performance metrics, and presentation quality.
Bring your laptop, charger, and your skills in AI & Computer Vision. Refreshments will be provided to all participants. External teams with the food package will also receive meals and snacks throughout the event.
Yes, you can use pre-trained models, but they must be properly documented and disclosed. Plagiarism or use of external solutions without attribution will lead to disqualification.
Faculty Mentors
Guided By Experts
Dr. Prateek Yadav
Mentor
Dr. Navneet Pratap Singh
Mentor
Dr. Rajeev Tiwari
Dean
Ready to Compete?
18 hours. One challenge. Prove your skills in Computer Vision & AI.
Register your team for DeepSight Challenge 2026 — April 24–25.