Skip to main content

The Challenge

HYROX wants a computer vision system that automates squat depth validation while keeping human oversight for competition integrity. With 425,000 participants projected for the 2024/25 season across 80+ global events, and individual events regularly attracting 2,000-4,000 participants, the system must operate at massive scale. It needs to process thousands of repetitions per hour across 60-80 simultaneous wall ball stations typical at major events. Rather than replacing judges entirely, the technology will augment human decision-making by handling routine validation while flagging edge cases for manual review. This hybrid approach balances automation efficiency with the nuanced judgment that complex situations require.

Our solution must work with existing infrastructure to protect HYROX's technology investments. The current Digital Wall Ball Target system already handles ball impact detection and rep counting through integrated sensors and displays. Our computer vision layer will add squat validation capabilities without disrupting these proven systems. This integration requires developing compatible APIs, maintaining synchronized data flows, and ensuring failover modes that preserve basic functionality if vision components encounter issues.

Global deployment requires flexibility to adapt to different venue types, regulations, and operational constraints. The system must function equally well in convention centers, outdoor stadiums, and temporary event spaces. It needs to comply with privacy regulations from GDPR in Europe to data localization requirements in Asia. The technology must accommodate events ranging from 1,000 participants in emerging markets to 10,000+ athletes at world championships, scaling smoothly without architectural changes.

Detailed Specifications

Performance Requirements

The system must process everything within 200ms from camera to display - this cannot be compromised. This specification encompasses the entire pipeline from image capture through pose estimation, validation logic, and communication to the target display system. Athletes completing wall balls at competition pace perform roughly one repetition every 2-3 seconds, making 200ms latency imperceptible to human observation. This requirement ensures the system provides immediate feedback without disrupting athlete rhythm or creating judge confusion about rep status.

Accuracy targets require 95% agreement with expert judges on clear repetitions and 85% accuracy on borderline cases. These metrics will be validated through extensive testing against annotated competition footage where multiple qualified judges have reached consensus on correct calls. The system must achieve these accuracy levels across diverse body types, movement patterns, and environmental conditions. False positive rates (counting invalid reps) must remain below 2% to maintain competition integrity, while false negatives (rejecting valid reps) should stay under 5% to prevent athlete frustration.

The system must stay up 99.9% of the time during competitions - less than 1 minute of downtime per 16-hour event day. This reliability standard accounts for both complete system failures and degraded performance that impacts judging quality. Redundant components, automatic failover mechanisms, and graceful degradation strategies will ensure continued operation even when individual elements fail. The system must also recover automatically from temporary disruptions like power fluctuations or network interruptions without manual intervention.

Functional Capabilities

Multi-athlete tracking handles relay and doubles competitions where multiple athletes share station space. With Doubles divisions accounting for 49.7% of all participants and growing rapidly, the system must track up to 4 athletes simultaneously occupying a single wall ball area. It must identify which athlete is actively competing based on movement patterns and position relative to the wall. Non-competing athletes might be resting nearby, preparing for their turn, or transitioning through the space. The algorithm must maintain accurate tracking even when athletes cross paths, temporarily occlude each other, or move in synchronized patterns during warmups.

Wristband recognition will automatically assign divisions without manual input from judges. Athletes wear colored wristbands indicating their competition division: Open Men (blue), Pro Men (black), Open Women (pink), Pro Women (red), and Adaptive (green). The computer vision system will detect these high-contrast markers to apply appropriate judging standards, as Pro divisions require heavier weights and higher rep counts. Manual override capabilities through the judge interface ensure correct classification when automatic detection fails due to wristband obstruction or unusual lighting conditions.

Starting position validation prevents cheating by ensuring athletes begin each rep standing fully upright. The system must verify that each repetition begins with hips and knees fully extended while holding the ball at chest level. This requirement stops athletes from bouncing out of the bottom position or catching the ball while already descending into the next squat. The detection algorithm needs to distinguish between valid starting positions and transitional movements that occur between reps, applying appropriate tolerance for natural movement variation while maintaining standard enforcement.

Operational Requirements

The system must work offline because many venues have poor or no internet connectivity. All core processing must occur locally using edge computing resources positioned on-site. The system cannot depend on cloud services for real-time operation, though optional cloud connectivity might enable features like remote monitoring or post-event analytics. This requirement extends to software licensing, which must support offline validation without phoning home to authentication servers during events.

Two people must set up the entire system in 8 hours - this matches typical venue access windows for traveling events. The installation process must accommodate venues where setup occurs overnight before competition day, with limited time windows and concurrent activity from other vendors. Hardware components need efficient packaging within standard shipping dimensions, clear labeling for rapid identification, and foolproof connections that prevent installation errors. The setup procedure should include automated calibration routines that minimize manual configuration requirements.

Equipment must meet IP64 weatherproofing standards to survive outdoor events and venue accidents. Cameras and processing units must withstand dust, moisture, and temperature variations typical of competition environments. Cables need protection against foot traffic and equipment movement. The system should continue operating despite spilled drinks, chalk dust clouds, and condensation from athlete exertion. This ruggedization extends beyond environmental protection to include impact resistance for equipment positioned within potential contact zones.

Timeline Constraints

We must submit our proposal by August 15, 2025 with detailed technical approach, hardware specs, software design, and timeline. The proposal must demonstrate deep understanding of HYROX's operational challenges while presenting realistic solutions backed by relevant experience. Cost structures need to reflect both initial development and ongoing support requirements. Team qualifications should emphasize computer vision expertise, sports technology experience, and global deployment capabilities.

Alpha testing begins October 2025 at Chicago Lab - our first chance to prove the core system works. This phase will validate fundamental pose estimation accuracy, latency performance, and basic integration with target hardware. Testing will occur in controlled conditions with HYROX staff athletes performing standardized movement patterns. The alpha system must achieve basic squat detection, depth measurement, and rep counting capabilities. Success criteria include meeting latency requirements and achieving >90% accuracy on clear repetitions under optimal conditions.

Beta testing starts December 2025 in real gyms with actual athletes in real-world conditions. Selected HYROX-affiliated gyms will host the system during training sessions, exposing it to diverse body types, movement patterns, and environmental variations. This phase tests system robustness against visual noise, varying lighting, and multiple simultaneous users. Beta deployment will validate the user interface, manual override functions, and data export capabilities. Feedback from coaches and athletes will guide refinements before public deployment.

February 2026 brings public beta testing in event warm-up zones where athletes can try the system without affecting official scores. Athletes can experience automated judging during pre-event preparation, providing valuable user feedback while identifying edge cases. This phase validates setup procedures, judge training requirements, and technical support protocols. The warm-up zone deployment also generates marketing buzz and athlete buy-in before official implementation.

April 2026 launches full-scale event testing where the system runs alongside human judges at real competitions. The computer vision system will run in shadow mode, making judgments that are recorded but don't affect official scoring. This allows direct comparison between automated and human decisions across thousands of real competition repetitions. Success metrics include agreement rates with human judges, system reliability under full load, and operational efficiency improvements.

Global rollout happens July 2026 across all HYROX events worldwide. The system must scale to support 80+ competitions annually across 60+ countries. This requires manufacturing capacity for hundreds of camera units, established installation teams in major markets, and 24/7 technical support capabilities. The rollout strategy will likely phase deployment by region, starting with major markets before expanding to smaller events.

Success Metrics

Judging consistency must hit 95% or higher across all stations, events, and locations worldwide. This metric measures whether identical movements receive identical judgments regardless of external factors. Consistency will be validated through standardized test videos evaluated across different system instances. Regular calibration checks ensure long-term stability as components age or software updates deploy. Athletes should trust that their performance will be evaluated identically whether competing in Chicago, Berlin, or Singapore.

We must reduce judge requirements by 50% or more to prove the operational efficiency gains justify this investment. Current events require 100-700 total volunteer judges, with 40-80 specifically for wall ball stations. The automated system should enable one judge to oversee 4-6 stations, dramatically reducing the staffing burden that currently limits HYROX's expansion. This reduction translates directly to cost savings, simplified logistics, and expanded event possibilities in markets with limited judge availability. The metric accounts for both direct station coverage and supervisory roles required for system oversight.

Athletes must rate fairness and accuracy 4.5 out of 5 or higher to confirm the technology improves rather than complicates their experience. Post-event surveys will assess athlete confidence in judging consistency, satisfaction with real-time feedback, and overall system transparency. These subjective metrics complement objective accuracy measurements by capturing the human element of competition integrity. High satisfaction scores drive continued participation and positive word-of-mouth marketing.

System uptime must reach 99.9% during competition hours to prove reliability event organizers can trust. This metric encompasses all system components from cameras through processing units to display integration. Downtime calculations exclude scheduled maintenance windows but include any degradation affecting judging quality. Achievement requires redundant architectures, proactive monitoring, and rapid response protocols for issue resolution.

Future Directions

Future expansion to other exercises will use the same computer vision platform for broader competition coverage. Once proven for wall balls, similar technology could validate rowing form, monitor sled push depth, or ensure proper burpee execution. Each new exercise would require specific pose models and validation logic but could reuse the camera infrastructure and processing architecture. This expansion potential multiplies the return on HYROX's technology investment.

Advanced analytics and coaching tools will turn raw movement data into useful insights for athlete development. The system could identify technique inefficiencies, predict fatigue onset, and recommend training focus areas. Real-time coaching cues during competition could help athletes optimize their performance while maintaining form standards. These value-added services create new revenue opportunities through premium athlete subscriptions or coaching partnerships.

Broadcast enhancements will improve spectator experience through data-driven graphics and storytelling. Real-time statistics like average squat depth, rep cadence, and power output estimates create compelling visual narratives. Automated highlight detection could identify exceptional performances or dramatic moments for instant replay. These broadcast improvements help HYROX attract media partners and expand audience reach beyond event participants.

Integration with wearables and training platforms will extend the ecosystem beyond competition days. Athletes could use HYROX-certified apps to validate training sessions, ensuring practice matches competition standards. Wearable sensors might provide additional biometric data that combines with video analysis for comprehensive performance assessment. This ecosystem approach strengthens athlete engagement while creating recurring revenue streams between events.