In May 2017, the UCLA Bruins met the Stanford Cardinal for the third time in four years in the title game of the NCAA women’s water polo national championship tournament. And for the third time in four years, UCLA lost. This time, Stanford’s Maggie Steffens stole the ball, and in the final seconds of regulation, she muscled a shot into the cage for the 8-7 win. Her goal marked the third consecutive year the champion was decided by a goal scored within the final fifteen seconds of play. Throughout the game, Stanford went 3-for-7 on power plays, while UCLA only went 2-for-6. By the end of the game, this small difference in margins made a huge difference in outcome.

What if, in one their missed power plays, UCLA had shot the ball from a different position? What if they had taken a skip shot instead? What if they had run a different play? Currently, addressing these questions requires many hours of painstaking, manual scraping through game footage. Even then, the player locations would be qualitative. Only with player and ball tracking data can we begin to quantify what makes the best shot.

Over the last eight years, companies like STATS have fueled an “analytics revolution” by providing player and ball tracking data in professional sports. They have enabled analyses beyond traditionally-recorded statistics, which helps equip teams with competitive advantages. However, their systems largely rely on sophisticated, multi-camera setups.

Although water polo is the oldest team sport in the Olympics, it is a niche sport. Without a professional league, the US national team travels to pools around the world for exhibition matches. Therefore, we introduce a more affordable, portable solution for automatically tracking water polo players and the ball in video recordings taken with a single camera, e.g., a smartphone.

The ability to automatically detect and track water polo players presents unique challenges. Players are always at least partially occluded. The water is dynamic, precluding the standard practice of background subtraction. Furthermore, half the players wear light-colored caps that look similar to water splashes.

We overcome these challenges by adapting Fast R-CNN object detectors, coupled with a novel hierarchical track-bydetection algorithm. We achieve promising per-frame detection performance of players (0.73 mean average precision) and the ball (0.62 AP), and automatically produce long-range player tracks (Figure 1). Using singular value decomposition, we project single-view tracks into world-coordinates, where we are able to automatically quantify how a water polo team can optimize their scoring opportunity. Some of our findings include: Scoring likelihood drops from 60% to 40% when shooting from the 2m line versus the 3m line. Very surprisingly, scoring likelihood is halved (30% down to 15%) when shooting from outside 5.5m to within 4.5-5.5m. Skip shots are 14% more likely to result in a goal than power shots. And, as shown in Figure 2, the distributions of made versus missed shots varies greatly with shot placement. Our results are based on data from 26 NCAA women’s games in their 2018 season.

https://github.com/sswpro/CalWaterPolo