Popular basketball and football matches are captured with dozens of cameras to engage the spectators at home with the most appealing view. However, the stadium reporters decide on the available views and so the TV-viewer can only switch between a handful of them. How convenient would it be to be able to decide on an arbitrary view angle while watching a live broadcast? Or to pause and seamlessly transition to a new perspective through a bullet time effect? When viewed through VR or AR devices, you would further be able to immerse into the game and experience being a player on the court---as demonstrated in the video above . We have previous experience in detecting and tracking players  and have recently advanced novel view point synthesis . In this project, we want to combine and extend these methods to produce higher resolution and more realistic output currently possible , be applicable at live broadcasts, and to work particularly well for the challenging conditions in team sports, where multiple, similarly looking players interact.
 Rematas et al. "Soccer On Your Tabletop"
 Horesh et al. "Tracking Multiple People under Global Appearance Constraints", video and description
 Rhodin et al. "Unsupervised Geometry-Aware Representation Learning for 3D Human Pose Estimation", project page
Back to the project list.
The candidate should have programming experience, ideally in Python. Previous experience with machine learning and computer vision is a plus.
30% Theory, 30% Implementation, 40% Research and Experiments