<div dir="ltr"><b>Dynamic Scene Reconstruction for Virtual Reality Video</b><br><div><b><br></b></div><div><div><b>Project Description</b></div><div>The goal of this project is to reconstruct the geometry and appearance of dynamic real-world environments to enable more immersive virtual reality video experiences.</div><div><br></div><div>State-of-the-art VR video approaches (e.g. Anderson et al., 2016) produce stereoscopic 360° video, which comprises separate 360° videos for the left and right eye (like 3D movies, but in 360°). The videos can, for example, be viewed on YouTube using a VR headset such as Google Cardboard or Daydream. Unfortunately, such videos only allow viewers to look in different directions, but they do not respond to any head motion such as moving left/right, forward/backwards or up/down. Truly immersive VR video, on the other hand, requires ‘freedom of motion’ in six degrees-of-freedom (‘6-DoF’), so that viewers see the correct views of an environment regardless of where they are (3 DoF) and where they are looking (+3 DoF).</div><div><br></div><div>This project aims to develop novel dynamic scene reconstruction techniques that are capable of producing temporally-coherent, dense, textured, time-varying 3D geometry from dynamic real-world environments from one or more standard or 360-degree video cameras. In particular, the goal is to convincingly reconstruct the visual dynamics of the real world, such as people and moving animals or plants, so that the reconstructed dynamic geometry can provide the foundation for a novel video-based rendering approach that synthesises visually plausible novel views with 6 degrees-of-freedom for the specific head position and orientation of a viewer in VR. This experience will provide correct motion parallax and depth perception to the viewer (like Luo et al., 2018) to ensure unparalleled realism and immersion.</div><div><br></div><div>Candidates should normally have a good first degree (equivalent to a First Class or 2:1 Honours), or a Master’s degree in computer science, visual computing or a related discipline. A strong mathematical background and strong previous programming experience, preferably in C++ and/or Python, is required. Candidates must have a strong interest in visual computing, and previous experience in computer vision, computer graphics and image processing is highly desirable.</div><div><br></div><div>Informal enquiries should be directed to Dr Christian Richardt, <a href="mailto:c.richardt@bath.ac.uk">c.richardt@bath.ac.uk</a></div><div><br></div><div>Formal applications should be made via the University of Bath’s online application form for a PhD in Computer Science: </div><div><a href="https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP01&code2=0012">https://samis.bath.ac.uk/urd/sits.urd/run/siw_ipp_lgn.login?process=siw_ipp_app&code1=RDUCM-FP01&code2=0012</a></div><div><br></div><div>More information about applying for a PhD at Bath may be found here: </div><div><a href="http://www.bath.ac.uk/guides/how-to-apply-for-doctoral-study/">http://www.bath.ac.uk/guides/how-to-apply-for-doctoral-study/</a></div><div><br></div><div>Anticipated start date: 1 October 2018</div><div><br></div><div><b>Funding Notes</b></div><div>UK and EU students applying for this project may be considered for a University Research Studentship which will cover Home/EU tuition fees, a training support fee of £1,000 per annum and a tax-free maintenance allowance at the RCUK Doctoral Stipend rate (£14,777 in 2018-19) for a period of 3.5 years.</div><div><br></div><div>Note: ONLY UK and EU applicants are eligible for this studentship; unfortunately, applicants who are classed as Overseas for fee-paying purposes are NOT eligible for funding.</div><div><br></div><div><b>References</b></div><div>R. Anderson, D. Gallup, J. T. Barron, J. Kontkanen, N. Snavely, C. Hernandez, S. Agarwal and S. M. Seitz, “Jump: Virtual Reality Video”. <i>ACM Transactions on Graphics (Proceedings of SIGGRAPH Asia 2016)</i>.</div><div><br></div><div>B. Luo, F. Xu, C. Richardt, J.-H. Yong, “Parallax360: Stereoscopic 360° Scene Representation for Head-Motion Parallax”. <i>IEEE Transactions on Visualization and Computer Graphics (IEEE VR 2018)</i>.</div></div><div><br></div><div>
<div style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8px;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial"><span style="font-size:12.8px"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8px;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><a href="http://cs.bath.ac.uk/~nc537/" target="_blank" style="color:rgb(17,85,204)"> </a><a href="https://richardt.name/" target="_blank" style="color:rgb(17,85,204);font-family:arial,sans-serif;font-size:12.8px;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px">Dr Christian Richardt</a></span></span></div><div style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8px;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial"><span style="font-size:12.8px"><span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8px;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline"><a href="http://cs.bath.ac.uk/~nc537/" target="_blank" style="color:rgb(17,85,204)">Dr Neill Campbell</a></span></span></div><div style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8px;font-style:normal;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial"><div><span style="font-size:12.8px"><i>Department of Computer Science, <span style="color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8px;font-variant-ligatures:normal;font-variant-caps:normal;font-weight:400;letter-spacing:normal;text-align:start;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);text-decoration-style:initial;text-decoration-color:initial;float:none;display:inline">University of Bath</span></i></span></div></div></div></div>