Giorgos Othonos is the first technician from the Faculty of Law, Arts and Social Sciences, at the University of Greenwich, to be awarded a research and development grant. This project places the university at the forefront of cutting-edge Extended Reality (XR) research, exploring the integration of virtual game worlds with physical environments, characters, and objects.
About the XR Labs Fund
The XR Labs Fund supports new technology development and prototyping across UK universities, with a particular focus on virtual production technologies and convergent media. Its mission is to broaden access to higher education virtual production infrastructure and foster collaboration between universities and industry partners.
This University of Greenwich-led project, XR-Sync (Extended Reality Stage Synchronization), will develop robust workflows to blend virtual sets with physical stages in real time.
The project will:
- Synchronise virtual reality headset views with the virtual production stage’s real-world tracking data.
- Leverage AHRC funded facilities via the Shared Hub for Immersive Future Technologies (SHIFT)
- Deliver adaptable XR workflow solutions across a range of accessible systems, using dynamic tracking systems, Light Emitting Diode (LED) volume technology, and virtual reality headsets.
- Bring together academic and industry expertise by collaborating with Target 3D.
By synchronizing the headset view with the stage’s real-world tracking data, users can step onto the stage and see the three-dimensional (3D) environment through their headset with accurate scale and alignment. This immediate and interactive visual feedback will enable creative teams to build, arrange, and refine set pieces and actor positions in real-time, reducing errors and guesswork by combining physical props with virtual sets.

The impact of this work will enhance on-set efficiency and enable real-time, on-stage exploration and decision-making for directors and crews working in Virtual Production filmmaking.
At the centre of the project lies the game engine: A 3D interactive environment enabling advanced graphics rendering, collision detection, audio, animation, artificial intelligence, and more. This technology underpins the integration of virtual and physical content, pushing the boundaries of what virtual production can achieve.
Professor Andrew Knight-Hill, Centre Lead for the Centre for Sound and Image said:
The University of Greenwich is committed to Extended Reality research and has an established track record in supporting practice-based research with immersive technologies, evidenced through our Arts and Humanities Research Council World Class Labs investment in the £1.4m Shared Hub for Immersive Future Technologies (SHIFT).
Recognition and impact
This award highlights the University of Greenwich’s growing reputation as a hub for innovative XR research and immersive technologies, strengthening its role in shaping the future of virtual production.
This project is supported by XR Network+, using funding from the Engineering and Physical Sciences Research Council (EPSRC) and UK Research and Innovation (UKRI). Led by XR Stories at the University of York, XR Network+ is conducted in collaboration with Cardiff University, Edinburgh University, Ulster University, and University of the Arts London.

Photo credit: Sakina Beladioui (Enterprise and Events Assistant, the Generator)