[go: up one dir, main page]

Research & Development

Posted by James Shephard on , last updated

BBC Research & Development's Render Engine Broadcasting and Public Service Augmented Reality teams in collaboration have developed a crime scene investigation game. Our prototype experience is displayed on both a TV and an augmented reality capable mobile device using remote rendering. Through continued iteration and user testing, we hope it will help answer some key questions around audiences' acceptance of different quality trade-offs and feed into future work relating to delivering object-based media universally and at scale.

Different methods of delivering media to end-users have different trade-offs. In interactive experiences like computer games or augmented reality, generating the rendered image on a computer in the user's home produces images with low latency. The rendered scene reacts almost immediately to the user's movement or button presses. Moving the computation outside the home may mean we could use more powerful computers to produce higher quality images. However, the trade-off is that some delay is introduced between the user controlling the experience and the resulting image being displayed to them. By having an experience that we can test with users, we hope to gauge how much factors such as latency and rendered image quality affect the perceived quality of the experience.

If the future of media delivery, including interactive, highly personalised or rendered media, involves distributing rendering to computers outside users' homes, there are obvious key challenges around scale. For the BBC to offer this kind of experience to millions of people simultaneously, a naive implementation might involve millions of powerful computers. Some economies of scale will need to be found to realistically deliver experiences like this, and so we're exploring the systems architectures and scaling opportunities that we might apply to future delivery systems.

A screen and keyboard connected to a very small device, a video stream is playing on the monitor.

A demonstration of remote streaming of a game from a server to a low-end device, part of the Render Engine Broadcasting team’s previous research into delivering interactive experiences at scale.

The collaboration between the two teams was driven partially by factors beyond a pure research agenda. We wanted researchers to work with different technologies and share skills while fostering cross-team collaboration when our informal interactions were restricted while we worked from home. This led to a formulation of use-cases driven by both teams' research goals. The Public Service Augmented Reality team's goals focus on AR and its place in public service and as a public good, while the Render Engine Broadcasting team is researching the practical future of delivery of object-based media in a public service setting.

Aside from bringing their own research questions, both teams came open-minded as to the the project's outcomes or deliverables. To help guide the design process, we ran a design sprint with sessions that allowed all team members to contribute ideas and help grow a combined understanding of the kind of use-cases that would be valuable. It soon became clear that a dual-screen experience, including AR, with high-fidelity visuals, was a fitting choice.

We needed to support lower-end devices that can't render high-fidelity 3D scenes to offer universal access to our experience. For this reason, we chose to offload rendering to a more capable but remote machine in a data centre. Over the last three years, this remote rendering approach has also been a key part of the REB team's research. In the final experience, both the phone and the TV display rendered content that is streamed from remote servers.

Due to the nature of AR, where a rendered image is superimposed over a real-world camera feed, it is very sensitive to latency in the rendering process - if the rendered image is presented too late, it won't match up with the image from the camera. This makes the rendered objects appear to lag or float in the world, which severely affects the feeling of immersion. By tackling this challenge head-on, we knew we'd be pushing the limits of what remote rendering could do, highlighting some key challenges that would be fertile ground for future work within the REB team.

A photograph of the Unity application, rendering a likeness of a human's head and shoulders.

Working with Unity’s Digital Human

We chose Unity, the industry-standard game engine, as the development platform. This allowed us to quickly prototype high-fidelity graphics and draw on the existing skills in the PSAR team. We also used the Render Streaming functionality supplied by an experimental Unity package to handle the communication between remote and local devices.

Once we settled on the technical requirements and the broad shape of the experience, the creative team began crafting a narrative that would bring it to life. In the design sprint, we had decided on a theme inspired by the popular BBC One drama 'Line of Duty', which was in its final season at the time. The team worked together on the narrative, game design and video content to provide context and guide the user. We developed the game's flow through detailed storyboards, created 3D models of the evidence using Blender, and imported this into Unity.

Storyboard illustration of the game's narrative

 Storyboarding the game narrative

The result is a crime scene investigation game where players use their AR capable mobile phone to view evidence in their living room as if it were a crime scene and they were the investigating officer. A live feed of the key suspect is streamed from the interrogation room to help the officer. The suspect responds emotionally when evidence is picked up. Submitting the right evidence ensures the suspect is convicted, and getting it wrong means the suspect gets off. In reality, the evidence and the suspect character are high-fidelity 3D models rendered on a data centre machine and streamed to the user's devices.

To answer some of our research questions, we carried out a round of user testing with nine participants aged between 16 and 24 over Zoom. Each participant took part in a semi-structured interview split between 30 minutes using the experience and 30 minutes for follow-up questions. Broadly the structure of each session was the same, but some of the sessions focused more on the accessibility of the experience with three hearing impaired users.

We processed our findings from the interviews into themes, and we'll share more when we finish our analysis. Overall though, we feel the experience and skills accumulated by our teams are as valuable as the user testing results.

Tweet This - Share on Facebook

BBC R&D - Object-based iPlayer - Our remote experience streaming tests

BBC R&D - Object-based media

BBC R&D - New audience experiences for mobile devices

BBC R&D - 5G Trials - Streaming AR and VR experiences on mobile

BBC R&D - New content experiences over 5G Fixed Wireless Access

BBC R&D - 5G Smart Tourism trial at the Roman Baths

BBC R&D - White Paper: Mobile augmented reality using 5G

BBC R&D - Where next for interactive stories?

BBC R&D - Storytelling of the future

BBC R&D - StoryFormer: Building the next generation of storytelling

BBC R&D - Object-Based Media Toolkit

BBC R&D - Designing subtitles for 360 content

Topics