Mapping urban noise through neural radiance fields on AR and virtual production
Mag Gingrich, Oliver ORCID: https://orcid.org/0000-0002-1656-0032, Watkins, Julie
ORCID: https://orcid.org/0000-0001-8872-7041, Flynn, Ryan and Spencer, George
(2024)
Mapping urban noise through neural radiance fields on AR and virtual production.
In: London Conference in Critical Thought 2024, 28th - 29th June 2024, University of Greenwich, London.
Preview |
PDF (Conference Program)
49963 MAG GINGRICH_Mapping_Urban_Noise_Through_Neural_Radiance_Fields_On_AR_And_Virtual_Production_(CONFERENCE PROGRAM)_2024.pdf - Other Download (1MB) | Preview |
Abstract
The lived experience of urban noise occupies a liminal realm between conscious perception and disregard. This paper examines an approach to mapping this realm by visualising noise data from a site in the Royal Borough of Greenwich inside a 3D representation of the same location. The 3D scene is created using Neural Radiance fields or NeRFs which are an AI-based technique for creating a visual representation of a 3D scene based on the input of a collection of 2D images. NeRFs (Gao et al 2022) have gained popularity as a method of 3D asset creation for Virtual Production filmmaking and streaming (Govaere 2023). The 2D images can be gathered with existing photography techniques and equipment making the process accessible to creatives without the budget for specialist 3D scanning. The 3D scene and a visualisation of noise data brought together in an immersive real-time environment at Virtual Production facilities at the University of Greenwich. The practicality of NeRFs as a tool for the synthesis of scenes for an Augmented Reality will be explored in small focus groups with researchers and practitioners: Through the development of an augmented-reality virtual production prototype concepts such as media hybridity, parallax and complex data visualisation will be explored. Researchers, public stakeholders and professionals in the area of noise will provide feedback on the value of data visualisations as a tool to map urban noise within paired immersive environments through the combination of AR and Virtual Production. This presentation will discuss research questions such as ‘what role can immersive tech play in mapping complex data?’ Can embodied experience of complex data visualisation yield new insights into data analysis? We will present a first prototype of a noise visualisation demonstrator, and discuss insights from our focus groups, encouraging a discussion on mapping within the context of emerging technology.
Item Type: | Conference or Conference Paper (Lecture) |
---|---|
Uncontrolled Keywords: | participatory art, noise, VR, XR |
Subjects: | N Fine Arts > N Visual arts (General) For photography, see TR N Fine Arts > NC Drawing Design Illustration Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Faculty / School / Research Centre / Research Group: | Faculty of Liberal Arts & Sciences Faculty of Liberal Arts & Sciences > School of Design (DES) |
Related URLs: | |
Last Modified: | 07 Mar 2025 14:58 |
URI: | http://gala.gre.ac.uk/id/eprint/49963 |
Actions (login required)
![]() |
View Item |
Downloads
Downloads per month over past year