Visual Interactive and Sound Technology in Archaeology
One Day Symposium
Sir George Buckley Lecture Theatre, CSLG01, University of Huddersfield, Huddersfield, West Yorkshire, UK, HD1 3DH.
Tuesday 16th June 2015, 10am – 5pm
This symposium collects together researchers working in digital modelling and reconstruction, app development, acoustic modelling, interactive design, audio-visual applications, and multimedia, and their relationships to archaeology, heritage science, cultural industries and museums.
European Music Archaeology Project and British Audio-Visual Research Network
Free tickets at http://tinyurl.com/ovcc7db
The increasing impact of digital survey across Historic England & English Heritage
Paul Bryan, Geospatial Imaging Manager, Historic England; Joe Savage, Interpretation Officer, English Heritage
This presentation will highlight the increasing impact of digital survey technologies and methodologies across the work of Historic England and English Heritage. Alongside a summary of the technologies employed their variety of application will be highlighted through a number of case studies including the use of laser scan data at Stonehenge, Harmondsworth Barn and West Kennet Long Barrow and current work at Tintagel Castle using drone-acquired imagery and Structure-from-Motion (SfM) photogrammetry. As will be shown such digital survey work has multiple archaeological, conservation and presentational applications including interpretation and analysis, interactive app development and within exhibition displays.
Professional Commercial Archaeological Digital Visualization
Marcus Abbott, York Archaeological Trust
Marcus Abbott presents an insider’s view of the world of professional commercial archaeological modelling. He discusses and illustrates recent work by York Archaeological Trust.
Acoustic and Interactive Modelling in the European Music Archaeology Project
Rupert Till, Casto Vocal, University of Huddersfield
This presentation discusses the Soundgate, a 180 degree projection screen using 4 widescreen HD projectors to create an immersive visitor experience for the EMAP travelling exhibition. Footage will be filmed live at archaeological sites with digital cameras, as well further live film using anamorphic and fisheye lenses and a 4k Red camera to generate a form of digital cinerama. Digital modelling will also be used to generate cinematic reconstructions in the same format. Acoustic modelling, sound archaeology, archaeoacoustics and music archaeology will be used to create a soundtrack focused on reconstructions of ancient musical instruments being played in their original historic acoustic contexts. Digital film with soundtrack, interactive online, PC, tablet and smartphone versions will all be created.
Visualisation and Auralization: exploring digital lived experience in late medieval buildings.
, Archaeological Computing Research Group, University of Southampton
Digital techniques in archaeological computing can offer new routes to approaching human experience. Catriona Cooper presents two case studies that demonstrate alternative and complementary techniques to explore the notion and implementation of a digital “lived experience” of late medieval buildings. A study based at Bodiam Castle uses digital visualisation to explore the lived experience of the private apartments. A second case study presents an assessment of a series of auralizations of Ightham Mote comparing recorded and modelled acoustical parameters with reference to both human responses and numerical parameters, concluding in combining the two approaches.
Intersensorality, Indeterminacy, Experimental Sound Design and Archaeological interpretation
Claire Marshall, Plateau Imprints
A focus on the VR reconstruction of the Ribchester Roman Fort in Lancashire. This will be amongst the first full 4D and multi-sensorial interpretations of a site in the UK, which foregrounds the intersensoriality and indeterminacy of ‘material culture,’ in general, and public heritage interpretation and experience, in particular.
Use of Oculus Rift for VR-Auralisation
Alex Southern, Royal Society Industry Research Fellow, University of York
This is part of an ongoing project that makes use of Oculus Rift virtual reality technology to deliver immersive, interactive, 3D audio-visual experiences of a theatrical performance venue. Software has been developed to integrate the Oculus Rift (for 3D visualisation) and MAX/MSP (for 3D headphone auralisation) to view recorded performances, and users are able to interactively select their preferred seating location. The user is also able to freely move their head and look around the venue adding to the sense of immersion and resulting in a multi-modal experience.
Lost and Found Sound in the Vale of Pickering: Exploring the sonic properties of a Early Holocene landscape through sound art
Ben Elliott, Department of Archaeology, Jon Hughes, Department of Music, University of York
The Sonic Horizons of the Mesolithic sought to apply new developments in contemporary landscape-based sound art composition to the palaeoenvironmental and archaeological data available for the landscape around Star Carr. Resulting in a series of ambisonic sound installation events across Yorkshire during the summer of 2013, this project explicitly explored the sonic environment within a context that defies traditional archaeocoustic approaches through its lack of archaeologically definable internal spaces. A landscape approach here was key, and this paper will reflect on the implications that this may hold for future considerations of sound in the deep past.