Cookies

To comply with new EU laws regarding "cookies", we have updated our Terms and Conditions and provided a detailed description of how cookies work and are used on this website.  By continuing to use this site, you are agreeing to those updated Terms and Conditions.
This notice should appear only the first time you visit the site.
View All Vacancies

Research Assistant: Music Production and VR Visual Assets

London College of Music

Salary:   £29,099 to £42,073 per annum
Release Date:   Friday 26 July 2019
Closing Date:   Sunday 11 August 2019
Interview Date:   Thursday 29 August 2019
Reference:   LCM0104

London College of Music is placed within the University of West London as a leading modern university specialising in the education and development of exceptional creative, business and service professionals. London College of Music is the largest specialist music and performing arts institution in the UK, delivering undergraduate and postgraduate degrees from BA and BMus through to PhD and DMus.  Based at the Ealing campus, the London College of Music boasts exceptional facilities for performance and studio technology.  It has a strong and vibrant REF-focussed research community with a number of active externally- and internally-funded research projects. outputs and collaborations.

UWL is part of a consortium funded by the Innovate UK ‘Audience of the Future’ call. The project is entitled Haptic Authoring Pipeline for the Production of Immersive Experiences (HAPPIE). This project aims to develop a generic piece of middleware that will facilitate the future integration of haptic-feedback hardware and software applications.

The UWL contribution to HAPPIE is as one of a number of demonstrators – music production. This project defines the future of interacting with sound in mixed reality. What this means is that any sound (think drumbeat or vocal melody) might be represented by a physical animated model that is superimposed on the user’s field of view (which might be a recording studio and/or its software emulation). The user will be able to reach out and actually touch the shape, and through physics-based modelling, feel a natural weight and resistance, and be able to sculpt the animation with their hands by stretching or twisting etc. Not only will it change appearance with lifelike behaviour, but the shape will be intelligently mapped to sets of audio parameters utilising semantic relationships. Thus, the user’s action will instantly and intuitively modify the sound, doing away with the need to iteratively adjust myriad controls, yet still allowing them all to be visible and adjustable as might be required by using mixed-reality solutions. Such manipulation might be of a sound in isolation or whilst it plays alongside many others. The control could also be over an entire music-mix in stereo, or even 3-D audio, and the system also used in live performance.

The successful applicant will be invited to contribute to IP protection and academic article authorship, and funding is provided for a number of overseas conference visits.

The successful candidate will have good experience of 3-D modelling/animation, and development of virtual-reality environments (preferably with Unreal Engine). In addition, it would be good to have experience of advanced (desktop) music production and an understanding of the application of semantic-audio approaches to it. Experience of experimental design and conducting participant-based studies and subsequent analysis of the data would also be an advantage. Good C++ skills would also help to get involved with the haptics at a more developmental level. Experience of working with 3-D audio would also be helpful.

To support your application, please attach your CV and cover letter.

Interviews will take place the week commencing 26 August 2019.

 

The closing date for this job vacancy has now passed, and applications are no longer being accepted for this position

Further details:



Logon
Email / Username

Password

Forgotten Details Register