MICA Lab Join Us! Outside-In People Partners Publications Contact
boat

Oh snap! We just showed you a modal..

Because we can

Cool huh? Ok, enough teasing around..

Go to our W3.CSS Tutorial to learn more!

Join the Lab!

The Music Interaction and Computational Arts (MICA) Lab is an inclusive research group that develops and evaluates novel technologies for creative expression. The lab is located in The Bridge: a cutting-edge creative technology facility funded by the Arts and Humanities Research Council (AHRC) and the West of England Combined Authority (WECA). We are currently recruiting a research fellow in Digital Music Interaction and have five funded PhD studentships with further posts available in 2025. We are always on the lookout for new projects, collaborations and opportunities. Please contact tom.mitchell@uwe.ac.uk

An image with the text a 'new outside in approach' and an illustration a hand interacting what an unusual looking interface with a wearable sensor with arrows indicating motion

Research Fellow in Digital Music Interaction

With the UKRI funded Outside Interactions Project

Apply Here!

HTML Version

Large Print (.doc, .pdf)

Image showing an arrangement of acoustic and digital musical instruemtns

5 Funded PhD Studentships

Recruiting for September 2024

Submit an Expression of Interest

UKRI Future Leaders Fellowship: Outside-Interactions

Video Audio Description:

The Outside Interactions project is supported by a UKRI Future Leaders Fellowship and will explore a novel approach to human computer interaction along with new methods for co-designing digital musical instruments (DMIs) with non-technical and physically impaired musicians. The project takes a disruptive and inclusive approach that that will reshape the fundamental practice of DMI design, inviting broader participation in the development of new musical instruments and future visions of musicianship.

Motion capture and gestural interaction technologies have shifted the locus of interaction away from physical objects and onto the body, realising human movement as the interface for performing music. The development of musical objects and gestural music interaction are often considered and pursued separately. By focusing on musical objects, instrument development is often framed as a technical challenge assuming normative players and overlooking the diversity of musicians, audiences, and their environments. Gestural music interaction has shown great potential for music performance and accessibility; however, it neglects the significant role of tactile feedback in music training and the development of virtuosic performance.

The Outside Interactions project will take a holistic approach, coupling gestural interaction and physical prototyping, departing from the conventional practice of embedding sensing technologies within an instrument and relocating the technology onto the body, sensing player interactions using wearable devices on the wrist and hand. A radical switch in both technical and design approach that opens innovative new research directions for low-cost, rapidly produced musical instruments, which can be designed to have any shape, scale or structure. Crucially, this provides new ways for people to participate in the rapid co-design of novel, customised musical instruments that are tailored to their unique artistic identity and access requirements.

People


PhD Students


Alumni


Supporters and Partners

UWE Logo
x-io Technologies Logo
UKRI Logo
MiMU Logo
Watershed Logo
Billy and Andy's Music School Logo
Drake Music Logo
Pervasive Media Studio Logo
AHRC Logo
WECA Logo

Publications

image Child, L., Mitchell, T., & Ford, N. (2023) A systematic review of reverberation and accessibility for B/blind users in virtual environments, In: Proceedings of AES 2023 International Conference on Spatial and Immersive Audio
image Simmons, J., Bown, A., Bremner, P., McIntosh, V., and Mitchell, T. J. (2023) Hear here: Sonification as a design strategy for robot teleoperation using virtual reality, In: Proceedings of VAM-HRI Virtual, Augmented, and Mixed-Reality for Human-Robot Interactions at HRI
image Aynsley, H., Mitchell, T. J., and Meckin, D. (2023) Participatory conceptual design of accessible digital musical instruments using generative AI, In: Proceedings of New Interfaces for Musical Expression
image Bremner, P., Mitchell, T. J., and McIntosh, V. (2022) The impact of data sonification in virtual reality robot teleoperation, In: Frontiers in Virtual Reality, 3, Article 904820
image Renney, N., Renney, H., Mitchell, T. J., & Gaster, B. R. (2022) Studying how digital luthiers choose their tools, In: Proceedings of the CHI Conference on Human Factors in Computing Systems
image Renney, H., Willemsen, S., Gaster, B. R., and Mitchell, T. J. (2022) HyperModels - A framework for GPU accelerated physical modelling sound synthesis, In: Proceedings of New Interfaces for Musical Expression
image Renney, H., Gaster, B., and Mitchell, T. J. (2022) Survival of the synthesis—GPU accelerating evolutionary sound matching, In: Concurrency and Computation: Practice and Experience 34(10), Article e6824
image Bolarinwa, J., Eimontaite, I., Mitchell, T., Dogramadzi, S., and Caleb-Solly, P. (2021) Assessing the Role of Gaze Tracking in Optimizing Humans-In-The-Loop Telerobotic Operation Using Multimodal Feedback, In: Frontiers in Robotics and AI 8, Article 578596
image Brown, D., Nash, C., and Mitchell, T. J. (2020) Was that me?: Exploring the effects of error in gestural digital musical instruments, In: Proceedings of the of the 15th International Conference on Audio Mostly
image Mitchell, T. J., Jones, A. J., O’Connor, M. B., Wonnacott, M. D., Glowacki, D. R., and Hyde, J. (2020) Towards molecular musical instruments: Interactive sonifications of 17-alanine, graphene and carbon nanotubes, In: Proceedings of the of the 15th International Conference on Audio Mostly
image Renney, H., Gaster, B. R., and Mitchell, T. J. (2020) There and Back Again: The Practicality of GPU Accelerated Digital Audio., In: Proceedings of the 20th New Interfaces for Musical Expression
image Hunt, S. J, Mitchell, T. J. and Nash, C. P. (2020) Composing computer generated music, an observational study using IGME: the Interactive Generative Music Environment., In: The Proceedings of NIME 2020
image Hunt, S. J, Mitchell, T. J. and Nash, C. P. (2019) Automating algorithmic representations of musical structure using IGME: The Interactive Generative Music Environment, In: Innovation In Music 2019
image Bolarinwa, J., Eimontaite, I., Dogramadzi, S., Mitchell, T., & Caleb-Solly, P. (2019) The use of different feedback modalities and verbal collaboration in tele-robotic assistance, In: Proceedings of the IEEE International Symposium on Robotic and Sensors Environments (ROSE)
image Mitchell, Thomas J., Thom, J., Pountney, M., Hyde, J. (2019) The alchemy of chaos: A sound art sonification of a year of Tourette’s episodes, In: Proceedings of the International Conference on Auditory Display
image O'Connor, M. B., Bennie, S. J., Deeks, H. M., Jamieson-Binnie, A.; Jones, A. J.; Shannon, R. J.; Walters, R.; Mitchell, T. J.; Mulholland, A. J.; Glowacki, D. R. (2019) Interactive molecular dynamics in virtual reality from quantum chemistry to drug binding: An open-source multi-person framework Journal of Chemical Physics, 150(22), ISSN: 1089-7690, DOI: 10.1080/14626268.2018.1510841
image Renney, H., Gaster, B. R., & Mitchell, T. (2019) OpenCL vs: Accelerated finite-difference digital synthesis, In: Proceedings of the International Workshop on OpenCL DOI: 10.1145/3318170.3318172
image Brown, D., Nash, C. and Mitchell, T. (2018) Simple mappings, expressive movement: A qualitative investigation into the end-user mapping design of experienced mid-air musicians, Digital Creativity, 29:2-3, ISSN: 1462-6268, DOI: 10.1080/14626268.2018.1510841
image Gaster, B. R., Renney, N. and Mitchell, T. (2018) Outside the block syndicate: Translating Faust's algebra of blocks to the arrows framework, In: International Faust Conference
image Brown, D., Nash, C. and Mitchell, T. (2018) Understanding user-defined mapping design in mid-air musical performance, In: International Conference on Movement Computing DOI: 10.1145/3212721.3212810
image Arbon, R. E., Jones, A. J., Bratholm, L. A., Mitchell, T. and Glowacki, D. R. (2018) Sonifying stochastic walks on biomolecular energy landscapes, In: International Conference On Auditory Display
image Hunt, S., Mitchell, T. and Nash, C. (2018) A cognitive dimensions approach for the design of an interactive generative score editor, In: International Conference on Technologies for Music Notation and Representation
image Renney, N., Gaster, B. and Mitchell, T. (2018) Return to temperament (In digital systems), In: Audio Mostly
image Hunt, S., Mitchell, T. and Nash, C. (2017) Thoughts on interactive generative music composition, In: Computer Simulation of Musical Creativity
image Brown, D., Nash, C. and Mitchell, T. (2017) A user experience review of music interaction evaluations, In: International Conference on New Interfaces for Musical Expression DOI: 10.5281/zenodo.1176286
image Hunt, S., Mitchell, T. and Nash, C. (2017) How can music visualisation techniques reveal different perspectives on musical structure?, In: International Conference on Technologies for Music Notation and Representation
image van den Berg, C., Heap, I., Stark, A. and Mitchell, T. (2017) Expressive gestural personality, In: Push Turn Move: Interface Design in Electronic Music, ISBN: 978-87-9999995-0-7
image Brown, D., Nash, C. and Mitchell, T. (2016) GestureChords: Transparency in gesturally controlled digital musical instruments through iconicity and conceptual metaphor, In: International Conference on Sound and Music Computing, DOI: 10.5281/zenodo.851193
image Brown, D., Renney, N., Stark, A., Nash, C. and Mitchell, T. (2016) Leimu: Gloveless music interaction using a wrist mounted leap motion, In: International Conference On New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1176000
image Mitchell, T., Bennett, P., Tew, P., Davies, E. and Madgwick, S. (2016) Tangible interfaces for interactive evolutionary computation, In: ACM Conference Human Factors in Computing Systems, Extended Abstracts, DOI: 10.1145/2851581.2892405
image Davies, E., Tew, P., Glowacki, D., Smith, J. and Mitchell, T. (2016) Evolving atomic aesthetics and dynamics, In: International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design, DOI: 10.1007/978-3-540-32003-6_54 (Nominated for best paper)
image Mitchell, T., Hyde, J., Tew, P. and Glowacki, D. (2016) danceroom Spectroscopy: At the frontiers of physics, performance, interactive art and technology, In: Leonardo Leonardo, 49 (2), DOI: 10.1162/LEON_a_00924 (Cover article)
image Madgwick, S. O. H., Mitchell, T. J., Barreto, C. and Freed, A. (2015) Simple synchronisation for open sound control, In: International Computer Music Conference. (Nominated for best paper)
image Hyde, J. I., Mitchell, T. J., Tew, P. and Glowacki, D. R. (2014) Molecular music: Repurposing a mixed quantum-classical atomic dynamics model as an audiovisual instrument, In: Generative Art Conference
image Rutter, E. K., Nash, C. and Mitchell, T. J. (2014) Turnector: Tangible control widgets for capacitive touchscreen devices, In: International Computer Music Conference (ICMC) and the 11th Sound & Music Computing Conference
image Mitchell, T. J., Madgwick, S., Rankine, S., Hilton, G., Freed, A. and Nix, A. (2014) Making the most of Wi-Fi: Optimisations for robust wireless live music performance, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178875
image Place, A., Lacey, L. and Mitchell, T. (2014) AlphaSphere: From prototype to product, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178903
image Serafin, S., Trento, S., Grani, F., Perner-Wilson, H., Madgwick, S. and Mitchell, T. J. (2014) Controlling physically based virtual musical instruments using the gloves, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178937
image Glowacki, D. R., O'Connor, M., Calabro, G., Price, J., Tew, P., Mitchell, T., Hyde, J., Tew, D., Coughtrie, D. J. and McIntosh-Smith, S. (2014) A GPU-accelerated immersive audiovisual framework for interaction with molecular dynamics using consumer depth sensors, Faraday Discussions 169, ISSN 1359-6640 DOI: 10.1039/C4FD00008K
image Glowacki, D., Tew, P., Hyde, J., Kriefman, L., Mitchell, T., Price, J. and McIntosh-Smith, S. (2013) Using human energy fields to sculpt real-time molecular dynamics. Molecular Aesthetics. MIT Press, ISBN 9780262018784
image Madgwick, S. and Mitchell, T. J. (2013) x-OSC: A versatile wireless I/O device for creative/music applications, In: International Conference on Sound and Music Computing, DOI: 10.5281/zenodo.850439
image Place, A., Lacey, L. and Mitchell, T. (2013) AlphaSphere, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178642
image Glowacki, D., Tew, P., Mitchell, T. J., Price, J. and McIntosh-Smith, S. (2012) danceroom Spectroscopy: Interactive quantum molecular dynamics accelerated on GPU architectures using OpenCL UK Many-Core developer conference
image Mitchell, T. J. (2012) Automated evolutionary synthesis matching: Advanced evolutionary algorithms for difficult sound matching problems, Soft Computing 16 (12), DOI: 10.1007/s00500-012-0873-x
image Mitchell, T. J., Madgwick, S. and Heap, I. (2012) Musical interaction with hand posture and orientation: A toolbox of gestural control mechanisms, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178111
image Mitchell, T. J. and Heap I. (2011) SoundGrasp: A gestural interface for the performance of live music, In: International Conference on New Interfaces for Musical Expression, DOI: 10.5281/zenodo.1178111
image Mitchell, T. J. (2010) An exploration of evolutionary computation applied to frequency modulation audio synthesis parameter optimisation. Ph.D, University of the West of England
image Mitchell, T. J., Creasey, D. P. (2007) Evolutionary sound matching: A test methodology and comparative study, In: Proceedings of the International Conference on Machine Learning and Applications, DOI: 10.1109/ICMLA.2007.34
image Mitchell, T. J., Pipe, A.G. (2006) A comparison of evolution-strategy based methods for frequency modulated musical tone timbre matching. In: Proceedings of the International Conference on Adaptive Computing in Design and Manufacture. Institute for People-centred Computation (IP-CC), ISBN 978-0955288500
image Mitchell1, T.J., Sullivan, J.C.W. (2005) Frequency modulation tone matching using a fuzzy clustering evolution strategy. In: Proceedings of the Audio Engineering Society 118th Convention. AES
image Mitchell T.J., Pipe A.G. (2005) Convergence Synthesis of Dynamic Frequency Modulation Tones Using an Evolution Strategy. In: Applications of Evolutionary Computing. EvoMUSART 2005. Lecture Notes in Computer Science, vol 3449. DOI: 10.1007/978-3-540-32003-6_54
Contact

MICA Lab, UWE, Bristol
Frenchay Campus, Coldharbour Lane
Stoke Gifford BS16 1QY

  Bristol, UK

  tom.mitchell@uwe.ac.uk