We’ve now integrated mobile technologies into our lives to the extent that it’s easy to forget what watershed events the introduction of particularly the iPhone and, later, wearable devices, were. That the content we consume through these devices have put us in a bubble, tempting us to doom scroll without end, while too much of our activity is being tracked, is one thing. On the other hand, they’ve also put a host of instrumentation at our finger tips, which, if we actively seek them out, we can use to unprecedented creative advantage.
VR, and perhaps more appropriately, AR, Augmented Reality, are two examples of these, where the motion sensors in our phones allow us to almost seamlessly step into another world. But, phones, and wearable technology, contain a large array of sensors, and can tap into an even larger range of contextual information, without necessarily us, as users, having to explicitly request them.
The iPhone was released in June of 2007. In late 2008, one of last.fm‘s cofounders founded RJDJ, which developed a musical genre they called reactive music, a close cousin to generative music and interactive music. Reactive music is affected by events occurring in the real life of the listener, through monitoring the sensors in the device the software is installed on; a digital music player, a phone, or, more recent, perhaps a watch.
RJDJ folded in 2013, though a leftover app, Hear, is still downloadable from the Apple App Store. It requires a wired set of headphones, which is testament to the app’s age, Apple leading the pack in removing a plug for wired headphones from their phones, years ago.
Creating the embodied experience
Around ten years ago, approaching a similar context from the perspective of artistic performances, the Finn Ove Holmqvist wanted to find ways to make live electronic music performances more intuitive, and expressive, while allowing the artists themselves to also enjoy their event.
Much earlier, in the early 1990s, he imagined his own biomechanical movements generating sound, perhaps even music, and audiences influencing the outcome of a live performances. Then, in a literal series of dreams, he solved the challenges around this kind of physical computing, convincing himself it could be done, then making it his life’s mission.
To be able to design and create a groundbreaking paradigm, Holmqvist realised that, to help build the future of music, he needed to understand its origins. What are the origins of musicality?
The linguist Steven Pinker posits that music is an evolutionary byproduct, mostly of language, serving no biological function, characterising it as a kind of auditory cheesecake; sugary and fatty, thus satisfying, but without purpose. An alternative theory, evolutionary musicology, posited by Darwin, puts music in a more central position in relation to the development of us as a species. Holmqvist, considering our species a musical one, rejects Pinker and goes further than Darwin: music is a social tool that connects individuals within society.
Even so, today, we use music more for mood regulation and to isolate ourselves.
Holmqvist wanted to expand the limiting application of music, and started to develop a musical system based on biometrics, expanding it to include movement, to influence the processed signals, and to make performances relatable to an audience. Support for a range of wearables was added, augmented with measurements which provided a context, like location and weather data.
Pproviding a platform for social musical communication remained the ultimate goal.
Holmqvist: “Biomechanical patterns are similar to musical patterns and this is not a coincidence. Evolutionary theory states that the vertebrate brain evolved to govern movement, to convert sensory patterns to motion patterns. I needed to convert motion patterns to musical patterns, which the brain would then convert back to movement in a closed biofeedback loop.”
Through trial and error, Holmqvist and his partners found that working with feedback loops made the experience embodied, a kind of ‘being in the moment’.
“Structural coupling with real-world data makes the experience personal and meaningful, introducing a physicality that provides for exploration and improvisation.” For Holmqvist, this was poetry.
Creating reactive music
Holmqvist is CEO of Holonic Systems, which pursues a future of music that is “participatory, contextual and embodied”. Two mobile apps they manage are Holon (free) and Holon.ist (paid), bot available on iOS.
Holon is like a simplified showcase of Holon.ist, which both create reactive music based on a range of environmental inputs, with particularly Holon.ist being able to deal with a large range of inputs. Holon is an easy-to-use standalone application, which generates audio, music, based on your movement, environmental factors, and input from third-party devices. Holon.ist, which allows for a broad configuration of inputs, requires an external piece of software to actually generate a musical output. For the latter, Holmqvist recommends miRack (free on desktop, paid on iOS), which requires ‘patches’ to map Holon.ist’s data to sound, and, if you’re not familiar with mixing studios, in contrast with the easy-to-use of Holon, makes for a daunting, somewhat confusing, and complex process to comprehend and master.
But, it’s Holon.ist that allows for vast experimenting, so, particularly for walking artist, this is the platform to experiment with.
To use miRack, make sure that either mobile or desktop version is on the same network as the device or devices running Holmqvist’s apps. Connecting the tools is automated and, mostly, ‘just works’.
A wealth of ‘patches’ (which connect inputs to outputs) are available at patchstorage.com, specifically Holon’s user page on patchstorage.com.
What sets Holon, and Holon.ist, apart from the above mentioned RJDJ is that RJDJ accepted environmental audio as an input, which, as Holmqvist points out, results in a kind of alchemical approach to the creation of music (for more on this, see Luciano Chessa’s book on Luigi Russolo), whereas the sensory replacement of Holon is more synthetic, a conversion, or translation, of the dynamic environment.
A major aspect of Holmqvist’s goals is to facilitate artistic performance. His apps can have multiple instances, performers, work in tandem, all processed in a single location, miRack, which then results in a cooperatively generated auditive experience. See the videos in this article, for what this can look like.
Embodiment as research
Holon.ist has been used as a vehicle for research in a number of contexts. At a university level, Holon.ist’s mappings are used as an investigative tool to create ubiquitous music from smart city data. Imagine each urban environment creating its own unique soundscape as a direct consequence of the collective data it outputs.
When creating your own sounds, you might find it tempting to record your creations for posterity, or for sharing on SoundCloud. But, for Holmqvist, this is somewhat antithetical to his work: “Holon is about ephemeral, transient experiences that are driven by passive operation, it isn’t holding a phone in your hand and watching a screen.”
Using Holon puts you in the moment, watching a recording of a a ‘performance’ does not.
That said, if you do want to make recordings, use iOS’ built in screen recorder, and separately convert the video to audio-only, afterwards. And also, particularly when recording raw data, an option within Holon.ist, you will have access to a much richer environment for content creation, after the fact.
Homqvist’s position makes sense; using Holonic’s products, you are not a consumer, you are the creator. Or perhaps more fitting, you are the performer. What you create, the music, is a result of your behaviour, and vice versa. And, when using supported wearables like AirPods Pro or Bose AR equipment, situational awareness is retained and the music becomes a part of your environment.
In stead of you dancing to the beat, the beat moves to your dance.
Reactive music deepens the relation between you and your surroundings, nudging users to listen more intently to changes, facilitating enactivism, where cognition arises through a dynamic interaction between an acting organism and its environment, and giving rise to stronger affordances, that what is made possible by a given object or environment.
An eco system on the rise
For now, the market which Holonic operates in is small. This means there are opportunities, but perhaps less commercial scope, but also a risk of fierce competition for a small pie.
In gaming, Zombies, Run!, which uses adaptive audio as a feature, has been a success. In personal exercise, Weav Run is making waves, but is available in only a limited number of app stores.
Holmqvist recently started a user group on Facebook, to bring users together and allow them to share their work and experiences.
Collaborative creation has the potential to democratise how music will evolve, through the removal of barriers, while also improving accessibility. But, Holmqvist is ambivalent in respect to what these processes tend to output: “I often don’t hear results that contribute anything new, musically speaking. Why do people make music in the first place? To stand on a stage and express their emotions? We only react emotionally to music half of the time. The other half of the time, something else happens. We’re interested in this other half, and I think it’s eventually where the music of the future reconnects with its evolutionary origins.”
You must log in to post a comment.