Advances in medical care and technologies have exponentially increased the number of individuals who are minimally responsive to their world, and who have little to no ability to interact with others.
For example, when the mechanical ventilator was invented in the 1940s, it enabled the unprecedented survival of individuals whose traumas and pathologies would previously have been fatal.
As this life-saving technology became integrated into routine medical practice, new kinds of persons and new modes of life were inadvertently created in the form of people whose physiological function was maintained by machines, who showed no signs of responsiveness.
Other advances in medical technologies have led to the increased survival rate of premature infants, a generation of children with severe disabilities who are surviving into adulthood, and a generation of baby boomers surviving into old age. Many of these individuals have conditions (such as severe autism or cerebral palsy) or develop pathologies (such as dementia) that limit their ability to interact with others.
In addition to being trapped in inner worlds that others can’t access, their lack of interaction causes them to lose personhood in the eyes of others, and inevitably their membership as full citizens of society.
At the Biosignal Interaction and Personhood Technology (BIAPT) Lab, we are working to develop technologies that augment and maintain the personhood of such non-responsive individuals.
One technology being developed in this lab is ‘biomusic’ – a system that translates physiological changes associated with emotion into musical output. When someone is stressed, or is experiencing a strong emotion, this often results in idiosyncratic changes in their physiological signals. For example, an individual experiencing fear might manifest this state physiologically with dilated pupils, cold and sweaty hands, shallow breathing and an increased heart rate.
Biomusic tracks changes specific to the individual that are associated with emotion, and outputs these changes through a process known as sonification – the use of sound to perceptualize data. This allows others to hear and ‘tune in’ to the physiological changes of another person, which gives them unique access to that individual’s emotional and mental state.
Biomusic technology was first tested with children with profound and multiple disabilities who resided in a long-term care unit at a pediatric rehabilitation hospital. Caregivers of these children were asked to interact with these children in their normal manner, while the biomusic of the children played in the background.
‘OK, so this is a person’
After four sessions with biomusic, caregivers not only noticed changes in the music associated with their interactions, they reported changing the way they interacted and their perceptions of the children as a result of the technology. A professional caregiver at the hospital said that caregivers “often don’t look at the person, and just do what we have to do”. Biomusic made her “step back and think OK, so this is a person’”.
Currently, the BIAPT lab is actively working on improving both the physiological signal-processing capabilities and the sonification parameters of this technology. Our goal is to integrate machine learning algorithms into the software to detect user-specific emotions and to use these results to modulate the biomusical output.
However, training these algorithms requires the user to self-report their emotional state, which, by definition, non-communicative persons cannot do. To circumvent this problem, we are classifying the various patterns of emotion-related physiological responses in the general population.
The Montreal Science Center is host to Mes Emotions Sont à Fleur de Peau, a five-year permanent exhibit which presents visitors with emotionally laden videos while their physiological signals are recorded. The visitors then re-watch the video with their biomusic overlaying the soundtrack, and their physiological reactions (along with their self-reported emotions) are sent to the BIAPT lab.
The BIAPT lab is also partnering with the ultimate users of their technology to develop, design and tailor the physiological sonifications. Through a series of participatory workshops, individuals with autism, dementia and severe and profound disabilities – along with their families, teachers and other stakeholders – partner with auditory design experts to customize how biomusic sounds.
The ultimate users of the technology want to apply biomusic to a wide range of situations, from self-monitoring to entertainment, and are also constrained by different musical preferences and listening environments.
Thus, the types of sounds that each user both requires and enjoys listening to are extremely different. This participatory design process ensures that the technology that is ultimately delivered to the users and their caregivers is relevant, pleasant to listen to, and fulfils the specific constraints of each individual’s unique situation.
Biomusic is part of a second wave of medical technologies that aim to augment and sustain the quality of life of individuals, as opposed to saving and prolonging their quantity of life.
The research at the BIAPT lab aims to develop assistive technologies that enable us to tune in to the inner worlds of those with minimal communicative abilities. In doing so, we hope to be able to preserve and maintain the personhood of some of the most vulnerable members of our society.