Ullo – Livre d’Or (Golden book)

This application was conceived for a public exhibition, to replace the traditional paper guest book. Here people can sign their presence with a heart sensor and place it wherever they like in the surrounding world via augmented reality, leaving their mark through their biosignals.

ARVatar – Cosmos AR

With ARVatar, you can join an augmented reality portal that plunges you into a magical open world. You can find flowers linked to your heartbeat for heart synchronization.

Multiplayer ambient version :

With the multiplayer version, you can join a peer and move your heart wherever you like in your environment. Relaxation and breathing are the watchwords of this application.

Through such demonstration we aim at bridging our various prototypes, exploring multiple modalities.

Flow – Control your heart

Flow is an Apple Watch application for relaxation, where you can do breathing exercises to control your heart. The app provides two breathing exercises, one where your heart controls the breathing cycles, the other where you pick a fixed breathing cycle duration. You have a complete feedback of your use of the app, with dashboards for each day, week, month or year. You can see charts with your average amplitude, your average number of breathing cycles or again the time you spent on the app.
You can also find in the app an AMA section, with responses to all questions you can have about how does this work.

The application is available on the Apple Watch app Store.

CosmosVR – Virtual Reality Biofeedback Game

Cosmos VR is a shared experience where multiple users can complete quests together in a virtual reality world. They can observe a biofeedback of their heart-rate activity as well as that of others, represented in various ways. All quests are based on biofeedback, users have to try to regulate their heart-rate activity to accomplish them.

How work Cosmos VR?

This project was made with Unity for Oculus Quests, and it requires a smart watch to get user’s heart-rate. In the video presentation, the aim of the quest is to make an eclipse with the planets. Each player controls a planet with his heart-rate, and the higher the heart rate, the higher the planet. Moreover, the further apart the players’ heart rates are, the further away the planets are from the sun. So, to accomplish this quest, each players’ heart rate have to be around 70 BPM (beats per minute). To make this possible, the player have to learn how to regulate his heart rate, for example with its breathing. This is why, in this virtual reality world there are ‘breathing crystals’ with changing luminosity, and the players can follow the crystals with their breathing to regulate their heart rate.

One of the objectives of Cosmos VR is to teach users that we can quite simply regulate our heart rate, and how to regulate it, for example with our breathing.

Another objective is to use virtual reality to represent physiological data, with an ambient biofeedback for the user.

Why does this project appeared ?

This project is an extension of Cosmos, we wanted to use virtual reality so that the user is fully immersed, completely concentrated to take the control of his heart rate. In addition to this, the player have a goal, quests to accomplish, so he really want to regulate itself which makes it easier.

Echo In Space – Multi Player Biofeedback Game

Echo in Space is a shared experience where two players can control Echo traveling the space by their breathing. The objective is to catch as many hearts as possible, and to avoid meteorites ! The first player controls the position of Echo (an inspiration and Echo goes up, an exhale and Echo goes down). Moreover, if there is a second player, he can controls the speed of Echo (the faster his breathing, the faster the Echo).

This game has been developed by a team of 5 (3 developers and 2 designers), in 3 days during a Game Jam. The programming language used is JavaScript, and we used PixiJS, an open-source library. The breathing of each player is recovered by a sensor (a breathing belt) who sends data using Bluetooth.

Finally, the aim of this game is to become aware of your breathing, that you can completely control it and that it affects many things, such as your heart rate, your focus and so on. Moreover, the players can have a real-time biofeedback of their breathing, and they can try to modulate it in a playful way in conjunction with a peer.

MusicEEG – Multiple User Brain Music Player

MusicEEG

Have you ever dreamed about controlling music with your brain? MusicEEG is an online application that allows you to listen to music and add filters that can be controlled by the electrical activity of your brain. You can even experience the app with someone else, choosing which brain and which EEG frequency band controls which sound filter. A combined chart of the real-time measurements is displayed in the middle for control.

This application was created for the sound week of UNESCO 2021, and it was presented on the 7th, June 2021.

You can test this project here (only with browsers supporting Web BLE, such as Google Chrome and derivatives), you need to have at least one Muse headset.

Since this demo we developed our own library to generate sounds and soundscape, soon to be released.

Harmonia – Connect your hearts

Harmonia is an iOS application with an Apple Watch companion. The aim is to connect your self with your friends or colleagues and achieve heart synchronization. Like a social media, you can add new friends, create communities and edit your profile. You can also see dashboards about your app usage, your friends, or the time spent on the app. The app work with a system of session (with a friend or with a whole community), and during a session you can follow a breathing guide in your Apple Watch.

The app provides features like notifications, or dark mode to personnalise your user experience.

It is currently in closed beta, don’t hesitate to contact us if you want to try it out 🙂

Dinner of Lights : a projection system

The Dinner of Lights is a projection system that uses animated scenes surrounding the plate to improve the multisensory dining experience of the patients and to initiate the adaptation to a healthy diet at their own pace. Offering a variety of sound and light stimuli anchors one in a sensory bubble that provides soothing, pleasure and nourishment.

This system avoids stigmatizing and labeling the participant’s appetite status and gently encourages him or her to eat. The visual themes can be easily adapted for children, adults or seniors. It creates social interaction through animations that link people together through movement. It also reinforces the multisensory intensity of the meal (tactile, visual, auditory, olfactory, gustatory senses) and allows a moment of relaxation while eating in poetry.

Tabletop projection is already implemented in stimulation and care products, notably for the elderly with the interactive tables created by Tovertafel. In the field of alimentation, the use of these technologies is mainly directed to the luxury market. Our wish is to make healthy eating more accessible and to encourage it.

Our main sources of research with this system concern undernutrition of the elderly, hospital alimentation and eating disorders.

“The projection that was playing resonated with me, inspired great emotions, and it made it easier for me to eat that chocolate dessert. “
An eating disorder patient on July 21, 2021

In order to respect the confidentiality of the current studies with medical partners, here is another example of our projection system applied this time to an audience of children to immerge them in an imaginary world during the afternoon snack.

The idea was to create a device dedicated to a public of children from 5 to 11 years old. The idea was to create a projection surrounding the meal, where dragons recognizable by each child, would grow up as the meal progressed. The graphics had to be simple, attractive and easily distinguishable during the projection. Each dragon had a particular colour so that it could be easily identified by the child as his or her own.

The dragons would smile, stick their tongues out, breathe fire to warm the child’s plate, fly, play with the other dragons and play around his plate. This type of animation allows a real collaboration between children and encourages communication and trust by having two avatars representing them mingle and play together. This encourages empathy, identification and conversation.

Coral

This project started as a(nother) collaboration with the Potioc research team, and the then-post-doc Joan Sol Roo. Through this projects we wanted to address some of the pitfalls related to tangible representations of physiological states.

At this point we had been working approximately 8 years on the topic, creating, experimenting with and teaching about devices using physiological signals; while at the same time exchanging with various researchers, designers, artists, private industries, enthusiasts or lay users. We started have a pretty good idea of the various frictions points. Among the main issues we encountered: building devices is hard, even more so when they should be used outside of the lab, or given to novices. We started to work more on electronics because of that, for example relying more on embedded displays instead of spatial augmented reality, but we wanted to go one step further and explore a modular design, that people could freely manipulate and customize. We first started to wonder what basic “atoms” were necessary to be able to recreate our past projects. Not so many it appeared. Most projects boil down to a power source, a sensor, some processing, an output, no more. Various output can be aggregated to give multi-modal feedback. Communication can be added, to send data to or receive from another location, as with Breeze. Data can be recorded or replayed. Some special form of processing can occur to fusion multiple sensors (e.g. extract an index of cardiac coherence) or to measure the synchrony between several persons, as with Cosmos. And this is it, we have or sets of atoms, or bricks, that people can assemble in various way, to redo or create new devices. Going tangible always comes with a trade-off in terms of flexibility or freedom as compared to digital or virtual (e.g. it is harder and more costly to duplicate an item), but it also brings invaluable features, with people more likely to manipulate, explore and tinker with physical objects (there is more to the debate; for another place).

We are of course not the firsts to create a modular toolkit; many projects and products provide approach to explore electronics or computer science, and the archetypal example — that is also a direct inspiration –, comes from the Lego bricks themselves. However we push for such form factor in the realm of physiological computing. More importantly, we use the properties of such a modular design to answer to other issues pertaining to biofeedback applications: how to ensure that the resulting applications empower users and do not enslave them?

Throughout the project, we aimed at making possible to interact with the artifacts under the same expectations of honest communication that occur between people, based on understanding, trust, and agency.

  • Understanding: Mutual comprehension implies a model of your interlocutor’s behavior and goals, and a communication protocol understand- able by both parts. Objects should be explicable, a property that is facilitated when each one performs atomic and specific tasks.
  • Trust: To ensure trust and prevent artifacts to appear as a threat, their behavior must be consistent and verifiable, and they should perform only explicitly accepted tasks for given objectives. Users should be able to doubt the inner workings of a device, overriding the code or the hardware if they wish to do so.
  • Agency: As the objective is to act towards desirable experiences, control and consent are important (which cannot happen without understanding and trust). Users should be capable to disable undesired functionalities, and to customize how and to whom the information is presented. Objects should be simple and inexpensive enough so that users can easily extend their behavior.

Coral was created (or “blobs”, “totems”, “physio-bricks”, “physio-stacks”… names were many) to implement those requirements. In particular, bricks were made:

  • Atomic: each brick should perform a single task.
  • Explicit: each task should be explicitly listed.
  • Explicable: the inner behavior of an element should be understandable.
  • Specific: each element capabilities should be restricted to the desired behavior, and unable to perform unexpected actions.
  • Doubtable: behaviors can be checked or overridden.
  • Extensible: new behaviors should be easily supported, providing forward compatibility.
  • Simple: As a mean to achieve the previous items, simplicity should be prioritized.
  • Inexpensive: to support the creation of diverse, specific elements, each of them should be rather low cost.

For example, in order to keep the communication between atoms Simple and Explicable, we opted for analog communication. Because no additional meta-data is shared outside a given atom unless explicitly stated, the design is Extensible, rendering possible to create new atoms and functionality, similar to the approach used for
modular audio synthesis. A side effect of analog communication is its sensitivity to noise: we accepted these as we consider the gain in transparency is worth it. It can be argued that this approach leads to a naturally degrading signal (i.e. a “biodegradable biofeedback”), ensuring that the data has limited life span and thus limiting the risk that it could leak outside its initial scope and application. Going beyond, in order to explicitly inform users, we established labels to notify
them what type of “dangerous” action the system is capable of performing. A standard iconography was chosen to represent basic functions (e.g. floppy disk
for storage, gears for processing, waves for wireless communication, …). We consider that, similar to food labeling that is being enforced in some countries, users should be aware of the risks involved for their data when they engage with a device, and thus being able to make an informed decision. On par with our objectives, everything is meant to be opoen-source, from the code to the schematics to the instructions — we just have to populate the holder at https://ullolabs.github.io/physio-stacks.docs/.

Over the last two years [as of 2021] we already performed several tests with Coral, on small scales, demoing and presenting them to researchers, designers or children (more details in Mobile HCI ’20 paper below). The project is also moving fast in terms of engineering, with the third iteration of the bricks, now easier to use and to craft (3D printing, soldering iron, basic electric components). While the first designs were based on the idea of “stacking” bricks, the latter ones explore the 2D space, more alike building tangrams.

This tangible, modular approach enable the construction of physiological interfaces that can be used as a prototyping toolkit by designers and researchers, or as didactic tools by educators and pupils. We are actively working with our collaborators toward producing a version of the bricks that could be used in class, to combine teaching of STEM-related disciplines to benevolent applications that could favor interaction and bring well-being.

We are also very keen interface the bricks with existing devices. Since the communication between bricks is analog it is directly possible to interact with standard controllers such as the Microsoft Xbox Adaptive controller (XAC), to play existing games with our physiological activity, or to interact with analog audio synthesis.

Our work was presented at the Mobile HCI ’20 conference, video below for a quick summary of the research:Our work was presented at the Mobile HCI ’20 conference, video below for a quick summary of the research:

Contributors

In the spirit of honest and transparent communication, here is the list of past and current contributors to the project (by a very rough order of appearance):

Joan Sol Roo: Discussion, Concept, Fabrication, Applications Writing (v1)
Jérémy Frey: Discussion, Concept, Applications, Writing, Funding (v1, v2, v3)
Renaud Gervais: Early concept (v1)
Thibault Lainé: Discussion, Electronics and Fabrication considerations (v1)
Pierre-Antoine Cinquin: Discussion, Human and Social considerations (v1)
Martin Hachet: Project coordination, Funding (v1)
Alexis Gay: Scenarios (v1)
Rémy Ramadour: Electronics, Fabrication, Applications, Funding (v2, v3)
Thibault Roung: Electronics, Fabrication (v2, v3)
Brigitte Laurent: Applications, Scenarios (v2, v3)
Didier Roy: Scenarios (v2)
Emmanuel Page: Scenarios (v2)
Cassandra Dumas: Electronics, Fabrication (v3)
Laura Lalieve: Electronics, Fabrication (v3)
Sacha Benrabia: Electronics, Fabrication (v3)

Associated Publications

Joan Sol Roo, Renaud Gervais, Thibault Lainé, Pierre-Antoine Cinquin, Martin Hachet, Jérémy Frey. Physio-Stacks: Supporting Communication with Ourselves and Others via Tangible, Modular Physiological Devices. MobileHCI ’20: 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services, Oct 2020, Oldenburg / Virtual, Germany. pp.1-12, ⟨10.1145/3379503.3403562⟩⟨hal-02958470⟩PDF

Breeze

Breeze is a research project conducted in collaboration with the Magic Lab laboratory from Ben Gurion University (Israel). Breeze is a necklace pendant that captures breathing from one user while conveying their breathing patterns to a paired pendant worn by another user.

The seed of this project was planted during the development of Echo, when we started to envision scenarios of biofeedback applications involving multiple users. One form factor that we considered as a follow-up of the Echo avatar was a wearable, that we could more easily bring with us and use in everyday life situations. In Breeze first prototypes the feedback was only conveyed through lights, but over the course of the project we added two other modalities, vibrations and sounds. The rationale behind is to let user choose the feedback depending on the context of use — notably social context. For example a breathing biofeedback through sounds can be shared with people around, while vibrations could be perceived only by the person wearing the pendant. Early-on, we also integrated sensing in the pendant, measuring breathing thanks to an inertial measurement unit. The final pendant is meant to increase connectedness by creating a new communication channels between relatives. It can also serve as a non-intrusive sensor that accounts for new features of breathing, correlated with emotions.

The presentation of the paper at CHI ’18 was recorded in the video below. (If you have the patience to watch until the questions, to answer one that caught me off guard and haunted me ever since: the fact that in the lexicon an increase in amplitude is not associated with a perceived increase in arousal can be explained by the fact that here we tested separately each breathing feature, whereas in a typical fight of flight response most often breathing rate and amplitude increase at the same time).

Something we did not describe in the published paper: Breeze contains an additional mode, the “compass”. What happens if someone removes the pendant from their neck? Because of change in absolute orientation, we can detect such event and signal to the wearer of the paired pendant that there is a good reason why the breathing biofeedback stopped. Then this partner could hold their own pendant on the horizontal, which now acts as a compass: the feedback will change not depending on the breathing rate of the partner, but depending on their location. Light changes when pointing Breeze toward the paired pendant and both pendants vibrate when users are face to face… even thousands of kilometers apart. Your loved one became your North, a nice touch for those living apart. More than just a gimmick, this is actually another way to mediate communication through the pendant, another layer of interactivity so that users can choose what information they want to share. A mode that probably should become its own project.

In a lab study, we investigated up to which point users could understand and interpret various breathing patterns when they were conveyed through the pendant. We showed how people associate valence, arousal and dominance with specific characteristics of breathing. We found, for instance, that shallow breaths are associated with low dominance and slow breathing with low arousal. We showed, for the first time, how features such as inhalations are associated with a high arousal and unveiled a “lexicon” of breathing features. This latter result is still overlooked in the HCI context the publication took place, but we believe that breathing patterns hold a huge potential to account for inner states. While most research in physiological computing only extract a breathing rate, there is much more in terms of features (amplitude, but also pauses between breaths, difference between inspiration time and exhalation time, etc.). Breathing is actually hard to properly measure. The features we unveiled could not be inferred from heart rate variability for example, they require a dedicated sensor. Surprisingly, we also found out during the study that participants intentionally modified their own breathing to match the biofeedback, as a technique for understanding the underlying emotion. This is an encouraging result, as it paves the way for utilizing Breeze for communication.

The next phase of the project will be two-folds. On the one hand, we hypothesize that wearable technology can be used to monitor a person’s emotional state over time in support of the diagnosis of mental health disorders. With new features that can extract from breathing with a non-intrusive wearable, we plan to conduct longitudinal studies to compare physiology with a mental health index. On the other hand, we are also very much interested in studying how Breeze as an “inter-biofeedback” could alter the relationship between two persons. He would like to investigate how Breeze could increase empathy, by giving pendants to pairs of users for several months. These studies come with their own technical challenges and their supervision require an important involvement. We are on the look-out for partners or calls that could help push in these directions.

Associated publications

Jérémy Frey, Jessica Cauchard. Remote Biofeedback Sharing, Opportunities and Challenges. WellComp – UbiComp/ISWC’18 Adjunct, Oct 2018, Singapore, Singapore. ⟨10.1145/3267305.3267701⟩⟨hal-01861830⟩. PDF

Jérémy Frey, May Grabli, Ronit Slyper, Jessica Cauchard. Breeze: Sharing Biofeedback Through Wearable Technologies. CHI ’18 – SIGCHI Conference on Human Factors in Computing System, Apr 2018, Montreal, Canada. ⟨10.1145/3173574.3174219⟩⟨hal-01708620⟩. PDF