BIOFEEDBACK by ANT

Mikaela Pisani
6 min readOct 26, 2022

A unique experience: During BIOFEEDBACK exhibition, you will be able to generate audiovisual artworks from brain waves using AI and generate an NFT to recall the experience.

Generated images with GANs

ANT is a interdiciplinary group that through the integration of Art, Neuroscience, and Technology, seeks to generate both critical and playful devices that reflect on contemporary themes.

ANT works to spread and encourage interest in Art, Neuroscience, and Technology within the framework of research, education, and praxis.

At ANT, we promote Citizen Neuroscience, Citizen Art, and Citizen Technology.

ANT Logo

Can humans and algorithms complement each other to create art? How is the map of the brain activity when we create, interpret, and/or when we are spectators of an artwork?

Meat the team!

The ANT team

Why interdisciplinary?

The concept of “Swarm Intelligence” [2] was proposed by Jing Wang and Gerardo Beni in 1989 in the context of the work on Robotic Cellular Systems. This concept is used in the study and development of Artificial Intelligence (AI).

It is based on the collective behavior of decentralized, self-organized, natural, or artificial systems. It represents the constitution of a system made up of simple agents that interact with each other and the environment, resulting in the emergence of intelligent global behavior.

This type of behavior is attributed to other animals and insects such as ants, as the fundamental basis of their society. We find embedded in this concept a metaphor, an esthetics, and poetry, with a strong link to the proposed project (ANT).

This is not only because Artificial Intelligence is present in the work, but also because of the way we have chosen to carry it forward through collaboration and contribution from each discipline, in the pursuit of a common goal that has these distinctive features.

Ongoing exhibition BIOFEEDBACK — JAM

BIOFEEDBACK — JAM provides an experience that generates a unique and sequential brain waves-audiovisual interaction.

BIOFEEDBACK Logo

The objective of this experience is to reflect on two contemporary issues:

Our relationship with Artificial Intelligence, as well as its algorithms and production of (individual) subjectivity in dialogue with other subjectivities.

This is achieved through the processing of brain waves data.

This artistic device integrates Visual and Sound Arts with Neurosciences and Artificial Intelligence. From their interaction, a new entity arises — Art & Neurosciences & Technology, whose emergent properties complement each of the disciplines as well as impact society.

For these interactions, we use a “closed-loop neuroscience” approach [1]:

In this approach, a space-time pattern of audiovisual stimulation is shown while recording brain bioelectric activity (EEG). The recorded EEG is an input for a machine learning algorithm using Generative Adversarial Networks to generate the following stimulation. With this method, a stimulation-recording cycle takes place.

From this, a stimulation-registration-audiovisual feedback loop is generated among participants, who in dialogue generate a narrative. This is a dynamic artistic product in itself, resulting from the exchange -JAM- of the brain records of several people.

This exhibition took place from March through July at the EAC museum in Uruguay, as well as participated in two consecutive editions of the event Noche Iberoamericana de los investigadores, organized by OEI.

This project is supported by DAOfy, Rootstrap, Fibras, IIBCE, Impulso Al Arte Digital and The Grass Foundation.

What is DAOfy?

DAOFY Logo

“ The world is evolving more and more from a physical state to a digital one. That is why digital assets acquire more value for people every time. At Daofy, we understand that, and that is why our mission is to help entrepreneurs create successful projects from digital assets.

How do we do it?
We take care of helping entrepreneurs, advising them regarding the onboarding and proper recommendations about their project in the web3 ecosystem.
We also help them develop all the technology around their project by creating a website or a decentralized application. Then, we help them in the sales period, generating good marketing strategies to make the project successful.At Daofy, we like to help the web3 community grow, but we know that for it to thrive, we must select projects that give real value to the world. That is why when an entrepreneur wants to bring their project to Daofy, they must first apply via our website, providing us with all the possible information about it so that together the Daofy team will decide whether to incubate it or not.It is essential to understand that Daofy is an incubator that incubates from human resources, helping to promote the project through developers, designers, marketers, project managers, and everything needed to create a successful project. At this time, Daofy does not incubate projects by giving them capital. Daofy is the creative place where ideas meet. We intend to grow the web3 ecosystem as professionally and responsibly as possible. We are using technologies such as NFTs to make humanity grow and prosper.”

Photos by Maria Jose Vespa @photo_june26

How does it work?

The installation is conducted on a comfortable couch using 3 monitors and 1 tablet. When you sit on the couch, you put on a headband and after 10 seconds sound and images will be generated. During this process several things happen:

  1. EEG waves are captured through the Mind Monitor app connected to the tablet, showing with different colors each of the frequency bands. These frequencies are streaming through OSC protocol.
  2. A software transformes the frequencies to sound waves and the different transformations that are applied are shown on one monitor.
  3. EEG signals are also transformed into a vector, which is the input for a GAN (Generative Adversarial Network) [3] to generate images to be displayed on another monitor.
  4. The third monitor is used to show visuals that are changed in real-time according to the frequency values.

During the experience An identifier is being generated for each person, which is shown on the screen. With this identifier the person will be able to get an NFT to recall the experience. When finishing the experience a QR code will be printed in a label and they will be able to access to a webpage to mint the NFT.

On the tablet you will be able to see the brain waves in the different frequency bands:

Electroencephalography (EEG) allows direct and real-time recording of the electrical activity generated by cells located in the cerebral cortex in a 6 cm2 region around each recording electrode (composed of conductive material that is located on the surface of the scalp). EEG data is decomposed into frequency bands, such as delta (1–4 Hz), theta (4–8 Hz), alpha (7.5–13 Hz), beta (13–30 Hz), and gamma (30–44 Hz), expressed in absolute values. The absolute band power for a given frequency range refers to the logarithm of the sum of the Power Spectral Density of the EEG data over the frequency range.

What tools did we use?

  • Mind Monitor app & Muse S headband.
  • RaspberryPi, Amazon S3, NodeJS, Heroku, AWS EC2.
  • PureData, Hydra, Python, Tensorflow lite, NumPy, & NodeJS.

References

1. Zrenner C, Belardinelli P, Müller-Dahlhaus F and Ziemann U (2016) Closed-Loop Neuroscience and Non-Invasive Brain Stimulation: A Tale of Two Loops. Front. Cell. Neurosci. 10:92. doi: 10.3389/fncel.2016.00092

2. Beni, G., Wang, J. Swarm Intelligence in Cellular Robotic Systems, Proceed. NATO Advanced Workshop on Robots and Biological Systems, Tuscany, Italy, June 26–30 (1989)

3. Goodfellow, Ian, et al. “Generative adversarial nets.” Advances in neural information processing systems 27 (2014).

Websitehttps://ant.net.uy/
Instagramart.neuroscience.technology

--

--

Mikaela Pisani

Head of Data Science at Rootstrap. Co-Managing Director at Girls In Tech Uruguay. Member of ANT (Art & Neuroscience & Tech), an interdisciplinary group.