Tasting the future: from virtual reality to food that levitates
How will tomorrow’s restaurants be like? How may new technologies change the way we relate to food? Adrian Cheok has a lot to say. Interested in the fusion between the real and virtual worlds, his work opens new possibilities as he develops an “augmented reality” where 3D printers for food, applications or gadgets share aromas and tastes through Internet. People get to feel the presence of one another no matter the distance that lies between them. His recent work tries to make food levitate and facilitate food consumption behaviour. As one of the main speakers in Brainy Tongue, shared some of his daring ideas.
How do you work the concept of virtual reality?
We know not only we are born, but that we live in the real world. We equally know the virtual world as we see it in computer games and 3D images. We are interested in the fusion of both worlds. We also use the concept of an “augmented reality” where we introduce the real world into the virtual world in real time. For example, with a 3D capture in real time, you can capture yourself and see your virtual avatar in the virtual field. By achieving this, we seek formulas to involve our five senses. To give an example, we may almost perfectly see a virtual flower in 3D image, but we are not able to touch, smell or taste it. We are working so that in the future we have a “mixed reality” in which the five human senses merge with the virtual world.
The relation between man and machine has barely begun to demonstrate its scope. How far will it be possible to go?
I think a fusion of these two worlds, the virtual and the physical, is already taking place, but what we see is our brain, our intellect, being increased, so in the future we will see increased human brains; augmented human intelligence. For example, we rely very much on the Internet. We no longer need to memorise so many things. We simply search via Google. But we can see that very soon the connection will be much more direct. Today we are able to connect the neurons of mice with optic fibre; this is something that is already happening in present-day science. This means that in the near future we will be able to connect the electronic, digital world directly with our neurons and obtain an even stronger fusion of the human brain with the digital world. We will have the knowledge of Internet connected directly to our brains. There are many people who have implants, such as pacemakers, artificial limbs, even hip replacements, and soon this will be much more common and parts of our bodies will be replaced or improved by robotic implants. In this way, we can reach a point that if a large part of our bodies has been replaced by mechanical, robotic or electronic implants and if our brains are connected to Internet, we might ask ourselves: what is the difference between a human and a robot?
Is it really possible to merge the real world and the virtual world? How can we involve the five senses in order to break the “glass barrier” set by the Internet?
Smell and taste are the only senses that are connected to the limbic system of the brain; responsible for emotions and memory. Therefore, they can subconsciously affect your mood or arouse a memory. Today the internet is “behind the glass” of the screen in our laptop or our telephone. It is like looking through a window. You can see things, but you cannot smell or touch them. What we aim to do is expand this experience beyond the glass and reach the next stage of the internet involving our five senses.
You have developed a series of “gadgets” such as RingU or the Kissenger. What exactly are they?
The idea is to touch each other via internet. Some years ago, we created “hugging pyjamas” thinking about parents who spend a lot of time away from their children. Imagine you are away from home, perhaps on a business trip. With this system, you can call your child or speak to him/her through Skype and hug him/her thanks to these pyjamas that reproduce touch via internet. However, we wanted to obtain something more portable. In our daily life, we use watches, collars, earrings, rings, things that we put on our bodies and that are so comfortable we forget we have them on. Therefore, we decided to develop a haptic, tactile ring, the RingU. When you stroke it, it connects to the internet via your cellphone and your loved ones receive the caress wherever they are. The Kissenger (“kiss messenger”) is a small robot which, when you kiss it, measures the pressure of your lips and sends the kiss signal to your loved ones via internet.
Regarding taste and smell, what applications have you developed?
We are at a very early stage in the process. What we need to obtain is the digitalisation of smell and taste signals because it is not possible to send chemical substances via cable or radio waves. This is why we have developed an electric taste device. Basically, it is an apparatus with electrodes that you place on your tongue. When you experience a real flavour, something bitter, for example, it produces a chemical ionisation on your tongue that converts it into an electric signal. This subsequently stimulates the taste neurons in your brain. With this apparatus, we stimulate the taste receptors and neurons directly, in such a way that we experience a bitter taste, for example, without the need to use a chemical substance. We have also developed Scentee, an apparatus that you insert in your cellphone and, when you send someone a message, it gives off a smell via the cellphone itself. For example, if it is your mother’s birthday you can send her an aroma of flowers or chocolate. We have also worked together with Andoni Luis Aduriz and Mugaritz to reproduce and send the aroma of one of their dishes: the smell of sesame seeds when grounded in a mortar with saffron, to give an idea of what it would be like to experience being in the restaurant. The next stage will be to obtain a non-chemical stimulus of smell. The problem is that the olfactory bulb is behind the nasal cavity and it is very difficult and uncomfortable to position an electrode there, so we are trying to do this using magnetic fields.
How do you imagine the restaurant experience change in the future?
Some time ago, chef Andoni Luis Aduriz said something to me that struck me as visionary: in the 21st century it is not a question of cutting carrots or making soup, but of creating an experience, and making it digital is part of the experience. So we may also have a mixed reality when it comes to food. Visionary chefs want to create the most exciting emotional experience possible. We will continue to have analogue food , but what we will do this by improving it with the digital. For example, right now it is complicated to get food on the go from changing it from sweet to savoury in a second, but we will be able to do it when we improve food digitally. When we manage to digitalise flavours and aromas and emit them through these interfaces, a whole new field will open for gastronomic creation. By way of example, when music was digitalised, when the first CD appeared, people became enthusiastic because they could hear a Beethoven symphony with perfect clarity, as if they were in the concert hall. But this was only the beginning. Now you can make completely different types of music, in a creative sense, through synthesisers and digital instruments. In this same way, we will be able to create different types of food which are complicated to obtain without the use of digital. We can imagine that, in the future, just as we can programme software in the computer, we will be able to programme food, so being a chef will nearly be like being a programmer. With these digital apparatus, they will be able to create digital flavours and aromas, but also physical food, thanks to 3D printers. They will be able to design food on the computer and then print it. Thanks to this fusion of the physical and the virtual world, chefs will find new ways of expression.
¿You are also working on a ¨Magnetic Table Interface¨ with which food could “levitate”. Is that really possible? Why are you after this?
We are developing a magnetic table that utilizes the fundamental physics of magnetic forces-at-a-distance for object levitation, by using an electromagnetic array underneath of the table. This can attract, repel and change the weight of these utensils. Our first application would be to use this platform to change the weight of eating utensils in a dining table. We hope to use this interface to facilitate the food consumption behaviour related research that is being carried out in the Imagineering Institute.
Another of your projects is called “Food Media”, a type of interactive communication between members of the family through food. How does it work?
We wanted something that links the elderly with their family and friends. In Japan, as in many other places, the population is ageing rapidly and the elderly suffer from loneliness and depression. So, we decided to look at the past when whole families, grandparents, parents and children, lived in the same house and cooked and ate together. This represented a time of intensification of these family links. We wanted to reproduce this via Internet. What we did, for example, was to develop utensils, forks, knives, spoons, that make it possible to “feel” grandmother’s hand helping you prepare the food, to stir whatever was in the pot. We also developed systems through which the members of the family could feel as if they were all eating together, not only via video-conference but also by transmitting kitchen aromas to the other person. And thanks to a very basic version of a 3D printer, children can design very simple sweets that they print with sugar and gelatine in grandma’s house.