Story

Give the Robots Electronic Tongues

Humans lives their lives trapped in a glass cage of perception. You can only see a limited range of visible light, you can only taste a limited range of tastes, you can only hear a limited range of sounds. Them’s the evolutionary breaks.

But machines can kind of leapfrog over the limitations of natural selection. By creating advanced robots, humans have invented a new kind of being, one that can theoretically sense a far greater range of stimuli. Which is presenting roboticists with some fascinating challenges, not only in creating artificial senses of touch and taste, but in figuring out what robots should ignore in a human world.

Take sound. I’m not talking about speech—that’s easy enough for machines to recognize at this point—but the galaxy of other sounds a robot would encounter. This is the domain of a company called Audio Analytic, which has developed a system for devices like smart speakers to detect non-speech noises, like the crash of broken glass (could be good for security bots) or the whining of sirens (could be good for self-driving cars).

Identifying those sounds in the world is a tough problem, because it works fundamentally differently than speech recognition. “There's no language model driving the patterns of sound you're looking for,” says Audio Analytic CEO Chris Mitchell. “So 20 or 30 years of research that went into language modeling doesn't apply to sounds.” Convenient markers like the natural order of words or patterns of spoken sounds don’t work here, so Audio Analytic had to develop a system that breaks down sounds into building blocks, what they’re calling ideophones. This is essentially the quantification of onomatopoeia, like in the Adam West Batman series. You know, bang, kapow, etc.

Audio Analytic can then group sounds into major categories: “impulsive” sounds like glass breaking, “tonal” sounds like sirens, and “voiced” sounds like dogs barking. “You generally then describe all audio in terms of that taxonomy,” says Mitchell. “Then you can starting getting into, ‘Is it mechanical is it natural?’ And you start organizing the world in that way.” It’s a system a computer, or maybe one day a humanoid robot, could use to differentiate between certain sounds like it would with spoken language.

Touch is another complex sense you probably take for granted. The fifth sense isn’t just texture—it’s pressure and temperature, too. So recreating touch for robots is about combining a variety of sensors. (A company called SynTouch is already doing this, by the way.) “Getting all of that information is half the battle,” says roboticist Heather Culbertson, who studies haptics at USC. “Then you have to teach the robot what to do with that information. What does that information mean?”

It turns out your body ignores a whole lot of touch stimuli. You don’t typically feel the clothes rubbing against your body all day, and if you’re sitting comfortably, you don’t feel the pressure of sitting. Your body acclimates to avoid sensory overload.

“Robots would require a lot of computational power in order to do all of this processing, if we're talking about full-body sensing,” says Culbertson. “We would have to teach robots not only how to process the data, but how to ignore stuff that is no longer important.”

Giving robots a sense of touch will be important not just for our safety (you don’t want a surgery bot crushing your skull, for example) but for the robots themselves. “If you start to have robots in the home and you can have them around a stove, then you're going to want to have temperature sensors so you don't start melting your robot,” says Culbertson.

So, speaking of cooking. Researchers have also been developing electronic tongues, which would be an amazing name for a psychedelic band but is in fact a device that allows machines to taste. It works much like our own tongues do. We have dozens of different sensors, which respond to molecules in our food. Our brain gets input from all these and combines the data to produce the taste of, say, orange juice.

Similarly, an electronic tongue combines the output from a range of sensors. “For sour things, you could have a sensor for acids to check pH,” says chemist Emilia Witkowska Nery of the Polish Academy of Sciences. “If you want salt, you'd probably have sodium, potassium, and chlorine.”

“They're taking the interdependence between signals from different sensors,” she adds, “just as our tongue.”

Except this tongue can go where ours dare not, and taste things ours cannot. You might use an electronic tongue to taste foods for harmful additives or other adulterations, for instance. “We can easily enable them to discriminate different food products,” says Nery, “to use them instead of expert panels, or in online shopping to check what is fresh, but the problem is how this information will be delivered to us. There is no way to give us a sensation of taste without sticking electrodes in the brain, and it is surely not a viable option for mass use.”

So how should electronic tongues sufficiently describe a taste? How might a home robot chef with an electronic tongue (not that it should necessarily come out of its mouth—think more of a probe) explain to you how its chili tastes?

This brings us back to the problem of data. Electronically tasting something is computationally expensive. Combine taste with all these other senses, and you start looking at a lot of data, especially since they’d work alongside already-common machine vision, which is itself computationally expensive. So the sensing, feeling, considerate robots of tomorrow will need a system that triages environmental stimuli, or their heads will explode.

Complicating matters: Robots will have to combine different senses for certain tasks, just like humans do. So you know how when you’re drilling in a screw, and it makes that terrible squeaking-grinding noise when it gets close to finishing? A screw-drilling robot could pick up not only on that noise, but on the changing vibrations as it’s finishing the task.

This is all a long ways off, yes. But to navigate a complex human world without killing anyone or themselves, robots will need a wide range of senses that researchers are just beginning to develop. And inevitably, their senses will evolve far beyond the capabilities of our own. You know, for cooking chili, not waging war.

Leave a Reply

Your email address will not be published. Required fields are marked *