Skip to main content
AI technology creates ears for autonomous vehicles

Embedded siren detection

Bosch Research is developing smart acoustic sensors to improve road safety for autonomous vehicles

Bosch Research is developing smart acoustic sensors to improve road safety for autonomous vehicles

Acoustic sensors establish siren detection

Your ears can detect a lot of important information behind the wheel of a car – especially about things you can’t see. Changes in the rumble of the engine or hum of the tires on the road could indicate a mechanical fault. The shriek of children playing around the corner, a bicycle bell or the distant siren of an emergency vehicle can prepare you to slow down and react accordingly. But what if you aren’t actually in control of the car? As autonomous driving levels increase, a car’s ‘senses’ are becoming just as important as those of the humans inside it. Its ‘eyes’, in the form of cameras, radars and lidar sensors, are crucial, of course. But autonomous vehicles also need ears. Bosch recognizes this and has been working on an AI technology solution to meet legal and general safety needs – starting with siren detection.

Detect and recognize with acoustic sensors

In many countries, it is a legal requirement to give way to emergency vehicles. As Thomas Buck, electrical engineer at Bosch Research, explains, this means a self-driving car must be able to recognize the sound of a siren and respond in the right way. “People often say, ‘Won’t autonomous vehicles just be able to talk to each other?’, but it will take a long time until everything on the road has that capability. And other road users still need to hear sirens.” The ears come in the form of microphones. Installing these on a car might sound simple – after all, most cars already have them for hands-free phone calls and voice-controlled media. But cockpits are now extremely good at blocking out external noise: for a microphone to accurately detect sounds from the outside world, acoustic sensors need to sit outside the car.

Thomas Buck and his colleagues are working on acoustic sensors to equip autonomous cars with “ears”.
Thomas Buck and his colleagues are working on acoustic sensors to equip autonomous cars with “ears”.

Listen to a part of the podcast episode about this topic.

0:00 0:00

A lot harder than it sounds: attaching sensors to a car

Sensor sound siren detection
Important for Level 3 and 4/5 automated driving: With the help of an intelligent sensor, obscured emergency vehicles (e.g. at an urban intersection) can be detected – even before visual recognition by camera images. Here the sensor is installed on a test vehicle and tested under real conditions.
Sound siren detection
Verification and analysis of the function in the vehicle: The raw signals of the sensor are read in and the evaluation is checked with AI. Direction recognition is possible. The necessary algorithms can be applied in the sensor itself and the system can communicate the information via an automotive interface.
Sensor data analysis
In the laboratory, individual sensors are examined and re-characterized after the testing on the road. Possible influencing factors (rain, rockfall, vibrations) can change the function. It is also possible to analyze the recorded data afterwards on a computer. In this way the algorithms and the intelligence of the systems can be continuously improved.
Sensor characterization in an anechoic box in the lab in Renningen.
Sensor characterization in an anechoic box in the lab in Renningen.
/

Combining durability and high performance of acoustic sensors

Outside sensors of a car: This is where things become a little more complex, as Thomas explains. “The body of a car is exposed to some very harsh environmental challenges: rain, wind, stones, oil, cleaning chemicals, even high-pressure jet washers. Microphones are sensitive objects, so we need to make sure they are protected.” For the last 5 years, the Bosch team in Stuttgart, southwest Germany, has been carrying out extensive tests to find the right solution: intelligent acoustic sensors with a microphone that can withstand everything the road can throw at it, mile after mile. The mic itself is a classic MEMS (micro-electro-mechanical system) microphone, just like the ones found in cell phones.

To protect it, the team has developed a membrane that passes on sound waves but keeps things like dirt and water out – a bit like an ear drum. This is a delicate and difficult balance to achieve: too thin and the microphone can become damaged; too thick and the acoustic performance is compromised. The engineers put a variety of materials through dozens of tests, subjecting them to oil, gasoline, water jets, even stones by pelting the sensor with tiny rocks, says Thomas. “Thanks to the protective membrane, our sensor can now handle all these events and still work incredibly accurately.” Further tests for interference like wind and vibrations revealed the best locations for the auto sensors: on the front and rear bumpers as well as the roof.

Listen to a part of the podcast episode about this topic.

0:00 0:00
andreas merz

The AI learns to recognize the acoustic signature of each sound, so it can distinguish sirens from other acoustic signals.”

Dr. Andreas Merz, physicist at Bosch Research

Making siren detection smart using AI technology

Being able to detect sounds is only one part of the smart sensor, though. “After all, you want your vehicle to react to a siren, not a nearby car alarm,” says physicist Andreas Merz. He and his team have been training artificial intelligence algorithms, which are embedded on a tiny microprocessor inside the sensor housing, to distinguish between a siren and a range of other noises picked up by the microphone. “That means feeding the algorithms more than 200 GB of audio data for training. We’ve collected siren sounds from different emergency vehicles in a variety of countries, as well as background noises like alarms, car horns and so on,” Andreas says. “By doing this, the AI learns to recognize the acoustic signature of each sound, so it can filter out what isn’t required.” Once a siren is identified, the information is relayed to the car’s onboard computer to determine the appropriate action such as slow down or change lanes. In addition, information about the siren could be provided to the driver in non- or semi-autonomously driving cars.

Listen to a part of the podcast episode about this topic.

0:00 0:00

44,000 minutes

of audio data (more than 200 GB) are fed to the AI algorithm of the siren detection.
This is more than 30 days – day and night – of siren and non-siren sounds. But more important than the amount of the data is its quality:
Two minutes of sound from critical measurements with an emergency vehicle behind a wall or a building,
increase the learning effect of the algorithm more – in proportion – than 30 minutes of silence or birds tweeting.

How it works: Recognition of emergency vehicles

Harnessing the power of AIoT

The potential for this AI technology is significant and could stretch far beyond siren detection. According to Andreas Merz, the embedded AI system will be able to learn additional functions and download upgrades – making the sensor a true AIoT (Artificial Intelligence within the Internet of Things) device. “It could be voice commands from outside the vehicle like, ‘Open the trunk’, for example. Or the sensor could detect subtle changes in road conditions or the mechanics of the car,” he says. “It could even detect damage impact. The closer we come to seeing autonomous vehicles on the road, the more we’ll hear of these acoustic sensors.”

from know how to wow

Siren detection, or: Why cars need ears

Share this on: