• Home
  • tech-future
  • Natural Interaction: This is how BMW's operating concept works

Natural Interaction: This is how BMW's operating concept works

BMW Natural Interaction
Subscriptions & booklets

H aben Buttons, touchscreens and rotary /reset buttons will soon be obsolete? BMW is convinced of it. At the Mobile World Congress in Barcelona, ​​the Munich-based company is currently presenting a new operating concept for their future cars. It's called BMW Natural Interaction and combines voice and gesture control as well as eye recognition. This means that communication between occupants and the car is as reliable and natural as that between people.

Optimized sensors, infrared light, HD camera

The first functions of the new concept should be available in iNext , which will be launched in 2021. You could already drive the car at the Mobile World Congress in Barcelona - if only in a simulation using VR glasses. The electric crossover arrived punctually to pick you up at the airport. And to show that he can do much more than get people from A to B. For example, he can open his windows by himself when asked to do so by the driver. This does not happen by pressing a button, but by pointing a finger in the direction of the side window, combined with the command: “Please open!” This is how the other functions of the iNext can also be operated: navigation commands, switching between driving and allowing yourself to drive, the room temperature, everything . And when the complexity becomes too great and the driver doesn't know what is behind a function, he simply asks the car and has it explained to him.

That works - at least in the simulation - very reliably. For this to work, future BMWs must be able to analyze gestures in their context. With regard to hand and finger movements, they record not only the type of gesture but also its direction - in an extended area in the entire driver's workplace. Not only are there optimized sensors on board, but the gesture camera now has an infrared light signal that records the movements of the hand and fingers in three dimensions and precisely determines their direction. In addition, there is an HD camera in the instrument cluster that detects the direction of the head and the direction of vision.

Communication “in a completely natural way”

Behind the technology there is a learning algorithm that is constantly is further developed. It combines and interprets the information and the vehicle reacts accordingly. 'In the future, the customer will not have to think about which operating strategy he will use to get there, but interact freely at any time and the vehicle will understand him,' explains Christoph Grote, Senior Vice President BMW Group Electronics.And what does that do for the driver? BMW says that from now on he will be able to decide even better how he will interact with the vehicle based on his preferences, habits or the respective situation. Is he currently in conversation? Then he commands the car with looks and /or gestures. Does he need his eyes on the road right now? Then he uses languages ​​and gestures - and so on. The virtual simulation at the trade fair shows that it could work - but it wasn't spacious enough to be able to try everything out.

It also demonstrates that this type of human-car communication is not just should remain limited to the interior - and what advantages this brings. The driver can, for example, point outwards and ask the question: “What kind of building is that?” The vehicle answers such questions as well as those about opening times, names of restaurants or parking fees. Comprehensive environmental data stored in the car allows it to take on the role of a well-informed passenger, according to BMW.

Cinema trailer on the windshield

He always asks directly whether you can buy tickets for the respective museum or would like to reserve a table in the restaurant that you have just passed. That can of course be practical, and the first time it is a cool feeling to have cinema tickets bought directly from the iNext and to see the corresponding trailer in the car to get in the mood for the film, projected onto the windshield while the BMW drives autonomously . But the truth is also that the function develops a certain nerve factor after a short time. If you are asked every time whether you want to visit an exhibition or a restaurant, you inevitably ask yourself when you should actually still work, spend time with family and friends or sleep.

But the solution for this Problem is already on the horizon. The natural interaction should also be able to take into account the emotions of the vehicle occupants. If you bark at him with sentences like “I'm not hungry!” Or “I'm busy!”, The iNext may be a little less caring, which can be nice. At the same time, it is noticeable that instead of a natural conversation, communication between the commander and the recipient of the order is more established. While one would ask another person, one prompts the car and quickly falls into a rather harsh tone. You can also save yourself a simple “thank you”.

A lot of catching up to do in terms of “driving dynamics”

But the car takes it with equanimity and doesn't hold it against you. This again shows the advantage of working with one machine. An interpersonal dialogue would certainly tip in a somewhat uncomfortable direction. The simulation also reveals a disadvantage: In the virtual world, the iNext drove about as angular and unnatural as ina Nintendo racing game from the 90s. In real life, the driving dynamics will hopefully be much closer to the brand essence. We're in good spirits: After all, BMW not only employs experts for programming, digital and artificial intelligence, but also a couple of really good chassis engineers.


Leave a reply

Name *