Thin device unleashes one of the strangest and most useful quantum mechanics phenomena - ScienceDaily

Do eyes on self-driving cars reduce accidents? Signals from eye movement could help pedestrians anticipate vehicle intentions – ScienceDaily

A new study at the University of Tokyo shows that robotic eyes on self-driving vehicles can improve pedestrian safety. Participants played scenarios in virtual reality (VR) and had to decide whether or not to cross the road in front of a moving vehicle. When that car was equipped with robotic eyes, which either looked at pedestrians (registering their presence) or away (not registering them), participants were able to make safer and more effective choices.

It looks like self-driving vehicles are just around the corner. Whether they’re going to deliver parcels, plow fields, or take kids to school, a lot of research is underway to turn a futuristic idea into a reality.

While the preoccupation for many is the practical aspect of creating vehicles that can navigate the world autonomously, researchers at the University of Tokyo have turned their attention to a more “human” interest in self-driving technology. “There is not enough investigation into the interaction between self-driving cars and people around them, such as pedestrians. Therefore, we need more investigation and effort into such interaction to bring safety and reassurance to society regarding self-driving cars,” he said. Professor Takeo Igarashi of the Graduate School of Information Science and Technology.

One of the main differences with self-driving vehicles is that drivers may become more than passengers, so they may not fully pay attention to the road, or there may be no one at the wheel at all. This makes it difficult for pedestrians to gauge whether the vehicle has registered their presence, as there may be no eye contact or indications from the people inside.

So, how do pedestrians be notified when a self-driving vehicle has noticed them and intends to stop? Like a character from the movie Pixar carsA self-driving golf cart with large, remote-controlled robotic eyes. The researchers called it the “staring car.” They wanted to test whether placing moving eyes on the carriage would affect the behavior of more dangerous people and, in this case, whether people would continue to cross the road in front of a moving vehicle when they were in a hurry.

The team developed four scenarios, two with the wagon eyes and two without. The vehicle either noticed the pedestrians and intended to stop or did not notice them and would continue to drive. When the cart has eyes, the eyes will either look toward the pedestrian (it will stop) or look away (it will not stop).

Since it would obviously be risky to ask volunteers to choose whether or not to walk in front of a moving vehicle in real life (even though there was a hidden driver in this experiment), the team recorded the scenarios using 360-degree video cameras and 18 participants (nine women and nine men, ages 18-49, all Japanese) through the experience in virtual reality. They tried the scenarios several times in random order and were given three seconds each time to decide whether or not to cross the road in front of the wagon. The researchers recorded their choices and measured error rates in their decisions, that is, how often they chose to stop when they could cross, and how often they crossed when they had to wait.

“The results indicated a clear difference between the sexes, which is very surprising and unexpected,” said project lecturer Xia Ming Zhang, a member of the research team. “While other factors such as age and background may also have influenced participants’ reactions, we believe this is an important point, as it shows that different road users may have different behaviors and needs, which require different ways of communication in our future global self-leadership.”

“In this study, male participants made several risky decisions to cross the road (ie, choosing to cross when the car would not stop), but these errors were reduced by looking at the cart’s eye. However, there was no significant difference in the situations that were safe for them (i.e., choosing to cross) when the car is about to stop), Zhang explained. “On the other hand, participants made more effective decisions (i.e. choosing not to cross when the car intended to stop) and these errors were reduced by the look of the cart’s eye. However, there was no significant difference in unsafe situations for them.” The experiment eventually showed that the eyes led to a smoother and safer crossing for all.

But how did the participants’ eyes feel? Some thought they were cute, while others saw them as scary or scary. For many of the male participants, when their eyes were looking away, they reported feeling the situation was more dangerous. As for the female participants, when their eyes looked at them, several said they felt safer “We focused on eye movement but did not pay much attention to their visual design in this particular study. We designed the simplest one to reduce the cost of design and construction due to budget constraints,” Igarashi explained. “In the future, it would be better if a professional product designer could find the best design, but it will probably still be difficult to please everyone. Personally, I like it. It’s kind of cute.”

The team realizes that this study is limited by the small number of participants who play only one scenario. It is also possible that people will make different choices in virtual reality compared to real life. However, “The transition from manual driving to automatic driving is a big change. If eyes can really contribute to safety and reduce traffic accidents, we should seriously consider adding them. In the future, we would like to develop automatic connected robotic eyes control to self-driving AI (rather than manually controlled), which can accommodate different situations.” Igarashi said. “I hope this research will encourage other groups to try similar ideas, i.e. something that facilitates better interaction between self-driving cars and pedestrians, ultimately saving people’s lives.”

Paper title:

Xia Ming Zhang, Koki Toda, Xinyu Gui, Stella H Siu, and Takeo Igarashi. 2022. Do eyes on the car reduce traffic accidents? At the 14th International Conference on Automotive User Interfaces and Interactive Vehicle Applications (AutomotiveUI ’22), September 17-20, 2022, Seoul, South Korea. ACM, New York, New York, USA, 16 pages. https://doi.org/10.1145/3543174.3546841

Financing:

This work was supported by JST CREST Grant Number JPMJCR17A1, Japan.

The self-driving golf cart is supplied by Tier IV, Inc.

#eyes #selfdriving #cars #reduce #accidents #Signals #eye #movement #pedestrians #anticipate #vehicle #intentions #ScienceDaily

Leave a Comment

Your email address will not be published.