Vehicle manufacturers supposedly design autonomous cars to operate safely in the bustle of busy streets. But new research from King’s College London shows this safety doesn’t apply to everyone.
The findings illustrate driverless cars inadequately detect children and pedestrians with darker skin tones.
Researchers tested more than 8,000 images through eight AI-powered systems that detect pedestrians in autonomous vehicles. They found the systems were 20 percent more likely to accurately detect adults than children.
The same systems were also able to accurately detect lighter-skinned pedestrians compared to pedestrians with darker skin tones by 7.5 percent. This bias increases in settings with low contrast and brightness, which could lead to increased problems when driving at night.
Pedestrian images used to train AI systems are a major problem, the research shows, given more of them feature people with lighter skin.
“Car manufacturers don’t release the details of the software they use for pedestrian detection, but as they are usually built upon the same open-source systems we used in our research, we can be quite sure that they are running into the same issues of bias,” Dr. Jie Zhang, one of the study’s researchers, stated.
Dr. Zhang calls on manufacturers and governments to work together to create regulations that will ensure safety.
“As AI becomes more and more integrated into our daily lives, from the types of cars we ride to the way we interact with law enforcement, this issue of fairness will only grow in importance.”
Of course, autonomous vehicles aren’t the only category where AI has been a problem. For example, Snapchat’s AI chatbot, My AI, used slurs and made controversial statements. In another incident, Bing Chat told a user to “Heil Hitler.”
Image credit: Shutterstock
Source: King’s College London
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.