Unmanned driving systems are worse at recognising children and dark-skinned people, study says
Researchers at King's College London have concluded that unmanned driving systems for cars are prone to bias, especially against children and black people.
Here's What We Know
The team looked at eight artificial intelligence pedestrian detection systems used for autonomous driving research. They ran more than 8,000 images through the software and found that the AI identified adult pedestrians almost 20 per cent better than children.
In addition, the systems in question were 7.5% worse at recognising dark-skinned people, especially in low light, making the technology less safe at night.
The study did not test the software, which is already used by robot car makers like Waymo and Cruise. However, the results further confirm growing concerns about the safety of the technology.
According to the researchers, the main reason for AI's problems recognising children and dark-skinned people is the bias in the data used to train the algorithms. Typically, adults and light-skinned people are more likely to be represented in datasets.
A Waymo spokesperson said the study does not represent all the tools used in the company's cars. In addition to camera images, the AI receives data from sensors like lidar and radar, which improve the accuracy of object detection.
Waymo also trains the system to recognise human behaviour and makes sure the datasets are representative.
Flashback
Algorithms often reflect biases present in datasets and in the minds of the people who create them. One common example is facial recognition software, which has consistently shown less accuracy when recognising the faces of women, dark-skinned people and Asians.
However, these concerns have not stopped the enthusiasm for such AI technologies. Facial recognition has already been responsible for the unjust arrest of dark-skinned people on more than one occasion.
Go Deeper:
- An innocent woman who was eight months pregnant was arrested in the US because of a facial recognition error
- Detroit police will change rules on the use of facial recognition after mistakenly arresting a pregnant woman
Source: Gizmodo