Reddit | autonocat, Reddit

Research Finds Self-Driving Cars Struggle To Detect Dark-Skinned Pedestrians

When someone announces some exciting new tech, it's natural to have a hard time waiting until you can get your hands on it. That's especially true if it promises to handle a job that a lot of us really don't enjoy doing.

That makes it pretty easy to understand why anyone would be frustrated by the fact that we don't all have self-driving cars yet. With every slight mistake likely to get somebody screaming or honking at us, driving can be a stressful experience.

However, a recent study serves as a helpful reminder that things could get even more stressful if they're released before they're fully ready.

Based on how their report begins, the research team from the Georgia Institute of Technology are quite aware of what impact self-driving cars could end up having on our world.

Reddit | TwisterII

Their study cites relatively minor effects like reducing transit costs whenever something is shipped alongside possibly helping along a cultural shift that finds fewer people relying on having their own cars.

However, they also pointed out that the models which have been tested "have shown an inability to entirely mitigate risks of pedestrian fatalities."

Reddit | BigFlavors

No matter how delicately you put a statement like that, it's bound to make people nervous about crossing the street while a self-driving car is around.

But for them, the question was whether there was any type of person that these self-driving cars were more likely to fail to detect.

To answer that, they showed state-of-the-art object detection models a simulated crowd and logged where each member of the crowd's skin tones fell on the Fitzpatrick scale.

Wikimedia Commons | John D’Orazio, Stuart Jarrett, Alexandra Amaro-Ortiz and Timothy Scott

What they found was that the detection systems were consistently worse at detecting people whose skin tones registered at four or higher on this scale.

On average, they were 5% worse at detecting these pedestrians than those with lighter skin tones.

That difference may not sound like much, but it persisted no matter what time of day the systems were supposed to be working at.

Reddit | matznerd

Lighting issues are often a factor when automated systems fail to detect people, so it made the sensor's poorer performance with darker-skinned people stranger when this wasn't found to be the reason for it.

The research team also tested whether the sensors were being confused by "small pedestrians" and ones who were partially behind cover.

Reddit | alabamaman69

Worryingly, they reported that "small" pedestrians are known to be difficult to detect.

And yet, removing those from the equation didn't seem to affect the difference between its ability to detect light-skinned and dark-skinned people.

Instead, they found that the problem had to do with how the object-detection systems prioritized the people they encountered.

Reddit | TedRitties

These systems were trained to recognize humans using models of lighter-skinned people, but further issues arose when the team discovered that the systems didn't adjust what they "understood" about people when they encountered somebody with darker skin.

In other words, light-skinned people were, however intentionally, set as the system's default and it didn't learn to stop regarding them that way.

Fortunately, researchers were able to address this problem by adjusting how people are "weighted" in these systems.

Reddit | GoodOmens

By doing so, they not only addressed the the detectors' failings in noticing dark-skinned people, but actually improved the systems' ability to detect light-skinned people.

However, there were also some limitations with this study.

Reddit | autonocat

As Vox reported, it has yet to be peer-reviewed, which would give its findings a chance to hold up to scrutiny.

Another problem was that the system and datasets used weren't actually the same as those deployed by actual self-driving car manufacturers. Although that's because they could only gain access to publicly available ones.

Still, it could remain valuable if both what the team had observed and how they improved the problem holds true.

h/t: Predictive Inequity In Object Detection