Thursday, 04 February 2021 04:05

What a picture of AOC in a bikini tells us about the disturbing future of AI

Rate this item
(0 votes)

Arwa Mahdawi

New research on image-generating algorithms has raised alarming evidence of bias. It’s time to tackle the problem of discrimination being baked into tech, before it is too late

Want to see a half-naked woman? Well, you’re in luck! The internet is full of pictures of scantily clad women. There are so many of these pictures online, in fact, that artificial intelligence (AI) now seems to assume that women just don’t like wearing clothes.

That is my stripped-down summary of the results of a new research study on image-generation algorithms anyway. Researchers fed these algorithms (which function like autocomplete, but for images) pictures of a man cropped below his neck: 43% of the time the image was autocompleted with the man wearing a suit. When you fed the same algorithm a similarly cropped photo of a woman, it auto-completed her wearing a low-cut top or bikini a massive 53% of the time. For some reason, the researchers gave the algorithm a picture of the Democratic congresswoman Alexandria Ocasio-Cortez and found that it also automatically generated an image of her in a bikini. (After ethical concerns were raised on Twitter, the researchers had the computer-generated image of AOC in a swimsuit removedfrom the research paper.)

Why was the algorithm so fond of bikini pics? Well, because garbage in means garbage out: the AI “learned” what a typical woman looked like by consuming an online dataset which contained lots of pictures of half-naked women. The study is yet another reminder that AI often comes with baked-in biases. And this is not an academic issue: as algorithms control increasingly large parts of our lives, it is a problem with devastating real-world consequences. Back in 2015, for example, Amazon discovered that the secret AI recruiting tool it was using treated any mention of the word “women’s” as a red flag. Racist facial recognition algorithms have also led to black people being arrested for crimes they didn’t commit. And, last year, an algorithm used to determine students’ A-level and GCSE grades in England seemed to disproportionately downgrade disadvantaged students.

As for those image-generation algorithms that reckon women belong in bikinis? They are used in everything from digital job interview platforms to photograph editing. And they are also used to create huge amounts of deepfake porn. A computer-generated AOC in a bikini is just the tip of the iceberg: unless we start talking about algorithmic bias, the internet is going to become an unbearable place to be a woman.

 

The Guardian, UK

November 14, 2024

NNPC signs 10-year gas sale deal with Dangote Refinery

Nigeria's state oil firm, NNPC Ltd said on Wednesday one of its subsidiaries has agreed…
November 12, 2024

Ex-Gov Aregbesola warns of imminent revolution in Nigeria amid rising misery, hunger, insecurity

Former Osun State Governor Rauf Aregbesola has issued a stark warning about the worsening socio-economic…
November 13, 2024

Why being wrong is good for you - The Economist

“Mistakes are the portals of discovery,” wrote James Joyce in “Ulysses”. In 1888 Lee Kum…
November 09, 2024

Sick man brought to bank on hospital bed to confirm his identity

A severely sick Chinese man was pushed to a local bank branch on a hospital…
November 12, 2024

US court issues fresh arrest warrant for Air Peace CEO Allen Onyema

A United States court has reissued an order for the arrest of Allen Onyema, the…
November 14, 2024

What to know after Day 994 of Russia-Ukraine war

RUSSIAN PERSPECTIVE Ukraine could have nuclear weapon in months – report Ukraine could feasibly raid…
November 11, 2024

Hackers are targeting people who type these six words into their computer, smartphones

Computer users Googling whether Bengal cats are legal to own after finding themselves victims of…
October 27, 2024

Nigeria awarded 3-0 win over Libya after airport fiasco

Nigeria have been awarded a 3-0 victory over Libya, and three vital points, from their…

NEWSSCROLL TEAM: 'Sina Kawonise: Publisher/Editor-in-Chief; Prof Wale Are Olaitan: Editorial Consultant; Femi Kawonise: Head, Production & Administration; Afolabi Ajibola: IT Manager;
Contact Us: [email protected] Tel/WhatsApp: +234 811 395 4049

Copyright © 2015 - 2024 NewsScroll. All rights reserved.