Like society, Apple Intelligence is ultimately full of prejudice

The first versions of artificial intelligence tend to make sometimes annoying errors ... Remember in February 2024, when Google had tosuspendThe generation of images on Gemini. His AI represented black Nazi soldiers and women among the American founding fathers, forcing the company to temporarily deactivate this functionality.

Apple Intelligence is no exception to the rule. If you are already using Apple IA tools in the United Kingdom or the United States, you may have noticed its mistakes in the notifications summary. BBC and New York Times were victims ofSeveral false information generated by AI. The apple firm is now faced with a new challenge with its image generator.

Deep stereotypes

Playground image, Apple Intelligence's image generation tool, struggles to manage diversity. The expert in Jochem Gietema language models tested the system by subjecting his photo, obtaining surprising results.AI hesitates on the subject's skin color, the representative sometimes white, sometimes black, nothing very serious in itself.

The most disturbing is that Image Playground reproduces well anchored social shots. When Jochem asked the Apple generator to generate images of him in "rich" or "banker",The tool mainly generates white faces in costume. Conversely, the words "poor" or "farmer" mainly produce darker faces with relaxed outfits:

Found some biases in@Apple's Image Playground, which is their newest text-to-image generation app. For one particular photo, it creates very different images depending on the prompt (same input image). Some examples:pic.twitter.com/NO1oSxvO8r

- Jochem Gietema (@gietema)February 17, 2025

An AI that reflects our biases

These results raise questions about learning AI. The data used to train image playground seem to comea base imbued with social stereotypes. Apple will probably have to review its approach to avoid this type of discriminating associations.

The expert at the origin of these tests specifies, however, that these biases do not systematically appear. Using other photos as a starting point,Playground image produces more balanced results. The apple firm should therefore correct these drifts with an update of its learning model.

i-nfo.fr - Iphon.fr official app

Equal: AG Tescience