How fashion brand UNLABELED’s wearable technology is preventing AI from detecting humans

The integration of AI into today’s society is developing so rapidly that it is partly becoming our norm. Make rapid and dramatic changes for a Assumed well, AI is used to fight racial inequalities, diagnose medical conditions like dementia, predict suicide attempts, fight against homelessness in the UK and even offer you legal advice — that’s right, a AI lawyer. However, while new technologies have their undeniably progressive qualities, AI also reminds us of the terrifying reality of a George Orwell. 1984– level of control as well as its fairly obvious flaws.

Such examples include its potential use in the United States for spy on inmates, Facebook’s AI racial failure to label black men as ‘primates’, the use of voice profiling, his persistent inability to distinguish between different colored people and it’s worrying uncontrollable future– to name a few. This small list of incidents is enough for us to ask the question, should we fear AI? Well, a burgeoning new fashion brand seems to think so.

NOT LABEL (stylized in capital letters) is a new group of incredible and exciting artists and textile brand shaping fashion’s relationship with AI. Founded in Japan — in collaboration with Dentsu Tokyo Laboratory– the creators Makoto Amano, Hanako Hirata, Ryosuke Nakajima and Yuka Sai developed what is described as “camouflage against the machines”. Its specially designed clothes are constructed with specific patterns that prevent any AI used in the real world from recognizing you. Don’t worry, we’ll explain how it all works, but first, let’s see why the team decided to create the brand.

Creators’ project details – found on the Computer creativity laboratory– disclose the reasoning behind the creation. “Surveillance capitalism is here,” he says, “Surveillance cameras are now installed outside homes as well as in public places to constantly monitor our activities. Personal devices record all personal internet activity as data without our knowledge. In the Branding Support Project documentation video found on the same page, he details in more detail how this data acquired about us can be used: “The system turns our daily behavior into data and misuses it for the purposes of efficiency and profit motive. “

For creators, “the physical body is no exception” when it comes to using our data through AI. “With the development of biometric data and image recognition technology to identify individuals, information in real space is instantly converted into data,” they write. “So our privacy is threatened all the time. In [this] situation, what does the physical body or the choice of clothes mean? From this question was born the fashion camouflage of UNLABELED to escape the exploitation of information.

Showing a video example of their clothes in action, via the Computation Creativity Lab website, a comparison is shown between two individuals, one wearing the garment and the other in normal clothing. “[When] wearing [UNLABELED’s] particular clothes, the AI ​​will hardly recognize the wearer as “human”, [while] people wearing normal clothes are easily detected, ”the video says. And it seems to be working. The camera is can no longer recognize the person wearing UNLABELED. The brand name is quite appropriate, but how does it work?

If you guessed the AI, you would be right. UNLABELED fights AI with IA. The brand team has developed a series of models, like the one featured above, that confuse surveillance AI. The patterns were created by another model of AI used by the inventive creators: “In order to fool the AI, we have contradictoryly trained another model of AI to generate specific models causing the AI ​​to misrecognize , then we created a camouflage garment using the pattern, ”they recounted in the project documentation video.

How fashion brand UNLABELED's wearable technology is preventing AI from detecting humans

This technique is described by UNLABELED as an “Adversarial Patch” or “Adversarial Examples”. This unique method involves adding specific patterns (or little noises to the naked eye) to images or videos with the aim of inducing false recognition in the AI. If successful, the resulting models created from this particular approach cause the AI ​​to misrecognize shapes and objects. UNLABELED notes that this technique is currently widely used in research to actually improve surveillance gaps, but has in turn reversed this gap for the benefit of its products and “protect our privacy.”

“Once the adversarial model is created, we drop them onto the 2D model. Then the pattern [is] printed on a plain polyester blend fabric with transcription. After printing, we follow the general garment production procedure, ”says UNLABELED. The brand has even developed a Skateboard in the same AI escape patterns. The products are available for purchase on the brand’s website.

While such fashion tech isn’t widely available, it does indicate continued positive change versus heavy surveillance – I mean, we all hate it when those ads pop up minutes after we’ve just mentioned the name. of a product. I know you’re listening to me, Apple. However, there are two sides to every coin, even when it comes to AI. While this technology may better help us avoid surveillance, it may help us better anybody to avoid surveillance, if you know what I mean. From my own perspective though, it’s another sign of a generation rebelling against the norm and I love it.

How fashion brand UNLABELED’s wearable technology is preventing AI from detecting humans




Source link

About Leonard J. Kelley

Check Also

Hyundai makes an all-new system that helps cars move sideways and turn in place

Hyundai Mobis, the parts and service arm of the South Korean automaker, has announced that …

Leave a Reply

Your email address will not be published. Required fields are marked *