No matter how smart the systems are – they still need our control
Schneier on Security, April 19, 2022
Since the training of ML models requires certain computational costs and technical knowledge, this task can be transferred to the service provider. In a new study, experts demonstrate how an attacker can inject an undetectable backdoor into a classifier, while his behavior will seem normal.
At the same time, without the necessary “backdoor key”, the attack mechanism cannot be found by any observer with limited computing capabilities. The study demonstrates two frameworks for installing undetected backdoors with incomparable guarantees. AI Backdoors in not something new, Adversa’s CEO mentioned them in the latest Forbes article stating that they are very hard to find and now we see another prove from researchers that its impossible.
The work first demonstrates how to inject a backdoor into a model using a digital signature scheme. Given a black box access to the original model and a backdoored version, no different inputs can be computed, which means that the backdoored model has a generalization error comparable to the original model. From the paper, you will also learn how to insert undetected backdoors into models trained using the Random Fourier Function (RFF) learning paradigm or ReLU random networks. Also, the proposed construction of undetectable backdoors can create a classifier that is indistinguishable from a classifier that is reliable in the adversarial protocol – while each input in it has an adversarial example. According to the researchers, identifying undetected backdoors is an extremely important area of research on the path to intrusion protection certification.
electrek, April 22, 2022
If you asking what can be the issue of fooling an autonomous AI and if it possible in real life here is the example!
It’s no secret that the safety issue of autonomous vehicles is particularly acute – because people. naturally concerned about the issue of their own safety when using autonomous vehicles. Here is another example of how smart Tesla got into an accident.
Unfortunately for the manufacturer, a new Tesla car was filmed as the infamous car crashed into a $3.5 million Cirrus Vision jet. The Smart Summon that allegedly caused the crash is based on Tesla’s previous “Summon” feature. This feature has been used to autonomously move cars a few feet down a driveway or in restricted parking situations. The updated version of the feature gives owners the ability to call their Tesla vehicles from a greater distance, while having the vehicles navigate through more complex parking environments.
However, even when using such autonomous functions, owners should not forget about attentiveness and control: it was the lack of the latter, apparently, that caused what happened. At a Cirrus event at Spokane’s Felts Field, a Tesla Model Y car, the owner of which used the above function, crashed into a Vision Jet – after which the video went viral all over the Internet. One way or another, the case served as a reminder that no matter how smart the systems are, problems cannot be avoided if they are left without sensitive human control.
NY Post, April 22, 2022
Facial recognition is already grappling with the problem of identity forgery, but the decision to introduce this smart feature into the sale of alcohol and cigarettes could deal a serious blow to the fake ID market.
The New York State Senate recently introduced a bill that would allow bars and restaurants to use biometrics to verify a person’s age before buying alcohol, tobacco or e-cigarettes. Bars and restaurants are expected to scan customers’ fingerprints, faces or retinas so they don’t have to detect permanent ID. The law specifies that all data must be encrypted.
“No one’s forced into engaging with this technology, but they would have the choice,” commented state Sen. James Skoufis sponsoring the biometrics bill. “There’s no big brother involved.”
However, as you may have already thought, the new bill could create a host of new fake verification methods, similar to how people used to forge documents to buy alcohol and cigarettes. And there are multiple methods to do so which you can learn by subscribing to our newsletter .
Subscribe for updates
Stay up to date with what is happening! Get a first look at news, noteworthy research and worst attacks on AI delivered right in your inbox.