Keep an eye on facial recognition attacks – your data can be exploited by malefactors
Protocol, April 17, 2022
The expression on our face can say a lot – even sometimes more than we ourselves can imagine. However, according to college instructor Angela Dancey, this is not always clear, especially on the faces of the youngest students. t becomes even more difficult when classes are held remotely and all students are present virtually – their facial expressions are filmed by a webcam.
However, Intel and Classroom Technologies think otherwise. Two developers have come up with a new virtual school software called Class that is designed specifically for this. Companies can integrate AI-based technology with Class, a new product that runs on top of Zoom. With the help of a new tool, teachers will be able to determine what emotions students experience – whether they understand the new material, whether they get distracted or bored.
“We can give the teacher additional insights to allow them to better communicate,” commented Michael Chasen, co-founder and CEO of Classroom Technologies.
However, of course, despite the new technologies, they all need reliable protection and proper use, because any personal data collected using artificial intelligence can be used against people if the system is hacked by fraudsters.
Politico, March 29, 2022
Ten years ago, Chermain Leisner received a letter from the Dutch tax authorities demanding the return of childcare allowance for the period since 2008. As a student, the girl had three children under the age of 6, whom she supported on a fairly modest amount of money.
At first, the girl thought that this was some kind of mistake, and everything would be resolved soon, but alas, this turned out not to be the case – the story dragged on for almost nine years, as a result of which the victim was brought to depression and severe stress.
Unfortunately, the woman became the victim of what the Dutch called “toeslagenaffaire” – the child support scandal. Back in 2019, it became known that the tax authorities of the Netherlands introduced a self-learning algorithm – these systems were used to detect fraud with child care benefits.
What turned out to be that, as a result, the authorities fined families simply for suspicion – if the system considered the family to be potential fraudsters. As a result of system errors and miscalculations, tens of thousands of families have suffered enormous financial losses. However, as a result, the Dutch tax authorities faced a new €3.7 million fine that was created by the country’s privacy regulator. The statement accompanying the fine pointed to a number of violations of EU data protection rules, the General Data Protection Regulation. The situation once again demonstrates what large-scale losses and damage can result from the incorrect use of artificial intelligence.
Unilad, April 17, 2022
Adversarial clothing is no longer something fundamentally new – we have already seen T-shirts, bandanas and other items of clothing that can hide their wearer from being detected by facial recognition cameras.
The adversarial T-shirt works on the neural networks used for object detection,” explained Xue Lin, an assistant professor of electrical and computer engineering at Northeastern, and co-author of a new paper.
Researchers from Northeastern University have proposed a new version of a very stylish adversarial attire. Among the innovations was that when developing a new clothing line, the researchers were able to determine the areas of the body that are most favorable for the placement of special noise that deceives AI.
Subscribe for updates
Stay up to date with what is happening! Get a first look at news, noteworthy research and worst attacks on AI delivered right in your inbox.