KU Leuven researchers make themselves invisible to AI cameras

A cardboard sign with a colourful print. That’s all researchers from the Faculty of Engineering Technology at KU Leuven needed to fool a smart camera. To be clear: Wiebe Van Ranst, Simen Thys, and Toon Goedemé from the EAVISE research group don’t have evil intentions. Quite the opposite. Their research aims to expose the weaknesses of intelligent detection systems. 

"Smart detection systems rely on pattern recognition," says Professor Goedemé, head of EAVISE (Embedded and Artificially Intelligent Vision Engineering) at De Nayer Campus. "They consist of a camera and software to interpret the images automatically. If you train these systems with images of different people for a while, they learn to recognise people and to distinguish them from objects. Even though we differ in height, hair colour, or face, the algorithm identifies us as human beings."

"This makes smart detection systems very suitable for security purposes. They automatically give a signal as soon as the cameras detect an intruder, even when that person tries to hide. In the past, you needed security guards for that, and they’d be staring at screens for hours on end. A tedious job that is becoming a thing of the past."

Millions of parameters

However, smart detection systems are not infallible. Sometimes they have difficulties detecting certain patterns. Researchers around the world are trying to expose the Achilles heel of detection systems. Small changes suffice to do this. Fake glasses made of cardboard, for instance, are enough to confuse a facial recognition system. 

Professor Goedemé and his team have taken things one step further. Master’s student Simen Thys and postdoc Wiebe Van Ranst have managed to mislead YOLO, one of the most popular algorithms to detect objects and people. 

The researchers held up a cardboard sign of 40 by 40 cm, with a colourful print, in front of their body. That was enough to fool YOLO: carrying the sign makes you invisible to the system. 
Highly remarkable, according to Professor Goedemé: "In previous tests, people wore a T-shirt with the image of a bird. The algorithm didn’t recognise a person, but it detected a bird. Our pattern, which was itself designed using artificial intelligence, makes people invisible. If you carry the sign, the system does not detect you - not as a human being, nor as an object. Remarkable, and we do not know exactly why this particular pattern can trick YOLO. After all, neural network-based algorithms use millions of parameters. For researchers, this remains a black box."

Arms race

The researchers are enthusiastic, but they’re quick to warn of other security flaws. 
"Where to go from here? That’s easy: we found a weakness, and now it needs to be fixed. In this case, you could teach the YOLO algorithm that people holding a sign with this particular pattern are also human beings. That’s not hard to do. However, it’s safe to assume that YOLO has other weaknesses as well. Will we ever be able to fix all security flaws’ I don’t think so. I’ve already mentioned it: such an algorithm is a black box. This is the start of an arms race."

Technology websites have already extensively covered the news. The video that the researchers have posted online already has more than 100,000 views. "We didn’t expect this kind of hype. But we see why the application appeals to the imagination. The idea that you can make yourself invisible to security cameras using nothing more than a colourful sign is intriguing."