AI-Intrusion-DEEP is the video analytics plugin for intrusion detection. AI-Intrusion-DEEP was born from the thirty-year experience of the A.I. Tech, specialized in the design and definition of advanced algorithms based on artificial intelligence and artificial vision techniques. “AI-Intrusion-DEEP is anything but simply a deep learning-based detector”.
This is the comment of Alessia Saggese, Sales and Marketing Director of A.I. Tech, who continues: “AI-Intrusion-DEEP is based on advanced detection and tracking algorithms that exploit space-time information, and identifies the specific class of the object detected using deep neural networks, specifically designed by our team. The system is extremely robust with respect to attacks by fraudulent and in a certain sense “technologically advanced” intruders, who try to emulate adversarial machine learning attacks, with the aim of confusing neural networks. The scientific literature in fact teaches us that a person, dressed with certain patterns, could confuse a detection system based exclusively on deep learning. This does not happen in AI-Intrusion-DEEP, thanks to the combination of space-time information with those based on the model of the specific object of interest. “AI-Intrusion-DEEP is available on board the camera, integrated in AI-APPLIANCE (the embedded solution from A.I. Tech) and server based.
This allows to manage different types of sensors:
– detect the crossing of a line (tripwire);
– detect the crossing of multiple virtual lines (multiple tripwire);
– identify the staying in a forbidden area (sterile zone detection).
For each of the events, it is also possible to manage the entrance areas, i.e. generate an alarm only if an object enters a specific area and then activates a specific sensor among those mentioned above. Finally, it is possible to filter on the basis of the specific type of object that detected the alarm, to be chosen from person, vehicle, animal or unknown object.
AI-Intrusion-DEEP will be available in the price list starting from December, 21st 2020.