【AICC Original Article】Smart cars equipped with ‘quantum shield’: encrypted cabins block all hacker snooping
As smart‑car cabins gradually evolve into a "third living space" after the home and office, in‑vehicle cameras and sensors—while providing convenience—constantly collect highly sensitive data such as drivers’ facial images, voice commands, and driving trajectories. How to prevent hacker intrusions or misuse by automakers has become a core problem the industry urgently needs to solve.
Guandun Technology, an incubated company from Hefei University of Technology in the Grand Union of Innovation, is offering a disruptive solution using quantum encryption: make in‑car data "usable but not visible," so that when hackers encounter the encrypted data stream, they can see nothing at all.

In Guandun’s showroom, a "smart‑cabin privacy protection system" caught reporters’ attention. The display shows two side‑by‑side images: the left is the raw camera feed with clear facial images; the right shows the real‑time processed feed where facial areas are precisely pixelated and completely unrecognizable. That layer of masking is a direct manifestation of quantum encryption.
On their experimental "quantum cabin testbed," the encryption effect is further reinforced. Under direct quantum‑algorithm cracking attempts by hackers, the quantum‑encrypted original feed is completely locked and unreadable, whereas the feed protected by classical encryption algorithms is almost instantly broken and fully revealed.
Cao Zi'ang, a senior R&D engineer at Guandun Technology, explained the company’s technical philosophy in an interview. He said that in current intelligent connected vehicles, in‑car data is mainly divided into the cabin domain and the intelligent driving domain. The cabin domain includes large amounts of personal sensitive information such as infrared facial structure, 3D optical data, voice, and dashcam video.
"Our deliverable to automakers is our design and plan, and the ultimate goal is to achieve 'usable but not visible,'" Dr. Cao explained. This means automakers can use encrypted data to train large models to improve autonomous driving or human‑machine interaction capabilities, but they cannot directly read users’ raw private data.

Addressing the question of "how AI can be trained on encrypted data," Dr. Cao offered a vivid analogy: "We aim to achieve a 'black‑box' state—encrypted data is input, computations are performed, and a usable training output is produced. Throughout the process, AI can interpret it, but humans cannot."
He further emphasized that adopting quantum encryption is prudent foresight: "If quantum attacks become commercialized, existing classical encryption methods could be instantly compromised, so we must plan ahead."
To demonstrate the technology more intuitively, Dr. Cao performed a live demo of the facial de‑identification system. When a test subject moved a finger to cover their face, the pixelated area on the right‑hand screen tracked almost synchronously, with extremely low lag—practically imperceptible.
Dr. Cao explained, "This is the intuitive embodiment of 'zero visibility': those without the quantum key can only view the masked feed; only authorized parties holding a legitimate key may be able to restore the data." He revealed that the cabin privacy protection solution is currently in the scenario‑application stage and is expected to be deployed in production vehicles within the next two to three years. The team is also researching how to use quantum keys to protect the entire lifecycle from data collection and processing to cloud upload, building an intrinsic security framework for AI.
Source: anhuinews.com
编辑: 郑晨

微信
QQ
朋友圈