[Serge Vaudenay] and [Martin Vuagnoux] released a video yesterday documenting a privacy-breaking flaw in the Apple/Google COVID-tracing framework, and they’re calling the attack “Little Thumb” after a French children’s story in which a child drops pebbles to be able to retrace his steps. But unlike Hänsel and Gretl with the breadcrumbs, the goal of a privacy preserving framework is to prevent periodic waypoints from allowing you to follow anyone’s phone around. (Video embedded below.)
The Apple/Google framework is, in theory, quite sound. For instance, the system broadcasts hashed, rolling IDs that prevent tracing an individual phone for more than fifteen minutes. And since Bluetooth LE has a unique numeric address for each phone, like a MAC address in other networks, they even thought of changing the Bluetooth address in lock-step to foil would-be trackers. And there’s no difference between theory and practice, in theory.
In practice, [Serge] and [Martin] found that a slight difference in timing between changing the Bluetooth BD_ADDR
and changing the COVID-tracing framework’s rolling proximity IDs can create what they are calling “pebbles”: an overlap where the rolling ID has updated but the Bluetooth ID hasn’t yet. Logging these allows one to associate rolling IDs over time. A large network of Bluetooth listeners could then trace people’s movements and possibly attach identities to chains of rolling IDs, breaking one of the framework’s privacy guarantees.
This timing issue only affects some phones, about half of the set that they tested. And of course, it’s only creating a problem for privacy within Bluetooth LE range. But for a system that’s otherwise so well thought out in principle, it’s a flaw that needs fixing.
Why didn’t the researchers submit a patch? They can’t. The Apple/Google code is mostly closed-source, in contrast to the open-source nature of most of the apps that are running on it. This remains troubling, precisely because the difference between the solid theory and the real practice lies exactly in those lines of uninspectable code, and leaves all apps that build upon them vulnerable without any recourse other than “trust us”. We encourage Apple and Google to make the entirety of their COVID framework code open. Bugs would then get found and fixed, faster.
Recent Comments