A proof-of-concept attack that uses realistic fake turn-by-turn navigation directions for in-car GPS systems has managed to fool drivers into following them a full 95 percent of the time in testing.
Fresh experiments from a coalition team consisting of researchers from Virginia Tech, the University of Electronic Science and Technology in China and Microsoft showed that carefully crafting the spoofed GPS inputs (with cheap, readily available hardware) with an eye to the actual physical environment can successfully lead human operators astray most of the time.
The new attack, which the researchers dubbed “GangWang,” spoofs a route that mimics the shape of the route displayed on the map,
To test the idea, the team designed an algorithm based on 600 taxi routes in Manhattan and Boston. The code searches for attacking routes in real-time that would match the targeted victim’s location and remain consistent with the physical road network. On average, it was able to identify 1,547 potential attacking routes for each target trip for a would-be attacker to choose from.
“If the attacker aims to endanger the victim, the algorithm can successfully craft a special attack route that contains wrong-ways for 99.8 percent of the trips,”
https://threatpost.com/gangwang-gps-navigation-attack-leads-unsuspecting-drivers-astray/134172/
No comments:
Post a Comment