
A bicyclist rides by a Google self-driving car at the Google headquarters. Getty/ Justin Sullivan
Fairbanks International Airport in Alaska revealed on Wednesday that several drivers have ended up on their runway because they followed Apple Maps directions that led them there.
Apple Maps had been criticized since its release with the iPhone 5 in September of last year — though its bad results have never been quite so spectacular or dangerous. Fairbanks is laying blame on the human error at play here — as they should — but the incident should make us question whether the rush to the driverless car will be as free of disaster as we like to think.
Google and several automotive companies have invested in building self-driving cars. California Governor Jerry Brown signed a bill that allowed driverless cars to be on the road in the state. Brown and Google founder Sergey Brin said at the time of legislation that self-driving cars will be much safer than humans because they will eliminate human error — what provisions they’ve made for software error remain as yet vague. General Motors has promised a nearly driverless car by 2020; Nissan, a completely driverless car by the same year
A study by Cisco shows that 57% of the global population trusts driverless cars. (Brazil had the highest percentage, with 97% being in favor of automation.) For the other 43%, perhaps the way to convince them is not by focusing on the cars themselves, but by creating other apps that work perfectly. A flawless Google Now — the app that creates schedules, updates social media, reminds users of upcoming problems in their commute — might do more to instigate trust in automated technology than the next bleeding-edge model from Nissan.