As far as scandals go, the revelation that Volkswagen had engineered onboard software in 11 million diesel cars to cheat on emissions tests may be one of the most costly ever. Credit Suisse now estimates it could cost the company $87 billion — more than the 2010 BP oil spill in the Gulf of Mexico.
We all enjoy the benefits of having computers in cars — including better braking during bad weather — but this scandal should cause us to rethink security and transparency issues related to automobile software as 250 million cars are expected to be connected to the Internet by the year 2020.
“[T]he Volkswagen emissions scandal, though not precisely an Internet of Things issue, has exposed yet another issue with ‘smart’ physical goods: the possibility of manufacturers embedding software in their products designed to skirt regulations,” Klint Finley writes in Wired.
Because of the Digital Millennium Copyright Act, safety researchers have no access to the code being used to run onboard computers.
“Congress passed it in 1998 in part to protect DVDs from being pirated,” NPR’s Brian Naylor reports. “But courts have also interpreted the law to keep people from accessing the computer code in cars, homes, even tractors.”
Though it’s not mandated by law, this lack of transparency is consistent with much of the developing Internet of Things, which Finley describes as “profoundly closed.”
“You can’t wipe the factory-loaded software and load alternative software instead,” he writes. “In many cases you can’t even connect them to other devices unless the manufacturers of each product have worked out a deal with each other.”
The purpose of these restrictions is similar to why schools used to deactivate disk drives. They didn’t want kids loading viruses into a network.
In this scenario, the owners are the kids and their cars the school network the teachers are sure will get screw up. But this model relies on implicit trust of the manufacturer — which the VW scandal reminds us should not be handed out so easily.
“We can’t know for sure that researchers would have found the Volkswagen defeat device earlier if the software had been more open, but it surely wouldn’t have hurt,” Finley writes.
The auto industry has fought exemptions to the law that would allow researchers and owners access to car software. And the Environmental Protection Agency joined them, arguing that it could lead to tampering by the owners.
Is this the way we want the IoT to develop?
“Simply providing an application programming interface—API for short—that developers and hobbyists can use to build links between one product and another would go along way towards making the Internet of Things more robust,” Finley argues.
This sort of openness that lead to development of the personal computer and has played a significant role in the growth of Android as the world’s largest mobile platform.
However, Apple has maintained a more secure mobile platform with its “walled garden” approach, which requires all applications available for iPhones and iPads. In this model, we rely on Apple to play the role of regulator — and it has a tremendous incentive to maintain security given the promises it makes to consumers. Until recently almost no malware had ever shown up in its App Store.
If we’re going to trust the Internet of Things with our homes, our cars and our health, we need to evolve past a child/school model for security. At the very least, researchers need to be able to find out for sure that our “things” are doing what they say they are.