READ ORIGINAL ARTICLE AT AVAST BLOG
by Garry Kasparov
I spoke on politics and human rights at an important forum in New York last May, and my fellow speakers included many current and former politicians and academics there to talk about everything from North Korea to press freedom to cybersecurity. Former US Congressman Mike Rogers was one of them, and he gave a polished presentation about many of the risks we are facing today in the digital sphere, both for personal and national security. As the former Chair of the House Intelligence Committee, he was faced with these urgent concerns on a daily basis. (Unfortunately, the Committee has now become a political battleground, a very dangerous situation because security shouldn’t be a partisan issue.)
It was good to hear Rogers talk about many of the themes I’ve brought up here, including the dangers presented by nation-state actors who consider aggressive hacking to be a national priority. Rogers listed Russia, China, Iran, and North Korea as the most common sources of these threats, with the goals of collecting intelligence, stealing research and money, and, in a few cases, doing physical damage by sabotaging systems like power grids.
Rogers also talked about consumer privacy and security, and he brought an amusing prop to help make his point. He asked the audience about the “most successful spy in history” and then brought out a Barbie doll. The “Hello Barbie” version was released in 2015 and could talk with kids using wifi and speech recognition. There were two huge problems. First, the toy collected a lot of personal information in these conversations. (“Tell me about your family!”) Second, security flaws were discovered that would allow hackers to access the user’s data, effectively turning Barbie into a household super-spy.
Last year, another internet-connected doll, called Cayla, was banned in Germany over privacy concerns, which are of course heightened when it comes to protecting children. Just a few days ago, a similar situation led to major US retailers halting the sale of a the CloudPets toy, after the Mozilla Foundation protested their spying capabilities and data breaches. These cases got attention, but the root problem is a lack of standards to inform consumers—and accountability when things go wrong. As a Mozilla representative said, “Some smart toys are better at security than others, but we felt like CloudPets was an egregious example. Our goal was to reach out to retailers to make sure they knew exactly what kind of product they were selling.”
One report cites online reviews of the Hello Barbie that are a perfect example of what Rogers and I both warn about, that people care more about features and “saving two seconds” in Rogers’ words, than about privacy. “Many parents, however, still have decided to buy the doll. ‘I read all the hacking stuff but I’m sorry if big brother was going to spy he’s already doing it through your smart phone.’” Every time a consumer says something like this a hacker gets his wings…
Ironically, we seem to be far less concerned about our privacy as adults, despite having far more valuable information to protect. Nor do we seem to have any more self-control than children begging to have the latest cool toy. We download apps and updates, click through the security information without pausing, and are shocked when it makes the news how much information these apps have about us, and are selling it with other companies and political organizations.
We even have less fun version of household spies, devices that are far more popular and potentially invasive than any toy. When Rogers asked his rhetorical question about the “most successful spy ever,” many of us in the audience whispered “Alexa!”, the ubiquitous virtual assistant that runs on Amazon’s Echo devices as well as phones and just about any computing device. Add Apple’s Siri, Google’s Home, Microsoft’s Cortana, and we’re all virtually surrounded by these virtual helpers. There have been some amusing and some not-so-amusing abuse incidents with these devices already.
The fierce competition between these US tech giants ensures two circumstances that are at odds. One, they will work very hard on security because any breach will lose customers to their competition. Two, the rush to introduce new devices and features ahead of the competition inevitably leads to security holes. It’s a delicate balance, a balance that is tipped depending on how consumers and government react to security and privacy violations. If the companies see that they aren’t punished for the security flaws, they have little incentive to prevent them. That’s the free market at work, for better and for worse.
Common sense tells us that the more common these devices become, the more security problems there will be. But as so often happens, common sense isn’t entirely correct here. There will likely be more incidents in sheer quantity at first, but, paradoxically, the percentage of problems will likely go down as the technology matures and becomes standardized. Regulations will be clearer, consumers and companies will learn their obligations and risks, and the chain of responsibility won’t be so obscure.
This tendency reminds me of what traffic pattern researchers call the Dutch cycling effect, or, more generically, safety in numbers. Whenever cities contemplate adding more bike lanes, critics warn that there will be more accidents. And while there might be an initial rise in the raw numbers, studies show that the percentage of accidents per cyclist go down, and then, the numbers themselves start to go down, too. When drivers and pedestrians get used to cyclists, and the laws and practices to accommodate them, it becomes safer for everyone. Drivers in cities with relatively few cyclists, like New York City, generally aren’t expecting bikes, don’t look for them, and don’t know who has the right or way or responsibility for interacting with them. But in Amsterdam or Copenhagen, where there are as many or more bikes on the road than cars, everyone knows what is expected, the conventions are in place and respected. (Except by foreign tourists, who are often startled by the highways of bicycles.)
This creates a virtuous cycle when increased safety leads to even more people cycling. Or, to end this extended metaphor, leads to more people adopting digital devices when they see that they are secure, and understand the digital “rules of the road” for keeping themselves and their data safe.
We aren’t there yet, but if you can judge by the recent wave of new terms of service agreements issued from every site and service, we are making strides. These are in response to the European Union’s “General Data Protection Regulation,” or GDPR, legislation, which went into effect on May 25, 2018. There will surely be changes and improvements, as with any new laws that are so broad and global, but it’s a good beginning at creating accountability on the corporate side for giving users more awareness and control over their own data. My next column will focus on the practical upsides and downsides of the GDPR and other legislation. Most importantly, it doesn’t free consumers from the responsibility of staying informed and active regarding their privacy and security. Information isn’t worth much if you don’t act on it.
Regulations may help protect us all in the long run, but it’s equally essential to develop good personal cybersecurity habits, just like an experienced cyclist or driver develops good habits. A truck running a red light is illegal, but you should still look both ways when you cross the street. Otherwise, while law might hold the driver accountable for the accident, that’s small comfort when you’re as flat as a blini on the internet superhighway.