Article

Cool, Not Creepy: Avoiding IoT Backlash by Respecting and Educating Your Customers

Posted July 1, 2015 | Leadership | Amplify
In this issue:

Cutter IT Journal Vol. 28 No. 7

"'We'll see,' said Calvin. 'The robot may prove too valuable to dismantle.'"1 These words of the legendary Dr. Calvin -- the fictional chief robot psychologist in Isaac Asimov's revolutionary Robot Dreams short stories -- underlie our contemporary trepidations. We are at the cusp of an era in which connectivity is no longer limited to computing devices but expansively includes more and more everyday networked objects: an era already known as the Internet of Things (IoT). If we do not trip over our fears, the upcoming decade will usher in a new world of highly useful and valuable objects. Greater technologies will also lead to our dependence upon those technologies. In plainer terms, the point of creating cool stuff is to make people want and need the cool stuff.

Ultimately, however, Dr. Calvin felled the dreaming robot with a gunshot, proving that no value is too great to dismantle if it triggers our visceral fears. Heed my words: if you want your cool stuff to stick around, then do not ignore the possibility that it will elicit a fearful response. Do not stick your head in the sand when it comes to privacy protections. Deal with them from the get-go and spare yourself publicity like this CNN headline: "Your Samsung TV is eavesdropping on your private conversations."2 Either don't invent a robot that can dream, or market the hell out of it so that everyone knows that the robot dreams, is OK with the dreaming, and actually wants to buy a dreaming robot.

Technology backlash is as old as technological innovation. It is inevitable that people will grouse about new technologies and adopt them with varying degrees of acceptance. Yet, with one caveat, the cool stuff will take hold and prevail on the basis of its functionality and actual worth to people. The caveat is that this will happen only if these products do not give people some absurd reason to do a double-take and say, "What? You didn't tell me this amazing product" -- and here, take your pick -- "uses triangulation to share my location with perverts," "shares my aimless meandering around department store aisles with marketers," "leaves my television camera running," or "records my child babbling away to a beloved toy."

Nowadays, it is the vogue to be reactionary, to make grand statements such as deleting one's Facebook profile or starting a Google bus protest. This article concerns itself with a particular type of reaction to the Internet of Things -- that of consumers who avoid buying cool products because of privacy-related fears. My advice for overcoming this resistance is twofold: avoid infamy and bridge the gap.

AVOID INFAMY

The first guiding principle is to avoid infamy. Privacy invasions are a hot topic and are often presented in an inflammatory manner by the media. Where one used to accrue cachet for being "in the know" about the latest and coolest tech products, there is now a perverse prestige in refusing to buy into the latest and coolest because one is in the know about privacy. Folks nowadays regale dinner companions with all the creepy ways in which corporate America uses and misuses consumer data. Everyone has a favorite scare story -- be it Facebook-sponsored stories, toys that listen to your kids, Samsung televisions with an open camera feed on your family room, and more. My favorite is the hacked baby monitor that started speaking (the white hat hacker's voice) in a baby's room. Trust me, it makes for good dinner conversation.

There are many ways in which companies can reduce the risk of ignominy by being proactive. On the production end, it is invaluable to invest in vetting products for privacy hiccups or security gaps before they hit the market. Here, take note that privacy and security are not the same creature. Privacy refers to consumer-facing practices -- what you do, don't do, will do, and promise to do or not do under certain circumstances -- that affect people's personal data and their identifiability by third parties. Security refers to the protection of corporate assets (including data on customers) and systems; this includes a risk management program and technology policies in addition to actual security measures. On the consumer end, it is vital to keep an open mind while keeping an ear to the ground to really understand consumers' reactions to connected products. Companies that lack sensitivity to their customers' anxieties and needs will be culled from the rest -- and often in a very public manner.

Acknowledge Privacy Fears

First off, IoT executives and managers need to heighten and even reset their awareness of the myriad ways in which privacy affronts can bring down a business. At this point, there is plenty of rhetoric about consumer privacy. Yet there are still indications that executives remain smug on the issue. A recent Deloitte study depicts the disconnect between corporate complacency and consumer expectations. Whereas execs feel that their corporations are behaving better with regard to privacy and data security, consumers are having the exact flip experience:

  • 77% of executives surveyed believe their companies' data privacy policies are clear and well-understood by consumers, while roughly 73% of consumers say they'd like to see more easily understandable data privacy policies from companies.

  • 47% of executives believe consumers regard the risks of sharing their personal information as worth the personalized promotions, advertising, or coupons they receive from [consumer product] companies in return, yet only 25% of surveyed consumers agree.

  • 47% of executives think consumers view the risks of sharing personal information as worth the product recommendations they receive from [consumer product] companies, yet only 18% of consumers say the same.3

Name a Privacy Point Person

Second, there needs to be a leader on the privacy front, someone who is persistent in the face of apathy. Privacy is often perceived as a mere compliance roadblock. Security is typically regarded as a necessity, but employing security tools without regard to the human angle is ineffective. Human error causes a significant percentage of data breaches, misguided product development often leads to privacy snafus, and there is plenty of room for mistakes in this unregulated sector. The security defaults are insufficient when they themselves were set by technologists of a pre-IoT era. Many companies might give lip service to privacy, or even have an authentic regard for privacy and security. However, scattered interest will not get the job done. Every movement needs a leader, someone to corral all the stakeholders and push the needed conversations to the fore. The privacy point person must be an advocate, a proponent, and an administrator who has the authority to raise these issues and push privacy conversations through to completion. He or she needs the backing of the executive level and buy-in from the security staff. This person may be a chief privacy officer, a chief information security officer, or any other leader willing to champion the cause.

At Single Stop, where I am privacy counsel for a data-intensive and tech-driven nonprofit, I work hard to get all of our departments to reach out to me and engage in conversation with regard to the privacy of our clients and security of our systems and processes. It starts with an outstretched hand and an open invitation to talk. For example, I developed a privacy impact assessment (PIA) report, but rather than put the onus on staff to do the paperwork on subject matter that is new to them, I rolled it out by asking around about new projects and offering to fill out the PIA for folks. It's like saying, "I came up with a new compliance hurdle, but I'll take the first jump for you." I have found that in this situation, program and product developers are more willing to be up front about their product's implications for personally identifiable data, privacy, and confidentiality. It is critical to identify these issues beforehand, because then your product developers can bake in the privacy protection rather than spreading it on like frosting after the fact. Frosting is attractive, but it is also easily smeared off. Conversations can lead to actual product changes and implementations that have a significant impact on preventing privacy fiascoes or backlash.

Take Action

If you can surmount the hurdles of getting folks from the top of the organization to the bottom to talk about and participate in privacy conversations, the next step is to operationalize this work. A privacy program consists of regular risk assessment exercises and a structure for issue spotting and processing concerns and complaints. A privacy program does not require many full-time employees (or even one). An attorney or compliance manager can recruit stakeholders from throughout your organization to participate in a privacy awareness and protection committee. Staff may be assigned additional responsibilities to keep an eye on privacy issues and support the dedicated staff. Internal promotion of education and security reminders goes a long way. I prefer to leverage the use of internal staff rather than relying upon an external consultant to muster up this kind of participation.

BRIDGE THE GAP

As economists understand, expectations drive the economy. Consumers rationally expect that certain products (such as toasters) will not undertake certain actions (such as recording video of your kitchen activities). That was an extreme example -- I hope -- but it is vital to think ahead to the expectations that your product will elicit. Consumers vary widely in their awareness of the issues, technical ability to adjust privacy settings, and interest in being proactive about privacy. Expectations are based upon a person's experiences and knowledge, so it behooves us to consider the full range of expectations down to the lowest common denominator.

The Internet of Things presents consumers and businesses with a candy store of technological offerings to ease their lives and operations. However, these sweets are virtually unregulated and are built on a foundation of esoteric privacy surveillance and data-collection practices. The offerings are outpacing the education and information required to aid consumers in making knowledgeable decisions about them. The reach of technology is deeper and more potentially invasive than ever, and these cool products should be offered concomitantly with education and real-world partnerships rather than shrouding mysterious tech within a slick polyurethane case.

Mass consumption of technologies that are intimately involved in our lives will certainly lead to a backlash if consumers feel betrayed after spending their dollars inviting these products and companies into their homes. A customer who indulges her sweet tooth does not expect that the candy will cause her skin to break out in tattoos, just as customers of Aaron's Inc., the electronics rent-to-own retailer, did not expect their rental items to come in "Detective Mode" that would monitor their keystrokes and covertly activate their webcams.4 That is a negative kind of example. But what if your candy is better than just a confection? What if it also has healing and nutritional properties? In addition to easing the chores of driving and navigating, self-driving cars may save a significant number of lives. Technology is no longer just a tool, but a way of life. People need to understand that they are eating a new kind of candy -- one with deep and unexpected implications in their lives. It may come with sketchy ingredients like real-time monitoring, but also awesome ingredients like decreased accident casualty rates, lower energy bills, and insurance discounts.

I hesitate to use the buzzword "transparency" at this point, although that is what we are discussing. Transparency, as a concept, has failed. Companies all have a privacy policy and terms of service on their websites and apps. Policymakers make a big deal about consumer disclosures such as privacy labels, short form privacy policies, and terms of use agreements. But who reads them? And who understands them? Projects such as Terms of Service: Didn't Read5 are commendable, but who will use them? It's still the people in the know. The concept, as it is currently operates, is esoteric lingo to the typical consumer. If you stopped ordinary pedestrians on the street and queried them about transparency in consumer tech products, how many people do you think would register understanding? The word "transparency" is, ironically, rather opaque.

Teach Your Customers Well

True consumer education is more than disclosure statements. It needs to extend to teaching consumers how to engage with technology with mastery and consciousness of surreptitious practices. When it comes to privacy, there is a widening gap between those with tech know-how and passive tech users. I see a parallel in current US national debates on economic parity and the haves and have-nots. The emotional current in America is relevant here because there is always an emotional element to widespread public backlash, be it against economic inequality or tech creepiness. A big plus in this case is that it should be easier to improve tech-savvy inequality than solving poverty and race inequality. Technology education has made strides in accessibility and quality, but it needs to keep pace with the next surge in product offerings.

There are two kinds of tech education that can make a difference here: one is the broad education of the general public, and the other is specific guidance for consumers of IoT products. To stave off a backlash against IoT technology, we must bring the public into the fold by helping everyone become a "techie" to some small degree. Industry has the resources to offer free tech education to mass audiences: think Lynda.com meets Khan Academy. We are rapidly approaching a world in which as many people as possible need remedial education in the language of technology and privacy. For example, I believe that every online consumer should know how to inspect website source code to identify which cookies are being placed. Every consumer should know how to fiddle around with the settings in their IoT products. Every IoT product should come with an accessible privacy menu like the iPhone privacy settings, with as much detail as is appropriate for the product and its data uses.

Moreover, broad-appeal education programs must not be confined to the virtual world but should materialize as programming and partnerships in the corporeal world. Local libraries, town rec centers, and cafes are hubs where corporations and trade associations can leverage real organizations to push out education, training, and messaging to demystify technology and demonstrate its potential usefulness, yet invasiveness, in our private lives. Industry can engage with nonprofits such as the Electronic Privacy Information Center (EPIC) and the Future of Privacy Forum (FPF) to create campaigns to educate the public about how to optimally harness the benefits of cutting-edge technology. There is no reason for industry to have an adversarial relationship with such orgs when, in the end, transparency will lead to consumer empowerment and comfort. Privacy should not be a political issue. Cooperation between industry and consumer advocates can depoliticize this charged topic, because ultimately there is an alignment of goals here: creating cool stuff that people want to pay for.

In my utopia, all kids would be tinkerers using tools available at their playgrounds, all grandparents would learn how to Google Chat at their senior citizens center, all public libraries would host a Mini MakerFaire, and ... all high schools would offer a class in sci fi literature. When I attended junior high school, they still taught auto shop, wood shop, and home economics. Those classes may have fallen by the wayside, but something should sprout up in their place. What about engineering classes and programming classes? What about combining both with Lego Robotics toys?

NICE GUYS FINISH FIRST

Like cream rising to the top, companies that publicly embrace consumer education and empowerment will reap benefits in the marketplace. There is a role for industry self-regulation and, possibly, government regulation that could level the playing field for industry and ease the way for consumers. However, until that happens, we have before us a prime opportunity for IoT producers with integrity to distinguish themselves from the horde of data-collecting, location-snooping, vulnerability-ridden products.

ENDNOTES

1Asimov, Isaac. Robot Dreams. Berkley Books, 1986.

2 Goldman, David. "Your Samsung TV Is Eavesdropping on Your Private Conversations." CNNMoney, 10 February 10 2015.

3 "Executives Underestimate Importance of Security, Privacy to Consumers." Deloitte CFO Journal, 15 June 2015.

4 Kerr, Dara. "Aaron's Computer Rental Chain Settles FTC Spying Charges." CNET, 22 October 2013.

5Terms of Service; Didn't Read is an online project that seeks to pare down major websites' lengthy terms of service agreements into a meaningful and visually understandable ratings-based label.

About The Author
Annie Bai
Annie C. Bai, CIPP/US, CIPM, is a graduate of NYU School of Law. Ms. Bai is Privacy Counsel at Single Stop, a national anti-poverty nonprofit and speaks on modern privacy law for New Directions for Attorneys at Pace Law School. She consults with for-profit and notfor-profits on privacy and data security on privacy audits in a variety of industry sectors. Ms. Bai is a member of the Technology Operations Committee for Per Scholas, an IT workforce… Read More