The homeowners association (HOA) of my townhouse community is debating the cost of replacing shrubberies separating our shared patios with patio privacy walls. I don’t use my patio as often as my neighbors, so I've been passively intrigued by the passionate rhetoric some homeowners have used at HOA meetings to argue in favor of the privacy walls. Although I appreciate the desire for personal privacy (and wish some of my sunbathing neighbors would value their privacy a little bit more), as I connected to my WiFi I couldn’t help but wonder why my neighbors aren’t as concerned about their Internet firewalls as they are about their patio walls. Most of my neighbors’ wireless networks are either unsecured or have a default password. (DSL networks through our local landline telephone provider use your home phone number as the default password.)
This is just one example of how in the age of big data, where nearly every aspect of our personal and professional lives is captured as data, we rarely seem concerned about data privacy. Not only do we use unsecured WiFi connections, we freely give away countless bytes of our personal data in exchange for free Internet/mobile-based services. But even when we do pay for services, how often do we consider how our data will be used? Where does the responsibility for data privacy lie?
Seeing through transparency
One of the most lauded principles of data privacy is transparency – but Omri Ben-Shahar dispelled its mythology. As he explained, the theory says “if firms are required to tell people what information they collect, and do so in a simple and conspicuous manner, people would be able to avoid doing business with those that inflict abusive privacy practices.” But there's a problem with the transparency solution – studies have shown no evidence that it works. It's tempting to conclude that this is because privacy policies are often verbose and vague. Yet “even when the notices about the ways firms collect, use and share information are delivered in the simplest and most concise manner, people still don’t read the notices and don’t change their behavior. If simple notices are not read or used by people, the hopes for informed choice crumble. Users are not going to opt out of Google’s personalized ads or personalize Facebook’s privacy settings. These consumers might comparison-shop among services based on various quality and service measures, but not on the basis of privacy features.”
Saving private data
So it would seem transparency alone isn’t the answer since the majority of consumers don’t prioritize data privacy. Even when they do, consumers still require the organizations saving their data to save its privacy as well. Whether externally transparent or not, organizations need to protect data privacy by having clearly defined data privacy policies detailing exactly what data is to be saved and how that data should be used. Doing this provides a well-defined way to identify which data is sensitive and therefore subject to protection. Organizations also need to reduce the risk of unauthorized access to sensitive data by understanding how data privacy affects different organizational roles and by masking sensitive data to provide differential privacy that still enables essential analytics but without needlessly exposing sensitive information.
What say you?
Can we save private data? Or in the age of big data, is this a war we have already lost? Please share your perspective and experience regarding data privacy by posting a comment below.Download a paper about SAS Data Management and the General Data Protection Regulation