Just Because We Can Use Big Data Analytics, Should We?

By   ISBuzz Team
Writer , Information Security Buzz | Sep 10, 2014 05:02 pm PST

As a data scientist, some of the most important and interesting aspects of my profession include identifying causal relationships, performing “what if” analyses on different scenarios, and overall seeking to answer questions.

After reading a recent news article on how large US healthcare providers are using data collected from consumers such as food and lifestyle purchases to assess whether or not someone is more or less likely to get sick, I think we need to bring those same critical thinking skills I use in my job to bear on what are very serious privacy concerns surrounding the use of people’s personal information.

FREE Download: CISO Data Breach Guide

Under the guise of trying to improve people’s health, there are so many “nanny state” red flags mentioned in the article I read, it’s hard to know where to begin. For example, in talking about applying a risk score to patients, a chief clinical officer of analytics and outcomes for a healthcare provider explains how his company has plans to pass patient scores to doctors and nurses who can then reach out to the most high-risk patients and suggest treatment before they fall ill. Exactly what does “reach out” involve? He is also quoted as saying, “What we are looking to find are people before they end up in trouble.” What if that person doesn’t want some bureaucrat to find them? What if they want to be left control of their own medical health?

As if in response to those questions, the officer goes on, “We are looking to apply this for something good.”

That really says it all, doesn’t it? What may seem to be “something good” is in reality a Pandora’s box of unintended consequences including, but not limited to, flagrant constitutional violations of people’s privacy.
[wp_ad_camp_4]
It is one thing to aggregate data and perform analytics in order to make assumptions about certain demographic groups. From a pure data science perspective, using big data analytics can certainly provide some interesting information to support or refute diagnosis or predict the success or failure rates of a particular treatment with regards to external stimuli.

However, that’s a far cry from using specific, detailed behavioral information about an individual and their purchases to formulate a medical “pre-treatment.” One woman mentioned in the article with Type 1 diabetes has received phone calls from her insurance company to discuss her daily habits. Do you want to have this conversation with some unknown person on the other end of the line at your insurance provider? This is outrageous and clearly falls into the “none of their business” category.

Today, credit card companies and retailers are able to sell your private information to data brokers. To be realistic, most of us know this happens on a daily basis. However, there seems to be an ethical line that is being crossed with the example above.

How in the world would we even vet this data? Having information correlated across all these domains will present a clear and present threat to privacy, with minimal, if any, value added to the individual. Not only that, it presents opportunities for both government and individuals to misinterpret people’s data. For example, how can someone evaluate another person’s smoking or drinking habits based upon their purchasing behavior alone? That there would be ample room for subjective analysis constitutes a significant threat to the consistency of these assessments.

While most people recognize that we will never again have the degree of privacy we once did even just a few years ago, they probably don’t understand the extent to which information is gathered about them in today’s world. From cell phone call histories to camera snapshots to credit card records, there really is no such thing are privacy anymore.

As we have seen with the recent Supreme Court decision on warrants and cell phones, the digital age means we need to rethink privacy and how we protect our personal data. An article in the MIT Technology Review suggests that a code of ethics is needed to govern big data, outlining some thought-provoking tenants that should be adopted. Implementing such a framework would be difficult. Big data means big business and big money, after all. As a result, we have to ask ourselves two important questions: What is our privacy worth to us? And have we already crossed the point of no return?

By Dan Nieten, CTO, Red Lambda

About Red Lambda

red_lambdaRed Lambda is a pioneering technology company that has developed a next generation IT security and analytics solution for Big Data environments. In an industry yearning for innovation, Red Lambda and its flagship solution MetaGrid offer organizations around the world a new way to combat exponentially multiplying network security threats. Challenging the status quo, Red Lambda has torn through unchartered territory, creating in MetaGrid what one Fortune 500 CTO refers to as ”nothing short of revolutionary technology…game-changing software.”

Subscribe
Notify of
guest
0 Expert Comments
Inline Feedbacks
View all comments

Recent Posts

0
Would love your thoughts, please comment.x
()
x