Personality Harvesting and One's Right to Privacy

Personality Harvesting and One's Right to Privacy

Not unlike many others, I found Cambridge Analytica’s data harvesting practices unethical on multiple levels. Though the data they use is gathered "ethically", through public and consenting surveys, the use of the data as a tool to sway elections isn’t. Life experiences, amongst other things,  shouldn’t be sold by companies you buy from in order to be used against you.

The case study goes in detail about how companies like Cambridge Analytica use public data gathered on 190 million voters to help political campaigns tailor their messaging in hopes they sway or change your opinion or even encourage you to vote. It appears to be a violation of one’s right to the truth since most users never consented to any personality surveys.

My issue with Cambridge Analytica’s personality harvesting is that most people in the country have no idea their information is being used to manipulate them. Political elections often feel very personal and the issues have become more and more along the lines of a person’s morals, not just political affiliation. I don’t believe that my data should be cross-referenced with someone else’s (who consented to their questionnaires) because of my Facebook use. Just because someone uses a social media site doesn’t mean I consent to have my personality analyzed.

I believe in terms of data, regardless of the potential rejection to use a service or survey, transparency is the most important. Companies should be as transparent as possible with the data they collect and what they use it for. This means full disclosure on what data is collected and how it’s used on a user who has so much as interacted with a website. I think this is one of the few ways to avoid issues of corruption. Though most data analytics companies are driven by using information of many to analyze and predict the actions of few, this doesn’t mean there aren’t or won’t be companies looking to exploit the data at our fingertips. Cambridge Analytica's disclosure was extremely insufficient. Most survey takers wouldn’t classify political campaigns as research partners. Though the privacy policy states a person’s responses might be used on a grand scale, I know I would not participate in a survey that affects how political campaigns advertise to me. The use of this information hardly classifies as prioritizing the “good” of many, as political parties aren’t exactly contributing any “good” with this data. Instead, it's being used to sway outcomes and reinforce political agendas.

Educated people should be able to decide for themselves. This means full disclosure about their information in an easy way instead of a complicated Terms of Service or Privacy Policy document.  In this case, I fully understand the benefits to the campaigns, but I’m not sure there is any benefit to society overall. Yes, we may encounter better targeted political ads, but that’s not a benefit to voters. Instead, many who haven’t consented to have their psychographic information used to adjust marketing are actually having their privacy rights violated. The affected parties have the right to privacy in their personal lives and in this case, our likes/dislikes, openness, and neuroticism should remain private.

In addition, while possible, there’s no data that proves these political ads based on personality have any outcome on elections. This isn’t winning the case for the political personality harvesting as I’d only agree to these types of data aggregation or analysis if they served the common “good”. I can understand why this data is useful, but it shouldn’t be to manipulate us into voting for certain people.

 The ACM Code of Ethics and Data Science

The ACM Code of Ethics and Data Science

0