With every advance in technology comes a trade-off of some kind. Where the use of personally-identifiable information is concerned, the trade-offs typically involve the exchange of privacy and confidentiality for a non-monetary benefit. In the early days of social media, conventional wisdom said the product was the service. However, we have seen over the last decade that the users of such platforms are the products, the perceived benefits merely carrots on sticks to keep the products (users) engaged in the cycle. We willfully pour details of ourselves into various social media outlets, despite the documented bad behaviors by giants like Facebook, and mostly remain complacent in having our personal data packaged and leveraged against us by various business interests.
No alt text provided for this image
However, in the conversation I’ve had around personal genomic testing (PGT), I’ve noticed that many are quick to cite data privacy and risk as a key reason not to participate. Think about this. On one hand, we have evidence to prove Facebook has been using our data in dubious ways, yet we keep pouring ourselves into it (McNamee, 2019). On the other hand, the potential benefits of PGT are outweighed by a fear of that data potentially being misused.
My purpose is not to minimize the potential hazards around PGT. Consider the following risks: (a) hacking; (b) profit or misuse by the company or partners; (c) limited protection from a narrow scope of laws; (d) requests from state and federal authorities; and (e) changing privacy policies or company use due to mergers, acquisitions, bankruptcies, et cetera (Rosenbaum, 2018). In the face of potential benefits from PGT, these are serious caveats. But read that list outside of this context, and it is equally applicable to the data we generate and provide to social media outlets on a daily basis.
As of yet the privacy regulations around social media use only exist within the context of the company itself—that is, there are no substantial federal regulations in the US on the matter, only the GDPR in the EU (St. Vincent, 2018). Where health information is concerned, the US does have slightly more mature federal regulation. The Health Insurance Portability and Accountability Act (HIPAA) requires confidentiality in all individually-identifiable health information; in 2013, this law was extended to genetic information by way of the Genetic Information Nondiscrimination Act (GINA). While the rules prohibit use of genetic information for underwriting purposes, there is no restriction on the sharing or use of genetic information that has been de-identified (National Human Genome Research Institute, 2015). De-identification is not entirely foolproof. There are cases in which the data can be re-identified (Rosenbaum, 2018).
The incongruence is puzzling. In the case of social media, users willfully provide a wealth of data points on a regular basis to companies that repackage and monetize that data for dubious purposes, in the absence of meaningful US legislation to protect it. In the case of PGT, where at least HIPAA and GINA have a rudimentary level of codified protection, users’ hesitance appears to be much more pronounced.
McNamee, R. (2019). Zucked: Waking up to the Facebook catastrophe. New York: Penguin.
National Human Genome Research Institute. (2015). Privacy in genomics. Retrieved from https://www.genome.gov/about-genomics/policy-issues/Privacy
Rosenbaum, E. (2018). Five biggest risks of sharing your DNA with consumer genetic-testing companies. Retrieved from https://www.cnbc.com/2018/06/16/5-biggest-risks-of-sharing-dna-with-consumer-genetic-testing-companies.html
St. Vincent, S. (2018). US should create laws to protect social media users’ data. Retrieved from https://www.hrw.org/news/2018/04/05/us-should-create-laws-protect-social-media-users-data