Neuroethics

Neurodata Protection: Shortcomings and Considerations for Responsible Data Sharing

Rohit Paradkar


Abstract

With the increasing usage of neuroinformatics in brain research, there has been an unprecedented growth in brain data usage and sharing. These are accompanied by many ethical and societal issues that must be considered and addressed in order to expand on the effectiveness of neuroinformatic research. This article specifically discusses the problem of data-sharing frameworks and how they infringe upon the privacy of patients and test subjects. The aim of this article is to suggest future considerations and open a dialogue on measures that can be utilized to address these major privacy concerns.

 

Intro to Big Data and Neuroinformatics

Just as the invention of the wheel reshaped the world in 3500 BC, there is one tool that has revolutionized the 21st century: data. The use of big data has skyrocketed in recent times as its widespread use now applies to almost every discipline, including banking, healthcare, stock trading, and most importantly, research. With the rapid advancement of neuroinformatics, a discussion about the ethics of neurological data privacy and sharing has become increasingly significant. This article breaks down the inherent ethical concerns of data sharing, usage, and privacy involved in neuroinformatics as well as suggests future considerations to address these issues. 

 Neuroinformatics is a field of study that incorporates neuroscience and information technology by building computational models using neurological data. It is used for three primary purposes: creation of tools to manage and store neuroscience data; development of software to analyze data; and formation of elaborate models to improve our understanding of brain function [1]. The data used in neuroinformatics is called brain data and comes mainly from brain scans such as fMRI (functional magnetic resonance imaging) and EEG (electroencephalogram) [2]. Neuroinformaticians collect raw brain data, interpret it, and then employ it in computational experimentation.  

Neuroinformatics research is a key factor in allowing  modern-day scientists to make discoveries that further our understanding of the brain; there are approximately 100,000 research papers published every year that make discoveries through analysis of brain data [3]. Needless to say, it will continue to become a driving force in sustaining neuroscience research in the information age [1].

 

Current Data Sharing Practices and Frameworks

The sharing of large volumes of experimental data is essential to the efforts of neuroscientists around the globe. However, this comes with alarming ethical concerns ー ones that must be addressed in order to tap into the full potential of the discipline. In order to understand the problem, it is necessary to examine current brain-data privacy measures.

In most research projects, data is first anonymized to exclude any personal and health information and then shared through interoperable databases. This strategy is employed by small and large studies including the Human Brain Project (HBP) and BRAIN Initiative ー two international efforts that aim to build a collaborative neuroinformatic framework [4].  While anonymization sounds like an effective method to eliminate privacy concerns, there remain two key problems: violation of informed consent and the re-identification of subjects.

 

Problems with Informed Consent and Re-identification

Informed consent is the process of a research participant contractually agreeing to the usage of their data in a research study with a complete understanding of its use and possible consequences [5]. In most cases, researchers are required to obtain consent to utilize data for each individual project. However, with frameworks that allow researchers to publicly share data, the informed consent of participants can be breached. Their personal data may be unknowingly used for research projects other than those agreed upon. The ethical implications of this are drastic, especially due to the inherent sensitivity of brain data [6].

Brain patterns obtained via neuroimaging have an even higher level of uniqueness than DNA or fingerprints and are extremely personal to an individual. Brain data can not only detect diseases and mental disorders, but can even reveal an individual’s intentions, behaviors, and personal thoughts [7]. Even after anonymization, brain data is more sensitive than other health information because it is the only data that can directly reveal a part of the forum internumーthe world of mental individuality and persona ー of a test subject [8]. It is therefore necessary to put forth stronger privacy laws to protect the rights of research participants and to preserve the integrity of the neuroinformatics field. 

Another issue that occurs within data sharing is the ease of reidentifying brain data. Currently, US researchers are required to follow the Health Insurance Portability and Accountability Act (HIPAA) guidelines to protect “individually identifiable health information” [9]. Data anonymization is compliant with these guidelines [10]; however, with the increasingly human-like capabilities of artificial intelligence and machine learning, data anonymization is not enough to effectively protect the privacy of test participants. A study conducted by researchers from the Imperial College of London and Université Catholique de Louvain found that by using just 15 demographic attributes, the anonymized data of 99.98% of Americans could be correctly reidentified. Furthermore, even incomplete and heavily anonymized datasets were found to be easily identifiable [11]. The case of Cambridge Analytica’s political weaponization of Facebook data exposed the ramifications of inadequate data privacy laws [12]. It is imperative that robust alternatives to data anonymization are developed.

 

Conclusion and Future Suggestions

As brain data becomes more abundant, we must engage in an open dialogue about this critical issue before it’s too late and find solutions to safeguard the rights of research participants. Presently, big data governance is primarily geared towards business endeavors and does not fully consider scientific research applications. Chile is the only country taking initial steps towards legislation that specifically regulates brain data privacy [13]. In order to ensure that informed consent standards are not violated and anonymized data remains truly anonymous, governments should follow Chile’s lead and apply more provisions for brain data privacy. Stricter regulations and policies need to be put in place so that research participants are aware of the possibility that their data may be shared. Greater strides need to be taken to ensure that data can be made permanently unidentifiable. 

As we strengthen our “21st century wheel”, it is vital that new regulations maintain subject privacy without impeding the scientific progress made possible by collaboration. As revolutionary as the wheel was, its partnership with the invention of wings allowed us to transcend to an entirely new frontier. Like the wheel, the field of neuroinformatics has the potential to widen the boundaries of neurological research, but it is crucial that we pair it with the wings of ethical practice so that it can truly soar.


References


  1. Bjaalie, Jan. (01/08/2008). Understanding the Brain through Neuroinformatics. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2570069/.  Retrieved: 07/09/2020.

  2. Bhagchandani, Ashish., Madhuri Chopade, and Dulari Bhatt. (14/03/2018). Various Big Data Techniques to Process and Analyze Neuroscience Data. https://www.researchgate.net/publication/328529465_Various_Big_Data_Techniques_to_Process_and_Analyze_Neuroscience_Data. Retrieved: 07/09/2020.

  3. Christen, Markus et al. (04/08/2016). On the Compatibility of Big Data Driven Research and Informed Consent: The Example of the Human Brain Project. The Ethics of Biomedical Big Data. https://link.springer.com/chapter/10.1007/978-3-319-33525-4_9. Retrieved: 07/09/2020.

  4. Choudhury, Superna et al. (16/05/2014). Big data, open science and the brain: Lessons learned from genomics. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4032989/. Retrieved: 07/09/2020.

  5. Shah, Parth et al. (01/06/2020). Informed Consent. https://www.ncbi.nlm.nih.gov/books/NBK430827/. Retrieved: 07/09/2020.

  6. Hallinan, Dara et al. (2014).  Neurodata and Neuroprivacy: Data Protection Outdated?. https://www.researchgate.net/publication/265048889_Neurodata_and_Neuroprivacy_Data_Protection_Outdated. Retrieved: 07/09/2020.

  7. Latini, Sara. (2018). To the edge of data protection. How brain information can push the boundaries of sensitivity. https://www.academia.edu/37661478/To_the_edge_of_data_protection._How_brain_information_can_push_the_boundaries_of_sensitivity.pdf. Retrieved: 07/09/2020.

  8. Mandell, Andrew et al. (2005). Are Your Thoughts Your Own?: “Neuroprivacy” and the Legal Implications of Brain Imaging. http://www.nycbar.org/pdf/report/Neuroprivacy-revisions.pdf. Retrieved: 07/09/2020.

  9. (26/07/2013). Summary of the HIPAA Privacy Rule. https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html. Retrieved: 07/09/2020.

  10. (06/11/2015). Methods for De-identification of PHI. https://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html. Retrieved: 07/09/2020.

  11. Rocher, Luc, Julien Hendrickx, and Yves Alexandre De Montjoye. (23/07/2019). Estimating the success of re-identifications in incomplete datasets using generative models. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6650473/. Retrieved: 07/09/2020.

  12. Confessore, Nicholas. (04/04/2018). Cambridge Analytica and Facebook: The Scandal and the Fallout So Far. https://www.nytimes.com/2018/04/04/us/politics/cambridge-analytica-scandal-fallout.html. Retrieved: 07/09/2020.

  13. (11/10/2019).  Computers Accessing Human Brain: Columbia Neuroscientist Calls for Regulation. https://globalcenters.columbia.edu/news/computers-accessing-human-brain-columbia-neuroscientist-calls-regulation. Retrieved: 07/09/2020.

  14. Stopczynski, Arkadiusz et al. (2014). Privacy for Personal Neuroinformatics. https://www.researchgate.net/publication/260754882_Privacy_for_Personal_Neuroinformatics. Retrieved: 07/09/2020.

  15. Karsten, Jack. (30/07/2018). How should the US legislate data privacy?. https://www.brookings.edu/blog/techtank/2018/07/30/how-should-the-us-legislate-data-privacy/. Retrieved: 07/09/2020.

  16. (23/07/2019). Anonymizing personal data 'not enough to protect privacy,' shows new study. https://www.sciencedaily.com/releases/2019/07/190723110523.htm.  Retrieved: 07/09/2020.

  17. Bushwick, Sophie. (23/07/2019). "Anonymous" Data Won't Protect Your Identity. https://www.scientificamerican.com/article/anonymous-data-wont-protect-your-identity/. Retrieved: 07/09/2020.

  18. Sanchez, David, Sergio Martinez, and  Josep Domingo-Ferrer. How to Avoid Reidentification with Proper Anonymization. https://arxiv.org/pdf/1808.01113. Retrieved: 07/09/2020.

  19. Yuste, Rafael et al. Responsible use of data in neuroinformatics research. https://responsible-ai.org/brain-data-protection/. Retrieved: 08/30/2020.

Rohit Paradkar

Rohit Paradkar


Hi! I'm Rohit, a rising sophomore from Newton, Massachusetts. I love biology and computer programming and hope to go into the medical field in the future. In my free time, I play tennis, ski, and watch football.