Neuralink In Media: Technological Limitations And Bioethical Concerns

Over the past decade, using media sources to present scientific advances and research has increased in popularity. A growing number of companies are eschewing typical presentation methods for more informal approaches, whether it be the employees posting updates on Twitter or the company making Youtube videos, as only a couple of examples. The freedom that the internet has given people in relaying information is a net-benefit but does diminish the vetting standards of the information the public receives. With healthcare products, easy access to potential customers can be tricky in two ways, (1) it allows the company to garner support for their idea or research project - financially or otherwise - without having to be held accountable. But, also, (2) causes people to, in their excitement, overlook the serious bioethical issues involved. In particular, these issues have been a concern in the research and development of Brain-Computer Interfaces(BCIs), which "…acquire brain signals, analyze them, and translate them into commands that are relayed to output devices that carry out desired actions''. [1] While many companies have started to research and develop these technologies, none are more prevalent in the media than Elon Musk's Neuralink. [2]

In what follows, an examination of Neuralink's website and product demonstration will show a deep tension between the company's goals and reality. The discussion will touch on the concept of neuroscience theater and ambiguity in answers from the neural link team in presentations. Next, shifting away from technological limitations, there will be a discussion of the bioethical implications of Neuralink. At present, Neuralink has generated excitement by showcasing, in many cases, technological chimera, foregoing discussions of essential bioethical considerations. Consequently, the following sections will address the bioethical concerns of BCIs encroaching on patient privacy and autonomy.

Marketing and Presentation of Medical Technology

Neuralink's website is what to expect from an emerging biotech company; flashy and exciting, yet leaving much to be desired in light of the promises the company makes. As an example, take the science section of the website with the header of understanding the brain. [3] Although compelling, it stops at We Can Record Electrical Signals To The Brain, even when Neuralink's promises [4] require writing to the brain, an underdeveloped science - at best. However, even reading signals from Neural activity is more complicated than the website asserts. Bridging the gap between reading electrical signals and turning those signals into something usable has made strides in recent years but is nowhere close to reading real-time data of the complex activity in the brain. As it stands, reading is only possible after training models on all possible questions and answers. [5] Model training consists of participants repeating questions and answers ten times and running that data through models that perform predictive analysis on the neuronal data to find the likely answer [6] (this reveals more about the shape of neuronal data than it does in explaining the underlying phenomena). While groundbreaking, current technology is far from being able to handle complex interactions with minimal computational power. Given that, reducing reading Neural activity to "Recording from many neurons allows us to decode the information represented by those cells" [7] is misguided and reaffirms Neuralink's tendency to exaggerate what it can offer potential users in the foreseeable future. Neuralink does recognize the challenges it faces though, engaging in relevant discussion about these challenges on their website. [8] Yet, there is minimal effort to cite pertinent studies or give realistic timelines about how they plan to overcome those barriers.

Consequently, Neuralink is participating in neuroscience theater. For our purpose, neuroscience theater is a company ignoring the onus to report on the limitations or liabilities of their products while generating enthusiasm and excitement. In this case, the theater is referring to a fictional representation of the state of Neuroscience research. While the term does not appear in bioethical literature as of now, there are ethical elements to the promotion of healthcare products. Biotech companies should be able to access the public, as long as "…the information is truthful, honest, non-discriminatory, and not misleading" [9] , but those claims "…presented must not alter the reality and should not give false hopes to patients" [10] . As the unvetted promotion of health products grows, I believe terms like Neuroscience theater will receive more treatment in bioethics literature.

Whether it be on the website or company presentations, Musk's claims are far from reality. [11] MIT press author Regalado, in Elon Musk's Neuralink, is neuroscience theater, posits that the biotech companies' claims remain far-fetched and unrealistic. One roadblock is reading real-time brain activity, but also, as Regalado mentions, dealing with the highly corrosive environment in the brain. Though completed research shows encouraging results, "…challenges remain, such as an inadequate understanding of the brain and its mechanisms, the chronic separation of adhesive electrodes, a low SNR, and inflammation associated with invasive electrodes". [12] Regalado also discusses the "inadequate understanding of the brain and its mechanisms". [13] In response to Musk saying the chief challenge is adding electrodes to the brain, Regalado takes a more pragmatic approach in countering with "scientific knowledge about what electrochemical imbalance creates, say, depression in the first place" [14] being the major impediment. Regalado is right here in asserting that Neuralink is a lot further behind than it thinks. Nevertheless, the company continually neglects to mention the complexities and considerations related to their product.

The Neuralink Q&A streamed via Youtube on August 28th, 2020, again, tends toward excitement generation and recruitment. Elon Musk does, in all fairness, say that the sole goal of the presentation is to recruit talent for Neuralink. [15] However, considering the lack of engagement with known hurdles and inability to offer definitive timelines, the demonstration seemed more so to "…build the kind of fan base that has cheered on Musk's other ventures and has helped propel the gravity-defying stock price of electric-car maker Tesla". [16] Questions about playing video games and summoning teslas with the link, throughout the presentation, reaffirmed the companies ulterior motives [17] One concrete achievement mentioned in the presentation was the breakthrough device designation from the FDA for Neuralink. Still, that does not tell future patients much. While FDA breakthrough device designation is a complex process requiring significant work by Neuralink's team, it only involves reporting qualitative aspects of the technology. [18] In other words, no results related to the safety or the efficacy of the device need to be produced by Neuralink. Nevertheless, Musk promises to exceed the safety standards of the FDA, drawing on the fact that Tesla only ships the safest cars to customers. [19] But, this is a rather unnerving analogy considering the recent safety vulnerabilities exposed in the newest Tesla Models. [20][21]

Despite the safety standards that Musk wants to meet, anything wireless or connected to the web has vulnerabilities. For instance, in Tesla's cases, the software was updated so that there no longer was a vulnerability. However, the stakes are higher when the device, ostensibly, has access to the patient's thoughts, memories, and, via the Neuralink, their phones. All of these critiques fall under the umbrella of privacy, a consideration left out of Neuralinks website and streamed-presentation.

What is missing: Privacy and Autonomy

Although one definition of privacy is difficult to come by, considering its dependence on a variety of factors and personal preference. [22] Most generally though, in bioethics, privacy "…pertains to the collection, storage, and use of personal information and addresses the question of who has access to personal information and under what conditions". [23] Neuralink's goals, as portrayed in the media, do not respect the intricacies of patient privacy. For the sake of clarity, the discussion will firstly focus on the storage and security of patient data. [24] And later, the question of which third parties have access to users data via Neuralink and under what conditions. Storage of Data is an ever-growing issue for companies and customers alike. Regardless of how well data is protected, unauthorized access remains a concern. Between 2017-2019 there were 4,395 known breaches, which resulted in the exposure of over 800 million records. [25] Despite not being the most common target, healthcare systems were the result of various attacks during this period. In 2017, for example, Ransomware took over hospital filesystems, threatening to release patient data unless a fee was paid. [26] In 2018, Meditab's fax system was accessed through a vulnerability in a fax machine, releasing over 6 million records of unencrypted data. [27] Many healthcare facilities' IT infrastructures are particularly vulnerable due to "A combination of aging IT infrastructure and weak IT security practices". [28] Neuralink falls into neither of these categories. Yet, regardless of how good an organizations' cybersecurity divisions or how cutting edge their technology, there is no information system completely imperceptible to unauthorized access.

So as not to put patients in harms way, Neuralink needs to offer an essentially unbreachable product. If not, patients' most valuable data(eg., personal thoughts, mental health, memories) risk exploitation. Any breach of private health information can have damaging effects: "Breaches of privacy and confidentiality not only may affect a person's dignity but can cause harm. When personally identifiable health information, for example, is disclosed to an employer, insurer, or family member, it can result in stigma, embarrassment, and discrimination". [29] Beyond this, privacy is also important in interpersonal relationships. All the private information patients want to be kept private from friends and family, for example, should be; allowing individuals to be the authority in deciding how to present themselves(see autonomy). [30]

Neuralink has also said very little in the media about exactly what it intends to do with the decoded neural data. Data, as mentioned, is valuable. For example, 23andMe, the DNA testing giant sold user data in mid-2018 for over 300 Million dollars to GlaxoSmithKline. [31] Biotech companies selling anonymized data to drug companies is a common practice and is used to make decisions about which drugs to make and which populations to target in advertising. [32] However, dispersed user data also increases the likelihood of possible leaks and ultimately undermines patient privacy - making it more susceptible to unauthorized access.

In the standard physician-patient relationship, there is limited private health information granted to third parties or outside sources without informed consent. A physician sharing outside data would be a breach of confidentiality and degrade trust within that relationship. [33] However, companies can now require consumers' informed consent to use services and then turnaround and sell the collected data legally. There is a concerning dynamic that goes on here: customers pay for a product that requires informed consent, then services are rendered, and then the company sells loosely anonymized data. Under the guise of offering genetic insight, biotech companies get users informed consent and profit off of it. Neuralink can do the same with private information by getting users' real-time reaction to products and then turnaround and sell that data to companies through informed consent. Neuralink being able to sell patients health information could have profound implications on the sanctity of the individual by shifting them from ends to means, having them act as marketing tools while simultaneously subverting their privacy.

Unquestionably, patients shifting from means to ends is troubling because it undermines a fundamental doctrine in healthcare ethics: respect for persons. [34] Although respect for persons, which holds that people must be treated as ends and not means, encompasses a much larger group of guidelines. [35] The focus here will be on patient autonomy, or "…self-rule that is free from both controlling interference by others and from limitations, such as inadequate understanding, that prevent meaningful choice". [36] Privacy informs the discussion on the threat of diminished autonomy via Neuralink. Despite wanting to exceed the FDA safety standards with Neuralink, Musk's other ventures have been unable to offer an impenetrable product(see Montalbano, Lambert). Hackers have taken control of Teslas via Bluetooth and unveiled a host of other vulnerabilities. The ramifications of access to Neuralink are more serious and open the door to diminished autonomy of patients. Self-rule without interference from outsiders is essential to any medical treatment, but Neuralink, as it stands, cannot promise non-interference. Unauthorized access to the Neuralink would give attackers the ability to write to patients' brains that would prevent meaningful choice and self-rule. Patients, in this case, would be experiencing "…compulsion and weakness of will…" [37] a noted indication of diminished autonomy. [38] Although Neuralink does promise to follow all current rules and regulations in development, that is not enough considering the capabilities of the product.

There is also the concern of autonomy if Neuralink does decide to partner with third parties. Neuralink would have access to patients' real-time interactions with products and situations. Yet, it is crucial to understand that, "If a person's choices, decisions, beliefs, desires, etc. are due to such external influences as unreflected socialization, manipulation, coercion, etc., they are not autonomous". [39] Using people as marketing tools, not as ends, has already been touched on. But, there are worries that Neuralink could also exert external pressure on people to interact with those companies. For instance, if Neuralink partners with a bank that allows users to transfer or send money [40] via the link, users could be coerced into using that bank through promotional offers, free trials, or pervasive advertisements that target vulnerable demographics. This has implications for patients' informed consent, too. If Neuralink is willing to partner with any company in the future, then patients would not be able to fully understand the scope of the procedure, which would be a breach of informed consent.[41] Additionally, partnerships beg the question about what third party software Neuralink will require users to engage with. If, for example, someone suffering the loss of mobility in their limbs wanted Neuralink to regain control but did not want third parties to have access to their data, would they be able to opt-out? If not, Neuralink would be coercing vulnerable populations into sacrificing their privacy and autonomy to use their product. While Neuralink does need to address the prospect of diminished patient autonomy in and of itself, they may be able to start this discussion by strengthening the privacy of the device.

In light of the persistent threats to personal Information via unauthorized access and legal data acquisition, but also a tendency of the public to be more careless with their private health information,[42] the control view of privacy may need replacement. Insurers, health care providers, WebMD, Amazon, Facebook, and Google.[44] all acquire data from web traffic, online health surveys, and purchase history - naming only a handful of a much larger group for conciseness. Considering the abundance of companies that collect data through legal routes alone makes the idea of complete protection of private data almost farcical.

Referencing the non-control view, Neuralink could protect itself from privacy critiques through data obfuscation. Neuralink would be transmitting massive amounts of personal data over non-private or loosely anonymized communication signals. If transmitted data is effectively obfuscated, Neuralink would not be able to function, a noted downside of current anonymization methods.[44] However, new methods can distort private information linked directly to patients while retaining functionality dependent data[45] needed to let the Neuralink function properly. Distorted data would be unrecognizable to unauthorized access, but also to any third parties. Unidentifiable data protects the user in two ways: by protecting privacy over communication signals(Neuralink to phone, Neuralink to servers) and eliminating the possibility of being taken advantage of by authorized access; both would be a massive stride toward following the ethical and legal guidelines of respect for persons. The latter, however, may be a negative for Neuralink, not being able to use personal data for analysis or further research projects or profit off selling user metadata. One possibility to the former would be having a class of people who consent to be Neuralink researchers subjects, separate from the standard user. The research class would have Neuralink for gathering data. Their information would be more susceptible to being leaked and accessed but would also help further health initiatives and research. Participants must be compensated for the program and receive transparent briefings on the risks to their data. Consequently, this has bioethical implications of its own but would contribute valuable information to further public research efforts that would ultimately benefit the patient and society at large.

Regardless of the differing views of privacy and autonomy, they are both salient concepts that permeate through many other facets of life.[46] However, Neuralink has done little in the media to show potential patients that it respects privacy and autonomy. The company can develop better and more efficient obfuscation methods that would effectively protect patients' privacy and some aspects of autonomy but have only said they plan to exceed current standards, nothing about further research. Considering the security vulnerabilities of related ventures, this is cause for concern. Later in Neuralink's development, the lack of focus on patient privacy could cause problems in public receptiveness. The revolutionary technology will need public backing, which is supported, generally, when privacy is protected.[47] Likewise, it needs to be transparent about how and under what conditions data third parties will have access to patient health information and how common partnerships will be. To say nothing here would make it hard to find suitable research candidates,[48] but also open the door to diminished autonomy. Patients losing the ability to self-rule through the use of Neuralink via patients as marketing tools, unauthorized access, or coercion are real prospects that have the potential to cause harm if the product is not developed with a modern bioethical framework.

Neuralink should be concerned about its portrayal in the media, even in the early stages, and focus on building public trust through responsible handling of bioethical issues. Even if the website is to summarize compelling information and possibilities, and if the presentation's sole purpose is to recruit talent, possible patients and research subjects are watching and care about their private information and autonomy. Further, timelines and goals must be realistic. Neuralink has the potential to change lives for the better. Yet, the neuroscience theater also has the potential to cause great harm.

Notes

1 Shih, Jerry J et al. "Brain-computer interfaces in medicine." Mayo Clinic proceedings vol. 87,3 (2012): 268-79. doi:10.1016/j.mayocp.2011.12.008

2 "Neuralink: Home." Accessed December 4, 2020. https://neuralink.com/.

3 Ibid., 'science

4 Ibid., 'applications'

5 Moses, David A., Matthew K. Leonard, Joseph G. Makin, and Edward F. Chang. "Real-Time Decoding of Question-and-Answer Speech Dialogue Using Human Cortical Activity."Nature Communications 10, no. 1 (2019):3 https://doi.org/10.1038/s41467-019-10994-4

6 Ibid., 4

7 "Neuralink: Science." Accessed December 4, 2020. https://neuralink.com/science/

8 Ibid., approach

9 Solomon, M., Radu, G., Hostiuc, M., Margan, M. M., Bulescu, I. A., & Purcarea, V. L. (2016). Ethical issues in advertising and promotion of medical units. Romanian journal of ophthalmology, 60(4), 216–218.

10 Ibid

11 Ibid

12 Kim, G. H., Kim, K., Lee, E., An, T., Choi, W., Lim, G., & Shin, J. H. (2018). Recent Progress on Microelectrodes in Neural Interfaces. Materials (Basel, Switzerland), 11(10), 1995: 15. https://doi.org/10.3390/ma11101995

13 Regalado, Antonio. "Elon Musk's Neuralink Is Neuroscience Theater." MIT Technology Review. MIT, August 30, 2020.

https://www.technologyreview.com/2020/08/30/1007786/elon-musks-neuralink-demo-update-neuroscience-theater/.

14 Ibid

15 Musk, Elon. "Neuralink Progress Update, Summer 2020." YouTube, 1:20 - 1:24 Posted by Neuralink, August 28, 2020. https://www.youtube.com/watch?v=DVvmgjBL74w.

16 Regalado, Antonio. "Elon Musk's Neuralink Is Neuroscience Theater." MIT Technology Review. MIT, August 30, 2020. https://www.technologyreview.com/2020/08/30/1007786/elon-musks-neuralink-demo-update-neuroscience-theater/.

17 Musk, Elon. "Neuralink Progress Update, Summer 2020." YouTube, 29:30 Posted by Neuralink, August 28, 2020. https://www.youtube.com/watch?v=DVvmgjBL74w.

18 Center for Devices and Radiological Health. "Breakthrough Devices Program." U.S. Food and Drug Administration. FDA. Accessed December 4, 2020. https://www.fda.gov/medical-devices/how-study-and-market-your-device/breakthrough-devices-program.

19 Musk, Elon. "Neuralink Progress Update, Summer 2020." YouTube, 21:54 - 22:15 Posted by Neuralink, August 28, 2020. https://www.youtube.com/watch?v=DVvmgjBL74w.

20 Montalbano, Author, Elizabeth Montalbano. "Tesla Hacked and Stolen Again Using Key Fob." Threatpost English Global threatpostcom. Accessed December 4, 2020. https://threatpost.com/tesla-hacked-stolen-key-fob/161530/.

21 Lambert, Fred. 2020. "The Big Tesla Hack: A Hacker Gained Control over the Entire Fleet, but Fortunately He's a Good Guy." Electrek. August 27, 2020. https://electrek.co/2020/08/27/tesla-hack-control-over-entire-fleet/.

22 Nass, SJ, Laura Levit, and Lawrence Gostin. "Beyond the HIPAA Privacy Rule," 2009. 16. https://doi.org/10.17226/12458.

23 Ibid., 17

24 Coin, Allen, Megan Mulder, and Veljko Dubljević. 2020. "Ethical Aspects of BCI Technology: What Is the State of the Art?" Philosophies 5 (4): 31. https://doi.org/10.3390/philosophies5040031. Overview of important factors for consumers and patients

25 Clement, J. "U.S. Data Breaches and Exposed Records 2020." Statista, October 1, 2020. https://www.statista.com/statistics/273550/data-breaches-recorded-in-the-united-states-by-number-of-breaches-and-records-exposed/).

26 "The NHS Ransomware Attack & Data Privacy in the Era of Digital Health – Part One." The Medical Futurist, January 15, 2018. https://medicalfuturist.com/the-nhs-ransomware-attack-data-privacy-in-digital-health-part-one/.

27 Whittaker, Zack. "A Huge Trove of Medical Records and Prescriptions Found Exposed." TechCrunch. TechCrunch, March 17, 2019. https://techcrunch.com/2019/03/17/medical-health-data-leak/.

28 "The NHS Ransomware Attack & Data Privacy in the Era of Digital Health – Part One." The Medical Futurist, January 15, 2018.

29 Nass, SJ, Laura Levit, and Lawrence Gostin. "Beyond the HIPAA Privacy Rule," 2009. 77. https://doi.org/10.17226/12458.

30 Ibid., 78

31 Brodwin, Erin. "DNA-Testing Company 23andMe Has Signed a $300 Million Deal with a Drug Giant. Here's How to Delete Your Data If That Freaks You out." Business Insider. Business Insider, July 25, 2018. https://www.businessinsider.com/dna-testing-delete-your-data-23andme-ancestry-2018-7.

32 Ibid.

33 Nass, SJ, Laura Levit, and Lawrence Gostin. "Beyond the HIPAA Privacy Rule," 2009. 77-79. https://doi.org/10.17226/12458.

34 McCormick, Thomas. 2013. "Principles of Bioethics UW Department of Bioethics & Humanities." Washington.Edu.2013. https://depts.washington.edu/bhdept/ethics-medicine/bioethics-topics/articles/principles-bioethics

35 "Universal Declaration on Bioethics and Human Rights: UNESCO." 2019. Unesco.org. 2019. http://portal.unesco.org/en/ev.php

36 Varelius J. (2006). The value of autonomy in medical ethics. Medicine, health care, and philosophy, 9(3), 377–388. https://doi.org/10.1007/s11019-006-9000-z

37 Ibid., Notion of Autonomy

38 Ibid.

39 Ibid.

40 This relies on effectively discerning action thoughts from passive thoughts. Neuralink does want to interaction between brain and application via neuralink, but that requires a discrimination between a thought like "Transfer 100 dollars to savings"(action) and "I want to transfer money 100 dollars to savings every month"(passive). Both could be interpreted as actionable which could have negative ramifications. More work is needed here.

41 Shah P, Thornton I, Turrin D, et al. Informed Consent. [Updated 2020 Aug 22]. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2020 Jan-. Available from: https://www.ncbi.nlm.nih.gov/books/NBK430827/

42 Kasperbauer TJ, "Protecting health privacy even when privacy is lost" Journal of Medical Ethics 2020: 768. http://orcid.org/0000-0003-0216-7632

43 Ibid., 769

44 Ibid., 768

45 Qiao, Y., O. Zhang, W. Zhou, K. Srinivasan and A. Arora. "PhyCloak: Obfuscating Sensing from Communication Signals." USENIX Annual Technical Conference (2016).

46 Nass, SJ, Laura Levit, and Lawrence Gostin. "Beyond the HIPAA Privacy Rule," 2009. 81-85. https://doi.org/10.17226/12458.

47 ibid

48 Ibid

Written on December 17, 2020