The digital collection and use of body data has become a rapidly growing market that is expected to exceed $500 billion by 2030. This comes from a paper by Mozilla researcher Julia Kasseron. Additionally, the associated risks have increased dramatically; According to Saffron, in the United States alone, health-related cyberattacks have increased by nearly 4,000 percent since 2010, and health data on the darknet now exceeds the value of credit card data. , This development is the focus of Saffron’s current study “Skin to Screen: Bodily Integrity in the Digital Age.”
Advertisement
Casero’s research showsHow the massive collection of “body-related data” – from fingerprints to fitness trackers to digital health data – creates significant risks: data leaks, surveillance, discrimination, and exploitation by AI systems. The role of data brokers, who trade sensitive health and biometric data without users’ consent, is particularly problematic.
As a solution, she proposes the “Databody Integrity Framework” – a holistic approach that aims to protect digital information about the body and mind with the same human rights standards as physical existence. After all, people have an interest in protecting their most sensitive data. The Framework includes concrete recommendations for action at various levels:
- Redefining sensitive data in data protection laws
- Expand health privacy laws to cover all health information
- User friendly consent mechanism
- Favoring platforms with strong privacy protections
In the interview, we talk to Julia Kasseron about the details of her research and the urgency of better protecting the “digital body,” as well as the goals of her framework.
Heise Online: What was the inspiration for your research work?
Julia Kasseron is a Senior Fellow at Mozilla and has been studying the role of physical integrity in regulating the tech industry for more than a decade.
(Image: Saffron)
Julia Kasseron: I have been working at the intersection of technologies, human rights, and social justice for 15 years. I saw how difficult it is to develop narratives about data security that resonate with people who are not concerned about these issues. For most people, data security remains an abstraction; Many people don’t understand what this means for them or their loved ones.
The goal of my project is to create a compelling, data-driven narrative about why privacy matters. Initially, I wrote essays about what I called “the unwanted touch of the digital age,” trying to explain how intrusive technologies were becoming mainstream and the full impact on their long-term impact on our lives. Integrating without question. Since the pandemic, I have also been witnessing the rise of artificial intelligence and other emerging technologies, which has inspired me to research and gather evidence on the growing endogenous data collection.
How do you think public awareness and understanding of these issues can be improved?
As I wrote these essays, I realized that these stories move people. I tried to explain that the harms associated with body data collection can affect anyone, including your daughter or someone you love. This is no longer a problem limited to vulnerable communities; It affects everyone. As parents become increasingly concerned about the role of technology in their children’s lives, especially as smartphones have become an everyday item. Therefore, this message has impacted many people who might not have bothered with it before.
Ten years ago these concerns seemed like science fiction. However, in recent years I have noticed a change: the problems we face are no longer imaginary. There is clear evidence that body-related data collection is actually causing harm in various fields, forcing more people to think about the importance of data security.
What do you think about European? Health data room?
I find this deeply troubling and confusing and it will probably take some time for us to get a clear picture of its far-reaching implications. European Health Data Space (EHDS) It aims to build on the General Data Protection Regulation (GDPR).But it seems to me that it contradicts the spirit of data protection regulation by mandating reckless data collection rather than curtailing the information collected by these systems.
The technology industry is growing aggressively and investing heavily in innovation, and politicians and policymakers are excited by the potential to provide better services to their constituents. However, discussions often ignore shortcomings that go beyond privacy concerns. Users and patients are wondering why they should trust these systems, who will have access to their health data besides the doctors treating them, and what level of cybersecurity support the institutions that install these technologies will receive. Although the research aspect is commendable, it is very unclear with what standards and consensus mechanisms the system will be implemented. Transparency is needed to ensure that research is conducted ethically and with fully informed participation.
Innovation is great, but who gets to decide what price we pay? AI software is also developing, but data security is quite low and people do not care about it. He says he has nothing to hide. If your child has any pre-existing medical conditions, it will increase the cost of insurance and if this information is leaked, it will affect their entire life. The data will be there, and it will probably be easy to identify the person behind the data.
We need to slow down a bit and regulate the industry. Otherwise, it will definitely cause great harm to our fundamental rights. But people do not understand the potential danger. When you’re trying to convince lobbyists and politicians who feel safe in their own world, it’s really difficult to get the message across. Mass data collection makes life even worse for people who are already struggling.
You propose a framework, can you please tell us more about it?
The framework is based on existing fundamental rights and directives, such as the Charter of Fundamental Rights of the European Union, the International Covenant on Civil and Political Rights or the Declaration of Helsinki. My aim was specifically to examine how we can build on these conventions to implement the right to physical integrity in online spaces. A key element of this framework is the concept of “integrity of the data entity,” which I have developed specifically to refer to the inviolability of the individual’s online personality and their right to control the management of the data that relates to their unique physical and Reflects psychological characteristics.
It is important to recognize that we have not yet adapted human subject management guidelines to the online environment. While sharing data is essential to scientific progress, rigorous consent processes are essential for any research project. This is the case, for example, when mobile health companies share information about our fitness routines or mental health with research institutions. Our body is no longer just a data point, but an extension of our physical self, and any loss of this data represents the loss of our actual life. With this framework and concrete recommendations for policymakers, civil society organizations, technology leaders, and individual users, I wanted to clarify these connections and show the importance of taking physical integrity more seriously in the digital context.
(Mac)