Alert 23andMe bankruptcy and personal data

Alert: 23andMe bankruptcy and personal data

11 Min Read

On 23 March 2025, popular genetic testing company 23andMe announced that it had filed for bankruptcy protection, and that the CEO and co-founder Anne Wojcicki had resigned with immediate effect. The company will be selling itself under court supervision. Their press release said that the company will continue to operate throughout the sale process and that there are no changes to the way the company stores, manages, or protects customer data.

On 24 March 2024, the UK’s Information Commissioner’s Office released a statement regarding the 23andMe data breach that was first reported in October 2023, stating that the ICO issued a notice of intent to fine 23andMe £4.59 million and a had issued a preliminary enforcement notice. The statement acknowledged the recent bankruptcy filing and noted that UK GDPR (the General Data Protection Regulation) continued to apply to 23andMe.

What happened in the data breach?

The 2023 data breach was the result of a cyberattack, which affected an estimated 7 million users. Wired reported that over 1 million users’ data was available for sale on hacker forum BreachForums, much of it targeting people of Ashkenazi Jewish heritage.

The timing of the ICO’s statement is particularly interesting as following 23andMe’s bankruptcy protection announcement, there has been a significant spike in public interest regarding the personal data that consumer genetics companies might hold (e.g. MyHeritage, or Ancestry.com).

How does 23andMe work?

23andMe usually works by first sending the consumer a kit by post, which the consumer will use to collect some of their own saliva and post back to the company. The company then extracts DNA from the saliva sample. The DNA can then be analysed. Customers could see information about not just their genealogy, but also about genetic predispositions to health-related concerns.

What is DNA data under GDPR?

An individual’s DNA analyses would be considered genetic data, which is one the special categories of personal data under GDPR. Other examples of genetic data are chromosomal analysis and RNA data. There are concerns that individuals can also be very casual in giving away their biometric data for things like facial recognition and fingerprint recognition to log into apps and devices or uploading facial landmarks for AI image generation. As a result biometric data concerns tend to be given a higher priority for enforcement by regulators.

Biometric data is a separate category from genetic data under GDPR but is also special category data. Common examples of biometric data are retina or iris analysis, fingerprint data, facial imaging data, and voice recognition data. Biometric data can even include behavioural analysis which can be linked back to an individual, for example via handwriting analysis, keystroke analysis, or eye tracking.

In a similar vein, health or medical data, which includes data like NHS numbers, medical history, or test results, are also considered special category data.

Special category data merit more protection under GDPR because use of this data is more likely to affect a person’s fundamental rights and freedoms, such as the right to bodily integrity and freedom from discrimination.

What will happen to the data?

There is currently much speculation about what might happen to 23andMe’s vast trove of genetic data and who will own it as 23andMe has proposed a 14 May 2025 auction of its assets.

It is not the first time that there has been a data auction in circumstances like this. In 2000 online toy retailer Toysmart had a plan to sell its customer data to help pay off its debts. The US Federal Trade Commission (FTC) intervened to block the sale. Whether a less interventionist FTC (which currently has its own issues with two FTC Commissioners commencing proceedings against the Trump Administration over their purported removal) intervenes here remains to be seen.

There is a possibility however that other regulators may step in to block or limit a sale. Already Attorney Generals in California, Connecticut, New York, New Hampshire, Virginia, Minnesota, Maine, Oregon, Florida, the District of Columbia and Massachusetts have said they are monitoring the situation. It is likely that GDPR regulators will take an interest too.

For users who are worried about where this data will go, it might be a good idea to try and erase as much personal data as possible from 23andMe. In fact, 2 days prior to the bankruptcy announcement, the California Attorney General issued a consumer alert reminding Californians of their right to direct the deletion of their genetic data under the Genetic Information Privacy Act, and the California Consumer Privacy Act.

Users in the UK and Europe have the right to have their personal data deleted “without undue delay” under GDPR. This is known as the right to erasure, or the right to be forgotten.

23andMe’s UK privacy page shows that they have an automated account deletion process. Users can delete their account via the Account Settings page. There have been reports of technical issues with this process but some users have said these issues have now been resolved. Once the request is confirmed, the personal data should no longer be used in research projects and stored genetic samples should also be discarded.

Under GDPR users also have the right of access, to see what personal data an organisation holds about them. The right is not absolute however and can be subject to various carve outs and exemptions. The main purpose of this right of access is to help individuals understand how and why an organisation is using their data, and to check that they are using the data lawfully.

This right is usually exercised by a Subject Access Request (SAR) though 23andMe allows users to access and download their personal data via their 23andMe account. However, extra cautious ex-users may want to make a SAR after they have deleted their account, just to check that all of their personal data has been deleted. In that case, without an account, the ex-user could potentially make their request by emailing the organisation directly, specifying the reason why they are making that request.

What protects an individual’s genetic and biometric data?

The main legislation protecting an individual’s personal data is GDPR which applies across the EU. In addition, the UK has retained GDPR into domestic law as the UK GDPR, which is complemented by the Data Protection Act 2018 (DPA 2018).

Data and cybersecurity are inextricably linked. Much of the EU’s new and upcoming technology and cybersecurity regulations also address the security of individuals’ personal data, particularly that of biometric data.

Under the EU AI Act most biometric identification or biometric categorisation AI systems (e.g.: using biometrics to analyse what ethnicity a person is) are categorised as either high risk or unacceptable risk. The outright prohibition on unacceptable risk AI systems started applying from 2 February 2025. AI medical devices, or AI systems used for medical devices are also categorised as high-risk. Recently Nvidia announced new partnerships to use their AI technologies in genomic research, clinic trials, patient monitoring, and healthcare administration. Healthcare is turning into a prominent AI industry, and it will involve significant amounts of special category data as discussed above. For more information on the EU AI Act please see: The EU Artificial Intelligence Act FAQs.

Popular wearable fitness devices may collect and process both biometric and health data. The EU Cyber Resilience Act addresses connected devices (also commonly known as the Internet of Things), which includes not just wearable tech, but also products such as smart fridges, app connected security cameras, and smart home software. Through this Act the European Commission wanted to address both the vulnerabilities and security of these products, and help consumers better understand these products. For more information on the EU Cyber Resilience Act, see: EU Cyber Resilience Act FAQs.

As consumers are increasingly interested in health and wellness (in December 2024 the FT predicted that the wellness industry will reach $7 trillion in 2025), with terms like “self-care” and “bio-hacking” becoming increasingly trending topics, it’s worth giving additional consideration to the data about our health and bodies that we give away to various apps, devices, and services, and keep in mind the right to be forgotten.

What should organisations using biometric or health data do?

Any organisation processing biometric or health data will need to develop a plan. It will want to consider:

  1. Doing a Data Protection Impact Assessment (DPIA) and/or an AI Impact Assessment. For many applications processing this sort of data that will be mandatory but even where this is not mandatory it is often a good idea. In some cases, it will need to share the DPIA in advance with a regulator so make sure that the organisation can build in enough time to do this in the project plan or launch plan.
  2. Doing due diligence on any partners or providers. The 23andMe experience shows that companies in this space are vulnerable to failure and start-ups in particular might not have the financial wherewithal to survive even one breach. A formal due diligence process, possibly including insurance against the company’s default, is likely to reduce risk.
  3. Looking at their transparency obligations. Transparency is a key feature of both GDPR and the EU AI Act and most of the fines to date under GDPR have featured a lack of transparency. It is important to understand the technology the organisation will use, and to explain it in clear and simple terms to the people whose data it will be processing.
  4. Look at the impact of data subject rights. Dealing with subject access requests and the right to be forgotten can be technically challenging if systems are not designed with these rights in mind. Organisations need to have a proper process in place for opt-outs too.
  5. Look at data security. This case shows that regulators are concerned about the security of this type of data. Civil actions have also followed, and the level of damages may be higher than for other breaches – whilst it is relatively easy to change a bank account it is impossible to change one’s DNA! Even if an organisation tries and buys a cyberinsurance policy to cover the risk a good insurer is likely to ask questions about the security measures that the organisation deploying.

Further information:

The 23andMe press release can be found at: 23andMe Initiates Voluntary Chapter 11 Process to Maximize Stakeholder Value Through Court-Supervised Sale Process.

Wired report on the 2023 breach can be found at: 23andMe User Data Stolen in Targeted Attack on Ashkenazi Jews.

Examples of recent coverage of similar concerns over genetic data include:

See reporting of the auction date at: Bloomberg Law: DNA Data of 15 Million People Up for Sale After 23andMe Demise.

The full alert from the California Attorney General: State of California Department of Justice: Attorney General Bonta Urgently Issues Consumer Alert for 23andMe Customers.

The 23andMe UK privacy page: 23andMe: Requesting 23andMe Account Closure.

Nvidia’s news regarding their partnerships: NVIDIA Newsroom: NVIDIA Partners With Industry Leaders to Advance Genomics, Drug Discovery and Healthcare.

Financial Times articles on the wellness industry: FT: How wellness grew to become a multi-trillion-dollar market.

Jonathan Armstrong Lawyer

Jonathan Armstrong

Partner

Jonathan is an experienced lawyer based in London with a concentration on compliance & technology.  He is also a Professor at Fordham Law School teaching a new post-graduate course on international compliance.

Jonathan’s professional practice includes advising multinational companies on risk and compliance across Europe.  Jonathan gives legal and compliance advice to household name corporations on:

  • Prevention (e.g. putting in place policies and procedures);
  • Training (including state of the art video learning); and
  • Cure (such as internal investigations and dealing with regulatory authorities).

Jonathan has handled legal matters in more than 60 countries covering a wide range of compliance issues.  He made one of the first GDPR data breach reports on behalf of a lawyer who had compromised sensitive personal data and he has been particularly active in advising clients on their response to GDPR.  He has conducted a wide range of investigations of various shapes and sizes (some as a result of whistleblowers), worked on data breaches (including major ransomware attacks), a request to appear before a UK Parliamentary enquiry, UK Bribery Act 2010, slavery, ESG & supply chain issues, helped businesses move sales online or enter new markets and managed ethics & compliance code implementation.  Clients include Fortune 250 organisations & household names in manufacturing, technology, healthcare, luxury goods, automotive, construction & financial services.  Jonathan is also regarded as an acknowledged expert in AI and he currently serves on the New York State Bar Association’s AI Task Force looking at the impact of AI on law and regulation.  Jonathan also sits on the Law Society AI Group.

Jonathan is a co-author of LexisNexis’ definitive work on technology law, “Managing Risk: Technology & Communications”.  He is a frequent broadcaster for the BBC and appeared on BBC News 24 as the studio guest on the Walport Review.  He is also a regular contributor to the Everything Compliance & Life with GDPR podcasts.  In addition to being a lawyer, Jonathan is a Fellow of The Chartered Institute of Marketing.  He has spoken at conferences in the US, Japan, Canada, China, Brazil, Singapore, Vietnam, Mexico, the Middle East & across Europe.

Jonathan qualified as a lawyer in the UK in 1991 and has focused on technology and risk and governance matters for more than 25 years.  He is regarded as a leading expert in compliance matters.  Jonathan has been selected as one of the Thomson Reuters stand-out lawyers for 2024 – an honour bestowed on him every year since the survey began.  In April 2017 Thomson Reuters listed Jonathan as the 6th most influential figure in risk, compliance and fintech in the UK.  In 2016 Jonathan was ranked as the 14th most influential figure in data security worldwide by Onalytica.  In 2019 Jonathan was the recipient of a Security Serious Unsung Heroes Award for his work in Information Security.  Jonathan is listed as a Super Lawyer and has been listed in Legal Experts from 2002 to date. 

Jonathan is the former trustee of a children’s music charity and the longstanding Co-Chair of the New York State Bar Association’s Rapid Response Taskforce which has led the response to world events in a number of countries including Afghanistan, France, Pakistan, Poland & Ukraine.

Some of Jonathan’s recent projects (including projects he worked on prior to joining Punter Southall) are:

  • Helping a global healthcare organisation with its data strategy.  The work included data breach similuations and assessments for its global response team.
  • Helping a leading tech hardware, software and services business on its data protection strategy.
  • Leading an AI risk awareness session with one of the world’s largest tech businesses.
  • Looking at AI and connected vehicle related risk with a major vehicle manufacturer.
  • Helping a leading global fashion brand with compliance issues for their European operations.
  • Helping a global energy company on their compliance issues in Europe including dealing with a number of data security issues.
  • Working with one of the world’s largest chemical companies on their data protection program. The work involved managing a global program of audit, risk reduction and training to improve global-privacy, data-protection and data-security compliance.
  • Advising a French multinational on the launch of a new technology offering in 37 countries and coordinating the local advice in each.
  • Advising a well-known retailer on product safety and reputation issues.
  • Advising an international energy company in implementing whistleblower helplines across Europe.
  • Advising a number of Fortune 100 corporations on strategies and programs to comply with the UK Bribery Act 2010.
  • Advising of Financial Services Business on their cyber security strategy.  This included preparing a data breach plan and assistance in connection with a data breach response simulation.
  • Advising a U.S.-based engineering company on its entry into the United Kingdom, including compliance issues across the enterprise. Areas covered in our representation include structure, health and safety, employment, immigration and contract templates.
  • Assisting an industry body on submissions to the European Commission (the executive function of the EU) and UK government on next-generation technology laws. Jonathan’s submissions included detailed analysis of existing law and proposals on data privacy, cookies, behavioural advertising, information security, cloud computing, e-commerce, distance selling and social media.
  • Helping a leading pharmaceutical company formulate its social media strategy.
  • Served as counsel to a UK listed retailer and fashion group, in its acquisition of one of the world’s leading lingerie retailers.
  • Advising a leading U.S. retailer on its proposed entry into Europe, including advice on likely issues in eight countries.
  • Working with a leading UK retailer on its proposed expansion into the United States, including advice on online selling, advertising strategy and marketing.
  • Dealing with data export issues with respect to ediscovery in ongoing court and arbitration proceedings.
  • Advising a dual-listed entity on an FCPA investigation in Europe.
  • Acting for a U.S.-listed pharmaceutical company in connection with a fraud investigation of its Europe subsidiaries.
  • Acting for a well-known sporting-goods manufacturer on setting up its mobile commerce offerings in Europe.
  • Comprehensive data protection/privacy projects for a number of significant U.S. corporations, including advice on Safe Harbor Privacy Shield and DPF.
  • Risk analysis for an innovative software application.
  • Assisting a major U.S. corporation on its response to one of the first reported data breaches.
  • Work on the launch of an innovative new online game for an established board game manufacturer in more than 15 countries.
  • Advice on the setting up of Peoplesoft and other online HR programs in Europe, including data protection and Works Council issues.
  • Advising a leading fashion retailer in its blogging strategy.
  • Advising one of the world’s largest media companies on its data-retention strategy.
  • Advising a multinational software company on the marketing, development and positioning of its products in Europe.

Vivien YanniGan Lawyer

Vivien Yanni Gan

Associate Solicitor

Vivien was admitted as a Solicitor of England and Wales in December 2022. Prior to commencing her training contract she worked in regulatory policy at an investment management firm. She has experience assisting on the implementation of regulatory change projects and advising on risk and regulatory compliance issues in the financial services industry. Vivien also speaks Mandarin.


Related Insights

Insights

Criminal Compliance Podcast: Jonathan Armstrong on the current climate in investigations

Jonathan Armstrong recently joined Dr Christian Rosinus for an in-depth discussion on the shifting landscape of corporate investigations. In this episode of the Criminal Compliance Podcast, Jonathan draws on his...

2 Min Read

Read More Criminal Compliance Podcast: Jonathan Armstrong on the current climate in investigations