Alert The UK’s Crime and Policing Act 2026

Alert: The UK’s Crime and Policing Act 2026

6 Min Read

On 29 April 2026, the Crime and Policing Bill 2025 received Royal Assent, becoming the Crime and Policing Act 2026. It’s an Act that has a mash-up of different legislative changes – in this alert we’ll focus on the impact on companies using technology.

What is the Crime and Policing Act 2026 about?

Some of the key topics that the Act seeks to address include but is not limited to:

  1. Expanding the ability to prosecute organisations and senior managers;
  2. Strengthening tools for law enforcement to tackle economic crime;
  3. Tackling violence against women and girls,
  4. Protecting children; and
  5. Knife crime.

How does the Crime and Policing Act 2026 affect internet usage and digital activities?

Some of the main changes stemming from this Act exist to tackle violence against women and girls, and to protect children and vulnerable adults.

This isn’t a new area of interest for UK legislation – the Online Safety Act 2023 (OSA 2023) also has extensive regulatory provisions designed to address some of the same harms. The first set of enforceable duties under OSA 2025 came into force on 17 March 2025.

The Crime and Policing Act 2026 makes additional changes including:

  1. Criminalising some types of pornography including content depicting illegal sexual conduct between family members and adults roleplaying as someone under 18;
  2. Criminalising the making, adapting, supplying, or offer to supply of what is commonly called ‘nudification’ tools – this is partly in response to the public outcry over functionality on the X/Grok platform;
  3. Allowing for the criminal prosecution of moderators and administrators of websites that host child sexual abuse material (known as CSAM);
  4. Extending existing criminal law to include material providing instructions on how to use AI to general CSAM;
  5. Creating a new offence of adapting, possessing, supplying or offering a CSAM image generator, punishable by up to 5 years in prison. This could be an adaptation of a common platform e.g. a commonly used LLM;
  6. Creating new offences around screenshotting intimate images without consent;
  7. Allowing courts to make deletion orders for non-consensual intimate images;
  8. Putting more duties on online platforms to ensure non-consensual intimate images are taken down within 48 hours, and
  9. Imposing potential criminal liability for senior managers of online platforms who host CSAM, don’t act to take down content quickly or host illegal content about knives and offensive weapons including crossbows.

Who is in scope of the Crime and Policing Act?

The Act doesn’t just apply to social media companies. Prosecutors will look at what a company does rather than what it calls itself. Any business who hosts content or allows peer-to-peer (P2P) communication should look at their responsibilities. This could include organisations who allow customers or shareholders to chat with each other online. It might also include organisations who allow comments on videos or other content. AI chatbots are also in scope.

The new Act also gives the Secretary of State the ability to expand the scope of the Act to different providers in addition to those currently under the OSA 2023 regime.

Personal Liability

The UK, in common with a number of other jurisdictions, has been expanding the personal liability of directors and senior managers. We have seen this with changes to fraud legislation (see our alert: UK Home Office Guidance: Failure to Prevent Fraud).

We have also seen regulators focus on that, for example in the action against Carlos Abarca after the IT failures at TSB Bank plc. The new Act extends the liability of individuals for a wide range of offences – not only those introduced by the Act.

Who can be personally liable under the Crime and Policing Act 2026?

Personal liability in the Crime and Policing Act 2026 is not framed by a fixed list of named organisations. It is generally triggered by what the organisation does, not by sector labels.

That said organisations who operate one or more of the following are likely to be especially at risk:

  • Social media services (user generated content or UGC platforms)
  • Video and livestreaming platforms
  • Forums / message boards and community platforms
  • Marketplaces and classifieds platforms
  • Search / content discovery services
  • Messaging / community servers where content is hosted and moderated
  • AI service providers.

Is the Crime and Policing Act likely to be enforced?

The experience since the passing of OSA 2023 suggests that this is an area of interest for regulators and that the new provisions are likely to be enforced. The UK communications regulator OFCOM recently announced a number of investigations including investigations into:

  1. Telegram and teen chat sites
  2. File-sharing service Yolobit which has now taken steps to make itself unavailable to people in the UK
  3. File-sharing service Pixeldrain
  4. Nippybox
  5. First Time Videos LLC
  6. Web Prime Inc
  7. Kick Online Entertainment S.A
  8. 8AVS Group Ltd.

Given public concern and the statements made by politicians, particularly around harm to children, violence around women and girls, knife crime and antisemitic hate action, enforcement and regulatory action seem very likely.

What steps can organisations take?

Organisations might want to review their operations in light of the new Act. The steps they might want to consider include:

  1. Making sure that you have an effective takedown process in place for any concerns about content you host. This will need to be escalated and actioned quickly. Requests for takedown could be made to relatively junior employees or inboxes which are not continuously monitored. You’ll need to act within 48 hours. Make sure that your people know of the need to act quickly.
  2. Look at the extension of personal liability. This might mean discussions with your insurers about D&O insurance.
  3. Look at your due diligence processes. If you’re acquiring a company or its assets you’ll need to check that you are not buying extra risk.
  4. Put in place technical and organisational measures to allow you to monitor what you host. This could include moderation systems and AI detection tools in addition to rapid response teams to act when you see concerns.

For more information:

Our lawyers can advise on all areas of governance & compliance impacting your organisation

For bespoke legal advice on all regulatory or compliance matters, speak with one of our lawyers. We are ready to assist you with your regulatory or compliance needs and work with you on achieving your objectives.

To learn more, visit Governance & Compliance Services or Contact Us to arrange a consultation with a compliance lawyer.

Related Insights