Imagine a place where the usual rules are put on pause, just for a while, so you can see what happens when you try something new. That’s what a regulatory sandbox is. The UK government has just announced something called AI Growth Labs—a kind of playground for testing new AI ideas without all the usual red tape. For now, they’re focusing on areas like healthcare, professional services, transport, and robotics.

So what’s actually changing? For the first time, companies can try out their AI products in the real world, with some of the usual rules loosened—but only for a short time, and only under watchful eyes. Experts will be there to make sure nothing goes off the rails, and there are plenty of safety nets. If something goes wrong or crosses the line, the experiment stops right away, and there could be fines.

But not everything is up for grabs. Rules about safety, basic rights, workers’ protections, and intellectual property are staying put. The government is also asking the public for ideas on how to run this whole thing—should it be managed by the government itself, or by the regulators?

Take housing, for example. Right now, getting a new development approved can mean wading through 4,000 pages of paperwork and waiting a year and a half. The hope is that, by letting AI help out, we could cut that time way down and actually get more homes built. There’s also a new fund to help the medicines regulator try out AI tools for drug discovery and clinical trials—but with humans still making the final calls.

Why does this matter? Because it’s a chance to see if we can make smarter rules, not just more rules. Other countries are watching to see if it works. Right now, only about one in five UK companies use AI, but experts think it could boost the country’s productivity by a huge amount—maybe £140 billion a year.

If you’re building AI in one of these sectors, now’s your chance to have a say in how the rules are made. The UK is taking a different path from the EU, which sorts AI by risk level. Here, it’s more about trying things out under supervision. Other countries like the USA, Japan, and Singapore are also experimenting with sandboxes, but the UK was actually the first to try this idea back in 2016 with fintech.

For policymakers, this is a way to see which rules can bend and which ones are set in stone. Earlier sandboxes have already shown this can work: one helped a company improve age checks, another helped develop online mental health services. The big question now is who should be in charge—the government or the regulators? It’s a debate that matters far beyond the UK.

If you work in compliance, don’t worry—this isn’t a free-for-all. There are experts watching, licenses to get, and real penalties if things go wrong. Knowing which rules can be flexed, and which can’t, will help you decide if joining the sandbox makes sense. The list of exclusions is there to show where the line is drawn, no matter what.

Read the UK government announcement