Blog post
Written:
February 27, 2026
Author:
George Windsor

Innovating Under Constraint

Share:
Share:
Organisation growth

How regulation is quietly shaping the North East’s AI future

The North East is increasingly recognised as a centre of AI and data-driven innovation, with companies building tools for digital manufacturing, healthcare, predictive analytics and beyond. But while the region’s momentum is real, it is also fragile - because the conditions under which local firms are innovating are profoundly shaped by national regulatory systems that do not always fit regional realities. Regulation matters not only because it sets limits, but because it shapes the speed, confidence and direction of innovation itself.

This is especially true in the North East, where a large share of AI activity is driven by SMEs. These firms are inventive and ambitious, but they operate with fewer people, less legal capacity and more distance from regulators than their counterparts in London, Cambridge or the South East. When national frameworks assume that every organisation can interpret complex rules with ease, the cost of misunderstanding is disproportionately borne by smaller firms in regions with thinner institutional support.

What is actually AI for North East SMEs

Although most people associate AI with tools like ChatGPT, AI isn’t just one thing. It’s more like a puzzle made of many pieces that help computers perform tasks we usually rely on people for. These pieces can include machine learning (tools that learn from data), NLP (systems that understand or generate language) computer vision and deep learning (software that recognises images or patterns) generative AI (tools create text, code or visuals) and much more.

Most SMEs don’t build these technologies from scratch. Instead, they combine and adapt existing AI tools to solve real problems in areas like logistics, manufacturing, healthcare, and customer service.

Understanding AI as a modular puzzle, we can clarify the practical reality in North East where innovation is not only about inventing new algorithms but also and mainly about putting the right pieces together to make products smarter, faster and more reliable.

When equal rules meet unequal capacity

The UK’s data protection regime - UK GDPR, the Data Protection Act and associated standards - applies uniformly to all organisations handling personal data. That universality is a strength: it creates trust, protects people and prevents misuse. But uniformity becomes strain when the systems for interpretation and support are not themselves universal.

A large financial services firm can dedicate a team to data governance, risk documentation and training. A three-person start-up in Sunderland must meet the same obligations without the same infrastructure. Even firms with strong technical skills can find themselves struggling not with data science, but with understanding what constitutes a valid DPIA, how to document processing consistently, or when a transparency obligation applies to a model under development.

The tension is not about unwillingness to comply. It is about bandwidth. It is about navigating a regulatory world designed around organisations with very different capacities and contexts. And it is about doing so from a region where access to specialised guidance is limited and where the informal networks that help interpret grey areas are thinner.

AI governance as an exercise in interpretation

Unlike data protection law, the UK has no single AI statute. Instead, AI governance emerges from a mixture of sector regulators, equality law, consumer protections and advisory principles such as fairness, transparency and accountability. This gives flexibility - ut flexibility transfers responsibility for interpretation downward.

For SMEs, this often means making high-stakes decisions about fairness assessments, transparency statements and model oversight without access to the expertise these decisions normally rely on. A recruitment company adopting an off-the-shelf screening tool may not have the capacity to test for bias or examine how the model performs across demographic groups. Yet the responsibility to ensure fairness still sits with them.

The challenge is not only technical. It is conceptual. It involves interpreting broad principles in specific, resource-constrained settings. In regions where regulators and advisory bodies are physically distant, and where specialist lawyers or ethicists are not readily accessible, interpretation becomes guesswork, and guesswork creates risk.

A region disadvantaged by distance and by institutional thinness

Regulation in the UK is highly centralised. Policy pilots, consultations, sandboxes and industry/regulator networks tend to cluster around London. Companies in the North East rarely have opportunities to test ideas with regulators early, to receive informal steers, or to participate in the conversations that shape how rules evolve in practice. They receive the obligations, but not the interpretative ecosystem that surrounds them elsewhere.

At the same time, the region’s institutional landscape remains thin. Across local authorities, NHS trusts, universities and public-sector organisations, governance expectations vary considerably. SMEs often face different documentation requirements from each partner they work with, even for similar types of data or models. These inconsistencies multiply administrative burden and slow down collaboration. What should be a regional strength - tightly connected public-sector environment - becomes a source of friction.

This combination of centralised regulation and decentralised, inconsistent local governance leaves SMEs exposed. It is not simply a compliance problem; it is a structural one. It affects how quickly firms can bring innovations to market, how confidently they can experiment and how readily they can work with public-sector partners that could benefit from their technologies.

Building foundations for a more confident, regionally grounded AI economy

The North East’s innovation potential is not in question. Its challenge lies in the infrastructure around that innovation: the interpretative, advisory and compliance support that allows SMEs to take responsible risks at speed. Strengthening this infrastructure does not mean weakening regulation. It means making regulation work as intended - protecting people while enabling progress - in a regional context that currently lacks the mechanisms to translate national rules into local practice.

A regional AI and data sandbox could play a vital role here. By giving North East SMEs the opportunity to test products alongside regulators and academic experts, it would turn uncertainty into guided exploration. Ambiguities around fairness, lawful processing or automated decision-making could be resolved during development, not after deployment. Such a mechanism would support not only compliance, but safer, faster innovation.

A permanent, regionally anchored compliance hub would complement the sandbox by helping firms embed good governance into everyday operations. Shared templates, routine training, accessible advisory services and a common language across public organisations would remove much of the duplication and inconsistency that currently slows collaboration.

These interventions are not about creating new bureaucracy. They are about reducing the hidden bureaucracy that arises when firms have to interpret complex rules alone. They are about grounding innovation in a regionally appropriate support system that acknowledges how SMEs operate.

What next? The North East AI Growth Zone

So where does this leave the North East? In a surprisingly strong position - if the region moves quickly to turn regulatory friction into a competitive edge. The newly announced AI Growth Zone is an opportunity to do exactly that. It gives the region political momentum, investment focus and national visibility. What it needs now is the practical infrastructure that lets SMEs make use of it.

That means building the things that don’t yet exist but would make an immediate difference: shared templates, a place to sense-check tricky compliance questions, and somewhere to test early-stage ideas before they hit real users. None of this is abstract. These are straightforward interventions that could sit inside existing institutions and start delivering value fast.

A lightweight regional compliance support service - something as simple as a rotating clinic hosted by local universities or innovation organisations - would help SMEs get unstuck on the issues that currently slow them down: fairness assessments, DPIAs, vendor checks, documentation. A small amount of hands-on guidance can prevent a huge amount of wheel-spinning.

Alongside this, the region could pilot a practical, narrowly scoped AI sandbox focused on one or two high-impact sectors - health, manufacturing or local government services. This doesn’t require a new building or a large programme. It requires a structured space for SMEs to test ideas with expert eyes on them, and a way for regulators to feed back quickly. Done well, it becomes a signature feature of the Growth Zone: a place where you can experiment safely and get to deployment sooner.

Finally, this opens up a clear direction for future research: understanding how these targeted interventions affect SME productivity, investment readiness and the pace of public-sector adoption. Rather than studying AI governance in the abstract, the North East can become a live case study of how a region builds the practical plumbing for responsible innovation.

If the Growth Zone helps the North East become the easiest place in the UK for SMEs to build compliant, trustworthy AI - because the region has invested in the everyday infrastructure that makes doing the right thing simple - then the regulatory challenges outlined earlier become an advantage, not a drag. And that is a future worth building toward.

If you would like to speak to us about anything in the article please contact: georgia.donta@sunderlandsoftwarecity.com

Written by: Georgia Donta