Skip to main content

We make the difference. Talk to us: 0333 004 4488 | hello@brabners.com

Implementing AI at work — your legal obligations

AuthorsLaura KeaneEleanore BeardSara Ludlam

Hands typing on a laptop keyboard with a screen displaying a search bar labelled "AI" and a "Create" button, in a dark setting.

Data protection day has always been about raising awareness of privacy rights and compliance obligations. However, the arrival of AI has introduced a level of complexity that most organisations are still grappling with — from hyper-realistic phishing to deepfake fraud and data collection at an unprecedented scale. Regulations are struggling to keep pace and the risks are evolving faster than traditional compliance frameworks can handle.

That’s why our data protection team held a webinar to provide attendees with a clear understanding of the current regulatory landscape, insights into specific AI risks (and how to address them), compliance strategies that work in practice and how to align cybersecurity with data protection.

This really is a must-watch for any organisation using AI — particularly generative AI and large language models (LLMs). You can view a replay of the webinar to get our in-depth guidance and we’ve pulled out some key takeaways below.

Hands typing on a laptop with floating icons of checkmarks, warning signs, and an AI shield indicating cybersecurity or AI protection.

Watch our webinar for in-depth guidance

This is essential viewing for any organisation using AI — especially those deploying generative AI and large language models (LLMs).

Watch now

Implementing AI: the risks for businesses & sensible safeguards

Excitement around the efficiencies that AI can offer often means that the data protection consequences of implementing such technologies are overlooked. There are also the issues of confidentiality and intellectual property (IP) protection to consider. 

Those who purchase AI systems for a business often fail to consult their legal and risk compliance teams before doing so. This can result in you signing up to legal agreements that give away your ‘crown jewels’ without realising. It may also increase your risk of breaching legal obligations under the UK GDPR, potentially exposing your business to significant fines.

If you wouldn’t invite your competitors into your business and allow them to access your confidential data, you shouldn’t implement an AI system without the appropriate safeguards in place. The two are equivalent and must be seen as such. 

Legal problems will arise when you provide access to an AI system that can crawl your data with no restrictions on use. Some sensible and practical safeguards therefore include preventing access to important and confidential information and checking the terms through which the AI is being supplied to find out who else will be able to access the material that’s shared with the AI system.

Ultimately, as AI systems become more advanced and handle increasing volumes of personal data, the risk of breaches will rise. New questions will also need to be navigated around fairness, transparency and explainability — so building in legal compliance from the outset (and re-evaluating your position regularly) is critical.

 

AI & your legal obligations

The GDPR principles highlight that if the information you share with an AI system includes personal data, you must fulfil your legal obligations.

These include ensuring that you have:

 

Data protection impact assessment (DPIA)

Ideally, your business will have carried out a data protection impact assessment (DPIA) before implementing AI software that’ll process personal data for which you’re responsible.

Breaches of the UK GDPR may result in you being fined — alongside the associated reputational damage — not the AI software owner, since you’re responsible for that personal data.

To carry out an effective DPIA, you’ll need an up-to-date record of processing activity that the business carries out on personal data using the AI system. This is referred to by data protection geeks (like ourselves) as a ROPA. 

 

Talk to us

If you need guidance or help with carrying out a DPIA or ROPA — or if you’d simply like to chat about whether your business is at risk and how to practice effective data protection — give us a call on 0333 004 4488, send us an email at hello@brabners.com or message us.

Laura Keane

Laura is a Solicitor in our commercial and intellectual property team.

    Read more
    A woman with long red hair, smiling, standing with crossed arms in an office space with large windows, a brown lounge chair, and green plant leaves.

    Sara Ludlam

    Sara is a Partner and Chartered Trade Mark Attorney in our commercial and intellectual property (IP) team.

    Read more
    Sara Ludlam

    Eleanore Beard

    Eleanore is a Legal Director and Data Protection Practitioner in our commercial team.

    Read more
    Eleanore Beard

    Talk to us

    Loading form...

    Related insights

    Data Protection FAQs

    Working on laptop at night bokeh

    Find answers to our most frequently asked questions about data protection and privacy from our lawyers.

    Read more