8th February, 2023

Robot v Lawyer

Artificial intelligence (AI) has been the fearsome villain in many a sci-fi film. It is often portrayed as the biggest threat to mankind, able to reform itself after every attack. But who would have thought that the new AI frontier could be the old dusty corridors of the Employment Tribunal?

There has been an incident of a claimant’s witness statement being written by AI in the ET. At least one of which I am aware. Not the most shocking figure, but it led me to consider whether AI could be part of the future.

My initial views were that it was a worrying thought; how can a computer replace the careful analysis and evidence-proofing of those in the legal profession. Surely it would lead to an odd mismatch of themes that do not connect, such as the recent attempts of AI to draft film scripts. I soon discovered that it was not that straightforward.

There are various online AI chatbots which will write a statement on behalf of the user, after asking a series of questions. One of which, Talk to Spot, is being marketed as a way of documenting events in the workplace so that they can be sent to the employer (https://app.talktospot.com). The user can do this anonymously. The completed report is then emailed to a mailbox of your choice. There is no reason, it seems to me, why a litigant in person could not start to use these tools in the ET.

What are some of the pitfalls or benefits of using technology in this way?

Firstly, if the witness has used a directed conversation with a chatbot, would they be intimately familiar with the contents of, and the implications within, the statement? If not, it stands to reason that effective cross-examination could make the witness appear unreliable.

Witness statements in Tribunal often need to refer to specific things such as: what was the effect of the unwanted conduct? Would the AI chatbot ask directed questions based on its understanding of the law? Who has provided the AI with the legal bases which need to be covered and how do we know they are correct? Are they up to date? Any flaws in the system are likely to infect the end product.

In relation to a witness describing unwanted conduct, for example, would the chatbot suggest terms to include - such as harassment? If a witness is not in receipt of legal advice, they may provide a factual account to the chatbot, for it to then suggest legal labels. Is this chatbot legal advice? If so, who would be responsible for any errors which result in poor outcomes for the individual?

We often see diary entries in the Tribunal. These can be handwritten pages allegedly written at the time of an event, or shortly thereafter. They can be produced in an attempt to corroborate an account, or simply to show that an action or omission had a particular impact at a certain point in time. The chatbot could be used as a form of diary entry. There seems to be no reason why a Tribunal could not view a chatbot report in the same way as a diary.

Any interviewer comes to the interview bringing along their own viewpoints. Even if we try not to do so, what we hear is filtered through the lens of our life experiences. AI is unlikely to have the same preconceptions. It stands to reason that any statement produced may be purer than that taken by a person.

Discrimination and whistleblowing detriments can have a damaging effect on a person’s confidence. People often feel humiliated by their treatment, particularly in the context of sexual harassment. Having to face someone to give their account may enhance those feelings of shame which can discourage people from coming forward. Even when they do, their account is potentially going to be impacted by their embarrassment. AI is faceless. Some witnesses may therefore feel more comfortable giving their account to a computer screen.

Finally - and most importantly for lawyers - research has shown that evidence obtained by AI was more likely to be accurate than that obtained by a person interviewing the witness. (Protecting victim and witness statement: examining the effectiveness of a chatbot that uses artificial intelligence and a cognitive interview (Rashid Minhas, Camilla Elphick, Julia Shaw: AI & Society Issue 1/2022)).

It will be interesting to see how the Tribunal will grapple with this constantly evolving area.

Amy Smith
9 St John Street Chambers



Latest News...

We are seeking applicants for a Chambers Administration Assistant

28th March, 2024
9 St John Street Chambers is seeking to recruit for a Chambers Administration Assistant.

Access Awards 2024

27th March, 2024
We have shortlisted for the Access Award and candidates have been invited to interview by email.

Success in the Court of Appeal for Beth Caunce – R v Mbatijua Murise

26th March, 2024
Beth Caunce appeared in the Court of Appeal and successfully appealed against sentence.