Monday, August 11, 2025
By NEIL HARTNELL
Tribune Business Editor
nhartnell@tribunemedia.net
The Supreme Court has reported an attorney to the Bahamas Bar Council’s ethics committee after she submitted three “fake cases” generated by artificial intelligence (AI) to back her arguments.
Justice Denise Lewis-Johnson, in an August 1, 2025, verdict said placing the “fictitious cases” before her “may very well amount to contempt of court” as she branded the predicament facing Darell Taylor “a warning” to the entire Bahamian legal profession over “how very dangerous” AI can be if not properly used.
Ms Taylor, who is a former prosecutor with the Department of Public Prosecutions, according to Tribune Business research, admitted in an April 28, 2025, e-mail to Justice Lewis-Johnson that ChatGPT had been used to locate the three fake cases and rulings, and that the links to them could no longer be found.
She asserted that there was “no restriction” on Bahamian attorneys using AI or ChatGPT, and doubled down by going “back to the poison well” the very same day to argue that one of the three disputed cases - Kelly versus Rolle - was heard by the Court of Appeal on February 17, 2016.
However, Justice Lewis-Johnson said it was impossible for one of the three Court of Appeal judges said to have heard that matter to have done so because he had retired almost three months previously. And the case number cited by Ms Taylor was assigned to another matter, Valentino Yustare versus the Crown.
Rejecting Ms Taylor’s argument that the three “fake cases” were not vital to her case, the judge said she had “wholly failed” in her duty as an attorney to ensure that all material placed before the Supreme Court is authentic and truthful. She added that the consequences of failing to verify the legitimacy of the AI-generated cases threatened to bring the Bahamian judicial system’s integrity into “disrepute”.
Judge Lewis-Johnson said the matter “highlights the increasing use of AI” by the Bahamian legal profession, and how its misuse and failure to check and verify the data it produces can cause problems for the administration of justice. So-called ‘hallucinations’, created when an AI database produces fake information sources, have already undermined court cases in the US and Trinidad & Tobago.
What is arguably the first AI ‘hallucination’ in the Bahamian legal system occurred after Ms Taylor brought an application on behalf of her client, Robert Phelps Herman, who is ensnared in a legal battle with his late wife’s children over probating her estate, which sought to have the latter’s claim dismissed over “procedural irregularities”.
Justice Lewis-Johnson, in her verdict, said it was discovered that three of the five cases upon which Ms Taylor relied to support the strike-out bid were non-existent. “Plainly stated, they were ‘fake cases’,” the judge wrote.
Gail Lockhart-Charles KC, representing Mr Herman’s opponents, found that the three cases in question either did not exist or were “fabricated”. She added that the first, Ladmat Ltd versus, Backo, which was said to have been delivered in 2002 by the UK-based Privy Council, was fictitious and its “citation” led to an unrelated decision on shareholder disputes and fraudulent misrepresentation.
Describing it as “a fabricated authority” that is “entirely untrue”, Mrs Lockhart-Charles said the second case raised by Ms Taylor - Petrie versus Dowling - originated from Australia and related to the recovery of damages for nervous shock.
“However, the defendant’s counsel falsely asserted that this case supports the proposition that participation in a proceeding does not cure a void act or confer jurisdiction where none exists - a blatant misrepresentation,” Mrs Lockhart-Charles wrote. She added that there was no record of the third case, Kelly versus Rolle, in the Bahamian court system, and the citation led to an unrelated decision.
Justice Lewis-Johnson said the Supreme Court conducted its own investigation to confirm Mrs Lockhart-Charles’ findings. Ms Taylor, after being asked to provide copies of the three cases, admitted in an April 28, 2025, e-mail that they were located by ChatGPT and “that she could no longer find the cases as the links were not available.
She said her assistant had conducted the research and located the three cases, and admitted they had “yet to be verified” beyond the use of AI, but asserted: “They were only a part of my speaking points.” Ms Taylor, in responding to Supreme Court questions, said there were “no restrictions” on AI use by the legal profession and “unapologetically” said she would defend herself before the Bar ethics committee.
“The court accepts that there is no legal or ethical restriction on the use of ChatGPT or any other artificial intelligence tool by attorneys,” Justice Lewis-Johnson wrote. “However, counsel is duty-bound to verify that the case law cited and relied upon to the court is accurate, real and that they exist.
“Counsel is duty-bound to verify the authenticity of that case, that citations are correct, that quotes used are factual and truthful, that the court is in no way misled. In this case, counsel [Ms Taylor] wholly failed on all accounts. The advancement of fictitious cases to the court may very well amount to contempt of court.
“While the court appreciates that workloads may impact time allotted for research, and thus modern research engines such as ChatGPT can assist counsel, they must be used so as not to mislead the court by the advancement of AI hallucinations,” Justice Lewis-Johnson added.
“As helpful a legal research tool it may be, this case shows how very dangerous it could be if not properly used. I am hopeful that this case will serve as a warning and guide to others, and such instances will be reduced. Attorneys remain bound by their oath and are governed by their code of ethics.”
AI ‘hallucinations’, or inaccurate predictions, occur if the data that ChatGPT is analysing is incomplete, biased or flawed in any way as it depends on data patterns to make conclusions and forecasts. Similar errors have already occurred in the US and Trinidad & Tobago legal systems.
However, Justice Lewis-Johnson said the proactive response of the guilty attorneys in both those cases, who pledged to address the deficiencies, was not matched by Ms Taylor who “continued to assert that she was free to use it and there was no legal or ethical restriction on its use”. She added that, had not Mrs Lockhart-Charles asked to respond, she could have relied on the fake cases in making her ruling.
“The implications are severe and serious when the court cannot accept counsel’s assertion to be truthful and cases to be real,” Justice Lewis-Johnson wrote. “The risk of harm to the integrity of the judicial process is real and could bring the system into disrepute.”
Describing Ms Taylor’s position that she had done nothing wrong as “unfortunate”, and noting she only apologised in her written submissions on “the non-existent cases”, Justice Lewis-Johnson said her efforts to distinguish between oral and written submissions was “without merit” as was the argument that they were not central to her case.
“Counsel’s first duty is to the court, never to mislead it and always act with honesty and integrity,” Justice Lewis-Johnson asserted. “This case highlights the increasing use of AI in the practice of law. As with any research tool and modern technology, the individual using it has an obligation to do so responsibly, to be diligent and verify the accuracy of the information being relied on.
“Regrettably, this was not done here and the court is now forced to have this matter referred to the ethics committee of the Bahamas Bar Association for hearing and determination based on the Code of Ethics.”
Commenting has been disabled for this item.