Doctored audio evidence used to damn father in custody battle
Doctored audio files and footage – known as ‘deep fakes’ – are being submitted to UK courts as evidence, a family lawyer has warned.
‘Deepfake’ audio was used in a custody battle to try and portray a father as threatening, a family lawyer has revealed, as he warned that doctored evidence is being submitted to courts.
Byron James, a family lawyer and partner at the international law firm, Expatriate Law, said that voice forging software was used to create a fake recording of his client threatening another party to a dispute over their children.
In what is believed to be the first reported case of its kind in UK courts, Mr James said: “It is now possible, with sufficient content, to create an audio or video file of anyone saying anything.”
Deepfakes use machine learning and Artificial Intelligence to create highly sophisticated and plausible fake footage. The technology is available to anyone, with step-by-step guides on the internet.
Speaking to The Telegraph, Mr James said that while PDF and paper documents are relatively “easy” to manipulate, he warned other legal experts of the worrying threat that more technological evidence is also being tampered with and submitted to courts as evidence.
The English court system requires parties to submit their evidence prior to hearings. However the family courts in particular – which are notoriously secretive – are under a lot of time pressure to get through hearings as quickly as possible.
As a result, Mr James, who is based in Dubai, said that “the courts take evidence such as audio recordings, visual footage and written documents at face value”.
“A lot of judges are in their 50s and 60s and are not particularly tech-savvy. Unless you’re aware of the possibility of something being fake, it’s difficult to know.”
Mr James, whose client was the father in a custody battle, was accused of “threatening” their children’s mother over the phone.
“He was adamant that he hadn’t said it, and couldn’t explain how they had a recording… it was disclosed before the hearing and introduced as evidence and he was shocked.
“So we started looking into an explanation and luckily we were able to get the original file, got it exported, looked at metadata and saw it had been manipulated. The judge was really shocked. It would have never occurred to him to look into that.”
“Up to that point they thought they had a slam dunk case,” he added. “If this parent is so malevolent that they faked evidence, should they be able to have custody of the children?”
Mr James said that this was the first instance in around 30 years of legal practice that he had seen such a case of ‘deep faking’.
The case was heard last year in the notoriously secretive family courts, whose hearings are always held in private and are not reported.
“This is always a difficult position to be in as a lawyer, where you put corroborating contrary evidence to your client and ask them if they would like to comment. My client remained, however, adamant that it was not him despite him agreeing it sounded precisely like him, using words he might otherwise use, with his intonations and accent unmistakably him. Was my client simply lying?”
“In the end, perhaps for one of the first times in the family court, we managed to prove that that the audio file had not been a faithful recording of a conversation between the parties but rather deep fake manufacture.”
“The whole legal system needs to catch up, it’s not good at technology there are really easy ways to manipulate the system,” he added.
Mr James told of the case in an article to be published in the March edition of the International Family Law Journal.
Read the full story on The Telegraph’s website.