As a juror, the most clear-cut evidence you can expect to see in the courtroom is a video. What better evidence can you expect beyond seeing the defendant commit the crime before your very eyes? However, video evidence is not always what it seems to be, and emerging technology has made this issue more difficult.
We have seen historically that video evidence is not always as clear as we believe it to be, and intelligent minds can see very different things in the same video. In Scott v. Harris, not only did the jurors perceive a dashcam video differently, but the Supreme Court Justices did as well. Go look at it for yourself and see if you think that the car being chased by the police is
driving recklessly or not. Researchers found that the differences seen in
the video were tied to the viewer’s implicit biases. Seeing was not believing in this unmanipulated video. When video evidence can be altered, juries can be further misled.
For decades now, videos could be manipulated to adjust the timing of the footage, delete or cut frames together, re-dub sound among other alterations. These simple manipulations can drastically shift a jury's perception of a video. We saw this in the Rodney King case, in which video
evidence was slowed down to undermine the allegations of police brutality. With increasingly convincing video manipulation, juries could be completely misled by an altered video.
With
the help of artificial intelligence, videos can now be altered to manipulate
human bodies, faces, and voices with eerie accuracy. These altered videos are known as deepfakes. While deepfakes can be used
for entertainment purposes (take for example Donald Trump joining the cast of Breaking Bad, Barak Obama warning about the dangers of deepfakes, or Nick Offerman performing Full House as a one man show), they can be used for more
sinister purposes as well.
The BBC TV Show The Capture brings attention to this new reality and its implications on the criminal justice system, as the protagonist of
the show is falsely accused of a crime he didn’t commit, but is shown “video
proof” of CCTV that he did it. Showing that footage to the average jury without knowledge of this new technology would certainly lead to a conviction – the jurors can see the crime happening before their very eyes.
While it was easier to
detect the alterations made in ‘cheap fake’ videos, deepfakes are much more
difficult to detect, both to the human eye, and to computers, as deepfake
detection algorithms are not yet widely available.
With this fear in mind,
what can be done to ensure that jurors are not swayed by altered videos? Until detection
of these alterations is made widely available, juries will need to be
instructed to take video evidence with a grain of salt and not rely on them
independently.
No comments:
Post a Comment