Proving the photo is fake is one thing.  Proving that it isn’t is another matter entirely

Proving the photo is fake is one thing. Proving that it isn’t is another matter entirely

Proving the photo is fake is one thing.  Proving that it isn’t is another matter entirely

Ttrue hat is the first casualty of war, it’s an old aphorism. One recent example is the proliferation of photos and videos of things that didn’t happen in wars like those currently being fought in Ukraine and Syria. Some of them are outright fakes. Others are manipulated versions of honestly recorded material. Last year, a faked video surfaced of Ukrainian President Volodymyr Zelensky apparently ordering Ukrainian soldiers to surrender.

The proliferation of such fakes, however, has led to a second, more subtle approach to lying with images. This is to use the ubiquity of fakes to undermine the authenticity of inconvenient photos that are actually real. For example, shortly after Russia invaded Ukraine last year, the Associated Press released a video of doctors failing to revive a young girl who was hit in the Mariupol shelling. The footage soon appeared on Russian television labeled “fake”. Since it is difficult to prove something negative (i.e. that the material was not falsified), such evidence can be challenged, perhaps even in court. Criminal charges based on this evidence may not stand.

Means of establishing the authenticity of digital images would therefore be valuable. And one is available now. Glass-to-glass alert systems create special software “ecosystems” where photos and videos can be captured, stored and transmitted in a way that alerts viewers to changes, no matter when or where those changes are made on the go image from the lens to the screen.

A plate of hashish

One such system was developed by eyeWitness is Atrocity, a charity based in London. The title application essentially does two things. Firstly, when the phone with the application installed takes a photo or records a video, it records the time and place of the event, which is reported by electronic witnesses that are difficult to deny, such as e.g. GPS satellites and nearby cell phone towers and Wi-Fi networks. This is called controlled metadata capture and is more secure than collecting such metadata from the phone itself because you can change the phone’s time and location settings.

Second, the application reads the entire digital image sequence (the ones and zeros that represent it) and uses a standard mathematical formula to calculate an alphanumeric value, known as a hash, unique to that image. After doing all these things, it puts the metadata and hash into a file called a trial suite, which is separate from the image, and sends an encrypted copy of the image and its trial suite to a special server.

Wendy Betts, director of eyeWitness to Atrocity, describes this server as a digital evidence locker. When the authenticity of an image needs to be verified, all you have to do is rescan its digital sequence, recalculate its hash, and ask the repository if it contains an identical hash. If even a single pixel of the image has been altered, the recalculated hash will not match the original. If it matches, it means the image has not been retouched. As an added service, around 80 lawyers, each working for the charity for free for a few hours a week, review the incoming images. They package those that appear to record abuses into files that are then sent to law enforcement agencies, including Europol (the European Union’s law enforcement agency), the International Criminal Court, and the Office of the Prosecutor General of Ukraine.

Andriy Kostin, the Attorney General himself, is a fan of the eyeWitness system – and not just because it provides the security of authenticity that the courts require. He also likes the fact that it helps overcome the second hurdle in his efforts: the fear of witnesses of being detected.

Make connections

In the areas of Ukraine occupied by Russia, this is a serious threat. If soldiers manned a checkpoint, for example, to find video evidence collected by that person in a war crimes case on someone’s phone, the consequences could be serious. To make this less likely, the app’s icon doesn’t reveal its purpose. Moreover, if it is then touched by a probing officer and an incorrect passcode is entered, the phone’s normal photo gallery will open. Maryna Slobodianiuk, an investigator at Truth Hounds, a human rights group in Kiev, says of the evidence of attacks she has collected with eyeWitness: “even if I am captured … no one will reach her.”

The first version of eyeWitness, available for free, was released in 2015, so most of the bugs have been fixed. Demand in Ukraine has increased sharply over the past year. Ms Betts says that of the 40,000 reports received in 2022 that her team considers relevant to investigations, more than 27,000 were sent from Ukraine.

Police officers and journalists are particularly eager users. Similarly, analysts from the Ukrainian Health Care Center, a Kiev-based think tank that uses the app to collect evidence of attacks on medical facilities. EyeWitness is not the only provider of glass-to-glass services either. The Guardian Project of Valhalla, New York, released a smartphone app called ProofMode. Like eyeWitness, ProofMode combines controlled capture metadata and image hash into a proof suite. However, instead of operating the receiving server itself, ProofMode uses repositories run by third-party companies, such as Google, which register them like a notary public. Viewers of a photo taken in ProofMode can upload it to the Guardian Project website, which recalculates its hash and checks the repositories for a match. If it cannot be found, the image will be considered altered.

The Guardian Project will be adding a new feature, Synchrony, soon. This will link the location and time of the photo to OpenStreetMap, the online cartography of the world, as well as a detailed geographic record of the world’s weather over the past few years (which has yet to be determined). This will make it easier to check for discrepancies between the place and time someone claims the photo was taken, and the local landscape and weather conditions on that day. The idea, says Nathan Freitas, founder of the Guardian Project, is “to synchronize the images with the real world as it was.” He hopes to be able to connect to other databases as well – including those that record when and where street protests took place.

A third operator, La Jolla, Calif.-based Truepic, takes a more commercial approach. Charities pay nothing to use his software, but the companies that use it to keep an eye on supply chains, construction progress, loan compliance, and the whereabouts and condition of expensive kits have to work hard.

Truepic offers two services. One scans smartphones for malware that is supposed to facilitate metadata falsification. Other spots are so-called retransmission attacks, in which a forged image is photographed in order to create a new image, which thus has no traces of tampering with its code. Mounir Ibrahim, once a member of the US Diplomatic Corps (served, among others, in among others, in Damascus, the hotbed of photographic fraud) and now head of public affairs at Truepic, is cautious about how it’s done. But the trick, he notes, is to look for clues that all pixels in the image have registered a uniformly flat surface.

In 2021, Truepic teamed up with Adobe, ARM, BBCIntel and Microsoft to form the Coalition for Content Origin and Authenticity (C2PA). It is an attempt to create a set of image authentication technology standards for hardware and software developers. The goal is to eliminate the need to bother with special applications. Instead, the coalition wants metadata capture, hashing, and data uploads to repositories done behind the scenes and without licensing fees.

If C2PAstandards were widely used, even web browsers would be able to check the internet hash repository and put a warning on images that don’t match. Ultimately, hashes can be automatically distributed on blockchain ledgers. The Starling Laboratory, based at Stanford University, is testing such a system.

However, obstacles remain. Jonathan Dotan, the founding director of Starling Lab, points to one in particular. This technology has the potential to enable authoritarian regimes to identify the devices and thus the people who took the damning photos. Scientists, he says, must therefore first find a way to prevent such tracking. Transparency is very good, but even good people realize that sometimes too much of it can be too good a thing.

Leave a Reply

Your email address will not be published. Required fields are marked *