As artificial intelligence becomes widespread and easier to generate, it will become increasingly difficult to sort the authentic from the fake—or more specifically, from the deep fake. This situation has also created an opening for companies that specialize in identifying AI-generated content.
That includes a start-up called OpenOrigins, which was founded in 2021, and combats fake media by using blockchain technology to verify the authenticity of photos, videos and other digital content. On Thursday, the company announced it has raised $4.5 million in a seed investment led by Galaxy Interactive and is expanding globally.
The London-based company offers newsrooms the ability to verify the authenticity of digital media in real-time and in retrospect. Its point-of-capture software can be installed on a camera or iPhone to establish that a photo or video was in fact taken by a human. It can also be used to comb through extensive archives and validate the originality of existing content. All of this information is then recorded onto the public and immutable Hyperledger blockchain— an open source, Ethereum-based framework provided by the Linux foundation.
OpenOrigins CEO and co-founder Manny Ahmed developed one of the first deep fake detectors as a PhD student at Cambridge University. He soon realized, however, that people would use detectors to train AI tools to make more convincing deep fake generators. “And that’s exactly what happened,” he said.
Instead, he pivoted to develop an infrastructure that proves content is human-made. “What we try to do is not prove whether something is fake, but rather prove what is real,” Ahmed told Fortune. “We’re not doing deep fake detection. We are using a lot of different data points to prove that a particular piece of photo or video is real.”
The company works with numerous British news outlets including Sky News, The Sunday Times, The Sun and ITN, one of the U.K.’s leading media production companies. With the new funding it is now expanding globally, starting with India and the United States. The company is also launching a marketplace to help these newsrooms effectively license their verified data for AI training.
“We’re trying to secure the form of the internet that allows for evidentiary value,” OpenOrigins co-founder Ari Abelson told Fortune. “It’s something that’s really important in news, which is why we’re focusing there but, this is broadly applicable to other industries.”
Abelson says that the technology could be applied to safeguarding insurance companies from fraud as well as verifying the authenticity of dating app profiles and Zoom calls.
Other companies use blockchain technology to certify the originality of digital content, like the Verify protocol which was developed by the Fox Corps team in collaboration with Polygon labs but, Ahmed says that his company solves a scalability issue that the competitors don’t.
“We’re the only ones who are trying to do authentication of, not just live content gathered in real time, but also of historical content. As far as I’m aware, there isn’t a unified solution for both,” he said.
The founders said they will use the additional funding to expand their commercial team and hire within the United States.
“The reason why we’re so focused on securing archives at this point is because there is a limited time window where that is feasible.” Ahmed said. “Once we reach a point of sophistication where AI videos are functionally indistinguishable from non-AI ones, it’s going to be really, really hard for us to retroactively trust those archives.”