top of page
  • Elliot Grainger

Future Trends 2021 - Part 3 Manipulation

Manipulation; from Unsplash by AgniB

This is the third of several reports we are now choosing to share to highlight why we do what we do, and what we think are the current challenges facing those working to address issues within the communications ecosystem.

From Manipulated Content to Synthetic Media to Augmented Reality

Complementary to the reality of an “online space for everything and everyone” is the lowering cost of entry to the communications space. Whether in human capital or direct investment, content creation such as video production has become easier to deliver. There is an app for everything bringing once complex, expensive, technical skills to anyone with a smartphone. Terrorist and extremist organisations as well as those perpetrating disinformation have therefore been able to multiply those they can rely on for delivery of communications, as they no longer need to rely on the tech-savvy to deliver their content.

However, the misuse of images and content is not new. The use of photos out of context, or from older events to imply current reality has been a simple starting point for manipulated content for decades, if not the best part of a century (for example, manipulating truths to create news was an accusation levelled at newspapers as early as the 20s and 30s). This continues to take place. The greater concern is when the original content is not simply misused in context but manipulated to imply something else altogether, particularly the exploitation by emerging technologies for which there is not an easy way to determine what the form of manipulation is. The initial manipulation of content by terrorists and extremists has been used to get around takedown measures; hashes on video content for example, were initially overcome through manipulation of the original through audio or visual changes, creating a cat-and-mouse game in the posting and takedown of tinkered content.

This has developed in recent years to the point where manipulated content is now entirely synthetic. This is not original content that is manipulated, but content that is created as original, using informed machine learning from past content. This can be used for comic effect, imposing your own comments and facial expressions onto famous works of art etc. However, the use of such technology to create new realities for people of importance, fame or influence can be exploited to continue to undermine trust in government within our society and used to exacerbate divisions within society. Most prevalent amongst the concerns is its continuing use to drive information disorder within the European communications ecosystem. It is not thought at this time that this is taking place to any large extent, but the degree to which it can alter people's perception of reality is felt to be one the greatest threats to the communications ecosystem in the years ahead.

The challenge is further exacerbated by the fact that the alteration of content is difficult for humans to detect. AI manipulation creates synthetic media that is perceived as real to the human viewer. It also creates challenges to censors and the platforms who need to identify real from fake. Technology approaches are coming online that are able to identify and flag such content. However, this is currently slow and behind the curve. Beyond that is the ability to socialise such technological protection to ensure people trust it. There are real risks to companies, to individuals, but fundamentally to democracy that need to be understood. How can you ensure policing of such content removes the fake and not real content? Whose responsibility is it to police such content? The new EU AI Regulation will have to address all these concerns while supporting EU companies’ competitiveness in the global market.

It is worth noting that in our interviews, with regards to this challenge, there was confidence that the mainstream platforms will develop ways to build in protections against exploitative synthetic media - as has been with deep fakes. However, the response to it has to be quicker than the threat. In the case of manipulated media, the industry is confident they are aware and are prepared for the impact on security and social stability - governments and civil society remain unconvinced.

22 views0 comments


bottom of page