UK general election misinformation: What publishers can do about it (2024)

  1. Comment

June 10, 2024

The same technological advances vitalise and threaten content creation in 2024.

By Paul Doyle

UK general election misinformation: What publishers can do about it (1)

Within days of the UK election campaign starting, videos started circulating on a number of platforms ‘investigating’ Rishi Sunak’s ‘real heritage’ following similar patterns to that experienced by Barak Obama during his US presidential campaign.

In today’s digital society, the demarcation between truth and falsehood is increasingly obscured. Misinformation, or the unintentional spread of false information, has become the genericised term. It is, in fact, one pillar in a triad of information disorder.

Peer deeper beyond the viral, baseless claims, and you may spot ‘disinformation’, the deliberate creation and dissemination of falsehoods, a sinister tactic used to sway public opinion or tarnish reputations.

Then there’s malinformation, the strategic use of true information to inflict harm, such as leaking genuine but private information to discredit an individual or entity.

Related

Social media

Why male voices dominate when it comes to news on social media

Comment

Women’s voices and issues are not being heard in UK general election

It’s crucial to get to grips with these nuances to identify and avoid information disorder, know the source of the information, understand the context and terms of reference and not readily accept anything at face value.

Subscribe to our newsletters View all newsletters

Thank you

Thanks for subscribing.

Information disorder boosted by tech revolution

Information disorder is nothing new buttechnological advancements, particularly in social media and AI, have revolutionised the way information is created, shared and consumed.

Content from our partners
Emotional salaries: How do media jobs stack up?

Amanda Kavanagh

Pugpig named best media technology partner of 2024 by AOP

Press Gazette

Cannes Lions: The world’s best creativity all in one place

Press Gazette

On the other side of this technological revolution, algorithms designed to engage users often inadvertently prioritise sensational or divisive content, regardless of its veracity.

A fertile ground for the spread of misinformation has been created by the ability of everyone to publish, algorithmic bias and the human tendency to engage with content that resonates with pre-existing beliefs.

Information is not always intelligent

The role of AI has introduced new complexities to the information landscape. The ability of AI to create convincing yet false content, from deepfakes in audio, imagery and even video, to fabricated news articles, poses a formidable challenge to discerning truth from fiction.

These technologies, while remarkable when used for good, offer tools for those intent on manipulating public opinion. This underscores the need for a vigilant and informed populace.

Within the first week of the UK election campaign, there were many videos, beyond those of parody or satire, making factually incorrect assertions around ‘National Service’ and an unfounded claim about Starmer’s involvement in not prosecuting Jimmy Savile.

Information runs deep

A 2023 report by Home Security Heroes exposed the chilling ease of manipulating reality with deepfakes, AI-generated fabrications that exploit our trust in visuals.

Beyond the technical prowess, it’s the psychological manipulation that’s concerning. We inherently trust familiar faces and voices, making us susceptible to deepfakes’ potent mix of misinformation, disinformation, and malinformation.

The consequences are far-reaching, impacting individual reputations, public opinion, and even societal stability. High-profile targets like Taylor Swift (whose deepfake nudes were widely circulated) highlight the potential for mass deception. Social media’s quick takedown in this case underscores the urgency for a comprehensive response.Arguably, only Swift’s stardom resulted in such rapid, universal action. We need education, regulation, and advanced detection to safeguard online discourse.

[Read more: ITN sounds alarm over fake online content featuring Robert Peston, Mary Nightingale and others]

Deepfakes aren’t the only threat. ‘Shallowfakes’, created with traditional editing techniques, exploit similar vulnerabilities. These can be malicious, but often stem from misinformation, like taking quotes out of context or memes masquerading as news headlines.

In recent days, there have been videos using a combination of shallowfake and AI-generated voiceover around supposed hustings where a Conservative MP’s speech has been layered to imply the audience in the background are not interested and sceptical of the orator. Both scenes are totally different but have been merged into one misleading piece of video.

Information creators need to be responsible

In previous roles in current affairs and journalism I have worked with great people who scientifically process the many examples of misinformation. I’ve seen first-hand the relative ease to which they can disseminate. It brings life to the old proverb that ‘a lie will go round the world while truth is pulling its boots on’.

So how can we mitigate the impact of misinformation? Education plays a pivotal role. Enhancing media literacy is not just about people being able to distinguish true and false information but cultivating a critical mindset that questions and analyses the source, context, and purpose of the information they’re consuming.

But journalists and content creators play a crucial role in this ecosystem. Adhering to rigorous verification processes, beyond a fact-check, to delve into the context and framing of information to ensure it is accurate and unbiased, promoting transparency and accountability along the way.

Technology firms, too, bear a significant responsibility. Implementing more transparent algorithms, enhancing fact-checking mechanisms, and fostering collaborations with fact-checkers and academia can contribute to a more informed and discerning public.

It’s vital to support creators and publishers with the skills they desperately need to be able to accurately interpret, inspect and investigate the information at hand in order to inform their response and, ultimately, their output.

Information sharing is built on trust

In an age where misinformation thrives, the cornerstone of countering this tide is rebuilding public trust. The Edelman Trust Barometer offers insightful revelations. Their last report indicates that trust across institutions varies significantly, shaping public receptivity to information.

[Trust in media: UK drops to last place in Edelman survey of 28 nations]

The European Broadcasting Union (EBU) charts public trust in media across the complex landscape. It reveals trust in traditional media remains relatively robust, but we are only as worthy as our last headline, citation, video edit or social post.

Engendering trust isn’t an overnight task but a sustained effort. Media institutions have established trust over decades. However, it takes seconds to eradicate. Media brands need to continue to uphold honest, wholesome, ethical, values. That is by no means a rallying cry to return to notebook and quill.

Content creation in 2024 is exciting, it is innovative, it fuels so much aspiration to inspire and bring joy to audiences, made possible by the same technological advances that threaten it.

Paul Doyle has over 20 years of experience in the media industry, specialising in video production and content strategy. At Immediate, Paul oversees the strategic direction of video output, managing content creation, production, engagement, distribution, and monetisation for platform brands. Immediate’s portfolio includes Good Food, Radio Times and BBC Gardeners’ World.

Previously at Tiktok EU, he led content strategy and development, localising content for five European markets. Prior to this, Paul managed technical and production responsibilities at media companies including the BBC, Sky, and RTE and ITV while also launching various formats for broadcast and digital platforms.

In his role as head of programme delivery at First Draft, Paul addressed digital misinformation across three continents through strategic initiatives, promoting digital content integrity for media companies, tech firms, NGOs, and non-profits.

Topics in this article : Artificial Intelligence , Fake news , Immediate Media Company

Email pged@pressgazette.co.uk to point out mistakes, provide story tips or send in a letter for publication on our "Letters Page" blog

UK general election misinformation: What publishers can do about it (2024)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Rev. Leonie Wyman

Last Updated:

Views: 5797

Rating: 4.9 / 5 (79 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Rev. Leonie Wyman

Birthday: 1993-07-01

Address: Suite 763 6272 Lang Bypass, New Xochitlport, VT 72704-3308

Phone: +22014484519944

Job: Banking Officer

Hobby: Sailing, Gaming, Basketball, Calligraphy, Mycology, Astronomy, Juggling

Introduction: My name is Rev. Leonie Wyman, I am a colorful, tasty, splendid, fair, witty, gorgeous, splendid person who loves writing and wants to share my knowledge and understanding with you.