WHAT WILL THE FUTURE OF AI-POWERED DISINFORMATION LOOK LIKE?

Explorer's Window In case you missed it Most Read News Desk Tech

Thu 30 March 2023:

Researchers are growing more concerned about the technology’s ostensibly inevitable role in disinformation and propaganda campaigns as sophisticated artificial intelligence systems improve their capacity to produce images, video, and text.

Our information environment could be drastically changed by developments in generative AI, to the point where humans and even machines might not be able to tell the difference between content produced by AI and that created by humans.

The proliferation of tools powered by generative AI are making disinformation easier to produce, paving the way for a host of new problems with no clear solutions for online content moderators. AI ethicists and others within the industry are calling for more regulatory measures.

Additionally, while generative AI has been praised for its capacity to offer highly personalized recommendations, the same online personal data that AI programs use to train themselves could potentially be used to persuade large numbers of people via chatbots to spread hoaxes or propaganda from other countries. What steps can be taken to make the technology safer and how effective will AI be at manipulating us?

In this episode of The Stream, we’ll look at how AI could worsen the online disinformation landscape.

On this episode of The Stream, we speak with:
Henry Ajder, @HenryAjder 

AI expert and broadcaster

Sam Gregory, @SamGregory 
Executive director, Witness

Asra Nadeem, @asranadeem
Chief Operating Officer, Opus AI

SOURCE: AL JAZEERA

___________________________________________________________________________________________________________________________________ 

FOLLOW INDEPENDENT PRESS:

TWITTER (CLICK HERE) 
https://twitter.com/IpIndependent 

FACEBOOK (CLICK HERE)
https://web.facebook.com/ipindependent

Think your friends would be interested? Share this story!

Leave a Reply

Your email address will not be published. Required fields are marked *