Page 100 - Cyber Defense eMagazine June 2024
P. 100

Technology’s Role in Spreading Misinformation

            Technology  and  AI  are  enabling  ever-more  sophisticated  and  personalized  platforms  to  spread
            misinformation, with deepfakes a particular area of concern.

            Manipulated and falsified information is now the most severe short-term risk the world faces, according
            to  the  World  Economic  Forum.  Its  Global  Risk  Report  says  misinformation  and  disinformation  could
            radically disrupt electoral processes in several economies, triggering civil unrest and confrontation, and
            deepening polarized views in societies where political opinion is entrenched.


            A group of 20 tech companies, including Google, Microsoft, Meta, TikTok, IBM, Adobe, and Amazon,
            announced a commitment in February to adopt “reasonable precautions” to prevent the spread of AI
            misinformation ahead of this year’s elections. AI-generated deepfake content has already been used to
            interfere with the US election, when thousands of households received a fake robocall that used AI to
            mimic  President  Joe  Biden  in  January,  encouraging  them  not  to  vote  in  New  Hampshire’s  primary
            election. In February, a deepfake news report about a supposed assassination attempt on President
            Macron of France spread quickly online.


            Kent Walker, Google’s president for global affairs, said in an interview that given the breakneck pace of
            AI development there was a danger of “micro-targeted” deepfakes being customized to influence small
            but potentially decisive parts of the electorate through some social media platforms.

            Alongside deepfakes, there are concerns about the repurposing of existing imagery for disinformation
            purposes as well as convincingly crafted personalized emails or text messages. Where people feel a
            sense of grievance or perceived injustice, receiving compelling personalized communication could be the
            nudge they need to vote a certain way, or a motivation to take their frustration on to the streets.

            Public disaffection with governments that have not heeded their concerns or demonstrably changed their
            lives for the better is driving mistrust and cynicism, which can be exploited by misinformation, undermining
            the legitimacy of governments and media sources. This mistrust can be stoked by populists for their own
            ends. There is also the additional danger that genuine evidence can in turn be dismissed as ‘fake’ by
            those acting in bad faith.



            Multinational companies show increasing demand for political violence insurance

            Political violence activity can impact businesses in many ways. Businesses need to protect their people
            and property with forward planning, such as ensuring safe and robust business continuity planning is in
            place in the event of an incident, increasing security, and reducing and relocating inventory if likely to be
            impacted by an event. Using scenario planning and tracking risks in areas key to their operations can
            raise  businesses’  awareness  of  where  political  violence  and  civil  unrest  risks  may  be  intensifying.
            Companies should also review whether their insurance policy covers the impact of risks such as strikes,
            riots, and civil commotion.








            Cyber Defense eMagazine – June 2024 Edition                                                                                                                                                                                                          100
            Copyright © 2024, Cyber Defense Magazine. All rights reserved worldwide.
   95   96   97   98   99   100   101   102   103   104   105