Home Editor's Pick AI News: UBI to Address Job Losses, Microsoft Faces Multibillion-Dollar Fine & Threats to 2024 U.S. Elections

AI News: UBI to Address Job Losses, Microsoft Faces Multibillion-Dollar Fine & Threats to 2024 U.S. Elections

by

As artificial intelligence (AI) continues to advance at an unprecedented pace, its potential impact on society and politics has become a growing concern among experts and government officials.

Recent developments highlight the need for proactive measures to address the challenges posed by AI, particularly in the context of job displacement and election security.

TLDR

Geoffrey Hinton, a renowned AI expert, advised the UK government to consider implementing Universal Basic Income (UBI) to address potential job losses due to AI.
Microsoft faces a potential multibillion-dollar fine in the EU if it fails to provide information about the risks associated with its Bing AI by May 27, 2024.
The U.S. Department of Homeland Security warns that AI tools pose a significant threat to the 2024 U.S. elections by enabling the spread of misinformation and disruption of election processes.
AI-generated content, such as deepfakes and altered media, can be used to confuse voters, sow discord, and potentially incite violence during the election period.
Experts emphasize the need for public education and preparedness to counter the rapid spread of AI-generated misinformation in the upcoming elections.

Geoffrey Hinton, widely regarded as the “Godfather of AI,” recently advised the UK government to consider implementing a Universal Basic Income (UBI) to mitigate the potential job losses resulting from AI automation.

Hinton, along with other prominent figures in the AI industry, such as OpenAI co-founder Sam Altman, believes that UBI could help offset the impact of automation on the human economy, particularly for those employed in jobs that can be easily automated.

Meanwhile, Microsoft faces a potential multibillion-dollar fine in the European Union if it fails to provide information about the risks associated with its Bing AI by May 27, 2024.

The European Commission has requested information under the Digital Services Act, citing concerns over “generative AI risks” such as “hallucinations,” deepfakes, and the automated manipulation of services that could mislead voters.

The threat posed by AI to the integrity of elections is not limited to Europe.

In the United States, the Department of Homeland Security (DHS) has issued a warning about the potential threats AI poses to the 2024 U.S. elections.

According to a DHS analysis, AI tools can be exploited by both domestic and foreign actors to interfere with the election process, sow discord, and disrupt election infrastructure.

The DHS bulletin highlights the ability of AI to generate altered or deepfaked pictures, videos, and audio clips that could be used to confuse or overwhelm voters and election staff.

The timing of the release of such AI-generated content is crucial, as it may take time to counter-message or debunk false information spreading online.

Experts emphasize the need for public education and preparedness to counter the rapid spread of AI-generated misinformation.

Elizabeth Neumann, a former DHS assistant secretary, warns that the 2024 race may be one of the most challenging elections for Americans to navigate in terms of finding the truth, as AI-generated content can make it difficult to trust even one’s own eyes.

To address these challenges, authorities at every level must be prepared to defend against the dissemination of fake news by AI. John Cohen, a former DHS intelligence chief, stresses the importance of educating and preparing the public, as they are the primary targets of AI-generated content designed to influence behavior.

State and local officials need to have plans in place to quickly counteract and correct inaccurate information using trusted sources of communication.

The post AI News: UBI to Address Job Losses, Microsoft Faces Multibillion-Dollar Fine & Threats to 2024 U.S. Elections appeared first on Blockonomi.

You may also like