On Tuesday, Google, a subsidiary of Alphabet Inc., declared its intent to enforce limitations on election-associated questions that its chatbot Bard and search generative experience can address. These constraints are projected to be implemented by early 2024 in preparation for major elections worldwide, including India’s general elections, the US Presidential election, and others.
Notable elections set for 2024 encompass India’s national elections, the US Presidential election, and South Africa’s elections. Google expressed its dedication to focusing on the possible contributions of AI in assisting voters and campaigns pertinent to these electoral events.
Meta prohibits usage of AI advertising products for political campaigns
In November, Meta (Facebook’s parent company) announced a ban on political campaigns and marketers from regulated industries employing its innovative AI advertising offerings. Advertisers will also be mandated to reveal instances when AI or other digital techniques have been applied to alter or generate political, social or election-related ads on Facebook and Instagram.
Elon Musk’s social media network X allows political advertising
Conversely, Elon Musk’s social media platform X, which is currently facing scrutiny from the European Union, proclaimed in August its decision to authorize political advertisements within the US for candidates and political parties. Additionally, the platform aims to enhance its safety and elections workforce ahead of the US election. This marks a considerable shift since all political advertisements were globally prohibited on X from 2019.
Global governments unite in regulating AI
Due to potential hazards such as misinformation dissemination, especially during elections, governments around the world have banded together in an effort to regulate AI. New EU rules will be imposed upon Big Tech corporations with regard to labeling political advertisements on their platforms while providing information about payment sources, expenditures, and targeted electoral events.