UK Labour Party Supports Nuclear-Like Regulation for Artificial Intelligence (AI)

The UK Labor Party has delved into the ongoing concerns regarding AI’s unregulated nature. 

The Labor Party’s concerns shared in The Guardian publication show that the United Kingdom’s politicians question the beneficial profile portrayed by AI developers. 

Labour Party Concered by Unregulated AI Sector Amidst Rapid Expansion

The report captures the Labor Party’s submission that AI needs to be regulated and have a government license. Its leaders demand treatment matching one deployed towards regulating nuclear power or pharmaceutical organizations.

Lucy Powell, a digital spokesperson for the Labor Party, emphasized the need to consider this model for AI developers seeking license approval from the regulators. She also claimed that rather than trying to ban AI, focusing on regulating the technology at the developmental level would be crucial.

Italy National Data Protection Authority Imposes Temporary Ban to ChatGPT

AI Trading Robot

In March, Italy claimed privacy concerns evoked the need to ban ChatGPT. The national authority involved in data protection declared the suspension of ChatGPT, alleging it violated privacy rights. It accused the ChatGPT developer OpenAI of non-compliance with the General Data Protection Regulation (GDPR).

The Italian Data Protection Authority indicated that ChatGPT suffered a breach leading to access to user payment details and conversations. The agency expressed concerns that OpenAI offered no legal ground justifying the mass collection and storage of data classified as personal to train the algorithms. 

The Italian watchdog challenged OpenAI on how it verified the age of the ChatGPt application users. It alleged that the absence of verification exposed minors to unsuitable content, otherwise avoidable through the age-regulated protocol. However, implementing new security interventions in April led the data regulator to lift the temporary ban.

Regulations Necessary to Moderate AI Creation and Control

Powell added that the primary concern regards the failure to have regulations that govern big language models. Besides, these regulations can play a significant role in regulating other artificial intelligence tools, including how they are managed, created, or controlled. 

Powell’s claims are similar to those of United States Senator Lindsey Graham. Back in May, she emphasized the need for an agency to provide and withdraw licenses to AI developers. Sam Altman, OpenAI’s chief executive officer, supported the idea and suggested developing a national agency responsible for setting standards and practice.

OpenAI Boss in Support of Regulating AI

Sam Altman claimed he could create an agency that provides licenses for efforts beyond a specific capabilities scale. Further, the agency can withdraw the same license and ensure that safety standards are adhered to.

The comparison of nuclear technology alongside AI is not new. Back in May, Warren Buffett, a well-known investor, claimed that artificial intelligence and the atomic bomb were similar. He stated that the ability to uninvent the technology was impossible. In the same month, Geoffrey Hinton, AI’s inventor, stepped down from his position at Google to enable him to raise concerns regarding the likely dangers linked to the technology. 

Replicate Regulatory Approach Deployed in Nuclear Power Checks

A letter published last week by the Center for AI Safety claimed that, similar to pandemics and nuclear war, it is critical to prioritize mitigating the risk of extinction from artificial intelligence. Sam Altman, Stability AI chief executive officer Emad Mostaque, and Microsoft co-founder Bill Gates were the letter’s signatories. 

AI’s quick development and use have evoked discrimination, bias, and surveillance concerns. According to Powell, they can be eliminated by compelling developers to embrace data openness. Specifically, she stated that owing to the high speed at which the technology is moving, it is critical to adopt an interventionist government strategy instead of a laissez-fire one. 

Previous post SEC Classifies Cardano, Solana, and Polygon as Securities When Charging Binance
Next post Bitcoin ATMs Growth Surges Dramatically for the First Time in 2023