A group of prominent computer scientists and tech industry leaders have come together to call for a 6-month pause to contemplate the potential risks posed by AI systems with “human-competitive intelligence” that could potentially outsmart humans. The letter highlights the dangers that powerful AI systems might pose to society and humanity as a whole, ranging from disinformation and job automation to future catastrophic risks.
The petition has been organized by the Future of Life Institute. It has attracted signatories like Yoshua Bengio, Stuart Russell, Gary Marcus, Andrew Yang, Rachel Bronson, Apple co-founder Steve Wozniak, and Elon Musk. The signatories call on all AI labs to immediately pause the training of AI systems more powerful than GPT-4 for at least six months. If the labs do not comply, the governments are urged to institute a moratorium.
While the United Kingdom has outlined its approach to regulating high-risk AI tools and lawmakers in the European Union have been negotiating the passage of sweeping AI regulations, this call for a pause raises some doubts. Some experts find the letter’s vagueness and lack of regulatory considerations worrying. Some have gone as far as to call it too hypocritical for Elon Musk to sign on, given Tesla’s fight against accountability.
Despite the skeptics, the Future of Life Institute believes that the pause is necessary to give policymakers and stakeholders time to reflect on the potential consequences of advancements in AI. The pause will provide an opportunity to consider the risks and address concerns regarding autonomy, transparency, accountability, and ethical considerations in AI development.
This petition is a significant development in the AI landscape, with some significant leaders and experts calling for a pause to consider the risks posed by powerful AI systems. Governments, policymakers, and stakeholders should take these concerns seriously and ensure that the development of AI systems is done with ethical considerations and certain regulatory measures in place.