Cypher
Cryptocurrency RegulationEthereum (ETH)NewsOpenAI / ChatGPT / Artificial Intelligence

Vitalik Buterin Claims Artificial Intelligence Might Overtake Humans as ‘Top Species’

Vitalik Buterin, Ethereum’s cofounder, cautions that ‘even Mars might be unsafe’ in case superintelligent artificial intelligence revolts against humankind.

Vitalik Buterin, Ethereum’s founder, says that if unchecked, super-advanced artificial intelligence (AI) has a ‘significant possibility’ of outdoing humans and becoming the planet’s next ‘apex species.’ However, he said this is going to boil down to the manner in which humans possibly intervene with artificial intelligence developments.

Vitalik Buterin Outlines the Benefits of AI

In a November 27 post, Vitalik, considered by some as a thought leader in the cryptocurrency context, asserted that artificial intelligence ‘primarily varies’ from other latest inventions, for instance, airplanes, the printing press, social media, airplanes, the wheel, and guns. This is because of artificial intelligence’s capability to generate a new kind of ‘mind’ capable of turning against the interests of humans.

Vitalik also said that artificial intelligence is a new kind of mind that is quickly gaining in intelligence and is highly capable of surpassing people’s mental faculties and becoming the planet’s new apex species.

Cypher

He asserted that different from a nuclear battle, climate change, and a pandemic, superintelligent artificial intelligence can eliminate humanity and leave nobody behind, especially in a situation where it views people as a danger to its survival.

📰 Also read:  Crypto Analyst Route 2 FL Warns Money-Hungry VCs' Hurting Token Launches 

Vitalik added that artificial intelligence could go wrong by worsening the world in the worst likely means: it may result in human extinction. Additionally, Mars might also be at risk of extinction. He quoted an August 2022 survey from more than 4270 machine learning scientists who approximated a 5-10% possibility of artificial intelligence wiping out humanity. Nevertheless, despite emphasizing that assertions of this nature are ‘extreme,’ humans can triumph in various ways.

Brain Interfaces and Techno-optimism

Vitalik recommended incorporating brain-computer interfaces (BCIs) to provide additional control over robust artificial intelligence-based cognition and computation forms. A BCI refers to a communication path between the electrical activity of the brain and an external device, for instance, a robotic limb or computer.

According to Vitalik, this can eliminate the two-path communication cycle between machines and humans from seconds to milliseconds. Additionally, he said it can aid in maintaining some level of relevant agency’ over the globe.

According to Vitalik, this path would be ‘harmless’ since people could partake in all decisions the artificial intelligence machine makes. He said it is possible to minimize the temptation to offload significant-level planning responsibility to artificial intelligence, eliminating the possibility of artificial intelligence doing something that does not correspond to humanity’s values.

Vitalik also recommended the need for ‘active human intention’ to take artificial intelligence in a humanity-benefiting path because profit maximization does not always guide humans to the most appropriate course.

📰 Also read:  Yuga Labs' Proposed Restructuring Plans Results to Headcount Reduction

Buterin Warns Against Unchecked Pursuit of Superintelligent AI

He concluded that humans are the universe’s ‘brightest star’ since they have created technology to diverse human potential for several years, and bullishly several more to come. Specifically, he said that several years from now, in case the earth or a section of the universe still contains the Earthly life’s beauty, human artifacts such as geoengineering and space travel will have led to this.

Buterin warns that unchecked superintelligent AI could eradicate humanity and possibly destroy Mars. He urged the developers to embrace guardrails in the course rather than prioritize proprietary corporations’ benefits. 

Editorial credit: Alexey Smyshlyaev / Shutterstock.com 


Tokenhell produces content exposure for over 5,000 crypto companies and you can be one of them too! Contact at info@tokenhell.com if you have any questions. Cryptocurrencies are highly volatile, conduct your own research before making any investment decisions. Some of the posts on this website are guest posts or paid posts that are not written by Tokenhell authors (namely Crypto Cable , Sponsored Articles and Press Release content) and the views expressed in these types of posts do not reflect the views of this website. Tokenhell is not responsible for the content, accuracy, quality, advertising, products or any other content or banners (ad space) posted on the site. Read full terms and conditions / disclaimer.

📰 Also read:  Google Joins OpenAI and Meta Pledging Child Safety in AI Development

Cypher

Stephen Causby

Stephen Causby is an experienced crypto journalist who writes for Tokenhell. He is passionate for coverage in crypto news, blockchain, DeFi, and NFT.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close
Skip to content