Meta’s AI chief says OpenAI’s and Google Deepmind’s AI optimism is overblown



summary
Summary

Premature regulation of AI could strengthen the dominance of big tech companies and stifle competition, warns Meta’s chief researcher Yann LeCun.

LeCun believes that regulating AI research and development could be counterproductive and lead to “regulatory capture” under the guise of AI safety.

He blames calls for AI regulation on the “superiority complex” of leading technology companies, which claim that only they are trustworthy enough to develop AI safely.

LeCun called this attitude “incredibly arrogant” and advocated a more open approach to AI development. Meta relies on open-source models like LLaMA, which encourage competition and allow a wider variety of people to develop and use AI systems, LeCun said.

Ad

Ad

Critics of Meta’s strategy, on the other hand, worry that putting powerful generative AI models in the hands of potentially malicious actors could increase the risks of disinformation, cyber warfare, and bioterrorism.

The renowned AI researcher made the comments to the Financial Times ahead of the Bletchley Park Conference on AI Security, organized by the British government in November.

Don’t fear the Terminator

LeCun called the idea that today’s AI could lead to the annihilation of humanity “preposterous.” People, he said, have been conditioned by science fiction and the “Terminator” scenario to believe that intelligent machines will take over the moment they become smarter than humans.

But intelligence and the drive for dominance are not synonymous, said LeCun, who sees humans as an apex species even in the age of super AI. According to LeCun, today’s AI models are not as powerful as some researchers make them out to be. They lack understanding of the world, planning, and true reasoning.

LeCun accuses OpenAI and Google DeepMind in particular of being “consistently over-optimistic.” Human-like AI, he says, is much more complex than today’s systems and requires several “conceptual breakthroughs.”

Recommendation

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top