Research

AI chatbots are evolving faster than computer chips

Forschende der Cornwell University untersuchten algorithmische Fortschritte bei Sprachmodellen. © Canva
RESEARCHERS AT CORNWELL UNIVERSITY EXAMINED ALGORITHMIC ADVANCES IN LANGUAGE MODELS. © CANVA
Startup Interviewer: Gib uns dein erstes AI Interview Startup Interviewer: Gib uns dein erstes AI Interview

The technological progress of Large Language Models in the past years is notable. A team of researchers at Cornwell University in New York has discovered that large language models are evolving rapidly and can increase their performance even faster than computer chips. However, the amount of energy required for their operation should not be underestimated.

More than 200 language model evaluations

The researchers found that computing power halves on average every eight months to reach the same benchmark as computer chips. During the research process, more than 200 language model evaluations were carried out from 2012 to 2023. The finding suggests an increase in efficiency compared to many other areas of data processing and even exceeds Moore’s Law. This means that the number of transistors on a chip regularly doubles. Depending on the source, 12, 18 or 24 months are mentioned as the period. The transistors indicate the computing power of a chip.

Increased performance through better algorithms

The fact that AI models achieve such high performance has primarily to do with the development of the algorithms on which they are based. By scaling up the size of LLMs, performance can be increased even further, the study states. Scaling is cited as the key driver for performance improvements. However, this also requires more computing power, which in turn depends on the availability of the AI ​​chips. These are currently in short supply. Results for the further development of algorithms could not be determined because the code of many LLMs is not publicly accessible. “Overall, our work provides a quantitative assessment of the rapid progress in language modeling,” the researchers write in their study.

More efficient AI models use more energy

Dr. Sasha Luccioni, climate and artificial intelligence researcher at Hugging Face, mentions in relation to the study that more efficient AI models can lead to them being used more, reports t3n . Overall, AI can result in higher energy consumption, says Luccioni. In her recent TED talk, she also noted: “The cloud that the AI ​​models live on is actually made of metal and plastic and runs on huge amounts of energy. And every time you query an AI model, it costs the planet.”

Advertisement
Advertisement

Specials from our Partners

Top Posts from our Network

Powered by This price ticker contains affiliate links to Bitpanda.

Deep Dives

© Wiener Börse

IPO Spotlight

powered by Wiener Börse

Europe's Top Unicorn Investments 2023

The full list of companies that reached a valuation of € 1B+ this year
© Behnam Norouzi on Unsplash

Crypto Investment Tracker 2022

The biggest deals in the industry, ranked by Trending Topics
ThisisEngineering RAEng on Unsplash

Technology explained

Powered by PwC
© addendum

Inside the Blockchain

Die revolutionäre Technologie von Experten erklärt

Trending Topics Tech Talk

Der Podcast mit smarten Köpfen für smarte Köpfe
© Shannon Rowies on Unsplash

We ❤️ Founders

Die spannendsten Persönlichkeiten der Startup-Szene
Tokio bei Nacht und Regen. © Unsplash

🤖Big in Japan🤖

Startups - Robots - Entrepreneurs - Tech - Trends

Continue Reading