‘We are at a turning point’

NOS

NOS News

  • Joost Schellevis

    editor Tech

  • Joost Schellevis

    editor Tech

Emails, complete papers, sarcastic epistles on obscure topics: you can’t imagine people from all over the world asking the ChatGPT program to generate it. The available AI tool, which invents texts based on artificial intelligence, writes them out before your eyes.

The site generates even more enthusiasm than previous versions of the text prediction algorithm GPT-3, which was also impressive. “Language models have improved a lot over the past five years,” explains AI expert Stef van Grieken.

Traditionally, computers do what you tell them to do, but no more. Ask a computer to calculate how a rocket can reach the moon, as the first supercomputers did, and it will – but without creative input. But artificial intelligence mimics human creative intelligence so that computers can reason for themselves about the best way to do a task. This is already useful in practice for navigation systems in your car, to compile your Netflix recommendations or to keep your car on the road while driving.

“We are now at an inflection point,” says Van Grieken, who previously worked in Google’s AI department and is now co-founder of the company Cradle, which aims to help biologists with artificial intelligence to devise new proteins. “We’re at the point with this kind of artificial intelligence where it’s starting to look like something.”

Van Grieken refers to the so-called generative models: artificial intelligence that is able to generate new things itself. In ChatGPT’s case, it is about texts, but it can also be done with photos, for example – for example the Lensa app, which also recently went viral. But also with proteins, as for example at Van Grieken.

For him, AI is not even comparable in terms of effect to, for example, the launch of the smartphone. “This is better. Generating something yourself is much more difficult than predicting, for example, what AI can also do. This will be the future of many knowledge-intensive professions.”

For example, how ChatGPT sarcastically describes the invention of sprinkles:

ChatGPT originates from OpenAI, a research institute in which, among others, Elon Musk and Microsoft have invested money. Money is also badly needed: “training” an AI system requires a lot of server power and therefore investment.

An AI system works by making predictions based on data. The more data that goes in, the better the predictions become, and at some point even complete texts can be predicted. A well-functioning model therefore requires an enormous amount of data.

It not only costs a lot of money, but also energy. “OpenAI has built a machine where all human knowledge has been put into it, and that machine has a very large ecological footprint,” says technology critic Marleen Stikker.

A fish is a mammal

Furthermore, it is not clear what data has been entered into the system. “There is no citation for the information you get from it. While many things that ChatGPT writes seem plausible, they are not.” For example, the site told Stikker that she was a member of a political party. “It’s not me at all! It’s called OpenAI, but you can’t control how it comes to such a conclusion.”

Assistant professor at TU Delft Olya Kudina, who specializes in artificial intelligence and technology ethics, also warns against too much enthusiasm. “It’s impressive, with a single press of a button you get a whole piece of text within a few seconds. But if you dig a little deeper, ChatGPT’s arguments are not necessarily correct.”

Previous studies by OpenAI itself showed that in many cases, depending on the circumstances, incorrect information is generated. AI tends to “hallucinate” when asked for information about subjects it actually knows too little about. “For example, you can convince a system that a fish is a mammal, and the system will follow that,” says AI expert Van Grieken.

Smart assistant

He therefore does not see it happening so quickly that artificial intelligence such as ChatGPT can work completely independently, as is also feared – it would make jobs redundant. “In practice, you will work as an employee with a model, and you will have to do something about the quality control.”

The employees could, for example, get a smart assistant that helps them in their work, for example with programming, but also by, for example, looking things up with lightning speed. Van Grieken is involved in a non-profit hackathon that will be held in Amsterdam in February to come up with such applications. “AI will add jobs, not take them over,” Van Grieken said.

Leave a Comment