GPT-3 (or any AI) won’t Kill Coding

mediumThis post was originally published by Alberto Romero at Medium [AI]

I’ve tried to rebut some ideas about GPT-3’s threat to coding. Now, I’ll extend the arguments over AI in general. There are 3 strong reasons programmers don’t need to fear AI that much:

Other paradigms are better suited for some tasks

When I mentioned prompting as a new programming paradigm (software 3.0) I left implicit the other two paradigms: traditional coding (software 1.0) and neural networks (software 2.0). Karpathy published a viral post some years ago defending the idea that neural networks should be framed as a new form of software, and that they were better prepared than traditional coding for some tasks.

I agree with him to some extent. Neural networks proved very successful in tackling some tasks at which traditional coding had always fallen short. In particular, neural networks are well-suited for vision and language. It was evident that for some problems, directly writing the behavior we wanted from a program was easier (software 1.0), but for others, collecting data as examples of the behavior we wanted to reproduce (software 2.0) was the go-to solution.

It’ll be the same with software 3.0. Prompting allows users to deal with tasks that are beyond the capabilities of previous software paradigms, but it won’t be well-suited in other situations. Building an operating system, the office package, a database, or a program that computes the factorial of a number, will still be done with traditional coding.

Other paradigms are less costly

Deep learning costs are often prohibitive. Many companies still rely on non-neural-network machine learning solutions because data manipulation, cleaning, and labeling would comprise higher expenses than the rest of the project.

Even if newer techniques and technologies are faster or more precise, economic costs are always a limitation in the real world. Training GPT-3 cost OpenAI around $12 million. How many companies can afford it? Would you spend a few million to create an AI that writes JSX for you?

Even if the API is free for developers to use, there’s another cost to take into account. The environmental damage to the planet. GPT-3 is so big that training it generated roughly the same amount of carbon footprint as “driving a car to the Moon and back.” Sometimes bigger isn’t better.

Today’s AI has limitations it can’t pass

Neural networks keep getting smarter each year, but there are tasks not even the smartest, most powerful neural network can manage. The uncertainty GPT-3 has to face when interpreting a written input is inevitable.

Disembodied AI —which comprises almost every AI to date — can’t access the meaning beyond the words. We can use context to interpret the world around us because we interact with it. We live in the world and that’s the reason we understand language. We can link form with meaning; we can link words with the subjective experience they convey.

Neural networks, no matter how powerful, won’t be able to master language as humans do. As professor Ragnar Fjelland says, “as long as computers do not grow up, belong to a culture, and act in the world, they will never acquire human-like intelligence.” And it isn’t happening anytime soon.

Spread the word

This post was originally published by Alberto Romero at Medium [AI]

Related posts