GPT Engineer: Things Are Starting to Get Weird

preview_player
Показать описание
GPT Engineer is another nail in the coffin of software developers. In this video, I'll show you how it works. The tool is crazy powerful. Just specify what you want it to build, and then, well, it just builds it. I’ll show you how it works in today's video and share my thoughts on where this technology might lead us.

🎓 Courses:

👍 If you enjoyed this content, give this video a like. If you want to watch more of my upcoming videos, consider subscribing to my channel!

Social channels:

👀 Code reviewers:
- Yoriz
- Ryan Laursen
- Dale Hagglund

🔖 Chapters:
0:00 Intro
0:26 How to use it
6:07 Thoughts
10:00 Outro
#arjancodes #softwaredesign #python

DISCLAIMER - The links in this description might be affiliate links. If you purchase a product or service through one of those links, I may receive a small commission. There is no additional charge to you. Thanks for supporting my channel so I can continue to provide you with free content each week!
Рекомендации по теме
Комментарии
Автор

My prediction is that there will be a slump in junior dev hiring at some point as you said, but as the technology becomes even more ubiquitous, companies will eventually redefine the job specifications of junior dev to accommodate the fact that low level boilerplate code is no longer needed, i.e. coding interviews will involve more architectural and prompt engineering type of questions instead of low level algorithms and data structures.

JChen
Автор

It's not that "this will replace programming." It's that this "is what programming will become." Big difference.

harfenueglebarfenuegle
Автор

One thing is developing a new application, another thing is dealing with legacy code, sometimes decades old. So when Chat-gpt or some other LLM becomes capable of ingesting a Fortran code base with millions on lines on code and rearchitecting it to something like React/Python/C++ with Cuda and zero defects then we will be in trouble.

gerardorosiles
Автор

Looking at the recent onslaught of tools like this (another one I recently saw was pandas-ai), we techies seem to be like a bunch of surfers, all standing on a beach looking out to the ocean and discussing whether the rising horizon is just an excitingly big wave or the mother of all tsunamis that is heading our way. Some of us excited about riding it, others soothing themselves by looking for any sign that it isn't as menacing as it seems by pointing out "flaws" they have spotted, some of us nervously bracing ourselves for impact, some talking about running for the hills but not wanting to look foolish by jumping the gun too quickly. The ones who are responsible for this "rising horizon phenomenon", have all warned us that there is going to be a huge, unprecedented impact, but here we find ourselves (myself included) watching and contemplating.

djl
Автор

It seems to me that to use AI effectively, you have to be at least knowledgeable enough about a subject to point it in the right direction, or to ask questions that are poignant, deliberate, and specific. That's going to require a lot of both basic and theoretical knowledge on the subject as well as the ability to organize your thoughts and most of all, be LITERATE. Something that even many college grads seem to be lacking these days, which is sad. Gone will be the days of just poking at something until it works and hoping for the best.

I think this is going to be a good thing. Humans should be engaging with their work at a higher level and machines should be doing the tedious stuff. If you know what you want, know what's needed, can articulate your thoughts clearly, concisely, and precisely, I really think AI is going to be a good lab partner. But yeah, there's going to be some restructuring and jobs are going to change, many are going to go away at all levels of industry. Change is never easy.

Personal-M.I.S.
Автор

my biggest issue with ai isn't the hallucinating or getting stuff wrong. its the lack of understanding of it's basis of prediction. it lacks citation features. you could ask it something from a specific book and, even if the answer is correct, if you asked it where the information came from it cant tell you. Which means it cant tell you the reasoning for the wrong output either. This is hard for humans too but we do a good job of logical backtracking our thought process in order to debug.

jesserigon
Автор

My opinion: High quality programming will only get better with AI and will produce more high quality content.

michaelmueller
Автор

I had a conversation with my father recently, who was a comp sci major in the late 60s, about this very subject. He doesn't see it as any different from how IT has always developed. When he was younger, he learned to program at the bit level and was very limited in what he could accomplish. Now, no one even considers how a .net function is executed at the bit level. With each progression, some knowledge becomes obsolete and we learn to code at a higher level which allows us to focus on larger and more complex tasks. I agree with him and I think the only difference is the pace of development. Technology is now going to be changing at a pace that a 4 year college degree program cannot possibly hope to keep up to. I also think that's why you see such a huge surge of coding academies and such where you can realistically get to a basic functional programming level in a matter of months, not years. This should only be worrying, in my opinion, to people that are not interested in learning new tools and advancing anymore. Maybe time to retire for them.

MMSoapgoblin
Автор

I think this will replace developers the same way that standard libraries replaced developers. Which is to say, it could reduce redundant code development. Statistical models like LLMs are only really interesting when they "fail" by predicting novel tokens, which in software means producing code that doesn't work. They can't produce new ideas and they aren't capable of analytical reasoning, so they're never going to be able to create anything new.

Some people say "if you give it a detailed enough prompt". A well specified, highly detailed, exhaustive prompt is often called "source code". You can get MS Paint to produce the Mona Lisa if you give it a detailed enough "prompt".

jasonx
Автор

I love the development, but as a professional myself, reading code is harder than writing code. Why does this matter? Unless we start new codebases every month with chatgpt, how does a chatgpt codebase look in a year? I have this weird succesrate of 100% on first app iterations, but a relatively low succes rate with any technical problems in my real life codebase issue as tech lead. Also, 1 of the things that help me analyze codebases, is that you learn the "style(s)" of the dev(s) that worked on a project, making the codebase over time time easier to read. Chatgpt writes in a different coding style all the time unless you address it, but even then it has no personal opinion or sense of self. This mean a chatgpt codebase in a year could look like it was built by 100 different developers, with different syntax and conventions, making reading it hard for the unintiated. Today's legacy might be nothing compared to it.

Rhetoric: Is it also even possible to add real life client features over a long time? And the moment chatgpt cannot add to a codebase, can a senior jump in and find the errors of 500k chatgpt lines? If not, is the project then stopped, and a new one started?

xdarrenx
Автор

If you’ve ever worked as a software engineer, especially at the senior level and beyond, you’ll know that the real engineering is mostly about design docs, system level coordination, getting alignment across multiple teams, etc. Frankly, writing actual code becomes the easy part. This is the part of the job I don’t see going away anytime soon. Low level coding might largely be replaced, but the human part will remain.

mickharthy
Автор

I doubt that AI would replace developers any soon. According to my experience with ChatGPT, i'm spending more time to figure out what's wrong with certain piece of code that it produces, than if I was to write the whole stuff by myself. If we take for example your example with the keys, you had to specify technical stuff that normal people are not familiar what it means - e.g. possible length of WEP keys. Then even taking into account that you specified that you need API. Take into account and the sequential object ids, which is not optimal.
All this stuff is usual for a developer to know and to expect. Other people would be like an neandertal in front of a typewritter...

Deadlious
Автор

The ones who leverage all this technology will be the ones that are not left behind. This was really cool! I find value in using this to generate quick skeletons of a project to use as a starting point or reference, and go from there. Building the boilerplate for a FastAPI api, or the boilerplate for DB classes, MVC files etc. never changes between projects. It gets boilerplatey - this solves all that and takes friction off the plate of the engineer. Leaves the core work and thinking to the human. This technology is amazing - and this is just the beginning. We aren't going anywhere, folks! Do not worry - just like Arjan said. Cheers.

TannerBarcelos
Автор

This seems like a perfect opportunity to do rigorous TDD without cutting corners. Rather than having the GPT engineer write some code for you based on requirements and hope the code works, have it write its tests first based on those requirements. Then have it write code to pass its own tests.

mullingitover
Автор

Reminds me of the hype around 3d printing. There were loads of videos explaining how 'soon we won't be buying homewares / appliance etc. we would just buy a printer and print them out'. Where's that got to. Yes a LLM can write SOME code but anything beyond very entry level coding is much more about business logic and problem solving that simply writing basic boilerplate code.

richardprocter
Автор

So, the entry level job goes from sweeping the floor to looking after the robot that sweeps the floor.

markm
Автор

GPT has some uses. But it leads you off into rabbit holes as much as it helps you. It shows potential, but if you think it might replace you as a programmer right now... its time to go back to school and get that GED you always dreamed of. However, it IS great at giving me the correct REGEX syntax. So, that's cool.

MarkJoel
Автор

You know what GPT-Engineer could not do? teaching junior programmers about software design. Thanks a lot Arjan. we love your channel and your work

naderbazyari
Автор

Good review. I have been wondering a lot about this topic since I am just starting to learn Data Analysis and someday Data Science. My thoughts are like this: Existing development jobs will be completed faster and by fewer people. On the other hand, more applications will emerge ( think IOT, blockchain, etc), more (non-AI-capable) technology will emerge, and more "connections" required between different technologies. Software people will be REQUIRED to have a broader view of all technologies at least to understand how they work together. ( I think you mentioned this last point) . I do not know what the final balance will be, but it seems the training or education level of software people must become more broad, and maybe fewer experts in one particular language? And more expert in understanding interdisciplinary functions?

waynelast
Автор

So.. we did a full circle on UI. From text to GUI to text again!

alexandrustefanmiron