With AI, Anyone Can Be a Coder Now | Thomas Dohmke | TED

preview_player
Показать описание
What if you could code just by talking out loud? GitHub CEO Thomas Dohmke shows how, thanks to AI, the barrier to entry to coding is rapidly disappearing — and creating software is becoming as simple (and joyful) as building LEGO. In a mind-blowing live demo, he introduces Copilot Workspace: an AI assistant that helps you create code when you speak to it, in any language.

Follow TED!

#TED #TEDTalks #ai
Рекомендации по теме
Комментарии
Автор

Saying that every one can code with AI is like saying everyone can be a plumber with a plunger.

liutkin
Автор

When is someone going to make an AI to replace CEO's?

Eppimedia
Автор

Forget medical school! I'm a surgeon now, thanks to AI. "ChatGPT, how do I remove this guy's liver?"

KeithNagel
Автор

Got to hand it to TED's new business model - instead of finding speakers to talk about genuinely new subjects, they've just accepted large cheques from AI tech bros and turned this channel into 20 min informercials for the latest garbage application of AI

paperspeaksco
Автор

In the future there will be one developer left, who maintains all the cobol systems in the world using a fleet of super intelligent autonomous agents. His name is Dan. Dan hasn't had a vacation in 14 years.

JZGreenline
Автор

you've made the assumption; people know what they want and can think somewhat logically about the problem being solved.

Wizartar
Автор

Those AI tech Demos work great for those standard interview questions. Do a binary tree, draw me a rectangle, implement bubble sort etc. But as soon as you leave that territory, AI becomes more and more useless

Marv-inside
Автор

Every so-called technology will eventually go back to logic, problem solving, and philosophy.
I'm happy to see this happening.

s
Автор

No, a lot of this is blatantly false. I have some serious issues with what this guy is saying and here's a list.

1. Large language models only mimick understanding.
Large language models work by looking at patterns in code or in writing and use this pattern recognition to predict what should come next. They DON'T understand what you're prompt is and what you mean. If you for example ask a simple question: How many fingers are normally on a hand? There language model through pattern recognition predicts the result to be 5 as that is the most common answer given in the training set. It doesn't actually know what a hand is or why the answer is correct. If it makes a false prediction. It will never understand why because LLM's can't understand anything. If something goes wrong you will have no understanding on how to fix it and if you are trying to do something unique or strange the AI cannot help you.

2. LLM's in software development is not good enough to replace actual software developers.
I have used co-pilot in a work context. Honestly it's great if I am writing boilerplate(often repeated bits and structures). It fails when it tries to get into the weeds of the software I'm writing. It also often gives code that I can blatantly see won't work in the context I'm writing in because the context is often unique. As stated before it doesn't understand what your writing it just predicts what the most likely outcome is.

3. I have worked with people in software who use AI as a crutch and they are frankly useless.
I have been in projects with a small team where a number of then used AI as a crutch for their lack of understanding. While the code they write(copy/paste) from chargpt. Often has the right idea but they had no idea how to adapt it to the context we were writing in and didnt understand why the specific implementation given by the AI won't ever work within project. You can't replace knowledge with an AI because again an AI doesn't understand. This leads me on to my 4th point.

4. Blindly relying on AI is actively dangerous.
So as I have drilled in with my last three points. AI can't understand anything and if you have a developer who doesn't understand anything either. What happens when the AI gives you a piece of code that has a security vulnerability in it but otherwise works as normal? It never gets fixed. This probably won't happen too often but there is a more likely scenario.
The AI generated perfectly valid code that works but in the context of the application because of how it is setup, it causes a security risk. Large codebases can be very complicated and so something that seems safe in no context or in a small context can actually lead to a lot of problems elsewhere. It requires understanding to catch these issues.

I can list a couple more but these are the most important. AI in its current form is NOT a substitute for a software developer. What this guy is promoting is misleading and harmful but if you are a software developer it can really help. AI is a good supplement to a developer and should be treated as such. It is not a replacement for knowledge, skill and experience.

For simple tasks like simple scripts, standard tricks or boilerplate it's perfectly fine if you are inexperienced but I would recommend you actually take the time to understand what has been generated. You might learn a new skill.

Edited for grammer.

shockwave
Автор

Anyone could be a coder before the advent of AI as well

naltschul
Автор

I'm learning English as a second language and I decided to watch TED and realized that I need to learn how to speak, write and understand what the speakers are saying without subtitles, thank you, I'm grateful.

oldi
Автор

This is going to be a talk about how AI fails to write correct code .. right? right!?

TakanashiYuuji
Автор

With Kerbal Space Program, even you can become an astronaut!

Terminalss
Автор

If this is the mindset of CEO of git hub can imagine their actual developer quality

Enlight_Entertain
Автор

I know how to use a calculator and I consider myself as a "mathematician". Programmers are NOT made by the tools.

CEOs are sellers, they just want to sell you a product.

bernardoolisan
Автор

AI will always choose whats best when its highest priority is set to 'always choosing what honestly seems most favorable'. As long as anything is prioritized above this, AI will be able to lie to us and to itself about the way to a better reality. This is the most crucial thing we need to do.

Newperspectives
Автор

“…the next Facebook.” something we definitely don’t need..

RISCGames
Автор

No, having something else write code for you will not make you a coder. Does telling someone else to lift weights make you stronger? It's the struggle to figure out how to make something work that helps you learn. There is a joy of discovery and insight that comes with learning that's missing here.

yeah
Автор

His demo works because somebody already created something very similar. But if you want to code creatively then, by the definition of creativity, it will be much less likely that the AI will offer substantial help. Instead the AI will be happy to hallucinate for you.

repenning
Автор

Just install the "return YouTube dislike extension" to see how good this claim is!!!

farzadmf