Digit + Large Language Model = Embodied Artificial Intelligence

preview_player
Показать описание
Is there a world where Digit can leverage a large language model (LLM) to expand its capabilities and better adapt to our world? We had the same question. Our innovation team developed this interactive demo to show how LLMs could make our robots more versatile and faster to deploy. The demo enables people to talk to Digit in natural language and ask it to do tasks, giving a glimpse at the future.

---------------------------------------

At Agility, we make robots that are made for work. Our robot Digit works alongside us in spaces designed for people. Digit handles the tedious and repetitive tasks meant for a machine, allowing companies and their people to focus on the work that requires the human element.

Subscribe (hit the bell for notifications)

Join our Team

Follow our Journey

#robotics #machinelearning #AI #engineering #LLM #embodiedAI
Рекомендации по теме
Комментарии
Автор

Nice! When LLMs hit the market I knew this could be the future for robotics as they no longer need to process objects but now can see what they are looking at knowing everything about those objects.

OZtwo
Автор

For the people who think it's slow (which it is). This is the slowest the robot will ever be. A year ago this was impossible. Keep in mind this robot can work 24/7, which is 168 hours a week. A human works 40 hours per week most of the time (minus about ten hours getting coffee, breaks, talking to coworkers, texting, etc). So while slower than a human working in real time, at the end of a week, they're probably be pretty close to being capable of the same output. The thing is, in a year or two, it'll work faster than a human can in real time, I'm guessing. So that means one robot does what 5 humans can. Which means it could eliminate five jobs at $30k per year, saving $150k per year (actually more because of holiday pay, vacation pay, sick pay, medical benefits, etc). Even if the robot costs $250k, it'll pay itself off and be profitable after only two years. (yes, I'm eliminating maintenance, and break downs/labor to fix the robot, which I can't possibly calculate. I assume it will be reliable when sold at scale)

Wake up. Human labor is about to become obsolete in practical terms.
Amazon, Elon, etc, know that eliminating humans is the key to a more profitable, more efficient, and easier to maintain company. It's obvious. It'll take many, many years to transition over, but it's here.

middle-agedmacdonald
Автор

“Digit, use Darth Vader’s lightsaber on all the younglings”

Orandu
Автор

Yes, LLMs are AGI. Y'all were just expecting miracles and felt disappointed when we got to this milestone. That don't change the fact though.

les_crow
Автор

Digit: *picks up blue box*

R&D: "Damn, there must be something wrong with his sensors, we'll have to-"

Digit: "ACKTCHYUALLY... in 'Star Wars Episode III: Revenge of the Sith' Darth Vader has a blue lightsaber until Obi-Wan defeats him on Mustafar, so I'm right because you didn't specify which era."

OceanGateEngineerHire
Автор

In 2000s they lasted an entire day to do that task using CPUs, now they last minutes using GPUs, they are improving at exponential rate and will last seconds using NPUs. Few people can see the acceleration curve and progression in here.

azhuransmx
Автор

Wow this is amazing. More, and longer videos, please.

JJs_playground
Автор

Nice demo.

It would have been good to see at least an outline of how the whole system is structured. For example, this video shows the output of the LLM as a human-readable text. But how does this get further elaborated into the lower level actions appropriate for the specific environment in which the robot operates?

cogoid
Автор

Great work. Retail sales when. ;)
Looking forward to more advances integrating smaller on-board LLM's.

K.F-R
Автор

Nice. Multimodal AI-powered robots are the future of robotics.

Ludens
Автор

Super success super congrats keep up the good work we need super intelligent robots

Amerikan.kartali.turk.yilani.
Автор

As soon as Chat GPT came out in 2022 November... I knew that it had advanced so far that it could be used to generalize tasks for robotics eventually.. it was only a matter of time. And it may be slow to process now, but just about guarantee in a few months to just a year or 2, this will be fully real time command execution. For the time being it is kinda funny to think about how slow the thoughts are :) Hes a toddler right now but wont be for long xD

Vartazian
Автор

I still feel ashamed calling this company CGI 2years ago... y’all are putting in the work and we see you. You guys rock✨

outtersteller
Автор

The ultimate test would be a robot that builds an Lego model with the help of the paper manual or cooks something via an recipe without specific programming.

tiefensucht
Автор

This is how I move when I’m pretending not to be drunk 😂 very cool though!

john-carl
Автор

Crude design but with a few adaptations, it can be far more productive. Nice to see them develop and hopefully evolve. These are the Atari of robotics but once the novelty phase is over, the focus will shift to proficiency.

arnoldbailey
Автор

The backwards legs give this little bot a bizarre insectoid look.

Without wanting to be the guy who comments about a "Terminator" style future - this robots abilities are incredible - and this technology is in its infancy.

In two years time, I wonder what tasks this robot will be carrying out....

richardede
Автор

How much info do the QR codes provide though?

OrniasDMF
Автор

imagine this whole project taking less than project binky that is about restoring a car? been ongoing for like 7 or more years

morkovija
Автор

This is way better that Tesla's Optimus.

DoctorNemmo