Cracking the code of body language to build lifelike robots

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

Great work, Ben! Super interesting video!

TechByZach
Автор

Please continue making these types of videos!

ghastlyqwert
Автор

I love this topic! Deep learning is going to change everything.

Soooooooooooonicable
Автор

This is a nice, interesting video guys. Hope to see more like this in the future. Do something on Alphabet's X labs.

smadriel
Автор

Ah Kinect some times I feel like the only person who wanted one with their Xbox. Glad to see someone get some use out of the device. Hope MS doesn't give up on it.

kenshuei
Автор

If we able to break through on this said elaborate code of body language. Off course its first breakthrough was not break through the code of humanity's body language of its entirety but the body language of an American first. Because the machine learning machine is exposed to mostly American citizens training it and it's research facility is in USI'm pretty sure despite no strong evidence or knowledge on psychology that cultures and environments plays a big part how we perform body language. So still a long way after all

mancerrss
Автор

Great video! Keep creating great content.

PratyushPrkash
Автор

Wow, someone actually found a use for kinects

jinmingliang
Автор

1:16 The only reason why *kinect* exists! XD

arnaumolas
Автор

5:58 who was the idiot at Google who after watching 2001 a space oddyssey decided to teach computers to read lips?

Frexican
Автор

It should be taken into account that the Human brain is very good at spatial reasoning and spatial improvisation. Research games like Foldit have shown that the human brain is better at finding more efficient spatial "sinks" for proteins when they need to be folded to the lowest energetic well possible. So in lay man's terms: our brains understand what the "tone" a gesture is sending. It may even know how that tone changes bases on the 3 dimensional coordinates that change over time and the relative speed and direction or vectoring those gestures perform. We humans just can't transliterate the meaning in real time yet, mainly because our wetware doesn't generate a rigid linguistic analog for these gestures and their relative coordinates in space.

Zipo
Автор

I consciously choose my body language. So, I feel half the things you said were false. I tilt my head because I'd rather not look at you, or I scratch because something is bothering me, perhaps I'm having trouble understanding, and I lean back to make myself comfortable. It doesn't take a genius. Here's some more information to propel the tech:

I put my hand to my chin because it's heavy, I'm probably slouching.
I don't do the hand motions when I speak because that's a little dumb, but it adds emotional energy to the words, and the emotional wavelength depends on how high or low the hands are, shrugging usually has hands at the bottom and face palms are at the head. I'm teetering on insults, but it comes from a hate of inaccuracy and falsity.

cevxj
Автор

Great video, bad title. It should be more like this: here's a video of a few guys in a university talking about tracking body language but exaggerating their own, even though they describe it as genuine and not at all on purpose. Also, nice socks. Genuine is the right leg movement after the compliment at 5:04

andremartins
Автор

robots won't have human-like body language without human-like intelligence.

busTedOaS
Автор

So for AI why don't we let it listen to millions of hours of human conversation and generate potential responses. Then tell it which responses are more appropriate. Except the audio can't be from movies or shows because that's not a good representation of real human interactions.

KHudso
Автор

omg 3:14 it's Charlie Sheen 20 years ago

rubascasalvarez
Автор

What's more worrying than a Terminator apocalypse is the replacement of traditionally human-exclusive service jobs with avatar/automata replacements.

seeranos
Автор

a human face should never be used on a robot, creepy af

shaozhe
Автор

Honestly thought Lok Cheung was in the video at 1:00 min.

joepphoto
Автор

damn, that Einstein robot freaks me out

xmrntx