filmov
tv
Rapid Prototyping: XR Hand Tracking for Retail

Показать описание
I like to do tech prototypes to keep my skills sharp, so for Spring 2023 I focused on exploring XR headset hand tracking (specifically with the Meta Quest "Interaction SDK"). There are three sets of prototypes: Retail, Home and Games (this one is Retail)
Here's the one line goal of this prototype:
"Using natural gestures, let a user interact with photorealistic looking shoes and showcase what these shoes might actually look like in a variety of athletic situations."
I chose photogrammetry to produce the shoe models: using some very clean Nike shoe scans (provided by Apple) and some less-then-clean shoe scans (gratefully provided by my wife). The photogrammetry was done about as inexpensively as you can: a simple backdrop, a $10 cake turntable, several soft lights to minimize shadows, and a MacOS command line tool (via the Object Capture API in Swift). Luckily, shoes are one of those items that lends itself very well to photogrammetry.
To visualize how the shoes would look during various real world activities, I utilized a human model with motion capture data - and set up a big animation controller in Unity. I removed the original shoes from the model, and then dynamically replace them with whatever shoe the user has selected. For animation selection, I set up curved UI surfaces (in VR, these can be easier to read, as opposed to a simple flat UII)
Interaction SDK Overview
Apple Object Capture API
Game Engine
Music
Here's the one line goal of this prototype:
"Using natural gestures, let a user interact with photorealistic looking shoes and showcase what these shoes might actually look like in a variety of athletic situations."
I chose photogrammetry to produce the shoe models: using some very clean Nike shoe scans (provided by Apple) and some less-then-clean shoe scans (gratefully provided by my wife). The photogrammetry was done about as inexpensively as you can: a simple backdrop, a $10 cake turntable, several soft lights to minimize shadows, and a MacOS command line tool (via the Object Capture API in Swift). Luckily, shoes are one of those items that lends itself very well to photogrammetry.
To visualize how the shoes would look during various real world activities, I utilized a human model with motion capture data - and set up a big animation controller in Unity. I removed the original shoes from the model, and then dynamically replace them with whatever shoe the user has selected. For animation selection, I set up curved UI surfaces (in VR, these can be easier to read, as opposed to a simple flat UII)
Interaction SDK Overview
Apple Object Capture API
Game Engine
Music