Razer Hydra for low-cost 3D displays

preview_player
Показать описание
The Razer Hydra is an electromagnetic six-degree-of-freedom (6-DOF, position and orientation) tracked input device consisting of a base station and two tracked handles each featuring several buttons and a joystick.

While most often used as a mouse or joystick emulator to work with existing games, the device offers very good usability for 3D applications if using its native capabilities. This video shows how the Razer Hydra can be used as the main input device for a low-cost 3D display.

The software shown in the video is the Nanotech Construction Kit, based on the underlying Vrui VR Development toolkit running on Linux.

More information:
Рекомендации по теме
Комментарии
Автор

Always fun to see the hydra in new applications. Nice work man.

razer
Автор

Terrific. Can't wait to see what you'll do with the Oculus Rift - Palmer Luckey seems to be slowly but surely addressing most concerns people had?

JamesCorbett
Автор

We have one of those, too, but it's pretty much the same. While the glasses are nice, and stereo quality is good, the problem with passive-stereo 3D TVs is that they reduce the image's resolution by half vertically because they are using an interlaced scheme where the odd pixel rows are for the left eye, and the even pixel rows are for the right eye.

okreylos
Автор

They're basically completely different things. The Wiimote can detect acceleration along any axis, which is a way to detect 3D gestures, and the Wiimote+ can also detect rotations. Together, those two can give you orientation tracking. The Hydra detects absolute position and orientation in 3D space. The Wiimote can only do that after heavy modification, or as a part of a larger tracking system.

okreylos
Автор

That's actually a very good comment. The problem is that to use the Hydra to its full potential, one has to punch through the user interface layers of the target application. Something closed-source like, say Maya, will only have interfaces for mice and maybe spaceballs, and emulation just doesn't cut it. Blender might work because access to the source code means that someone could create a native interface for the Hydra.

okreylos
Автор

Interesting and fascinating as always.

Did you order an Oculus Rift developer kit? I'd love to see what you can do with it.

omarcortes
Автор

I like how technical and science oriented everything is, including choosing to build a bucky ball with carbon atoms. Yet there was still room to include a lightsaber.

laremere
Автор

Dude, you are the best hacker/modder/whatever its called/ out there! I love your vids.

imbapwnnb
Автор

Oh, and I sure hope that 3D TVs will become more popular, for the simple reason that they are extremely useful to us, and I don't want them to go away.

okreylos
Автор

The problem is speed of light. To get timing-based triangulation at 1mm accuracy, your clocks need to be synchronized to 3.33 picoseconds.

However, replace radio with ultrasound, and you have exactly how the high-end Intersense IS-900 tracking system works. You have to very precisely measure the 3D positions of the receivers (senders, actually) using a survey laser range finder, but that's how it's done.

okreylos
Автор

I should mention that this is not just a single application. The Hydra is supported by the Vrui VR infrastructure, which means it can be used exactly like this in all Vrui applications (3D Visualizer, LiDAR Viewer, Crusta, Mesh Viewer, Nanotech Construction Kit, ProtoShop, etc.), and even in the "toy" applications like Quake 3, Doom 3, and Descent.

okreylos
Автор

you shold try this with an LG Cinema 3D TV, which uses completely flicker-free passive polarized glasses. it's in a totally different league with regard to the 3D image it creates. it literally pops out of the screen and places it in front of you!

awaken
Автор

A very interesting question. There's a lot of neuroscience research on "body image plasticity" (forgetting the exact phrase right now), which is exactly this, the idea that the brain can adjust to parts of your body being somewhere else than they really are. But I think the effect takes longer to hold than the more conscious process that users of this system do. Good read: "Phantoms in the Brain" by VS Ramachandran, and anything by Oliver Sacks, really.

okreylos
Автор

As always, great video. Please keep up the work :)

jojodi
Автор

There is a trade-off between lag and noise. The unfiltered measurements from the Hydra are very jittery, and I am using a variable low-pass filter to get rid of it. The filter gets more aggressive as the handles move away from the base station (as noise is proportional to that distance).

You can reduce filter strength in the configuration file to get better response time, at the cost of some jitter.

Or, you can bring the base station closer to where you will typically hold the handles.

okreylos
Автор

There is a new 3D display just like the Leonar3Do, the zSpace. It has a very similar tracked input device, and head tracking. I'm currently talking to that company to support it natively in our VR infrastructure.

There's no pricing information on their web site yet.

okreylos
Автор

The alternative is to do the reverse, port Blender to the Vrui infrastructure, or develop a Blender-like application from scratch. That way Hydra support comes for free, and it would even run on more esoteric or high-end VR environments like CAVEs and such. We already have a simple 3D modeling application, but of course it's still a far cry from full-featured programs like Blender.

okreylos
Автор

Leonar3Do has head tracking, which creates "holographic" displays where virtual objects appear solid. That's the same as our next level up in low-cost VR, combining a 3D TV with an optical tracking system. I'm still working on a low-cost head tracker to add the same degree of realism to the type of environment shown here.

Head tracking adds significantly to the usabilty of a 3D display.

okreylos
Автор

... or one could make a better filter, such as one that reacts quickly to quick changes in position or orientation, and filters small motions more aggressively. That's work left to do.

okreylos
Автор

Yes, my software has non-linear tracking correction already built in. The problem is that it's impractical for end users to do the required calibration steps. You'll need some external distortion-free absolute 3D measurement device to capture tie points between distorted and undistorted measurements. I would do that if I had to use Razer Hydras in a professional setting, but for the scope of the system I'm demonstrating here it's not applicable.

okreylos