Unity 6 & XR Toolkit 3.0 : Features You Can't Ignore as a VR/XR/AR Developer

preview_player
Показать описание
Discover the latest updates in Unity 6 and the XR Interaction Toolkit 3.0! 🚀 Unity 6, now in preview mode, brings a host of powerful features to VR developers. From the innovative Foveated Rendering API, which enhances performance by optimizing peripheral vision, Adaptive Probe Volumes for dynamic lighting, to the game-changing Composition Layers for crisp and clear visuals, Unity 6 is packed with tools designed to save you time and boost your projects.

The XR Interaction Toolkit 3.0 introduces significant improvements like the new Input System, making VR development smoother and more efficient. Learn about unique hand gestures, and advanced interaction models like the Near-Far Interactor for seamless VR interactions. Enhance your VR experience with upgraded climb teleportation and improved rendering capabilities.

Stay ahead in VR development with these essential updates in Unity 6 and XR Interaction Toolkit 3.0! 🌟

► 🍤Support the channel🍤

► 💬 Socials🗨️

► 🔗Useful Links🔗

► 📦Assets📦

► 🕹️More Shrimple Tutorials🕹️

🦐Who's El Shrimpo?🦐
I'm Fist Full of Shrimp! I make the most shrimple tutorials about Unity, Game Development and VR Development! My Unity tutorials are beginner friendly, but also tend to get into more details that can be forgotten. I do my best to make my tutorials as quick and clear as possible, so if you've found them helpful so far, consider liking and subscribing for more!

#unity #gamedev #vr
Рекомендации по теме
Комментарии
Автор

Hey, I'm one of the developers of XRI! Really happy to see you talk about our package in this video.

A few thoughts and answers to your questions:

- Why isn't XR Hands a part of XRI? XRI is meant to be an optional package that lets devs build rich interactions for their XR apps, but it's not a core input package, like the OpenXR package. The idea is that whether or not you use XRI, you should still be able to use hand tracking.

- Input readers, your example showed referencing interaction events. This was supported before. The new readers are meant to replace the action based controller, where you had to decide all the input actions you'd ever need, and the interactors would have to read their transform and input values from this. Its was problematic because, notably with hands, you'd often have different poses for different interactors, and you rarely needed all input actions for all interactors. This makes it a lot more modular.

- Near Far Interactor - a huge benefit of the new design is that casting is modular and it's easy to implement new strategies for getting valid targets without having to override the whole interactor. We also pulled out attach transform manipulation, and added a push/pull gesture. It should be easier than ever to implement custom behavior, like say the gravity glove mechanics from half life alyx, just by making a custom attach controller.

For extra info btw, we finally published our GDC talks online.
Here's a link to the playlist, including my talk about XRI 3.0.

I'll also be doing a livestream tomorrow, for anyone who wants to ask questions and watch me dive into XRI 3.0

pvncher
Автор

Adaptive probe volumes look sick, and I realized I'm not using nearly enough light probes. Thanks for the video!

abrahamdrinkin
Автор

That foveation pronunciation was something else

Zeldarulah
Автор

"Four-vation level". 10/10 pronunciation. Lol 😂

beardordie
Автор

I can't wait for your updated tutorial. After it my project will stop yelling at me.

SamOnTehsea
Автор

XR Hands is the OpenXR backend tech for hand tracking that any XR toolkit can use, not just Unity's XRIT. Keeping them separate is good.

hawkwoodcom
Автор

I am looking forward to it. The current Interaction with inheritance is a bit to complex for my taste. If you want to have different behaviour you needed have multiple objects and turn a lot of stuff off and on to get the designed behaviour. (At least how I made it to work..)

I hope it is getting better and more structured, that making adjustments gets easier.

thorsten
Автор

🤣I keep wanting to & trying to use U6, but... it's new, so about 80% of my assets don't want to play nice with things. Ah well, it all looks promising at least.
Oh, and there's no R in foveated: "foe-vee-ate-ed".
Thanks for the update! 👏

kpr
Автор

Hey, thanks for hard work.<3
I am a beginner in VR game development and trying to delve deep into it. I’m unsure whether to start with Unity 6 or go with a previous version. What do you recommend? As you mentioned in the video, the performance of my project might be better with Unity 6 if I can work properly with it. However, I’m concerned about potential difficulties. Any advice?

pooyasabeti
Автор

Love your content my dude, hands down one of the best unity XR explainers in the game.

sappalele_dev
Автор

Thanks for this video but Quest 3 can handle these APV, Gpu Residant Drawing or Gpu Occlusion Culling?

tufanaydin
Автор

Do you know if the XR Toolkit uses Meta's MultiModal, allowing a game to be created with both Hand Tracking AND controllers?

DigitalAdamTech
Автор

What is the colorful space game in the intro?

keirvr
Автор

I would love a series of 3 videos of max 10min each where you speedrun the creation of a veeeery simple Immersive Sim using the XRTK 3.

dfadir
Автор

I upgraded my existing project to XRI3 and everything stopped working. Now I gonna rebuild a new code foundation based on Unity6 + XRI3 for the next installment of Cactus Cowboy.

CactusVRstudios
Автор

what game is that? with the spaceship?

lightoflifegames
Автор

can you make a tut0orial on how to add physics hands

lunarvoidvr
Автор

"Foveated" is pronounced as foh-vee-ey-ted. Here's a breakdown of the pronunciation:

"Foh" sounds like "foe" (rhyming with "go").
"Vee" sounds like the letter "V".
"Ay" sounds like the letter "A".
"Ted" sounds like "ted, " as in the name.

messapatingy
Автор

Foveated (foe-viated), not fore-vidded

spyboy_
Автор

1:00 It's pronounced "FOH - VEEH- ATE - ED"

MrMycelium