filmov
tv
Apple Just LEAKED their OWN FUTURE SOFTWARE!
Показать описание
Apple has done it again. Not content with dropping the new iPad Pro and Air today, Apple has once again, like the past couple of years, casually thrown out some of the most revolutionary software on any platform, and hiding it behind the “accessibility” label. Don’t believe me? Here’s a few from the past. Back tap on the iPhone so you can activate shortcuts without touching the screen or a button. Double tap on Apple Watch S9? The accessibility version does WAY more, and you can use it on older Apple Watches. For example, my Series 7. So… let’s have a look at the actual innovations Apple announced today, coming later this year.
First up: Eye tracking for iPhone and iPad. Basically, you heard of that Apple Vision Pro? Well, it’s that, but on your iPhone. And I mean YOUR iPhone, because this isn’t a hardware locked feature, Apple specifically calls out that it uses your front facing camera and AI to set up and calibrate in seconds, doesn’t need additional hardware or accessories, and all happens on device. Then users can navigate through any app with their eyes and use dwell time to activate different elements including physical buttons, swipes and other gestures. With just their eyes. Can’t innovate any more my Ass.
Next, Music Haptics. This is designed to be a way for those with limited hearing to experience music on iPhone, with the Taptic Engine playing taps, textures and refined vibrations to the audio from music, works across millions of songs already on Apple Music, and is an open API for other devs to use too in their apps. Feel your music. Very nice.
Vocal Shortcust mean you can run complex tasks simply from custom sounds or speech, as well as including Atypical Speech, which allows people with acquired or progressive conditions that impair their speech to continue using speech recognition. This seems like a great expansion of the Personal Voice that Apple launched this time last year - which is another example I should have included at the start. But here we are.
Next up, and this is probably less of an accessibility thing, but Apple has invented a cure for car sickness. Like, actually. Studies show that its often the disconnect between what your ear detects in terms of g-forces, acceleration and turns not matching with the information your eyes are providing is a huge cause of travel sickness which is why looking out of the window can help. But we can’t expect actual humans to put their phones down, so this option adds animations that reflect the movement as detected by the accelerometer in your phone, giving your brain some visual cues. I believe that Mark Rober of huge YouTuber Mark Rober fame worked on this kind of technology using pre-cursor headsets to the Vision Pro for use with Apple’s now-cancelled Car project. Another genius idea. Probably coming to an Android near you soon.
Staying in the Car, CarPlay will get Sound Recognition so drivers with reduced hearing, or working on future reduced hearing with deafening music will get on-screen alerts if the iPhone detects car horns or emergency service sirens for example. Also, Voice control for CarPlay so you can navigate the interface and apps with just your voice, and even colour filters for colourblind people and options for bigger and bolder text.
Vision Pro will get system-wide live captioning for users with limited hearing, including FaceTime calls, movable captions in immersive video and more. Voiceover gets new voices, custom volume control and VoiceOver Keyboard shortcut customisation. Magnifier gets a new reader mode that simplifies pages just down to the text. Braille users get improvements to Braille screen input, Japanese braille gets support and Dot Pad gets support for multi-line Braille. - At this point I should say I don’t understand everything here - I don’t have experience of needing everything in Apple’s accessibility, and I don’t know what’s more or less valuable, but I do want to touch on as much as I can!
Hover Typing shows larger text and in the user’s preferred font size and colour when typing in text fields. Personal Voice adds Mandarin Chinese support. Live speech adds categories and live caption support. Assistive touch adds a virtual trackpad onscreen so you can use a cursor on iPhone & iPad. You can set custom finger taps in the Camera apps to switch modes, and Voice Control is adding support for custom vocabulary and complex words.
Support iCaveDave
How I record the Podcast? Riverside fm. Support the show by using my link if you're signing up :)
Join the conversation
First up: Eye tracking for iPhone and iPad. Basically, you heard of that Apple Vision Pro? Well, it’s that, but on your iPhone. And I mean YOUR iPhone, because this isn’t a hardware locked feature, Apple specifically calls out that it uses your front facing camera and AI to set up and calibrate in seconds, doesn’t need additional hardware or accessories, and all happens on device. Then users can navigate through any app with their eyes and use dwell time to activate different elements including physical buttons, swipes and other gestures. With just their eyes. Can’t innovate any more my Ass.
Next, Music Haptics. This is designed to be a way for those with limited hearing to experience music on iPhone, with the Taptic Engine playing taps, textures and refined vibrations to the audio from music, works across millions of songs already on Apple Music, and is an open API for other devs to use too in their apps. Feel your music. Very nice.
Vocal Shortcust mean you can run complex tasks simply from custom sounds or speech, as well as including Atypical Speech, which allows people with acquired or progressive conditions that impair their speech to continue using speech recognition. This seems like a great expansion of the Personal Voice that Apple launched this time last year - which is another example I should have included at the start. But here we are.
Next up, and this is probably less of an accessibility thing, but Apple has invented a cure for car sickness. Like, actually. Studies show that its often the disconnect between what your ear detects in terms of g-forces, acceleration and turns not matching with the information your eyes are providing is a huge cause of travel sickness which is why looking out of the window can help. But we can’t expect actual humans to put their phones down, so this option adds animations that reflect the movement as detected by the accelerometer in your phone, giving your brain some visual cues. I believe that Mark Rober of huge YouTuber Mark Rober fame worked on this kind of technology using pre-cursor headsets to the Vision Pro for use with Apple’s now-cancelled Car project. Another genius idea. Probably coming to an Android near you soon.
Staying in the Car, CarPlay will get Sound Recognition so drivers with reduced hearing, or working on future reduced hearing with deafening music will get on-screen alerts if the iPhone detects car horns or emergency service sirens for example. Also, Voice control for CarPlay so you can navigate the interface and apps with just your voice, and even colour filters for colourblind people and options for bigger and bolder text.
Vision Pro will get system-wide live captioning for users with limited hearing, including FaceTime calls, movable captions in immersive video and more. Voiceover gets new voices, custom volume control and VoiceOver Keyboard shortcut customisation. Magnifier gets a new reader mode that simplifies pages just down to the text. Braille users get improvements to Braille screen input, Japanese braille gets support and Dot Pad gets support for multi-line Braille. - At this point I should say I don’t understand everything here - I don’t have experience of needing everything in Apple’s accessibility, and I don’t know what’s more or less valuable, but I do want to touch on as much as I can!
Hover Typing shows larger text and in the user’s preferred font size and colour when typing in text fields. Personal Voice adds Mandarin Chinese support. Live speech adds categories and live caption support. Assistive touch adds a virtual trackpad onscreen so you can use a cursor on iPhone & iPad. You can set custom finger taps in the Camera apps to switch modes, and Voice Control is adding support for custom vocabulary and complex words.
Support iCaveDave
How I record the Podcast? Riverside fm. Support the show by using my link if you're signing up :)
Join the conversation
Комментарии