Next generation user interfaces: Mihai Dumitrescu at TEDxBucharest

preview_player
Показать описание
Mihai Dumitrescu is passionate about smart software, especially in the area of information management.
With a background in artificial intelligence and distributed & parallel computing, he uses many unconventional techniques to tackle this domain and is very motivated to apply out-of-the-box thinking for every new project he is working on. He co-founded "rosoftlab" and has spent the last 7 years as its CTO and Lead Software Architect, working on many different projects in the area of information, workflow and content management for customers located mainly in Germany and Switzerland.
Mihai also helped built up the collaboration between "rosoftlab" and Ergoneers, the German market leader in the area of high end tools and software for studying ergonomics.
Before founding "rosoftlab", he studied Computer Science at the Friedrich Alexander University in Erangen, Germany.

In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual TEDx events are self-organized.* (*Subject to certain rules and regulations)
Рекомендации по теме
Комментарии
Автор

My only concern on organic adaptation is teaching users about these "optimizations", and that they occur.  I think most users would not "expect" interfaces to adapt to their usages.  I could imagine users getting frustrated when elements change size or form, when they are used to the same interface every time.  On the other hand, I can also imagine a scenario where the user is pleasantly surprised.

I think we'll really see all of these ideas come to life with the Occulus Rift.  As a developer myself, my biggest issue is getting access to these eye-tracking tools, as most of the tools we have today are either really expensive or still in development (ex. Google Glass).  Once the Occulus Rift comes out, I think we'll see many unique types of data forms from eye-tracking (such as heat maps, eye movement, etc.).  The interesting thing will be how pre-existing analytics software will integrate with these new technologies.

Great talk.  Thanks for sharing.

SeanGoresht