Glass brain flythrough - Gazzaleylab / SCCN / Neuroscapelab

preview_player
Показать описание

This is an anatomically-realistic 3D brain visualization depicting real-time source-localized activity (power and "effective" connectivity) from EEG (electroencephalographic) signals. Each color represents source power and connectivity in a different frequency band (theta, alpha, beta, gamma) and the golden lines are white matter anatomical fiber tracts. Estimated information transfer between brain regions is visualized as pulses of light flowing along the fiber tracts connecting the regions.

The final visualization is done in Unity and allows the user to fly around and through the brain with a gamepad while seeing real-time live brain activity from someone wearing an EEG cap.

Team:
- Gazzaley Lab / Neuroscape lab, UCSF: Adam Gazzaley, Roger Anguera, Rajat Jain, David Ziegler, John Fesenko, Morgan Hough
- Swartz Center for Computational Neuroscience, UCSD: Tim Mullen & Christian Kothe

- Matt Omernick, Oleg Konings
Рекомендации по теме
Комментарии
Автор

Some additional technical details on the modeling and visualization: The activity shown here is estimated on the cortical surface, from 64-channel (de-noised / artifact-corrected) EEG, using a realistic MRI-based "forward" model of electrical conduction through this individuals' brain, skull, and scalp tissue, and a statistical inverse solution (taken together, "source localization") -- here, this is using an adaptive Bayesian minimum-L2-norm algorithm with cortical constraints, although other variants we use include Sparse Bayesian Learning or Beamforming. The model is typically updated rapidly in a short sliding window (i.e. "multiple measurement vector estimation") rather than on single time points (i.e. "single measurement vector estimation").

The entire cortical surface is colored based on source power in different frequency bands, obtained from multi-taper spectral estimation. The colors code for brain activity in different frequency bands: theta (red), alpha (blue), beta (green). The colored pulses of light traveling between regions reflect frequency-specific multivariate Granger-causal information transfer between selected anatomical regions of interest obtained via a 221-region subparcelation of the "Lausanne" anatomical atlas. Directional information transfer is statistically inferred based on a group-sparse spatiotemporal model adaptively fit to estimated source current density, and is visualized as flowing -- from causal source to sink -- along the anatomical fiber tracts (obtained via Diffusion Tensor Imaging) that are most likely to structurally connect the two regions of interest.

Optimized CPU and custom CUDA (GPU) routines are used to accelerate pipeline speed to "real-time" performance capabilities.

The mathematical approaches used here are extensively published in scientific, engineering, and statistical journals, and enjoy increasing development and utilization by the scientific community for EEG and MEG based inference. For those curious in additional technical details, an earlier (and less optimized) pipeline, similar to this one, is described and validated in [Mullen, Kothe, Chi, et al, 2013, IEEE EMBC.]

Remember this is a computational *model* (i.e. a statistical approximation) of fairly large-scale ongoing brain activity. As the saying goes "the map is not the territory, " and the brain -- in all its detail -- is exceedingly more complex than can be described by such a model (or any other existing model, for that matter). However, statistical models such as these nonetheless can be powerful and useful representations, with definite clinical and scientific utility.

Adding an artistic touch and making the visualization aesthetically pleasing to the eye hopefully serves to enhance our appreciation for the natural beauty of the active brain.

antillipsi
Автор

Watching this after a decent 8 hour work day is pretty refreshing ;)

fav
Автор

Neuroscientists create Glass Brain software - pretty incredible!

ahier
Автор

Cant wait to see the resolution of this evolve..

richlarow
Автор

Everything you will ever feel, ever think, and ever experience, happens here.

VexylObby
Автор

Наверное это мозг пионера возвращаюшегося со служения и предвкушающего вкусный ужин в кругу семьи!

piter
Автор

This requires music.  I'm listening to Stravinsky's Rite of Spring and it's working pretty nicely.

oneguycoding
Автор

That is honestly one of the "nice"est things I've ever seen


skahler
Автор

this what I think neurofeedback in the next year will be

RubanauAliaksei
Автор

Given the way EEG works, this is more artistic license than an actual representation of neural firing.

michaelbone
Автор

Scientific art work -- it has a strange beauty to it.

zarkoff
Автор

Well, I found my new, favourite screensaver. If only I knew how to do that

RQAlice
Автор

Very interesting, would love to see someone elaborate on the meaning, the significance of these brain activities.

DrCharlesParker
Автор

What frequencies do each of the colours represent? A great animation

chrisstreet
Автор

Incredible nice! Its not just science, its art.
But PLEASE, make it longer (more angles) and with sound-effects pattern (according to neuronal action)

lukewright
Автор

Hello, might I use a short fragment of this video in an education cartoon about learning? With credits to your YT channel and web site, of course.

DavideZaccaria
Автор

I have a question: isnt the activity visually represented purely infered? EEG has notoriously bad spacial resolution. How is the EEG data coupled to the (rather detailed) pathways in the brain that it corresponds with? isnt it an assumed (at least as far as the spacial resolution goes) model of neural activity, not a realtime one?

I get the feeling that this technology looks cool, but doesnt really add anything new scientifically. The software merely combines existing techniques. Does it open doors to new ways of neuroscientific research?

Enothrae
Автор

Watching your own brain activity in real time and in virtual reality could trigger something new and unexpected. No need of extra stimulus, isn't the brain known for stimulating itself?

yafiselim
Автор

watching your own brain activity in real time and in virtual reality could trigger something new and unexpected. No need of extra stimulus, isn't the brain known for stimulating itself?

yafiselim
Автор

Человеческое тело и мозг одно из совершеннейших творений вселенной. И те удивительные способности, которые сейчас наблюдаются у людей, прямое тому доказательство. Кто то не верит в это, кто то наоборот ищет возможность и способы развить в себе эти способности. Но правда заключается в том, что каждый человек на земле от природы наделен этими способностями, которые иногда он наблюдает у других людей. Просто эти способности в нем не активированы. Раньше ученые чтобы выяснить в чем же разница, между обычным человеком и человеком со сверх способностями.  Изучали и сравнивали строение мозга, обычного человека и человека со сверх способностями. И оказалось, что они не чем не отличаются. Разница между обычным человеком и человеком со сверх способностями заключается в сознании которым обладает человек. Тело является всего лишь носителем этого сознания. Если проводить аналогию с компьютерами. Человеческое тело и мозг является самим компьютером (монитором, системным блоком, клавиатурой, мышью) а сознание является операционной средой Windows, Linux или какой либо другой. И если продолжать аналогию с компьютерами. То от природы ваш мозг наделен всеми способностями, которые вы только можете себе представить, что делает его совершенным компьютером. И чтобы использовать все его способности, вам необходимо только программное обеспечение, для активизации тех удивительных способностей, которые заложены в вас природой. Программным обеспечением для вашего сознания, являются знания и информация, которую вы можете применить на практ

ruslanhancock
join shbcf.ru