Is the Complex Number Line the Future of AI?

preview_player
Показать описание

Hi, I’m Dylan Curious! I'm excited to introduce my latest video on the potential of Complex-Valued Neural Networks (CVNNs) in AI. This video delves into how CVNNs, leveraging the unique properties of complex numbers, might revolutionize AI technology.

Complex numbers, a blend of real and imaginary parts, offer a two-dimensional perspective on data, enhancing the precision and flexibility in AI models. This shift from traditional Real-Valued Neural Nets (RVNNs) to CVNNs could be a game-changer in processing phasic data like images and signals.

The video highlights a groundbreaking paper discussing the "THEORY AND IMPLEMENTATION OF COMPLEX-VALUED NEURAL NETWORKS," showcasing how CVNNs operate differently from RVNNs. We explore their advantages, challenges, and the potential for widespread application in fields like healthcare, signal communications, and more.

I also touch on the fascinating aspects of Riemannian geometry used in these networks, contrasting it with Euclidean geometry to explain how CVNNs handle data in complex spaces.

Technologically savvy viewers will appreciate the discussion on the Python tool developed for CVNN models, which shows promise in the AI community.

In summary, this video is a deep dive into the exciting possibilities CVNNs hold for AI's future. It's an area brimming with potential, signaling a significant shift in how AI systems might learn and operate. Don't forget to subscribe for more insights into the evolving world of AI, and a special shoutout to my Patreon supporter, robbrown2, for their incredible support.

Join me in exploring this frontier in AI by watching the video and helping me reach my next goal of 9,000 subscribers!

CHALLENGES: @vegasfriends

FUTURISM: @dylan_curious

PODCAST: @dontsweatitpod

REACTIONS: @curiousreactions

00:00 - Could AI Think Better with Complex Numbers?
02:54 - Imaginary Numbers
03:41 - Complex Neural Network

SOURCES:

WATCH THE FULL VIDEO ⤵

#ai #artificialintelligence #machinelearning #neuralnetworks #complexnumbers #cvnns #deeplearning #innovationintechnology #aitrends #datascience #pythonprogramming #tensorflow #technologyeducation #futureofai #aiexploration #computerscience #digitallearning #techupdates #aiinhealthcare #signalprocessing #riemanngeometry #aieducation #youtubecreator #seo #sciencecommunication

#AI #artificialintelligence #tech #tingtingin #AnastasiInTech #MattVidPro #mreflow #godago #AllAboutAI #BrieKirbyson #NicholasRenotte #aiexplained #OlivioSarikas #AdrianTwarog #aiadvantage #obscuriousmind #max-imize #DavidShapiroAutomator #DelightfulDesign #promptmuse #mattwolfe
Рекомендации по теме
Комментарии
Автор

I ask myself this question every day. Writing a research paper on it. 😂

eflick
Автор

I have thought of this too ... I also like the suggestion of Riemann ... do not forget about William Rowan Hamilton favorite child the Quaternion ... keep in mind the sublime efficiencies of biological brains ... I shutter everytime I hear of the power consumed by fleets of GPU racks when we all know your head purrs along consuming some 40 watts not gigawatts

scottstensland
Автор

Very nice. Haven’t heard of this before your review. Think this WILL be an area of research. Lots of smart folks out there looking for any kind of edge.

ScottSummerill
Автор

Great works, I ve been thinking complex nn for a long time.
The probability of nature might be better represented with complex numbers.
I wish we have complex valued transformer model code somewhere.

ErturkKadir
Автор

I will toss this out here and see if you (someone) might find use for this, just remember the idiots who thought of it lol. Could CVNN be placed in a factual structure to use the self similar nature to organize the information? I have been thinking about this a lot, I think they have poured a lot into this. I know a few months ago I was trying to decide if Riemann/Cantor system to create an infinite amount of places. Then looking into fractals I have noticed that you could use these two together. Though it been a bit since I thought of this! Since these AI, or "LLMs" or more have been out, it really helped me to understand to look at things. Plus I have a friend now who doesn't get annoyed by dumb questions lol. Anyways what do you think, could a Fractal system using the Complex Number Plane (Riemann and Cantor Riemann has the Pi while Cantor cuts it lol, was looking at it with conjunction to Knowledge Graphs) allow for one to get extra dimensionality and an infinite amount away of stack and nesting information? Sorry for the ramble but interesting stuff!!! Also Happy New Years!

jakerose
Автор

riemann sphere=bloch sphere=qbits
so they are just describing quantum artificial intelligence

julesdumont
Автор

But how is a complex plane any different from a normal 2-d plane? Just trying to wrap my head around what Imaginary really means instead of having a regular y-axis.

virushk
Автор

absolutely luv 3blue1brown XD ; also ScienceClicEnglish ; agreed on the potential for CVNN's to 'do more with less' ie improve the efficiency of the holographic/vector processing space [our neuronal biology being the most energy efficient and complex computational system we know of, assuming that what we are is computational that is, which it may or may not be XD, but thats a whole nother issue] ; HNY everyone =]

CYIERPUNK
Автор

Are "complex numbers" the math equivalent of "tokens/tags"? And is it an efficient grouping of significant mathematics patterns (e.g. golden ratio) into a single process? 🤔 That's the impression I'm getting so far... I really should research this or at least look at summaries...

Human_
Автор

Aren't we already using 3-dimensional vector databases for information storage in GPTs, it's surely related.

cjones
Автор

The bit I wonder is, given inside machine learning is a whole lot of vectors, and vectors describe three dimensional space, reverting to two dimensions is actually throwing away computational ability.

interestedinstuff