New Quantum Computing Approach Makes Rapid Progress

preview_player
Показать описание

Physicists have shattered previous limits of the new technology of "atoms in tweezers". They have collected more than 1000 atoms that could save as qubits, hinting at a scalable future for the technology. This breakthrough, challenges the currently more dominant qubit methods and signals a significant step towards commercially viable quantum computing.

🔗 Join this channel to get access to perks ➜

#science #sciencenews #technews #technology
Рекомендации по теме
Комментарии
Автор

The big advantage with quantum computing news is that it can be in a good news state and bad news state simultaneously.

earlofdoncaster
Автор

Atoms in Tweezers was my band back in the nineties.

nortuber
Автор

Just glue 10 arrays of 100, 000 atoms together with quantum epoxy.

AnnNunnally
Автор

Sabine playing with our emotions, taking away our hopes then giving it back

marcognudi
Автор

On the topic of physics and lasers, I was asked last week "what will you do with the laser?". My response was "I plan to tickle electrons with the laser. Enough to get them excited, but just enough to leave them in a state of frustration".

AnthonyDavid
Автор

There's probably also another metric between quantum computing and conventional computing, and that's the factor of how quickly the research department can blow a hole in your budget.

rich
Автор

Remember that the real problem with QC is getting consistently valid data out of the noise which scales up faster than the useful data. There are promising projects working on that, but I'm personally doubtful that we'll get more than extremely specialized applications using relatively small numbers of qbits concurrently without noise becoming an overwhelmingly significant hurdle.

AaronSherman
Автор

There is a lot of promise of these Rydberg or neutral atom quantum computers. They are used in an analog mode, however, with refined and individual lasers for each atom, they can be used for gate operations. The fact that the qubits can be placed in a 2D geometry, allows it to be customized for the problem, and slight changes in geometry can allow for the measurement of evolving results. But they are very different from the IBM superconducting or IonQ ion traps.

AlignedIT
Автор

Question: as far as I can see, the only thing these microarray lenses do, is replace eg a 100 lasers by 1 if you use a 10x10 array.
But how are they going to influence the states of each atom separately? How are they going to combine the state of one atom with one other atom to make calculations? How are they going to read out the states of the changed atoms? That's what I'll be thinking about today.
To put it his way: they got faster to the foot of the mountain, but will they get up the mountain?

thebooksthelibrarian
Автор

The first vacuum tubes were invented in 1904, first vacuum tube computation in 1939, semiconductors in 1940, photolithography was invented in 1958, microprocessor revolution in the 1970s. That’s 70 years to get to anything like we know of computers today. First quantum computer was in 1998… we’re 26 years in to what could take a long time. Enjoy the steps of progress that we do make.

brnto
Автор

5:38 - the crossover I never knew I needed. Got a big laugh out of me

jason
Автор

I'm so burned with tech hype that I will wait and see.

MrAlanCristhian
Автор

In the 1960s I read a short story (possibly written in the 1950s ?) by Isaac Asimov. The basis of memory and computing in Asimov's SF short story was "the nudged electron"... nearly there!

judewarner
Автор

Fascinating indeed! Thanks, Sabine! 😃
Stay safe there with your family! 🖖😊

MCsCreations
Автор

Does this mean we can combine those atoms to molecules to stuff like a star-trek replicator, I really want a star-trek replicator. Sabine can we have a star-trek replicator,

MrTheoJ
Автор

The uncertainty principle of quantum news states that we cannot know the speed toward progress and actual progress at the same time!

Narcissus-qn
Автор

Sabine’s opinions for quantum computing seem to be subject to quantum fluctuations. The wave function collapses with each update.

kurtiserikson
Автор

I feel like your videos are a scientific emotional rollercoaster. Clever marketing!

tommiest
Автор

People working with quantum computers are encouraged to run the following test to determine if the computer is quantum.
1) Initialize the input qubit. It could be more than a single qubit, for example, several quantum gates.
2) Determine the qubit's state or output of quantum gates.
3) Repeat steps 1 and 2 a reasonably large number of times, say 10000 to
4) Plot the distribution of acquired data.

The computer is probabilistic, quantum, and useless if the distribution is wide. Because of its probabilistic nature, any qubit cannot be initialized in a reproducible way to the same value. This cannot be done.
Any existing binary computer would produce an extremely narrow distribution. So, there are no issues with supercomputers, desktops, laptops, tablets, cell phones, digital watches, etc.

Vatsek
Автор

I assume Sabine is referring to classical Monte Carlo simulations that can be quadratically speeded up using a quantum computer. If you don’t know, the problem with MC simulation on a large data set with many iid variables is that you need to estimate the mean, with an additive error, in a confidence interval up to 99%. This requires that to get the number of samples, n, necessary to get the confidence interval at 99% to be big o is the variance squared divided by the additive error squared. So, if you want to decrease the additive error by a factor of ten, you need to increase the iteration by (10 ^ 2). This is obviously computationally expensive.

To do this on a quantum computer requires that this procedure is done using amplitude estimation. I don’t understand how this is done. Am going to read about this. But, a video from Sabine on the details of how this is done would help.

posthocprior