Quantum Computing about to take a Leap forward?

SGaynor

Well-known member
Dec 6, 2006
7,148
162
52
Bristol, TN
Putting on my skeptical scientist hat.....

While cool, don't belive the hype about mass production and changing the world. Seems like they run as superconductors, which means sub-liquid nitrogen temps (-196* C/-321* F). Might be useful for supercomputers, but the average person won't see one.
 

p m

Administrator
Staff member
Apr 19, 2004
15,634
864
58
La Jolla, CA
www.3rj.org
SGaynor said:
Putting on my skeptical scientist hat.....

While cool, don't belive the hype about mass production and changing the world. Seems like they run as superconductors, which means sub-liquid nitrogen temps (-196* C/-321* F). Might be useful for supercomputers, but the average person won't see one.
It is probably a matter of time when a LN2 coolers will be produced on a mass scale and cheaply. The low temperature itself is not that big of a deal - what I am curious about is what will be left of the device if the cooler breaks.
 

knewsom

Well-known member
Jul 10, 2008
5,262
0
La Mancha, CA
Even if it's something that doesn't make it to the consumer market immediately, or even for a long time, just a small number of these could have a pretty marked effect on our ability to run computer based analysis and models.
 

p m

Administrator
Staff member
Apr 19, 2004
15,634
864
58
La Jolla, CA
www.3rj.org
The article is just another hype piece - like this statement here:
"a 250 qubit computer would contain more data than there are particles in the universe."
It is soooo stupid.
 

hilltoppersx

Well-known member
Jun 29, 2010
1,010
0
NY
www.nelrc.com
according to Moore's Law 15 years is about right. Quantum computing is as interesting as nanotech. I can't wait for the Singularity. Forget taking some stupid pill shoot me full of nanobots and let them go to town analyzing my innards and keeping me alive for eternity.
 

Blue

Well-known member
Mar 26, 2004
10,057
870
AZ
knewsom said:
Even if it's something that doesn't make it to the consumer market immediately, or even for a long time, just a small number of these could have a pretty marked effect on our ability to run computer based analysis and models.

Great, maybe they'll finally realize that the global warming models are shit.
 

Mike_Rupp

Well-known member
Mar 26, 2004
3,604
0
Mercer Island, WA
Blue said:
Great, maybe they'll finally realize that the global warming models are shit.

The phrase "garbage in, gospel out" come to mind.

There are so many variables to consider when creating a model like that, conceptually it approaches trying to model the world itself. Yet, we have a few "scientists" that create a very simplistic model that concludes we have man made global warming, and nitwits believe it like it is the spoken word of God.
 

p m

Administrator
Staff member
Apr 19, 2004
15,634
864
58
La Jolla, CA
www.3rj.org
knewsom said:
Even if it's something that doesn't make it to the consumer market immediately, or even for a long time, just a small number of these could have a pretty marked effect on our ability to run computer based analysis and models.
Actually, Kris, this is exactly what's wrong with the science these days - "instant gratification" from computer simulations.

On the other hand, it never ceases to amaze me how much the "old ones" learned simply from observations; they really had patience (something we as a society cannot accept), and good memory of the events.
 

Tugela

Well-known member
May 21, 2007
4,763
564
Seattle
SGaynor said:
sub-liquid nitrogen temps (-196* C/-321* F). Might be useful for supercomputers, but the average person won't see one.

I think something like this would be useful for my 4.0 GEMS cooling system so I never have to do another head gasket job. If you could also hook it up to an onboard fridge you could have instantly chilled beer, too. The possibilities are endless.
 

knewsom

Well-known member
Jul 10, 2008
5,262
0
La Mancha, CA
p m said:
Actually, Kris, this is exactly what's wrong with the science these days - "instant gratification" from computer simulations.

On the other hand, it never ceases to amaze me how much the "old ones" learned simply from observations; they really had patience (something we as a society cannot accept), and good memory of the events.

I completely agree that patience to learn from observation and experimentation rather than just running a computer model and drawing conclusions is incredibly important - and especially agreed that patience in our society is at an all-time low, which greatly complicates proper research and development (not to mention funding). I don't think that the development of markedly more sophisticated computing technology (in this instance, a massive generational leap) is going to invalidate the scientific method or need for real-world experimentation, observation, and testing. I do think it'll likely improve the accuracy of computer simulation at a rapid pace because we'll be able to use more variables and more complex equations than ever - but our scientists will have to conceive of those variables as well, and still test the results in the real world. It could certainly help our experimentation be better directed, not to mention increase the speed at which we make major discoveries.

The major cool factor for me is the notion that we're nearing a functional type of computer that is infinitely more complex in its operation, and functions in an entirely different way than what we've been using. It's not a typical "faster cpu". It's not a home-run, it's a touchdown.
 

SGaynor

Well-known member
Dec 6, 2006
7,148
162
52
Bristol, TN
p m said:
Actually, Kris, this is exactly what's wrong with the science these days - "instant gratification" from computer simulations.

On the other hand, it never ceases to amaze me how much the "old ones" learned simply from observations; they really had patience (something we as a society cannot accept), and good memory of the events.

Spoken like a true experimentalist.:applause: I salute you!:patriot:

Kris' comments about computing are true, SCIENTISTS will get closer to predicting nature and how it works (chemical reactions, tectonic plate movements, weather/climate, etc). But the natural world is so much more complicated, it is (IMO) impossible to account for every variable that may play a role in the outcome; this is made worse when trying to predict 75+ years out.

ENGINEERING on the other hand is much more predictable, as it evaluates man made systems.

That's why we have computer programs (finite element analysis) that do a pretty good job of predicting how a car chassis will behave, but can't really design a program to predict chemical reactions (both the desired ones, and the side reactions).
 

p m

Administrator
Staff member
Apr 19, 2004
15,634
864
58
La Jolla, CA
www.3rj.org
Funny you should mention that, Scott.

In my two years as a ME grad student in Detroit, I saw that the only things being worked on were crashworthiness (read: computer simulation) and emissions (read: computer simulations). Nobody cared for making shit that actually drove and looked so people liked it.
But the research I was doing was also a computer simulation - an off-shoot of Fermi-Pasta-Ulam work. And that's where computing performance made (or could have made) a difference between a right and wrong conclusion about some very general concepts in physics.
 

I HATE PONIES

Well-known member
Aug 3, 2006
4,864
0
knewsom said:
I completely agree that patience to learn from observation and experimentation rather than just running a computer model and drawing conclusions is incredibly important - and especially agreed that patience in our society is at an all-time low, which greatly complicates proper research and development (not to mention funding). I don't think that the development of markedly more sophisticated computing technology (in this instance, a massive generational leap) is going to invalidate the scientific method or need for real-world experimentation, observation, and testing. I do think it'll likely improve the accuracy of computer simulation at a rapid pace because we'll be able to use more variables and more complex equations than ever - but our scientists will have to conceive of those variables as well, and still test the results in the real world. It could certainly help our experimentation be better directed, not to mention increase the speed at which we make major discoveries.

The major cool factor for me is the notion that we're nearing a functional type of computer that is infinitely more complex in its operation, and functions in an entirely different way than what we've been using. It's not a typical "faster cpu". It's not a home-run, it's a touchdown.

Cliffs? I don't have time to read all that shit.