Goodbye exaflop processing...hello quantum compute
4 posters
Page 1 of 1
Goodbye exaflop processing...hello quantum compute
This is why all public encryption codes can be easily broken within a matter of seconds at this point by the NSA
Let me explain. example is a 3 transistor qubit processor vs. a 3 transistor binary (classical computer)
First we'll do the binary transistors breaking a code. They exist in on\off states or mathematically simply 1 or 0. Since we only have three transistors we could turn some on and some off, but we would end up with only 8 combinations. We reset things 8 times and we find 8 possibilities.
off off on
on off off
off on off
off off off
on on on
on on off
on off on
off on on
Now let’s look at three qubit doing the exact same scenario to see how many arrangements there are. These exist in one or zero or alternatively both at the same time. Let's map out the possible combinations a three qubit processor could make
qubit qubit qubit
Since quantum computer is already in all 8 states at once, there is only one configuration. You don't have to reset it into all scenarios since it is already in all 8 of them.
Now at many more qubits and break public encryption codes...
Make a processor with 512 qubits. Then you can break 2^512 power encryption codes...
Let me explain. example is a 3 transistor qubit processor vs. a 3 transistor binary (classical computer)
First we'll do the binary transistors breaking a code. They exist in on\off states or mathematically simply 1 or 0. Since we only have three transistors we could turn some on and some off, but we would end up with only 8 combinations. We reset things 8 times and we find 8 possibilities.
off off on
on off off
off on off
off off off
on on on
on on off
on off on
off on on
Now let’s look at three qubit doing the exact same scenario to see how many arrangements there are. These exist in one or zero or alternatively both at the same time. Let's map out the possible combinations a three qubit processor could make
qubit qubit qubit
Since quantum computer is already in all 8 states at once, there is only one configuration. You don't have to reset it into all scenarios since it is already in all 8 of them.
Now at many more qubits and break public encryption codes...
Make a processor with 512 qubits. Then you can break 2^512 power encryption codes...
Acrobat- Number of posts : 316
Re: Goodbye exaflop processing...hello quantum compute
Acrobat wrote:This is why all public encryption codes can be easily broken within a matter of seconds at this point by the NSA
Oh Acro. If the NSA could break all current forms of encryption they wouldn't secure their own Top Secret level stuff using AES256. Let me put it this way: If a government agency invented a handgun that could penetrate any and all existing materials no matter how thick (bullet proof vests, tanks, schools, planetariums, invisible men, etc), they would also invent a new kind of armor to protect themselves from that very same gun in case it falls into enemy hands.
The mere fact that they still use AES128 and AES256 means those two algorithms are reasonably secure from attack even by their own technology.
Quantum computing on any practical scale is still a ways off from being a reality. Both of those videos you referenced speak of the future implications of quantum computing, not the current ones.
As soon as any government agency or public corporation (Google is launching a lab that will contain one eventually) has a quantum computer capable of threatening the security of AES128 or AES256, you will see the NSA reclassify their Top Secret level stuff with a new algorithm.
Re: Goodbye exaflop processing...hello quantum compute
"The mere fact that they still use AES128 and AES256 means those two algorithms are reasonably secure from attack even by their own technology."
Perhaps safe from D-Wave 256 bit. How do you know they haven't increased their encryption level with the arrival of the new D-Wave which would make sense.
Perhaps safe from D-Wave 256 bit. How do you know they haven't increased their encryption level with the arrival of the new D-Wave which would make sense.
Acrobat- Number of posts : 316
Re: Goodbye exaflop processing...hello quantum compute
Because they haven't even begun a new series of tests on new encryption algorithms? The last time they revised their encryption algorithms was when DES was being replaced by AES. It was a five year process involving security experts from all over the world, not something done in secret. AES was not invented by the NSA. It was the Rijndael algorithm invented by Joan Daemen and Vincent Rijmen. Twofish was one of the other finalists. Rijndael was selected and renamed to AES. If they were involved in selecting a new algorithm, security experts around the world would be competing once again to have their algorithm chosen.
Aside from that, I think you're a little misinformed about the D-Wave computer and its capabilities. Everything you're saying about quantum computing is based on the theoretical capabilities of a quantum computer. All current research indicates that "ideal" quantum computer still requires a major breakthrough in physics. The D-Wave may be faster at certain tasks than a regular old computer (even that is debatable apparently), but at others regular old computers match or surpass its computing speeds. It simply is not the breakthrough you think it is. Read wikipedia:
Aside from that, I think you're a little misinformed about the D-Wave computer and its capabilities. Everything you're saying about quantum computing is based on the theoretical capabilities of a quantum computer. All current research indicates that "ideal" quantum computer still requires a major breakthrough in physics. The D-Wave may be faster at certain tasks than a regular old computer (even that is debatable apparently), but at others regular old computers match or surpass its computing speeds. It simply is not the breakthrough you think it is. Read wikipedia:
D-Wave was originally criticized by some scientists in the quantum computing field. However, on May 16, 2013 NASA and Google, together with a consortium of universities, announced a partnership with D-Wave to investigate how D-Wave's computers could be used in the creation of artificial intelligence. Prior to announcing this partnership, NASA, Google, and Universities Space Research Association put a D-Wave computer through a series of benchmark and acceptance tests which it passed.[2] Independent researchers found that D-Wave's computers can solve some problems as much as 3,600 times faster than particular software packages running on digital computers.[2] Other independent researchers found that different software packages running on a single core of a desktop computer can solve those same problems as fast or faster than D-Wave's computers (at least 12,000 times faster for Quadratic Assignment problems, and between 1 and 50 times faster for Quadratic Unrestrained Binary Optimization problems).[32]
In 2007 Umesh Vazirani, a professor at UC Berkeley and one of the founders of quantum complexity theory, made the following criticism:[33]
Their claimed speedup over classical algorithms appears to be based on a misunderstanding of a paper my colleagues van Dam, Mosca and I wrote on "The power of adiabatic quantum computing." That speed up unfortunately does not hold in the setting at hand, and therefore D-Wave's "quantum computer" even if it turns out to be a true quantum computer, and even if it can be scaled to thousands of qubits, would likely not be more powerful than a cell phone.
Wim van Dam, a professor at UC Santa Barbara, summarized the scientific community consensus as of 2008 in the journal Nature Physics:[34]
At the moment it is impossible to say if D-Wave's quantum computer is intrinsically equivalent to a classical computer or not. So until more is known about their error rates, caveat emptor is the least one can say.
An article in the May 12, 2011 edition of Nature gives details which critical academics say proves that the company's chips do have some of the quantum mechanical properties needed for quantum computing.[35][36] Prior to the 2011 Nature paper, D-Wave was criticized for lacking proof that its computer was in fact a quantum computer. Nevertheless, questions remain due to the lack of conclusive experimental proof of quantum entanglement inside D-Wave devices.[37]
MIT professor Scott Aaronson, self-described "Chief D-Wave Skeptic", originally said that D-Wave's demonstrations did not prove anything about the workings of the computer. He said that a useful quantum computer would require a huge breakthrough in physics, which has not been published or shared with the physics community.[38] Aaronson in May 2011 updated his views, announcing that he was "retiring as Chief D-wave Skeptic" in 2011,[39] and reporting his "skeptical but positive" views based on a visit to D-Wave in February 2012. Aaronson alleged one of the most important reasons for his new position on D-Wave was the aforementioned article in Nature.[37][40][41] In May 16 2013 again he resumed his skeptic post. He now criticizes D-Wave for blowing results out of proportion on press releases that claim speedups of three orders of magnitude, while at the same time a recently published paper by scientists from ETH Zurich that had access to a 128 qbit D-Wave computer outperformed it by a factor of 15 using regular digital computers and applying classical Metaheuristics (particularly simulated annealing) to the problem that D-Wave's computer is specifically designed to solve.[21]
Re: Goodbye exaflop processing...hello quantum compute
BTW- If D-Wave works it's better at code breaking than conventional computers that have exponentially more transistors.
512 transisters in a binary chip would be one of those giant warehouse computers that now exist in a hand held calculator. Intels chips have billions of transistors in their newer stuff...there newest having 2.5 billion transistors.
"
As of 2012, the highest transistor count in a commercially available CPU is over 2.5 billion transistors, in Intel's 10-core XeonWestmere-EX. Xilinx currently holds the "world-record" for an FPGA containing 6.8 billion transistors."
A Quantum chip with only 512 transistors crushes those in code breaking ability.
And I don't think I am overstating anything. No publically available encryption is safe at the present time at least from government.
The D-Wave 512 costs 10's of millions of dollars and has to be run at close to absolute zero to function.
512 transisters in a binary chip would be one of those giant warehouse computers that now exist in a hand held calculator. Intels chips have billions of transistors in their newer stuff...there newest having 2.5 billion transistors.
"
As of 2012, the highest transistor count in a commercially available CPU is over 2.5 billion transistors, in Intel's 10-core XeonWestmere-EX. Xilinx currently holds the "world-record" for an FPGA containing 6.8 billion transistors."
A Quantum chip with only 512 transistors crushes those in code breaking ability.
And I don't think I am overstating anything. No publically available encryption is safe at the present time at least from government.
The D-Wave 512 costs 10's of millions of dollars and has to be run at close to absolute zero to function.
Acrobat- Number of posts : 316
Re: Goodbye exaflop processing...hello quantum compute
I'll say it again. Everything you're saying has to do with the theoretical capabilities of quantum computers, not the present real capabilities. You need to do some more reading that doesn't come from D-Wave's marketing department.
Re: Goodbye exaflop processing...hello quantum compute
Let's just forgot about all this and talk about something else...no need to be like this: because of this:
Re: Goodbye exaflop processing...hello quantum compute
Meh, I already cracked the encryption on vader's suit and de-helmeted him.
Re: Goodbye exaflop processing...hello quantum compute
D-Wave's Quantum Computer Proven To Be Real Thing (Almost): does not obey classical physics
University of Southern California published a paper that comes that much closer to showing the D-Wave is indeed a quantum computer.
Our research rules out one type of classical model that has been argued as a proper description of the D-Wave machine. A lot of people thought that when D-Wave came on the market their machine was just doing that, but we ruled that out.’
http://www.wired.com/wiredenterprise/2013/06/d-wave-quantum-computer-usc/
University of Southern California published a paper that comes that much closer to showing the D-Wave is indeed a quantum computer.
Our research rules out one type of classical model that has been argued as a proper description of the D-Wave machine. A lot of people thought that when D-Wave came on the market their machine was just doing that, but we ruled that out.’
http://www.wired.com/wiredenterprise/2013/06/d-wave-quantum-computer-usc/
Acrobat- Number of posts : 316
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum