Welcome to
Science a GoGo's
Discussion Forums
Please keep your postings on-topic or they will be moved to a galaxy far, far away.
Your use of this forum indicates your agreement to our terms of use.
So that we remain spam-free, please note that all posts by new users are moderated.


The Forums
General Science Talk        Not-Quite-Science        Climate Change Discussion        Physics Forum        Science Fiction

Who's Online Now
0 members (), 181 guests, and 2 robots.
Key: Admin, Global Mod, Mod
Latest Posts
Top Posters(30 Days)
Previous Thread
Next Thread
Print Thread
Joined: Oct 2004
Posts: 334
K
Kate Offline OP
Senior Member
OP Offline
Senior Member
K
Joined: Oct 2004
Posts: 334
Well, it does sound mysterious. Apparently, researchers are still "trying to characterize the chip" whatever that means.

Google says it has developed a kind of quantum computer capable of identifying objects that appear in digital photos and videos. According to the company, the system outperforms the classical algorithms running across its current network of worldwide data centers.

Hartmut Neven, Google technical lead manager for image recognition, recently unveiled the company's ongoing quantum computing work with a post to the company's research blog, saying he was due to demonstrate the technology at last week's Neural Information Processing Systems conference in Vancouver.

"Many Google services we offer depend on sophisticated artificial intelligence technologies such as machine learning or pattern recognition," Neven writes. "If one takes a closer look at such capabilities one realizes that they often require the solution of what mathematicians call hard combinatorial optimization problems. It turns out that solving the hardest of such problems requires server farms so large that they can never be built.

"A new type of machine, a so-called quantum computer, can help here."

Harmut Neven joined Google in 2006, when the web giant acquired his image search startup, Neven Vision. In 2007, at the SC07 supercomputing conference, Neven joined D-Wave in demonstrating the Canadian company's alleged quantum computer, and Neven now confirms that Google has spent the past three years working in tandem with D-Wave on a quantum system designed to identify images.

"Google has studied how problems such as recognizing an object in an image or learning to make an optimal decision based on example data can be made amenable to solution by quantum algorithms," he says. "These algorithms promise to find higher quality solutions for optimization problems than obtainable with classical solvers."

With a classical computer, the basic storage unit is the bit. Each bit is stored in a classical system - such as a capacitor - and the value of each bit is either a 1 or a 0. A charged capacitor might be a 1, an uncharged capacitor a 0.

But even if you're pooling a worldwide network of servers, classical computing has its limits. There are only so many 1s and 0s you can juggle at any given time. In the early 80s, American physicist/celebrity boffin Richard Feynman proposed the idea of a computer based on quantum mechanics - a system that would use the principles associates with very small particles to break through the glass ceiling of classical computing - and over the last quarter century, scientists have struggled to actually build the thing.

A quantum computer stores information in qubits (yes, quantum bits). These aren't stored in a capacitor or any other classical system. They're stored in a quantum system, such as, say, the magnetic spin of an atomic nucleus. An "up" spin might indicate a 1, a "down" spin a 0.

The trick is that such a system benefits from quantum superposition. The magnetic spin of that nucleus can be both up and down at the same time. Where two bits can only hold one of four possible values (00, 01, 10, and 11), two qubits can hold all four values at once.

The rub is that if a quantum system interacts with the classical world, it decoheres and randomly collapses into just one of its possible states. Two qubits may be able to hold four values at once, but if you actually try to read them, you're left with only one of the four. If you read a qubit, it's no longer a qubit.

Countless projects have proposed ways around this rather inconvenient reality, and some have succeeded - on a very small scale. In February 2007, D-Wave demonstrated what it called a 28-qubit quantum computer, tapping a quantum mechanical concept known as the adiabatic theorem. In essence, the theorem says that a quantum system can adapt to very gradual changes to its environment - as opposed to rapid changes.

Many have questioned whether D-Wave's machine is truly a quantum computer. And Google's Neven at least acknowledges the controversy. "[The D-Wave] design realizes what is known as the Ising model, which represents the simplest model for an interacting many-body system, and it can be manufactured using proven chip fabrication methods," he says.

"Unfortunately, it is not easy to demonstrate that a multi-qubit system such as the D-Wave chip indeed exhibits the desired quantum behavior and experimental physicists from various institutions are still in the process of characterizing the chip."

Nonetheless, Google sees value in the project, and after three years of work, the search giant cum world power says it has achieved significant performance gains in certain types of image recognition. Its quantum adiabatic algorithms are based on those discovered by MIT physicist Edward Farhi and his fellow collaborators.

"There are still many open questions but in our experiments we observed that this detector performs better than those we had trained using classical solvers running on the computers we have in our data centers today," Neven writes.

"Besides progress in engineering synthetic intelligence we hope that improved mastery of quantum computing will also increase our appreciation for the structure of reality as described by the laws of quantum physics."

http://www.theregister.co.uk/2009/12/15/google_quantum_computing_research/

.
Joined: Jun 2005
Posts: 1,940
T
Megastar
Offline
Megastar
T
Joined: Jun 2005
Posts: 1,940
Wow! Just WOW!

Joined: Feb 2007
Posts: 1,840
R
Megastar
Offline
Megastar
R
Joined: Feb 2007
Posts: 1,840
Well, yes, it could turn out to be a "Wow!" case. On the other hand, it seems that the educated skepticism regarding the viability of D-Wave's adiabatic technology is unabated.

http://www.technologyreview.com/blog/post.aspx?bid=349&bpid=24543


"Time is what prevents everything from happening at once" - John Wheeler

Link Copied to Clipboard
Newest Members
debbieevans, bkhj, jackk, Johnmattison, RacerGT
865 Registered Users
Sponsor

Science a GoGo's Home Page | Terms of Use | Privacy Policy | Contact UsokÂþ»­¾W
Features | News | Books | Physics | Space | Climate Change | Health | Technology | Natural World

Copyright © 1998 - 2016 Science a GoGo and its licensors. All rights reserved.

Powered by UBB.threads™ PHP Forum Software 7.7.5