News Feed Item

Latest Research Validates Quantum Entanglement in D-Wave Systems

Further Evidence of Quantum Nature of D-Wave System

BURNABY, BC -- (Marketwired) -- 05/30/14 -- Research published today presents groundbreaking evidence verifying the presence of entanglement in D-Wave's commercially available quantum computer. The paper entitled "Entanglement in a quantum annealing processor" authored by scientists at D-Wave and the University of Southern California has been published in the peer-reviewed journal Physical Review X (PRX).

The results of the research prove the presence of an essential element in an operating quantum computer: entanglement. This is when the quantum states of a collection of particles (or qubits) become linked to one another. The research demonstrates entanglement of a two and eight-qubit subsection of one of D-Wave's 512 qubit processors, a record number for a solid state quantum processor, throughout the critical stages of a quantum annealing algorithm.

Dr. Federico Spedalieri of USC Viterbi Information Sciences Institute and co-author of the paper played a crucial role developing the framework for this research. "There's no way around it. Only quantum systems can be entangled. This test provides the experimental proof that we've been looking for," said Dr. Spedalieri.

"The research published in PRX is a significant milestone for D-Wave and a major step forward for the science of quantum computing. The findings are further proof of the quantum nature of our technology," said Vern Brownell, CEO of D-Wave. "Building and improving the science of our technology in collaboration with the greater scientific community is important to us and we'll continue to conduct research that enables us to better understand the characteristics and power of our quantum processor."

The PRX paper provides four levels of evidence that the eight-qubit unit cell is entangled including:

(a) a demonstration of an avoided crossing of two energy levels,
(b) a partial restoration of a density matrix of the system with calculations of standard entanglement measures,
(c) calculations of an entanglement witness using measured populations and energy spectra of the system,
(d) measurements of a susceptibility-based entanglement witness, which reports entanglement of the ground state.

These findings demonstrate entanglement within D-Wave's processors at the most critical stages of the quantum annealing procedure.

D-Wave will perform additional research that address the extent of spatial entanglement and will also continue to explore the computational advantages of quantum algorithms. D-Wave has published more than 70 peer-reviewed papers to date.

The paper published today is available here on the PRX website.

About D-Wave Systems Inc.
Founded in 1999, D-Wave's mission is to integrate new discoveries in physics and computer science into breakthrough approaches to computation. The company's flagship product, the 512-qubit D-Wave Two™ computer, is built around a novel type of superconducting processor that uses quantum mechanics to massively accelerate computation.

In 2013, D-Wave announced the installation of a D-Wave Two system at the new Quantum Artificial Intelligence Lab created jointly by NASA, Google and USRA. This came soon after Lockheed Martin's purchase of an upgrade of their 128-qubit D-Wave One™ system to a 512-qubit D-Wave Two computer. With headquarters near Vancouver, Canada, the D-Wave U.S. offices are located in Palo Alto, California and Vienna, Virginia. D-Wave has a blue-chip investor base including Bezos Expeditions, Business Development Bank of Canada, Draper Fisher Jurvetson, Goldman Sachs, Growthworks, Harris & Harris Group, In-Q-Tel, International Investment and Underwriting, and Kensington Partners Limited. For more information, visit: www.dwavesys.com.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
SYS-CON Events announced today that eCube Systems, the leading provider of modern development tools and best practices for Continuous Integration on OpenVMS, will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. eCube Systems offers a family of middleware products and development tools that maximize return on technology investment by leveraging existing technical equity to meet evolving business needs. ...
Intelligent machines are here. Robots, self-driving cars, drones, bots and many IoT devices are becoming smarter with Machine Learning. In her session at @ThingsExpo, Sudha Jamthe, CEO of IoTDisruptions.com, will discuss the next wave of business disruption at the junction of IoT and AI, impacting many industries and set to change our lives, work and world as we know it.
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
operations aren’t merging to become one discipline. Nor is operations simply going away. Rather, DevOps is leading software development and operations – together with other practices such as security – to collaborate and coexist with less overhead and conflict than in the past. In his session at @DevOpsSummit at 19th Cloud Expo, Gordon Haff, Red Hat Technology Evangelist, will discuss what modern operational practices look like in a world in which applications are more loosely coupled, are deve...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Enterprises have been using both Big Data and virtualization for years. Until recently, however, most enterprises have not combined the two. Big Data's demands for higher levels of performance, the ability to control quality-of-service (QoS), and the ability to adhere to SLAs have kept it on bare metal, apart from the modern data center cloud. With recent technology innovations, we've seen the advantages of bare metal erode to such a degree that the enhanced flexibility and reduced costs that cl...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of So...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
DevOps is speeding towards the IT world like a freight train and the hype around it is deafening. There is no reason to be afraid of this change as it is the natural reaction to the agile movement that revolutionized development just a few years ago. By definition, DevOps is the natural alignment of IT performance to business profitability. The relevance of this has yet to be quantified but it has been suggested that the route to the CEO’s chair will come from the IT leaders that successfully ma...
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...