Page 21234..»

Archive for the ‘Quantum Computer’ Category

Global QC Market Projected to Grow to More Than $800 million by 2024 – HPCwire

Posted: October 3, 2020 at 5:59 am

without comments

The Quantum Economic Development Consortium (QED-C) and Hyperion Research are projecting that the global quantum computing (QC) market worth an estimated $320 million in 2020 will grow at an anticipated 27% CAGR between 2020 and 2024, reaching approximately $830 million by 2024.

This estimate is based on surveys of 135 US-based quantum computing researchers, developers and suppliers across the academic, commercial and government sectors. Supplemental data and insights came from a companion effort that surveyed 115 current and potential quantum computing users in North America, Europe and the Asia/Pacific region on their expectations, schedules and budgets for the use of quantum computing in their existing and planned computational workloads.

(Keeping track of the various quantum computing organization is becoming a challenge in itself. The Quantum Economic Development Consortium (QED-C) is a consortium of stakeholders that aims to enable and grow the U.S. quantum industry. QED-C was established with support from the National Institute of Standards and Technology (NIST) as part of the Federal strategy for advancing quantum information science and as called for by theNational Quantum Initiative Actenacted in 2018.)

Additional results from the study:

Based on our study and related forecast, there is a growing, vibrant, and diverse US-based QC research, development, and commercial ecosystem that shows the promise of maturing into a viable, if not profitable and self-sustaining industry. That said, it is too early to start picking winners and losers from either a technology or commercial perspective, said Bob Sorensen, quantum analyst for Hyperion Research.

A key driver for commercial success could be the ability of any vendor to ease the requirements needed to integrate QC technology into a larger HPC and enterprise IT user base while still supporting advanced QC-related research for a more targeted, albeit smaller, class of end-user scientists and engineers. This sector is not for faint of heart, but this forecast gives some sense of what is at stake hereat least for the next few years, noted Sorensen.

Source: QED-C

QED-C commissioned and collaborated with Hyperion Research to develop this market forecast to help inform decision making for QC technology developers and suppliers, national-level QC-related policy makers, potential QC users in both the advanced computing and enterprise IT marketplace investors and commercial QC funding organizations. This is a baseline estimate, and Hyperion Research and QED-C are looking to provide periodic updates of their QC market forecast as events, information, or decision- making requirements dictate. Contact: Celia Merzbacher, QED-C Deputy Director, [emailprotected]

See original here:

Global QC Market Projected to Grow to More Than $800 million by 2024 - HPCwire

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

Berkeley Lab Technologies Honored With 7 R&D 100 Awards – Lawrence Berkeley National Laboratory

Posted: at 5:59 am

without comments

Innovative technologies from Lawrence Berkeley National Laboratory (Berkeley Lab) to achieve higher energy efficiency in buildings, make lithium batteries safer and higher performing, and secure quantum communications were some of the inventions honored with R&D 100 Awards by R&D World magazine.

For more than 50 years, the annual R&D 100 Awards have recognized 100 technologies of the past year deemed most innovative and disruptive by an independent panel of judges. The full list of winners, announced by parent company WTWH Media LLC is available at the R&D World website.

Berkeley Labs award-winning technologies are described below.

A Tool to Accelerate Electrochemical and Solid-State Innovation

(from left) Adam Weber, New Danilovic, Douglas Kushner, and John Petrovick (Credit: Berkeley Lab)

Berkeley Lab scientists invented a microelectrode cell to analyze and test electrochemical systems with solid electrolytes. Thanks to significant cost and performance advantages, this tool can accelerate development of critical applications such as energy storage and conversion (fuel cells, batteries, electrolyzers), carbon capture, desalination, and industrial decarbonization.

Solid electrolytes have been displacing liquid electrolytes as the focus of electrochemical innovation because of their performance, safety, and cost advantages. However, the lack of effective methods and equipment for studying solid electrolytes has hindered advancement of the technologies that employ them. This microelectrode cell meets the testing needs, and is already being used by Berkeley Lab scientists.

The development team includes Berkeley Lab researchers Adam Weber, Nemanja Danilovic, Douglas Kushner, and John Petrovick.

Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com)

Information transmitted by MMQ-Com is impervious to security breaches. (Credit: Alexander Stibor/Berkeley Lab)

Quantum communication, cybersecurity, and quantum computing are growing global markets. But the safety of our data is in peril given the rise of quantum computers that can decode classical encryption schemes.

The Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com) technology is a fundamentally new kind of secure quantum information transmitter. It transmits messages by modulating electron matter-waves without changing the pathways of the electrons. This secure communication method is inherently impervious to any interception attempt.

A novel quantum key distribution scheme also ensures that the signal is protected from spying by other quantum devices.

The development team includes Alexander Stibor of Berkeley Labs Molecular Foundry along with Robin Rpke and Nicole Kerker of the University of Tbingen in Germany.

Solid Lithium Battery Using Hard and Soft Solid Electrolytes

(from left) Marca Doeff, Guoying Chen, and Eongyu Yi (Credit: Berkeley Lab)

The lithium battery market is expected to grow from more than $37 billion in 2019 to more than $94 billion by 2025. However, the liquid electrolytes used in most commercial lithium-ion batteries are flammable and limit the ability to achieve higher energy densities. Safety issues continue to plague the electronics markets, as often-reported lithium battery fires and explosions result in casualties and financial losses.

In Berkeley Labs solid lithium battery, the organic electrolytic solution is replaced by two solid electrolytes, one soft and one hard, and lithium metal is used in place of the graphite anode. In addition to eliminating battery fires, incorporation of a lithium metal anode with a capacity 10 times higher than graphite (the conventional anode material in lithium-ion batteries) provides much higher energy densities.

The technology was developed by Berkeley Lab scientists Marca Doeff, Guoying Chen, and Eongyu Yi, along with collaborators at Montana State University.

Porous Graphitic Frameworks for Sustainable High-Performance Li-Ion Batteries

High-resolution transmission electron microscopy images of the Berkeley Lab PGF cathode reveal (at left) a highly ordered honeycomb structure within the 2D plane, and (at right) layered columnar arrays stacked perpendicular to the 2D plane. (Credit: Yi Liu/Berkeley Lab)

The Porous Graphitic Frameworks (PGF) technology is a lithium-ion battery cathode that could outperform todays cathodes in sustainability and performance.

In contrast to commercial cathodes, organic PGFs pose fewer risks to the environment because they are metal-free and composed of earth-abundant, lightweight organic elements such as carbon, hydrogen, and nitrogen. The PGF production process is also more energy-efficient and eco-friendly than other cathode technologies because they are prepared in water at mild temperatures, rather than in toxic solvents at high temperatures.

PGF cathodes also display stable charge-discharge cycles with ultrahigh capacity and record-high energy density, both of which are much higher than all commercial inorganic cathodes and organic cathodes known to exist.

The development team includes Yi Liu and Xinie Li of Berkeley Labs Molecular Foundry, as well as Hongxia Wang and Hao Chen of Stanford University.

Building Efficiency Targeting Tool for Energy Retrofits (BETTER)

The buildings sector is the largest source of primary energy consumption (40%) and ranks second after the industrial sector as a global source of direct and indirect carbon dioxide emissions from fuel combustion. According to the World Economic Forum, nearly one-half of all energy consumed by buildings could be avoided with new energy-efficient systems and equipment.

(from left) Carolyn Szum (Lead Researcher), Han Li, Chao Ding, Nan Zhou, Xu Liu (Credit: Berkeley Lab)

The Building Efficiency Targeting Tool for Energy Retrofits (BETTER) allows municipalities, building and portfolio owners and managers, and energy service providers to quickly and easily identify the most effective cost-saving and energy-efficiency measures in their buildings. With an open-source, data-driven analytical engine, BETTER uses readily available building and monthly energy data to quantify energy, cost, and greenhouse gas reduction potential, and to recommend efficiency interventions at the building and portfolio levels to capture that potential.

It is estimated that BETTER will help reduce about 165.8 megatons of carbon dioxide equivalent (MtCO2e) globally by 2030. This is equivalent to the CO2 sequestered by growing 2.7 billion tree seedlings for 10 years.

The development team includes Berkeley Lab scientists Nan Zhou, Carolyn Szum, Han Li, Chao Ding, Xu Liu, and William Huang, along with collaborators from Johnson Controls and ICF.

AmanziATS: Modeling Environmental Systems Across Scales

Simulated surface and subsurface water from Amanzi-ATS hydrological modeling of the Copper Creek sub-catchment in the East River, Colorado watershed. (Credit: Zexuan Xu/Berkeley Lab, David Moulton/Los Alamos National Laboratory)

Scientists use computer simulations to predict the impact of wildfires on water quality, or to monitor cleanup at nuclear waste remediation sites by portraying fluid flow across Earth compartments. The Amanzi-Advanced Terrestrial Simulator (ATS) enables them to replicate or couple multiple complex and integrated physical processes controlling these flowpaths, making it possible to capture the essential physics of the problem at hand.

Specific problems require taking an individual approach to simulations, said Sergi Molins, principal investigator at Berkeley Lab, which contributed expertise in geochemical modeling to the softwares development. Physical processes controlling how mountainous watersheds respond to disturbances such as climate- and land-use change, extreme weather, and wildfire are far different than the physical processes at play when an unexpected storm suddenly impacts groundwater contaminant levels in and around a nuclear remediation site. Amanzi-ATS allows scientists to make sense of these interactions in each individual scenario.

The code is open-source and capable of being run on systems ranging from a laptop to a supercomputer. Led by Los Alamos National Laboratory, Amanzi-ATS is jointly developed by researchers from Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Berkeley Lab researchers including Sergi Molins, Marcus Day, Carl Steefel, and Zexuan Xu.

Institute for the Design of Advanced Energy Systems (IDAES)

The U.S. Department of Energys (DOEs) Institute for the Design of Advanced Energy Systems (IDAES) project develops next-generation computational tools for process systems engineering (PSE) of advanced energy systems, enabling their rapid design and optimization.

IDAES Project Team (Credit: Berkeley Lab)

By providing rigorous modeling capabilities, the IDAES Modeling & Optimization Platform helps energy and process companies, technology developers, academic researchers, and DOE to design, develop, scale-up, and analyze new and potential PSE technologies and processes to accelerate advances and apply them to address the nations energy needs. The IDAES platform is also a key component in the National Alliance for Water Innovation, a $100 million, five-year DOE innovation hub led by Berkeley Lab, which will examine the critical technical barriers and research needed to radically lower the cost and energy of desalination.

Led by National Energy Technology Laboratory, IDAES is a collaboration with Sandia National Laboratories, Berkeley Lab, West Virginia University, Carnegie Mellon University, and the University of Notre Dame. The development team at Berkeley Lab includes Deb Agarwal, Oluwamayowa (Mayo) Amusat, Keith Beattie, Ludovico Bianchi, Josh Boverhof, Hamdy Elgammal, Dan Gunter, Julianne Mueller, Jangho Park, Makayla Shepherd, Karen Whitenack, and Perren Yang.

# # #

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit

More here:

Berkeley Lab Technologies Honored With 7 R&D 100 Awards - Lawrence Berkeley National Laboratory

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

oneAPI Academic Center of Excellence Established at the Heidelberg University Computing Center (URZ) – HPCwire

Posted: at 5:59 am

without comments

Sept. 29, 2020 A oneAPI Academic Center of Excellence (CoE) is now established at the Heidelberg University Computing Center (URZ). The new CoE will conduct research supporting the oneAPI industry initiative to create a uniform, open programming model for heterogeneous computer architectures.

A common language for heterogeneous computing

URZ will focus its research and programming efforts on a fundamental high-performance computing (HPC) challenge where modern computers utilize different types of hardware for different calculations. Accelerators, including graphics processing units (GPUs) and field programmable gate arrays (FPGAs), are used in combination with general compute processors (CPUs). Using different types of hardware make computers very powerful and provide versatility for a wide range of situations and workloads. However, hardware heterogeneity complicates software development for these computers, especially when specialized components from a variety of vendors are used in tandem.

One major reason for this complication is that many accelerated compute architectures require their own programming models. Therefore, software developers need to learn and use a different and sometimes proprietary language for each processing unit in a heterogeneous system, which increases complexity and limits flexibility.

oneAPIs cross-architecture language Data Parallel C++ (DPC++), based on Khronos Groups SYCL standard for heterogeneous programming in C++, overcomes these challenges with its single, unified open development model for performant and productive heterogeneous programming and cross-vendor support.

Developing for Heterogeneous Systems: advancing features and capabilities, maximizing interoperability

URZs work as a oneAPI CoE will add advanced DPC++ capabilities intohipSYCL, which supports systems based on AMD GPUs, NVIDIA GPUs, and CPUs. New DPC++ extensions are part of the SYCL 2020 provisional specification that brings features such as unified shared memory to hipSYCL and the platforms it supports furthering the promise of oneAPI application support across system architectures and vendors.

URZ HPC technical specialist Aksel Alpay, who created hipSYCL, leads its on-going development. The whole project is quite ambitious, says Alpay, venturing a look into the future. hipSYCL is an academic research project as well as a development project, where the final product will be used in production operations. It is incredibly exciting to bring DPC++ and SYCL 2020 capabilities to additional architectures, such as AMD GPUs.

To expedite the research, URZ researchers and developers will access an international network of experts at Intel and numerous academic and government institutions a great advantage to advance hipSYCL capabilities and further the goal of the oneAPI initiative. For a scientific computing center to have access to this level of expertise and work together on an open standard with partners from around the globe, is a wonderful prospect, states Heidelberg Universitys CIO and URZ director Prof. Dr. Vincent Heuveline, who is a major proponent of the CoE. In addition to being the universitys main liaison for the center, he will function as its scientific advisor.

One of our strategic goals is to make a measurable contribution to the transfer of new technologies from research to industrial application, and of course to continuously expand our expertise and research efforts in the field of supercomputing. The oneAPI CoE will allow us to do both, explains Heuveline.

oneAPI is a true cross-industry initiative that seeks to simplify development of diverse workloads by streamlining code re-use across a variety of architectures through an open and collaborative approach. URZs research helps to deliver on the cross-vendor promise of oneAPI by expanding advanced DPC++ application support to other architectures, says Dr. Jeff McVeigh, Intel vice president of Datacenter XPU Products and Solutions.

About oneAPI

oneAPI is an industry initiative to create a single, unified, cross-architecture programming model for CPUs + accelerator architectures. Based on industry standards and its open development approach, the initiative will help streamline software development for high performance computers, increase performance, and provide specifications for efficient and diverse architecture programming.

Learn more

Source: Heidelberg University

See more here:

oneAPI Academic Center of Excellence Established at the Heidelberg University Computing Center (URZ) - HPCwire

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

Baidu offers quantum computing from the cloud – VentureBeat

Posted: September 26, 2020 at 9:52 am

without comments

Following its developer conference last week, Baidu today detailed Quantum Leaf, a new cloud quantum computing platform designed for programming, simulating, and executing quantum workloads. Its aimed at providing a programming environment for quantum-infrastructure-as-a-service setups, Baidu says, and it complements the Paddle Quantum development toolkit the company released earlier this year.

Experts believe that quantum computing, which at a high level entails the use of quantum-mechanical phenomena like superposition and entanglement to perform computation, could one day accelerate AI workloads. Moreover, AI continues to play a role in cutting-edge quantum computing research.

Baidu says a key component of Quantum Leaf is QCompute, a Python-based open source development kit with a hybrid programming language and a high-performance simulator. Users can leverage prebuilt objects and modules in the quantum programming environment, passing parameters to build and execute quantum circuits on the simulator or cloud simulators and hardware. Essentially, QCompute provides services for creating and analyzing circuits and calling the backend.

Quantum Leaf dovetails with Quanlse, which Baidu also detailed today. The company describes Quanlse as a cloud-based quantum pulse computing service that bridges the gap between software and hardware by providing a service to design and implement pulse sequences as part of quantum tasks. (Pulse sequences are a means of reducing quantum error, which results from decoherence and other quantum noise.) Quanlse works with both superconducting circuits and nuclear magnetic resonance platforms and will extend to new form factors in the future, Baidu says.

The unveiling of Quantum Leaf and Quanlse follows the release of Amazon Braket and Googles TensorFlow Quantum, a machine learning framework that can construct quantum data sets, prototype hybrid quantum and classic machine learning models, support quantum circuit simulators, and train discriminative and generative quantum models. Facebooks PyTorch relies on Xanadus multi-contributor project for quantum computing PennyLane, a third-party library for quantum machine learning, automatic differentiation, and optimization of hybrid quantum-classical computations. And Microsoft offers several kits and libraries for quantum machine learning applications.

Read more:

Baidu offers quantum computing from the cloud - VentureBeat

Written by admin

September 26th, 2020 at 9:52 am

Posted in Quantum Computer

IBM Partners With HBCUs to Diversify Quantum Computing Workforce – Diverse: Issues in Higher Education

Posted: at 9:52 am

without comments

September 21, 2020 | :

In partnership with historically Black colleges and universities (HBCUs), IBM recently launched a quantum computing research initiative to raise awareness of the field and diversify the workforce.

The IBM-HBCU Quantum Center, a multi-year investment, will fund undergraduate and graduate research, provide access to IBM quantum computers through the Cloud and offer student support.

Quantum computing is considered a fairly young field and quantum computers were not readily available in research labs until 2016. IBM was the first company to put a quantum computer on the Cloud, which allows it to be accessible from anywhere, according to Dr. Abraham Asfaw, global lead of Quantum Education and Open Science at IBM Quantum.

What that implies is that now anyone around the world can participate, he said. This is why we have this broad education effort, to really try and make quantum computing open and accessible to everyone. The scale of the industry is very small but we are stepping into the right direction in terms of trying to get more people into the field.

The 13 HBCUs that will be part of the initiative include Albany State University, Clark Atlanta University, Coppin State University, Hampton University, Howard University, Morehouse College, Morgan State University, North Carolina Agricultural and Technical State University, Southern University, Texas Southern University, University of the Virgin Islands, Virginia Union University and Xavier University of Louisiana.

Each of the schools was chosen based on how much the school focused on science, technology, engineering and mathematics (STEM).

Its very important at this point to be building community and to be educating everyone so that we have opportunities in the quantum computing field for everyone, said Asfaw. While at the same time, we are bringing in diverse perspectives to see where quantum computing applications could emerge.

Dr. Abraham Asfaw

The center encourages individuals from all STEM disciplines to pursue quantum computing. According to Asfaw, the field of quantum computing is considered highly interdisciplinary.

Teaching quantum computing, at any place, requires bringing together several departments, he said. So putting together a quantum curriculum is an exercise in making sure your students are trained in STEM all the way from the beginning to the end with different pieces from the different sciences instead of just one department altogether.

Diversifying the quantum computing workforce can also be looked at in two ways. One is getting different groups of people into the field and the other is bringing different perspectives into the field from the direction of the other sciences that could benefit from quantum computing, according to Asfaw.

We are in this discovery phase now, so really having help from all fields is a really powerful thing, he added.

IBM also plans to donate $100 million to provide more HBCUs with resources and technology as part of the Skills Academy Academic Initiative in Global University Programs. This includes providing HBCUs with university guest lectures, curriculum content, digital badges, software and faculty training by the end of 2020, according to IBM.

Our entire quantum education effort is centered around making all of our resources open and accessible to everyone, said Asfaw. [Our investment] is really an attempt to integrate HBCUs, which also are places of origin for so many successful scientists today, to give them opportunities to join the quantum computing revolution.

According to IBM, the skills academy is a comprehensive, integrated program designed to create a foundation of diverse and high demand skill sets that directly correlate to what students will need in the workplace.

The academy will address topics such as artificial intelligence, cybersecurity, blockchain, design thinking and quantum computing.

Those HBCUs involved in the academy include Clark Atlanta University, Fayetteville State University, Grambling State University, Hampton University, Howard University, Johnson C. Smith University, Norfolk State University, North Carolina A&T State University, North Carolina Central University, Southern University System, Stillman College, Virginia State and West Virginia State University.

While we are teaching quantum computing, while we are building quantum computing at universities, while we are training developers to take on quantum computer, it is important at this point to be inclusive and accessible as possible, said Afsaw. That really allows the field to progress.

This summer, IBM also hosted the 2020 Qiskit Global Summer School, which was designed for people to further explore the quantum computing field. The program involved three hours of lectures as well as hands-on learning opportunities. Many HBCU students were part of the program.

This shows you thats one piece of the bigger picture of trying to get the whole world involved in quantum education, said Asfaw. Thats the first place where HBCUs were involved and we hope to continue to build on even more initiatives going forward.

Sarah Wood can be reached at

Go here to see the original:

IBM Partners With HBCUs to Diversify Quantum Computing Workforce - Diverse: Issues in Higher Education

Written by admin

September 26th, 2020 at 9:52 am

Posted in Quantum Computer

IBM, Alphabet and well-funded startups in the race for quantum supremacy – IT Brief Australia

Posted: at 9:52 am

without comments

GlobalData, the worldwide data analysts, have offered new research that suggests that many companies are joining the race for quantum supremacy, that is, to be the first to make significant headway with quantum computing.

Quantum computers are a step closer to reality to solve certain real life problems that are beyond the capability of conventional computers, the analysts state.

However, the biggest challenge is that these machines should be able to manipulate several dozens of quantum bits or qubits to achieve impressive computational performance.

As a result, a handful of companies have joined the race to increase the power of qubits and claim quantum supremacy, says GlobalData.

An analysis of GlobalDatas Disruptor Intelligence Center reveals various companies in the race to monetisequantum computing as an everyday tool for business.

IBM's latest quantum computer, accessible via cloud, boasts a 65-qubit Hummingbird chip. It is an advanced version of System Q, its first commercial quantum computer launched in 2019 that has 20 qubits. IBM plans to launch a 1,000-qubit system by the end of 2023.

Alphabet has built a 54-qubit processor Sycamore and demonstrated its quantum supremacy by performing a task of generating a random number in 200 seconds, which it claims would take the most advanced supercomputer 10,000 years to finish the task.

The company also unveiled its newest 72-qubit quantum computer Bristlecone.

Alibabas cloud service subsidiary Aliyun and the Chinese Academy of Sciences jointly launched an 11-qubit quantum computing service, which is available to the public on its quantum computing cloud platform.

Alibaba is the second enterprise to offer the service to public after IBM.

However, its not only the tech giants that are noteworthy. GlobalData finds that well-funded startups have also targeted the quantum computing space to develop hardware, algorithms and security applications.

Some of them are Rigetti, Xanadu, 1Qbit, IonQ, ISARA, Q-CTRL and QxBranch.

Amazon, unlike the tech companies competing to launch quantum computers, is making quantum products of other companies available to users via Braket.

It currently supports quantum computing services from D-Wave, IonQ and Rigetti.

GlobalData principal disruptive tech analyst Kiran Raj says, Qubits can allow to create algorithms for the completion of a task with reduced computational complexity that cannot be achieved with traditional bits.

"Given such advantages, quantum computers can solve some of the intractable problems in cybersecurity, drug research, financial modelling, traffic optimisation and batteries to name a few.

Raj says, Albeit a far cry from the large-scale mainstream use, quantum computers are gearing up to be a transformative reality. They are highly expensive to build and it is hard to maintain the delicate state of superposition and entanglement of qubits.

"Despite such challenges, quantum computers will continue to progress into the future where companies may rent them to solve everyday problems the way they currently rent cloud services.

"It may not come as a surprise that quantum computing one day replaces artificial intelligence as the mainstream technology to help industries tackle problems they never would have attempted to solve before.

Read more from the original source:

IBM, Alphabet and well-funded startups in the race for quantum supremacy - IT Brief Australia

Written by admin

September 26th, 2020 at 9:52 am

Posted in Quantum Computer

IBM plans to build a 1121 qubit system. What does this technology mean? – The Hindu

Posted: at 9:52 am

without comments

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Last week, IBM said it will build Quantum Condor, a 1121 qubit quantum computer, by the end of 2023. The company claims the system can control behaviour of atoms to run applications, and generate world-changing materials to transform industries. IBM says its full-stack quantum computer can be deployed via cloud, and that it can be programmed from any part of the world.

The technology company is developing a super-fridge, internally codenamed Goldeneye, to house the computer. The 10-foot-tall and 6-foot-wide refrigerator is being designed for a million-qubit system.

What are Qubits and quantum computers?

Quantum computers process data exponentially faster than personal computers do. They deploy non-intuitive methods, coupled with lots of computing, to solve intractable problems. These machines operate using qubits, similar to bits in personal computers.

The similarity ends there. The way quantum machines solve a problem is very different from how a traditional machine does.

A classical computer tries solving a problem intuitively. If they are given a command, they attempt every possible move, one after another, turning back at dead ends, until they find a solution.

Quantum computers deploy superposition to solve problems. This allows them to exist in multiple states, and test all possible ways at once. And qubits, the fundamental units of data in quantum computing, enables these machines to compute this way.

In regular computers, bits have either 0 or 1 value, and they come in four possible combinations - - 00, 01, 10, 11. Only one combination can exist at a single point of time, which limits processing speed.

But, in quantum machines, two qubits can represent same values, and all four can exist at the same time. This helps these systems to run faster.

This means that n qubits can represent 2n states. So, 2 qubits represent 4 states, 3 qubits 8 states, 4 qubits 16 states, and so on. And now imagine the many states IBMs 1121 qubit system can represent.

An ordinary 64-bit computer would take hundred years to cycle through these combinations. And thats exactly why quantum computers are being built: to solve intractable problems and break-down theories that are practically impossible for classical computers.

To make such large and difficult calculations happen, the qubits need to be linked together in quantum entanglement. This enables qubits at any end of the universe to connect and be manipulated in such a way that not one can be described without referencing the others.

Why are qubits difficult?

One of the key challenges for processing in qubits is the possibility of losing data during transition. Additionally, assembling qubits, writing and reading information from them is a difficult task.

The fundamental units demand special attention, including a perfect isolation and a thermostat set of one hundredth of a degree above absolute zero. Despite strict monitoring, due to their highly sensitive nature, they can lose superposition even from a slightest variation. This makes programming very tricky.

Since quantum computers are programmed using a sequence of logic gates of various kinds, programmes need to run quickly before qubits lose coherence. The combination of superposition and entanglement makes this process a whole lot harder.

Other companies building quantum computers

There has been a lot of interest in quantum computing in recent times. In 2016, IBM put the first quantum computer in the cloud. Google launched Sycamore quantum computer last year, and said it was close to achieving quantum supremacy.

This month, IBM released its 65-qubit IBM Quantum Hummingbird processor to IBM Q Network members, and the company is planning to surpass the 100-qubit milestone with its 127-qubit IBM Quantum Eagle processor next year. It is also planning to roll out a 433-qubit IBM Quantum Osprey system in 2022.

D-Wave systems, a Canada-based quantum computing company, launched its cloud service in India and Australia this year. It gives researchers and developers in these two countries real-time access to its quantum computers.

Honeywell recently outlined its quantum system, and other technology companies like Microsoft and Intel are also chasing commercialisation.

The ongoing experiments and analysis speak volumes on how tech companies are viewing quantum computers as the next big breakthrough in computing.

Quantum computers will likely deliver tremendous speed, and will help in solving problems related to optimisation in defence, finance, and other industries.

IBM views the 1000-qubit mark as the point from where the commercialisation of quantum computers can take off.

Read the original here:

IBM plans to build a 1121 qubit system. What does this technology mean? - The Hindu

Written by admin

September 26th, 2020 at 9:52 am

Posted in Quantum Computer

Inaugural OSA Quantum 2.0 Conference Featured Talks on Emerging Technologies – Novus Light Technologies Today

Posted: at 9:52 am

without comments

Published on 22 September 2020

The unique role of optics and photonics in driving quantum research and technologies was featured in presentations for the inaugural OSA Quantum 2.0 Conference held 14 17 September. The all-virtual event, presented concurrently with the 2020 Frontiers in Optics and Laser Science APS/DLS (FiO + LS) Conference, drew almost 2,500 registrants from more than 70 countries.

Live and pre-recorded technical presentations on quantum computing and simulation to quantum sensing were available for registrants across the globe at no cost. The conference engaged scientists, engineers and others addressing grand challenges in building a quantum science and technology infrastructure.

The meeting succeeded in bringing together scientists from academia, industry and government labs in a very constructive way, said conference co-chair Michael Raymer of the University of Oregon, USA. The high quality of the talks, along with the facilitation by the presiders and OSA staff, moves us closer to the goal of an open, global ecosystem for advancing quantum information science and technology.

Marissa Giustina, senior research scientist and quantum electronics engineerwith Google AI Quantum, described the companys efforts to build a quantum computer in her keynote talk. Googles goal was to build a prototype system that could enter a space where no classical computer can go at a size of about 50 qubits. To create a viable system, Guistina said there must be strong collaboration between algorithm and hardware developers.

Quantum Algorithms for Finite Energies and Temperatures was the focus of a talk by Ignacio Cirac, director of the Theory Division at the Max Planck Institute of Quantum Optics and Honorary Professor at the Technical University of Munich. He described advances in quantum simulators for addressing problems with the dynamics of physical quantum systems. His recent work focuses on developing algorithms for use on quantum simulators to solve many-body problems

Solutions to digital security challenges was the topic of a talk by Gregoire Ribordy,co-founder and CEO of ID Quantique, Switzerland. He described quantum security techniques, technology and strengths in his keynote talk titled Quantum Technologies for Long-term Data Security. His work centers on the use of quantum safe cryptography and quantum key distribution, and commercially available quantum random number generators in data security.

Mikhail Lukin, co-director of the Harvard Quantum Initiative in Science and Engineering and co-director of the Harvard-MIT Center for Ultracold Atoms, USA, described progress towards quantum repeaters for long-distance quantum communication. He also discussed a new platform for exploring synthetic quantum matter and quantum communication systems based on nanophotonics with atom-like systems.

Conference-wide sponsors for the combined OSA Quantum 2.0 Conference and FiO + LS Conference included Facebook Reality Labs, Toptica Photonics and Oz Optics. Registrants interacted with more than three dozen companies in the virtual exhibit to learn about their latest technologies from instruments for quantum science and education to LIDAR and remote sensing applications.

Registrants can continue to benefit from conference resources for 60 days. Recordings of the technical sessions, the e-Posters Gallery and the Virtual Exhibit will be available on-demand on the FiO + LS website.

Labels: Optical Society,quantum technology,research,optics,conference,applications

See more here:

Inaugural OSA Quantum 2.0 Conference Featured Talks on Emerging Technologies - Novus Light Technologies Today

Written by admin

September 26th, 2020 at 9:52 am

Posted in Quantum Computer

Could Quantum Computing Progress Be Halted by Background Radiation? – Singularity Hub

Posted: September 1, 2020 at 10:55 am

without comments

Doing calculations with a quantum computer is a race against time, thanks to the fragility of the quantum states at their heart. And new research suggests we may soon hit a wall in how long we can hold them together thanks to interference from natural background radiation.

While quantum computing could one day enable us to carry out calculations beyond even the most powerful supercomputer imaginable, were still a long way from that point. And a big reason for that is a phenomenon known as decoherence.

The superpowers of quantum computers rely on holding the qubitsquantum bitsthat make them up in exotic quantum states like superposition and entanglement. Decoherence is the process by which interference from the environment causes them to gradually lose their quantum behavior and any information that was encoded in them.

It can be caused by heat, vibrations, magnetic fluctuations, or any host of environmental factors that are hard to control. Currently we can keep superconducting qubits (the technology favored by the fields leaders like Google and IBM) stable for up to 200 microseconds in the best devices, which is still far too short to do any truly meaningful computations.

But new research from scientists at Massachusetts Institute of Technology (MIT) and Pacific Northwest National Laboratory (PNNL), published last week in Nature, suggests we may struggle to get much further. They found that background radiation from cosmic rays and more prosaic sources like trace elements in concrete walls is enough to put a hard four-millisecond limit on the coherence time of superconducting qubits.

These decoherence mechanisms are like an onion, and weve been peeling back the layers for the past 20 years, but theres another layer that left unabated is going to limit us in a couple years, which is environmental radiation, William Oliver from MIT said in a press release. This is an exciting result, because it motivates us to think of other ways to design qubits to get around this problem.

Superconducting qubits rely on pairs of electrons flowing through a resistance-free circuit. But radiation can knock these pairs out of alignment, causing them to split apart, which is what eventually results in the qubit decohering.

To determine how significant of an impact background levels of radiation could have on qubits, the researchers first tried to work out the relationship between coherence times and radiation levels. They exposed qubits to irradiated copper whose emissions dropped over time in a predictable way, which showed them that coherence times rose as radiation levels fell up to a maximum of four milliseconds, after which background effects kicked in.

To check if this coherence time was really caused by the natural radiation, they built a giant shield out of lead brick that could block background radiation to see what happened when the qubits were isolated. The experiments clearly showed that blocking the background emissions could boost coherence times further.

At the minute, a host of other problems like material impurities and electronic disturbances cause qubits to decohere before these effects kick in, but given the rate at which the technology has been improving, we may hit this new wall in just a few years.

Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing, Brent VanDevender from PNNL said in a press release.

Potential solutions to the problem include building radiation shielding around quantum computers or locating them underground, where cosmic rays arent able to penetrate so easily. But if you need a few tons of lead or a large cavern in order to install a quantum computer, thats going to make it considerably harder to roll them out widely.

Its important to remember, though, that this problem has only been observed in superconducting qubits so far. In July, researchers showed they could get a spin-orbit qubit implemented in silicon to last for about 10 milliseconds, while trapped ion qubits can stay stable for as long as 10 minutes. And MITs Oliver says theres still plenty of room for building more robust superconducting qubits.

We can think about designing qubits in a way that makes them rad-hard, he said. So its definitely not game-over, its just the next layer of the onion we need to address.

Image Credit: Shutterstock

View post:

Could Quantum Computing Progress Be Halted by Background Radiation? - Singularity Hub

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,…

Posted: at 10:55 am

without comments

One of the goals of theSuperconducting Quantum Materials and Systems Centeris to build a beyond-state-of-the-art quantum computer based on superconducting technologies.The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles.

The U.S. Department of Energys Fermilab has been selected to lead one of five national centers to bring about transformational advances in quantum information science as a part of the U.S. National Quantum Initiative, announced the White House Office of Science and Technology Policy, the National Science Foundation and the U.S. Department of Energy today.

The initiative provides the newSuperconducting Quantum Materials and Systems Centerfunding with the goal of building and deploying a beyond-state-of-the-art quantum computer based on superconducting technologies. The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles. Total planned DOE funding for the center is $115 million over five years, with $15 million in fiscal year 2020 dollars and outyear funding contingent on congressional appropriations. SQMS will also receive an additional $8 million in matching contributions from center partners.

The SQMS Center is part of a $625 million federal program to facilitate and foster quantum innovation in the United States. The 2018 National Quantum Initiative Act called for a long-term, large-scale commitment of U.S. scientific and technological resources to quantum science.

The revolutionary leaps in quantum computing and sensing that SQMS aims for will be enabled by a unique multidisciplinary collaboration that includes 20 partners national laboratories, academic institutions and industry. The collaboration brings together world-leading expertise in all key aspects: from identifying qubits quality limitations at the nanometer scale to fabrication and scale-up capabilities into multiqubit quantum computers to the exploration of new applications enabled by quantum computers and sensors.

The breadth of the SQMS physics, materials science, device fabrication and characterization technology combined with the expertise in large-scale integration capabilities by the SQMS Center is unprecedented for superconducting quantum science and technology, said SQMS Deputy Director James Sauls of Northwestern University. As part of the network of National QIS Research centers, SQMS will contribute to U.S. leadership in quantum science for the years to come.

SQMS researchers are developing long-coherence-time qubits based on Rigetti Computings state-of-the-art quantum processors. Image: Rigetti Computing

At the heart of SQMS research will be solving one of the most pressing problems in quantum information science: the length of time that a qubit, the basic element of a quantum computer, can maintain information, also called quantum coherence. Understanding and mitigating sources of decoherence that limit performance of quantum devices is critical to engineering in next-generation quantum computers and sensors.

Unless we address and overcome the issue of quantum system decoherence, we will not be able to build quantum computers that solve new complex and important problems. The same applies to quantum sensors with the range of sensitivity needed to address long-standing questions in many fields of science, said SQMS Center Director Anna Grassellino of Fermilab. Overcoming this crucial limitation would allow us to have a great impact in the life sciences, biology, medicine, and national security, and enable measurements of incomparable precision and sensitivity in basic science.

The SQMS Centers ambitious goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Researchers have expanded the use of Fermilab cavities into the quantum regime.

We have the most coherent by a factor of more than 200 3-D superconducting cavities in the world, which will be turned into quantum processors with unprecedented performance by combining them with Rigettis state-of-the-art planar structures, said Fermilab scientist Alexander Romanenko, SQMS technology thrust leader and Fermilab SRF program manager. This long coherence would not only enable qubits to be long-lived, but it would also allow them to be all connected to each other, opening qualitatively new opportunities for applications.

The SQMS Centers goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Photo: Reidar Hahn, Fermilab

To advance the coherence even further, SQMS collaborators will launch a materials-science investigation of unprecedented scale to gain insights into the fundamental limiting mechanisms of cavities and qubits, working to understand the quantum properties of superconductors and other materials used at the nanoscale and in the microwave regime.

Now is the time to harness the strengths of the DOE laboratories and partners to identify the underlying mechanisms limiting quantum devices in order to push their performance to the next level for quantum computing and sensing applications, said SQMS Chief Engineer Matt Kramer, Ames Laboratory.

Northwestern University, Ames Laboratory, Fermilab, Rigetti Computing, the National Institute of Standards and Technology, the Italian National Institute for Nuclear Physics and several universities are partnering to contribute world-class materials science and superconductivity expertise to target sources of decoherence.

SQMS partner Rigetti Computing will provide crucial state-of-the-art qubit fabrication and full stack quantum computing capabilities required for building the SQMS quantum computer.

By partnering with world-class experts, our work will translate ground-breaking science into scalable superconducting quantum computing systems and commercialize capabilities that will further the energy, economic and national security interests of the United States, said Rigetti Computing CEO Chad Rigetti.

SQMS will also partner with the NASA Ames Research Center quantum group, led by SQMS Chief Scientist Eleanor Rieffel. Their strengths in quantum algorithms, programming and simulation will be crucial to use the quantum processors developed by the SQMS Center.

The Italian National Institute for Nuclear Physics has been successfully collaborating with Fermilab for more than 40 years and is excited to be a member of the extraordinary SQMS team, said INFN President Antonio Zoccoli. With its strong know-how in detector development, cryogenics and environmental measurements, including the Gran Sasso national laboratories, the largest underground laboratory in the world devoted to fundamental physics, INFN looks forward to exciting joint progress in fundamental physics and in quantum science and technology.

Fermilab is excited to host this National Quantum Information Science Research Center and work with this extraordinary network of collaborators, said Fermilab Director Nigel Lockyer. This initiative aligns with Fermilab and its mission. It will help us answer important particle physics questions, and, at the same time, we will contribute to advancements in quantum information science with our strengths in particle accelerator technologies, such as superconducting radio-frequency devices and cryogenics.

We are thankful and honored to have this unique opportunity to be a national center for advancing quantum science and technology, Grassellino said. We have a focused mission: build something revolutionary. This center brings together the right expertise and motivation to accomplish that mission.

The Superconducting Quantum Materials and Systems Center at Fermilab is supported by the DOE Office of Science.

Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit

More here:

Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,...

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Page 21234..»