Page 11234..»

Archive for the ‘Quantum Computer’ Category

Could Quantum Computing Progress Be Halted by Background Radiation? – Singularity Hub

Posted: September 1, 2020 at 10:55 am

without comments

Doing calculations with a quantum computer is a race against time, thanks to the fragility of the quantum states at their heart. And new research suggests we may soon hit a wall in how long we can hold them together thanks to interference from natural background radiation.

While quantum computing could one day enable us to carry out calculations beyond even the most powerful supercomputer imaginable, were still a long way from that point. And a big reason for that is a phenomenon known as decoherence.

The superpowers of quantum computers rely on holding the qubitsquantum bitsthat make them up in exotic quantum states like superposition and entanglement. Decoherence is the process by which interference from the environment causes them to gradually lose their quantum behavior and any information that was encoded in them.

It can be caused by heat, vibrations, magnetic fluctuations, or any host of environmental factors that are hard to control. Currently we can keep superconducting qubits (the technology favored by the fields leaders like Google and IBM) stable for up to 200 microseconds in the best devices, which is still far too short to do any truly meaningful computations.

But new research from scientists at Massachusetts Institute of Technology (MIT) and Pacific Northwest National Laboratory (PNNL), published last week in Nature, suggests we may struggle to get much further. They found that background radiation from cosmic rays and more prosaic sources like trace elements in concrete walls is enough to put a hard four-millisecond limit on the coherence time of superconducting qubits.

These decoherence mechanisms are like an onion, and weve been peeling back the layers for the past 20 years, but theres another layer that left unabated is going to limit us in a couple years, which is environmental radiation, William Oliver from MIT said in a press release. This is an exciting result, because it motivates us to think of other ways to design qubits to get around this problem.

Superconducting qubits rely on pairs of electrons flowing through a resistance-free circuit. But radiation can knock these pairs out of alignment, causing them to split apart, which is what eventually results in the qubit decohering.

To determine how significant of an impact background levels of radiation could have on qubits, the researchers first tried to work out the relationship between coherence times and radiation levels. They exposed qubits to irradiated copper whose emissions dropped over time in a predictable way, which showed them that coherence times rose as radiation levels fell up to a maximum of four milliseconds, after which background effects kicked in.

To check if this coherence time was really caused by the natural radiation, they built a giant shield out of lead brick that could block background radiation to see what happened when the qubits were isolated. The experiments clearly showed that blocking the background emissions could boost coherence times further.

At the minute, a host of other problems like material impurities and electronic disturbances cause qubits to decohere before these effects kick in, but given the rate at which the technology has been improving, we may hit this new wall in just a few years.

Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing, Brent VanDevender from PNNL said in a press release.

Potential solutions to the problem include building radiation shielding around quantum computers or locating them underground, where cosmic rays arent able to penetrate so easily. But if you need a few tons of lead or a large cavern in order to install a quantum computer, thats going to make it considerably harder to roll them out widely.

Its important to remember, though, that this problem has only been observed in superconducting qubits so far. In July, researchers showed they could get a spin-orbit qubit implemented in silicon to last for about 10 milliseconds, while trapped ion qubits can stay stable for as long as 10 minutes. And MITs Oliver says theres still plenty of room for building more robust superconducting qubits.

We can think about designing qubits in a way that makes them rad-hard, he said. So its definitely not game-over, its just the next layer of the onion we need to address.

Image Credit: Shutterstock

View post:

Could Quantum Computing Progress Be Halted by Background Radiation? - Singularity Hub

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,…

Posted: at 10:55 am

without comments

One of the goals of theSuperconducting Quantum Materials and Systems Centeris to build a beyond-state-of-the-art quantum computer based on superconducting technologies.The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles.

The U.S. Department of Energys Fermilab has been selected to lead one of five national centers to bring about transformational advances in quantum information science as a part of the U.S. National Quantum Initiative, announced the White House Office of Science and Technology Policy, the National Science Foundation and the U.S. Department of Energy today.

The initiative provides the newSuperconducting Quantum Materials and Systems Centerfunding with the goal of building and deploying a beyond-state-of-the-art quantum computer based on superconducting technologies. The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles. Total planned DOE funding for the center is $115 million over five years, with $15 million in fiscal year 2020 dollars and outyear funding contingent on congressional appropriations. SQMS will also receive an additional $8 million in matching contributions from center partners.

The SQMS Center is part of a $625 million federal program to facilitate and foster quantum innovation in the United States. The 2018 National Quantum Initiative Act called for a long-term, large-scale commitment of U.S. scientific and technological resources to quantum science.

The revolutionary leaps in quantum computing and sensing that SQMS aims for will be enabled by a unique multidisciplinary collaboration that includes 20 partners national laboratories, academic institutions and industry. The collaboration brings together world-leading expertise in all key aspects: from identifying qubits quality limitations at the nanometer scale to fabrication and scale-up capabilities into multiqubit quantum computers to the exploration of new applications enabled by quantum computers and sensors.

The breadth of the SQMS physics, materials science, device fabrication and characterization technology combined with the expertise in large-scale integration capabilities by the SQMS Center is unprecedented for superconducting quantum science and technology, said SQMS Deputy Director James Sauls of Northwestern University. As part of the network of National QIS Research centers, SQMS will contribute to U.S. leadership in quantum science for the years to come.

SQMS researchers are developing long-coherence-time qubits based on Rigetti Computings state-of-the-art quantum processors. Image: Rigetti Computing

At the heart of SQMS research will be solving one of the most pressing problems in quantum information science: the length of time that a qubit, the basic element of a quantum computer, can maintain information, also called quantum coherence. Understanding and mitigating sources of decoherence that limit performance of quantum devices is critical to engineering in next-generation quantum computers and sensors.

Unless we address and overcome the issue of quantum system decoherence, we will not be able to build quantum computers that solve new complex and important problems. The same applies to quantum sensors with the range of sensitivity needed to address long-standing questions in many fields of science, said SQMS Center Director Anna Grassellino of Fermilab. Overcoming this crucial limitation would allow us to have a great impact in the life sciences, biology, medicine, and national security, and enable measurements of incomparable precision and sensitivity in basic science.

The SQMS Centers ambitious goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Researchers have expanded the use of Fermilab cavities into the quantum regime.

We have the most coherent by a factor of more than 200 3-D superconducting cavities in the world, which will be turned into quantum processors with unprecedented performance by combining them with Rigettis state-of-the-art planar structures, said Fermilab scientist Alexander Romanenko, SQMS technology thrust leader and Fermilab SRF program manager. This long coherence would not only enable qubits to be long-lived, but it would also allow them to be all connected to each other, opening qualitatively new opportunities for applications.

The SQMS Centers goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Photo: Reidar Hahn, Fermilab

To advance the coherence even further, SQMS collaborators will launch a materials-science investigation of unprecedented scale to gain insights into the fundamental limiting mechanisms of cavities and qubits, working to understand the quantum properties of superconductors and other materials used at the nanoscale and in the microwave regime.

Now is the time to harness the strengths of the DOE laboratories and partners to identify the underlying mechanisms limiting quantum devices in order to push their performance to the next level for quantum computing and sensing applications, said SQMS Chief Engineer Matt Kramer, Ames Laboratory.

Northwestern University, Ames Laboratory, Fermilab, Rigetti Computing, the National Institute of Standards and Technology, the Italian National Institute for Nuclear Physics and several universities are partnering to contribute world-class materials science and superconductivity expertise to target sources of decoherence.

SQMS partner Rigetti Computing will provide crucial state-of-the-art qubit fabrication and full stack quantum computing capabilities required for building the SQMS quantum computer.

By partnering with world-class experts, our work will translate ground-breaking science into scalable superconducting quantum computing systems and commercialize capabilities that will further the energy, economic and national security interests of the United States, said Rigetti Computing CEO Chad Rigetti.

SQMS will also partner with the NASA Ames Research Center quantum group, led by SQMS Chief Scientist Eleanor Rieffel. Their strengths in quantum algorithms, programming and simulation will be crucial to use the quantum processors developed by the SQMS Center.

The Italian National Institute for Nuclear Physics has been successfully collaborating with Fermilab for more than 40 years and is excited to be a member of the extraordinary SQMS team, said INFN President Antonio Zoccoli. With its strong know-how in detector development, cryogenics and environmental measurements, including the Gran Sasso national laboratories, the largest underground laboratory in the world devoted to fundamental physics, INFN looks forward to exciting joint progress in fundamental physics and in quantum science and technology.

Fermilab is excited to host this National Quantum Information Science Research Center and work with this extraordinary network of collaborators, said Fermilab Director Nigel Lockyer. This initiative aligns with Fermilab and its mission. It will help us answer important particle physics questions, and, at the same time, we will contribute to advancements in quantum information science with our strengths in particle accelerator technologies, such as superconducting radio-frequency devices and cryogenics.

We are thankful and honored to have this unique opportunity to be a national center for advancing quantum science and technology, Grassellino said. We have a focused mission: build something revolutionary. This center brings together the right expertise and motivation to accomplish that mission.

The Superconducting Quantum Materials and Systems Center at Fermilab is supported by the DOE Office of Science.

Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit

More here:

Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,...

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

The future of artificial intelligence and quantum computing – Military & Aerospace Electronics

Posted: at 10:55 am

without comments

NASHUA, N.H. -Until the 21st Century, artificial intelligence (AI) and quantum computers were largely the stuff of science fiction, although quantum theory and quantum mechanics had been around for about a century. A century of great controversy, largely because Albert Einstein rejected quantum theory as originally formulated, leading to his famous statement, God does not play dice with the universe.

Today, however, the debate over quantum computing is largely about when not if these kinds of devices will come into full operation. Meanwhile, other forms of quantum technology, such as sensors, already are finding their way into military and civilian applications.

Quantum technology will be as transformational in the 21st Century as harnessing electricity was in the 19th, Michael J. Biercuk, founder and CEO of Q-CTRL Pty Ltd in Sydney, Australia, and professor of Quantum Physics & Quantum Technologies at the University of Sydney, told the U.S. Office of Naval Research in a January 2019 presentation.

On that, there is virtually universal agreement. But when and how remains undetermined.

For example, asked how and when quantum computing eventually may be applied to high-performance embedded computing (HPEC), Tatjana Curcic, program manager for Optimization with Noisy Intermediate-Scale Quantum devices (ONISQ) of the U.S. Defense Advanced Research Projects Agency in Arlington, Va., says its an open question.

Until just recently, quantum computing stood on its own, but as of a few years ago people are looking more and more into hybrid approaches, Curcic says. Im not aware of much work on actually getting quantum computing into HPEC architecture, however. Its definitely not mainstream, probably because its too early.

As to how quantum computing eventually may influence the development, scale, and use of AI, she adds:

Thats another open question. Quantum machine learning is a very active research area, but is quite new. A lot of people are working on that, but its not clear at this time what the results will be. The interface between classical data, which AI is primarily involved with, and quantum computing is still a technical challenge.

Quantum information processing

According to DARPAs ONISQ webpage, the program aims to exploit quantum information processing before fully fault-tolerant quantum computers are realized.This quantum computer based on superconducting qubits is inserted into a dilution refrigerator and cooled to a temperature less than 1 Kelvin. It was built at IBM Research in Zurich.

This effort will pursue a hybrid concept that combines intermediate-sized quantum devices with classical systems to solve a particularly challenging set of problems known as combinatorial optimization. ONISQ seeks to demonstrate the quantitative advantage of quantum information processing by leapfrogging the performance of classical-only systems in solving optimization challenges, the agency states. ONISQ researchers will be tasked with developing quantum systems that are scalable to hundreds or thousands of qubits with longer coherence times and improved noise control.

Researchers will also be required to efficiently implement a quantum optimization algorithm on noisy intermediate-scale quantum devices, optimizing allocation of quantum and classical resources. Benchmarking will also be part of the program, with researchers making a quantitative comparison of classical and quantum approaches. In addition, the program will identify classes of problems in combinatorial optimization where quantum information processing is likely to have the biggest impact. It will also seek to develop methods for extending quantum advantage on limited size processors to large combinatorial optimization problems via techniques such as problem decomposition.

The U.S. government has been the leader in quantum computing research since the founding of the field, but that too is beginning to change.

In the mid-90s, NSA [the U.S. National Security Agency at Fort Meade, Md.] decided to begin on an open academic effort to see if such a thing could be developed. All that research has been conducted by universities for the most part, with a few outliers, such as IBM, says Q-CTRLs Biercuk. In the past five years, there has been a shift toward industry-led development, often in cooperation with academic efforts. Microsoft has partnered with universities all over the world and Google bought a university program. Today many of the biggest hardware developments are coming from the commercial sector.

Quantum computing remains in deep space research, but there are hardware demonstrations all over the world. In the next five years, we expect the performance of these machines to be agented to the point where we believe they will demonstrate a quantum advantage for the first time. For now, however, quantum computing has no advantages over standard computing technology. quantum computers are research demonstrators and do not solve any computing problems at all. Right now, there is no reason to use quantum computers except to be ready when they are truly available.

AI and quantum computing

Nonetheless, the race to develop and deploy AI and quantum computing is global, with the worlds leading military powers seeing them along with other breakthrough technologies like hypersonics making the first to successfully deploy as dominant as the U.S. was following the first detonations of atomic bombs. That is especially true for autonomous mobile platforms, such as unmanned aerial vehicles (UAVs), interfacing with those vehicles onboard HPEC.

Of the two, AI is the closest to deployment, but also the most controversial. A growing number of the worlds leading scientists, including the late Stephen Hawking, warn real-world AI could easily duplicate the actions of the fictional Skynet in the Terminator movie series. Launched with total control over the U.S. nuclear arsenal, Skynet became sentient and decided the human race was a dangerous infestation that needed to be destroyed.

The development of full artificial intelligence could spell the end of the human race. Once humans develop artificial intelligence, it will take off on its own and redesign itself at an ever-increasing rate. Humans, who are limited by slow biological evolution, couldnt compete and would be superseded. Stephen Hawking (2014)

Such dangers have been recognized at least as far back as the publication of Isaac Asimovs short story, Runabout, in 1942, which included his Three Laws of Robotics, designed to control otherwise autonomous robots. In the story, the laws were set down in 2058:

First Law A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Second Law A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

Third Law A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Whether it would be possible to embed and ensure unbreakable compliance with such laws in an AI system is unknown. But limited degrees of AI, known as machine learning, already are in widespread use by the military and advanced stages of the technology, such as deep learning, almost certainly will be deployed by one or more nations as they become available. More than 50 nations already are actively researching battlefield robots.

Military quantum computing

AI-HPEC would give UAVs, next-generation cruise missiles, and even maneuverable ballistic missiles the ability to alter course to new targets at any point after launch, recognize counter measures, avoid, and misdirect or even destroy them.

Quantum computing, on the other hand, is seen by some as providing little, if any, advantage over traditional computer technologies, by many as requiring cooling and size, weight and power (SWaP) improvements not possible with current technologies to make it applicable to mobile platforms and by most as being little more than a research tool for perhaps decades to come.

Perhaps the biggest stumbling block to a mobile platform-based quantum computing is cooling it currently requires a cooling unit, at near absolute zero, the Military trusted computing experts are considering new generations of quantum computing for creating nearly unbreakable encryption for super-secure defense applications.size of a refrigerator to handle a fractional piece of quantum computing.

A lot of work has been done and things are being touted as operational, but the most important thing to understand is this isnt some simple physical thing you throw in suddenly and it works. That makes it harder to call it deployable youre not going to strap a quantum computing to a handheld device. A lot of solutions are still trying to deal with cryogenics and how do you deal with deployment of cryo, says Tammy Carter, senior product manager for GPGPUs and software products at Curtiss-Wright Defense Solutions in Ashburn, Va.

AI is now a technology in deployment. Machine learning is pretty much in use worldwide, Carter says. Were in a migration of figuring out how to use it with the systems we have. quantum computing will require a lot of engineering work and demand may not be great enough to push the effort. From a cryogenically cooled electronics perspective, I dont think there is any insurmountable problem. It absolutely can be done, its just a matter of decision making to do it, prioritization to get it done. These are not easily deployed technologies, but certainly can be deployed.

Given its current and expected near-term limitations, research has increased on the development of hybrid systems.

The longer term reality is a hybrid approach, with the quantum system not going mobile any time soon, says Brian Kirby, physicist in the Army Research Laboratory Computational & Informational Sciences Directorate in Adelphi, Md. Its a mistake to forecast a timeline, but Im not sure putting a quantum computing on such systems would be valuable. Having the quantum computing in a fixed location and linked to the mobile platform makes more sense, for now at least. There can be multiple quantum computers throughout the country; while individually they may have trouble solving some problems, networking them would be more secure and able to solve larger problems.

Broadly, however, quantum computing cant do anything a practical home computer cant do, but can potentially solve certain problems more efficiently, Kirby continues. So youre looking at potential speed-up, but there is no problem a quantum computing can solve a normal computer cant. Beyond the basics of code-breaking and quantum simulations affecting material design, right now we cant necessarily predict military applications.

Raising concerns

In some ways similar to AI, quantum computing raises nearly as many concerns as it does expectations, especially in the area of security. The latest Thales Data Threat Report says 72 percent of surveyed security experts worldwide believe quantum computing will have a negative impact on data security within the next five years.

At the same time, quantum computing is forecast to offer more robust cryptography and security solutions. For HPEC, that duality is significant: quantum computing can make it more difficult to break the security of mobile platforms, while simultaneously making it easier to do just that.

Quantum computers that can run Shors algorithm [leveraging quantum properties to factor very large numbers efficiently] are expected to become available in the next decade. These algorithms can be used to break conventional digital signature schemes (e.g. RSA or ECDSA), which are widely used in embedded systems today. This puts these systems at risk when they are used in safety-relevant long-term applications, such as automotive systems or critical infrastructures. To mitigate this risk, classical digital signature schemes used must be replaced by schemes secure against quantum computing-based attacks, according to the August 2019 proceedings of the 14th International Conference on Availability, Reliability & Securitys Post-Quantum Cryptography in Embedded Systems report.

The security question is not quite so clean-cut as armor/anti-armor, but there is a developing bifurcation between defensive and offensive applications. On the defense side, deployed quantum systems are looked at to provide encoded communications. Experts say it seems likely the level of activity in China about quantum communications, which has been a major focus for years, runs up against the development of quantum computing in the U.S. The two aspects are not clearly one-against-one, but the two moving independently.

Googles quantum supremacy demonstration has led to a rush on finding algorithms robust against quantum attack. On the quantum communications side, the development of attacks on such systems has been underway for years, leading to a whole field of research based on identifying and exploiting quantum attacks.

Quantum computing could also help develop revolutionary AI systems. Recent efforts have demonstrated a strong and unexpected link between quantum computation and artificial neural networks, potentially portending new approaches to machine learning. Such advances could lead to vastly improved pattern recognition, which in turn would permit far better machine-based target identification. For example, the hidden submarine in our vast oceans may become less-hidden in a world with AI-empowered quantum computers, particularly if they are combined with vast data sets acquired through powerful quantum-enabled sensors, according to Q-CTRLs Biercuk.

Even the relatively mundane near-term development of new quantum-enhanced clocks may impact security, beyond just making GPS devices more accurate, Biercuk continues. Quantum-enabled clocks are so sensitive that they can discern minor gravitational anomalies from a distance. They thus could be deployed by military personnel to detect underground, hardened structures, submarines or hidden weapons systems. Given their potential for remote sensing, advanced clocks may become a key embedded technology for tomorrows warfighter.

Warfighter capabilities

The early applications of quantum computing, while not embedded on mobile platforms, are expected to enhance warfighter capabilities significantly.

Jim Clark, director of quantum hardware at Intel Corp. in Santa Clara, Calif., shows one of the companys quantum processors.There is a high likelihood quantum computing will impact ISR [intelligence, surveillance and reconnaissance], solving logistics problems more quickly. But so much of this is in the basic research stage. While we know the types of problems and general application space, optimization problems will be some of the first where we will see advantages from quantum computing, says Sara Gamble, quantum information sciences program manager at ARL.

Biercuk says he agrees: Were not really sure there is a role for quantum computing in embedded computing just yet. quantum computing is right now very large systems embedded in mainframes, with access by the cloud. You can envision embedded computing accessing quantum computing via the cloud, but they are not likely to be very small, agile processors you would embed in a SWAP-constrained environment.

But there are many aspects of quantum technology beyond quantum computing; the combination of quantum sensors could allow much better detection in the field, Biercuk continues. The biggest potential impact comes in the areas of GPS denial, which has become one of the biggest risk factors identified in every blueprint around the world. quantum computing plays directly into this to perform dead reckoning navigation in GPS denial areas.

DARPAs Curcic also says the full power of quantum computing is still decades away, but believes ONISQ has the potential to help speed its development.

The main two approaches industry is using is superconducting quantum computing and trapped ions. We use both of those, plus cold atoms [Rydberg atoms]. We are very excited about ONISQ and seeing if we can get anything useful over classical computing. Four teams are doing hardware development with those three approaches, she says.

Because these are noisy systems, its very difficult to determine if there will be any advantages. The hope is we can address the optimization problem faster than today, which is what were working on with ONISQ. Optimization problems are everywhere, so even a small improvement would be valuable.

Beyond todays capabilities

As to how quantum computing and AI may impact future warfare, especially through HPEC, she adds: I have no doubt quantum computing will be revolutionary and well be able to do things beyond todays capabilities. The possibilities are pretty much endless, but what they are is not crystal clear at this point. Its very difficult, with great certainly, to predict what quantum computing will be able to do. Well just have to build and try. Thats why today is such an exciting time.

Curtiss Wrights Carter says he believes quantum computing and AI will be closely linked with HPEC in the future, once current limitations with both are resolved.

AI itself is based on a lot of math being done in parallel for probability answers, similar to modeling the neurons in the brain highly interconnected nodes and interdependent math calculations. Imagine a small device trying to recognize handwriting, Carter says. You run every pixel of that through lots and lots of math, combining and mixing, cutting some, amplifying others, until you get a 98 percent answer at the other end. quantum computing could help with that and researchers are looking at how you would do that, using a different level of parallel math.

How quantum computing will be applied to HPEC will be the big trick, how to get that deployed. Imagine were a SIGINT [signals intelligence] platform land, air or sea there are a lot of challenges, such as picking the right signal out of the air, which is not particularly easy, Carter continues. Once you achieve pattern recognition, you want to do code breaking to get that encrypted traffic immediately. Getting that on a deployed platform could be useful; otherwise you bring your data back to a quantum computing in a building, but that means you dont get the results immediately.

The technology research underway today is expected to show progress toward making quantum computing more applicable to military needs, but it is unlikely to produce major results quickly, especially in the area of HPEC.

Trapped ions and superconducting circuits still require a lot of infrastructure to make them work. Some teams are working on that problem, but the systems still remain room-sized. The idea of quantum computing being like an integrated circuit you just put on a circuit board were a very long way from that, Biercuk says. The systems are getting smaller, more compact, but there is a very long way to go to deployable, embeddable systems. Position, navigation and timing systems are being reduced and can be easily deployed on aircraft. Thats probably where the technology will remain in the next 20 years; but, eventually, with new technology development, quantum computing may be reduced to more mobile sizes.

The next 10 years are about achieving quantum advantage with the systems available now or iterations. Despite the acceleration we have seen, there are things that are just hard and require a lot of creativity, Biercuk continues. Were shrinking the hardware, but that hardware still may not be relevant to any deployable system. In 20 years, we may have machines that can do the work required, but in that time we may only be able to shrink them to a size that can fit on an aircraft carrier local code-breaking engines. To miniaturize this technology to put it on, say, a body-carried system, we just dont have any technology basis to claim we will get there even in 20 years. Thats open to creativity and discovery.

Even with all of the research underway worldwide, one question remains dominant.

The general challenge is it is not clear what we will use quantum computing for, notes Rad Balu, a computer scientist in ARLs Computational & Informational Sciences Directorate.

The rest is here:

The future of artificial intelligence and quantum computing - Military & Aerospace Electronics

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Researchers Found Another Impediment for Quantum Computers to Overcome – Dual Dove

Posted: at 10:55 am

without comments

Maintaining qubits stable will be the pivot to realizing the potential of quantum computing, and now researchers have managed to discover a new obstacle to this stability: natural radiation.

Natural or background radiation is produced by various sources, both natural and artificial. Cosmic rays produce natural radiation, for instance, and so do concrete buildings. It is surrounding us all the time, and so this poses something of an issue for future quantum computers.

After numerous experiments that modified the level of natural radiation around qubits, physicists could establish that this background noise does indeed push qubits off balance in a way that hinders them from operating properly.

Our study is the first to show clearly that low-level ionizing radiation in the environment degrades the performance of superconducting qubits,says physicist John Orrell, from the Pacific Northwest National Laboratory (PNNL). These findings suggest that radiation shielding will be necessary to attain long-sought performance in quantum computers of this design.

Natural radiation is under no circumstance the most important or the only menace to qubit stability, which is basically known as coherence; everything from temperature variations to electromagnetic fields is able to mess with the qubit.

However, scientists say if were to attain a future where quantum computers are performing most of our advanced computing needs, then this hindrance from natural radiation is going to have to be addressed.

After the team that carried out the study was faced with issues regarding superconducting qubit decoherence, it decided to examine the possible problem with natural radiation. They discovered it breaks up a main quantum binding known as theCooper pairof electrons.

The radiation breaks apart matched pairs of electrons that typically carry electric current without resistance in a superconductor,says physicist Brent VanDevender, from PNNL. The resistance of those unpaired electrons destroys the delicately prepared state of a qubit.

Regular computers can be distorted by the same issues that impact qubits, but quantum states are a lot more delicate and sensitive. One of the reasons that we dont have authentic full-scale quantum computers at the moment is that theres no way yet to keep qubits stable for more than a few milliseconds at a time.

If we can develop on that, the benefits when it comes to computing power could be gigantic: while classical computer bits can only be set as 1 or 0, qubits can be set as 1,0, or both at the same time, a state known assuperposition.

Researchers have managed to get it happening, but only for a very short period, and in an extremely controlled setting. The good news, however, is that scientists like those at PNNL are dedicated to the challenge of discovering how to make quantum computers a reality, and with the new finding, we know a bit more about what weve to overcome.

Practical quantum computing with these devices will not be possible unless we address the radiation issue,says VanDevender. Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing.

A paper detailing the research has been published in the journalNature.

Known for her passion for writing, Paula contributes to both Science and Health niches here at Dual Dove.

See original here:

Researchers Found Another Impediment for Quantum Computers to Overcome - Dual Dove

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 – Kentucky Journal 24

Posted: at 10:55 am

without comments


Quantum cryptographyis a new method for secret communications that provides the assurance of security of digital data. Quantum cryptography is primarily based on the usage of individual particles/waves of light (photon) and their essential quantum properties for the development of an unbreakable cryptosystem, primarily because it is impossible to measure the quantum state of any system without disturbing that system.

Request For ReportSample@

It is hypothetically possible that other particles could be used, but photons offer all the necessary qualities needed, the their behavior is comparatively understandable, and they are the information carriers in optical fiber cables, the most promising medium for very high-bandwidth communications.

Quantum computing majorly focuses on the growing computer technology that is built on the platform of quantum theory which provides the description about the nature and behavior of energy and matter at quantum level. The fame of quantum mechanics in cryptography is growing because they are being used extensively in the encryption of information. Quantum cryptography allows the transmission of the most critical data at the most secured level, which in turn, propels the growth of the quantum computing market. Quantum computing has got a huge array of applications.

Market Analysis:

According to Infoholic Research, the Global Quantum cryptography Market is expected to reach $1.53 billion by 2023, growing at a CAGR of around 26.13% during the forecast period. The market is experiencing growth due to the increase in the data security and privacy concerns. In addition, with the growth in the adoption of cloud storage and computing technologies is driving the market forward. However, low customer awareness about quantum cryptography is hindering the market growth. The rising demands for security solutions across different verticals is expected to create lucrative opportunities for the market.

Market Segmentation Analysis:

The report provides a wide-ranging evaluation of the market. It provides in-depth qualitative insights, historical data, and supportable projections and assumptions about the market size. The projections featured in the report have been derived using proven research methodologies and assumptions based on the vendors portfolio, blogs, whitepapers, and vendor presentations. Thus, the research report serves every side of the market and is segmented based on regional markets, type, applications, and end-users.

Countries and Vertical Analysis:

The report contains an in-depth analysis of the vendor profiles, which include financial health, business units, key business priorities, SWOT, strategy, and views; and competitive landscape. The prominent vendors covered in the report include ID Quantique, MagiQ Technologies, Nucrypt, Infineon Technologies, Qutools, QuintenssenceLabs, Crypta Labs, PQ Solutions, and Qubitekk and others. The vendors have been identified based on the portfolio, geographical presence, marketing & distribution channels, revenue generation, and significant investments in R&D.

Get Complete TOC with Tables andFigures@

Competitive Analysis

The report covers and analyzes the global intelligent apps market. Various strategies, such as joint ventures, partnerships,collaborations, and contracts, have been considered. In addition, as customers are in search of better solutions, there is expected to be a rising number of strategic partnerships for better product development. There is likely to be an increase in the number of mergers, acquisitions, and strategic partnerships during the forecast period.

Companies such as Nucrypt, Crypta Labs, Qutools, and Magiq Technologies are the key players in the global Quantum Cryptography market. Nucrypt has developed technologies for emerging applications in metrology and communication. The company has also produced and manufactured electronic and optical pulsers. In addition, Crypta Labs deals in application security for devices. The company deals in Quantum Random Number Generator products and solutions and Internet of Things (IoT). The major sectors the company is looking at are transport, military and medical.

The report includes the complete insight of the industry, and aims to provide an opportunity for the emerging and established players to understand the market trends, current scenario, initiatives taken by the government, and the latest technologies related to the market. In addition, it helps the venture capitalists in understanding the companies better and to take informed decisions.

Regional Analysis

The Americas held the largest chunk of market share in 2017 and is expected to dominate the quantum cryptography market during the forecast period. The region has always been a hub for high investments in research and development (R&D) activities, thus contributing to the development of new technologies. The growing concerns for the security of IT infrastructure and complex data in America have directed the enterprises in this region to adopt quantum cryptography and reliable authentication solutions.

<<< Get COVID-19 Report Analysis >>>


The report provides an in-depth analysis of the global intelligent apps market aiming to reduce the time to market the products and services, reduce operational cost, improve accuracy, and operational performance. With the help of quantum cryptography, various organizations can secure their crucial information, and increase productivity and efficiency. In addition, the solutions are proven to be reliable and improve scalability. The report discusses the types, applications, and regions related to this market. Further, the report provides details about the major challenges impacting the market growth.

Read more here:

Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 - Kentucky Journal 24

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Q-NEXT collaboration awarded National Quantum Initiative funding – University of Wisconsin-Madison

Posted: at 10:55 am

without comments

The University of WisconsinMadison solidified its standing as a leader in the field of quantum information science when the U.S. Department of Energy (DOE) and the White House announced the Q-NEXT collaboration as a funded Quantum Information Science Research Center through the National Quantum Initiative Act. The five-year, $115 million collaboration was one of five Centers announced today.

Q-NEXT, a next-generation quantum science and engineering collaboration led by the DOEs Argonne National Laboratory, brings together nearly 100 world-class researchers from three national laboratories, 10 universities including UWMadison, and 10 leading U.S. technology companies to develop the science and technology to control and distribute quantum information.

The main goals for Q-NEXT are first to deliver quantum interconnects to find ways to quantum mechanically connect distant objects, says Mark Eriksson, the John Bardeen Professor of Physics at UWMadison and a Q-NEXT thrust lead. And next, to establish a national resource to both develop and provide pristine materials for quantum science and technology.

Q-NEXT will focus on three core quantum technologies:

Eriksson is leading the Materials and Integration thrust, one of six Q-NEXT focus areas that features researchers from across the collaboration. This thrust aims to: develop high-coherence materials, including for silicon and superconducting qubits, which is an essential component of preserving entanglement; develop a silicon-based optical quantum memory, which is important in developing a quantum repeater; and improve color-center quantum bits, which are used in both communication and sensing.

One of the key goals in Materials and Integration is to not just improve the materials but also to improve how you integrate those materials together so that in the end, quantum devices maintain coherence and preserve entanglement, Eriksson says. The integration part of the name is really important. You may have a material that on its own is really good at preserving coherence, yet you only make something useful when you integrate materials together.

Six other UWMadison and Wisconsin Quantum Institute faculty members are Q-NEXT investigators: physics professors Victor Brar, Shimon Kolkowitz, Robert McDermott, and Mark Saffman, electrical and computer engineering professor Mikhail Kats, and chemistry professor Randall Goldsmith. UWMadison researchers are involved in five of the six research thrusts.

Im excited about Q-NEXT because of the connections and collaborations it provides to national labs, other universities, and industry partners, Eriksson says. When youre talking about research, its those connections that often lead to the breakthroughs.

The potential impacts of Q-NEXT research include the creation ofa first-ever National Quantum Devices Databasethat will promote the development and fabrication of next generation quantum devices as well as the development of the components and systems that enable quantum communications across distances ranging from microns to kilometers.

This funding helps ensure that the Q-NEXT collaboration will lead the way in future developments in quantum science and engineering, says Steve Ackerman, UWMadison vice chancellor for research and graduate education. Q-NEXT is the epitome of the Wisconsin Idea as we work together to transfer new quantum technologies to the marketplace and support U.S. economic competitiveness in this growing field.

Read more here:

Q-NEXT collaboration awarded National Quantum Initiative funding - University of Wisconsin-Madison

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

This Equation Calculates The Chances We Live In A Computer Simulation – Discover Magazine

Posted: at 10:55 am

without comments

Credit: metamorworks/Shutterstock

Sign up for our email newsletter for the latest science news

The Drake equation is one of the more famous reckonings in science. It calculates the likelihood that we are not alone in the universe by estimating the number of other intelligent civilizations in our galaxy that might exist now.

Some of the terms in this equation are well known or becoming better understood, such as the number of stars in our galaxy and the proportion that have planets in the habitable zone. But others are unknown, such as the proportion of planets that develop intelligent life; and some may never be known such as the proportion that destroy themselves before they can be discovered.

Nevertheless, the Drake equation allows scientists to place important bounds on the numbers of intelligent civilizations that might be out there.

However, there is another sense in which humanity could be linked with an alien intelligenceour world may just be a simulation inside a massively powerful supercomputer run by such a species. Indeed, various scientists, philosophers and visionaries have said that the probability of such a scenario could be close to one. In other words, we probably are living in a simulation.

The accuracy of these claims is somewhat controversial. So a better way to determine the probability that we live in a simulation would be much appreciated.

Enter Alexandre Bibeau-Delisle and Gilles Brassard at the University of Montreal in Canada. These researchers have derived a Drake-like equation that calculates the chances that we live in a computer simulation. And the results throw up some counterintuitive ideas that are likely to change the way we think about simulations, how we might determine whether we are in one and whether we could ever escape.

Bibeau-Delisle and Brassard begin with a fundamental estimate of the computing power available to create a simulation. They say, for example, that a kilogram of matter, fully exploited for computation, could perform 10^50 operations per second.

By comparison, the human brain, which is also kilogram-sized, performs up to 10^16 operations per second. It may thus be possible for a single computer the mass of a human brain to simulate the real-time evolution of 1.4 10^25 virtual brains, they say.

In our society, a significant number of computers already simulate entire civilizations, in games such as Civilization VI, Hearts of Iron IV, Humankind and so. So it may be reasonable to assume that in a sufficiently advanced civilization, individuals will be able to run games that simulate societies like ours, populated with sentient conscious beings.

So an interesting question is this: of all the sentient beings in existence, what fraction are likely to be simulations? To derive the answer, Bibeau-Delisle and Brassard start with the total number of real sentient beings NRe, multiply that by the fraction with access to the necessary computing power fCiv; multiply this by the fraction of that power that is devoted to simulating consciousness fDed (because these beings are likely to be using their computer for other purposes too); and then multiply this by the number of brains they could simulate Rcal.

The resulting equation is this, where fSim is the fraction of simulated brains:

Here RCal is the huge number of brains that fully exploited matter should be able to simulate.

The sheer size of this number, ~10^25, pushes Bibeau-Delisle and Brassard towards an inescapable conclusion. It is mathematically inescapable from [the above] equation and the colossal scale of RCal that fSim 1 unless fCiv fDed 0, they say.

So there are two possible outcomes. Either we live in a simulation or a vanishingly small proportion of advanced computing power is devoted to simulating brains.

Its not hard to imagine why the second option might be true. A society of beings similar to us (but with a much greater technological development) could indeed decide it is not very ethical to simulate beings with enough precision to make them conscious while fooling them and keeping them cut-off from the real world, say Bibeau-Delisle and Brassard.

Another possibility is that advanced civilizations never get to the stage where their technology is powerful enough to perform these kinds of computations. Perhaps they destroy themselves through war or disease or climate change long before then. There is no way of knowing.

But suppose we are in a simulation. Bibeau-Delisle and Brassard ask whether we might escape while somehow hiding our intentions from our overlords. They assume that the simulating technology will be quantum in nature. If quantum phenomena are as difficult to compute on classical systems as we believe them to be, a simulation containing our world would most probably run on quantum computing power, they say.

This raises the possibility that it may be possible to detect our alien overlords since they cannot measure the quantum nature of our world without revealing their presence. Quantum cryptography uses the same principle; indeed, Brassard is one of the pioneers of this technology.

That would seem to make it possible for us to make encrypted plans that are hidden from the overlords, such as secretly transferring ourselves into our own simulations.

However, the overlords have a way to foil this. All they need to do is to rewire their simulation to make it look as if we are able to hide information, even though they are aware of it all the time. If the simulators are particularly angry at our attempted escape, they could also send us to a simulated hell, in which case we would at least have the confirmation we were truly living inside a simulation and our paranoia was not unjustified...conclude Bibeau-Delisle and Brassard, with their tongues firmly in their cheeks.

In that sense, we are the ultimate laboratory guinea pigs: forever trapped and forever fooled by the evil genius of our omnipotent masters.

Time for another game of Civilization VI.

Ref: : Probability and Consequences of Living Inside a Computer Simulation

Here is the original post:

This Equation Calculates The Chances We Live In A Computer Simulation - Discover Magazine

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

I confess, I’m scared of the next generation of supercomputers – TechRadar

Posted: at 10:55 am

without comments

Earlier this year, a Japanese supercomputer built on Arm-based Fujitsu A64FX processors snatched the crown of worlds fastest machine, blowing incumbent leader IBM Summit out of the water.

Fugaku, as the machine is known, achieved 415.5 petaFLOPS by the popular High Performance Linpack (HPL) benchmark, which is almost three times the score of the IBM machine (148.5 petaFLOPS).

It also topped the rankings for Graph 500, HPL-AI and HPCH workloads - a feat never before achieved in the world of high performance computing (HPC).

Modern supercomputers are now edging ever-closer to the landmark figure of one exaFLOPS (equal to 1,000 petaFLOPS), commonly described as the exascale barrier. In fact, Fugaku itself can already achieve one exaFLOPS, but only in lower precision modes.

The consensus among the experts we spoke to is that a single machine will breach the exascale barrier within the next 6 - 24 months, unlocking a wealth of possibilities in the fields of medical research, climate forecasting, cybersecurity and more.

But what is an exaFLOPS? And what will it mean to break the exascale milestone, pursued doggedly for more than a decade?

To understand what it means to achieve exascale computing, its important to first understand what is meant by FLOPS, which stands for floating point operations per second.

A floating point operation is any mathematical calculation (i.e. addition, subtraction, multiplication or division) that involves a number containing a decimal (e.g. 3.0 - a floating point number), as opposed to a number without a decimal (e.g. 3 - a binary integer). Calculations involving decimals are typically more complex and therefore take longer to solve.

An exascale computer can perform 10^18 (one quintillion/100,000,000,000,000,000) of these mathematical calculations every second.

For context, to equal the number of calculations an exascale computer can process in a single second, an individual would have to perform one sum every second for 31,688,765,000 years.

The PC Im using right now, meanwhile, is able to reach 147 billion FLOPS (or 0.00000014723 exaFLOPS), outperforming the fastest supercomputer of 1993, the Intel Paragon (143.4 billion FLOPS).

This both underscores how far computing has come in the last three decades and puts into perspective the extreme performance levels attained by the leading supercomputers today.

The key to building a machine capable of reaching one exaFLOPS is optimization at the processing, storage and software layers.

The hardware must be small and powerful enough to pack together and reach the necessary speeds, the storage capacious and fast enough to serve up the data and the software scalable and programmable enough to make full use of the hardware.

For example, there comes a point at which adding more processors to a supercomputer will no longer affect its speed, because the application is not sufficiently optimized. The only way governments and private businesses will realize a full return on HPC hardware investment is through an equivalent investment in software.

Organizations such as the Exascale Computing Project (EPC) the ExCALIBUR programme are interested in solving precisely this problem. Those involved claim a renewed focus on algorithm and application development is required in order to harness the full power and scope of exascale.

Achieving the delicate balance between software and hardware, in an energy efficient manner and avoiding an impractically low mean time between failures (MTBF) score (the time that elapses before a system breaks down under strain) is the challenge facing the HPC industry.

15 years ago as we started the discussion on exascale, we hypothesized that it would need to be done in 20 mega-watts (MW); later that was changed to 40 MW. With Fugaku, we see that we are about halfway to a 64-bit exaFLOPS at the 40 MW power envelope, which shows that an exaFLOPS is in reach today, explained Brent Gorda, Senior Director HPC at UK-based chip manufacturer Arm.

We could hit an exaFLOPS now with sufficient funding to build and run a system. [But] the size of the system is likely to be such that MTBF is measured in single digit number-of-days based on todays technologies and the number of components necessary to reach these levels of performance.

When it comes to building a machine capable of breaching the exascale barrier, there are a number of other factors at play, beyond technological feasibility. An exascale computer can only come into being once an equilibrium has been reached at the intersection of technology, economics and politics.

One could in theory build an exascale system today by packing in enough CPUs, GPUs and DPUs. But what about economic viability? said Gilad Shainer of NVIDIA Mellanox, the firm behind the Infiniband technology (the fabric that links the various hardware components) found in seven of the ten fastest supercomputers.

Improvements in computing technologies, silicon processing, more efficient use of power and so on all help to increase efficiency and make exascale computing an economic objective as opposed to a sort of sporting achievement.

According to Paul Calleja, who heads up computing research at the University of Cambridge and is working with Dell on the Open Exascale Lab, Fugaku is an excellent example of what is theoretically possible today, but is also impractical by virtually any other metric.

If you look back at Japanese supercomputers, historically theres only ever been one of them made. They have beautifully exquisite architectures, but theyre so stupidly expensive and proprietary that no one else could afford one, he told TechRadar Pro.

[Japanese organizations] like these really large technology demonstrators, which are very useful in industry because it shows the direction of travel and pushes advancements, but those kinds of advancements are very expensive and not sustainable, scalable or replicable.

So, in this sense, there are two separate exascale landmarks; the theoretical barrier, which will likely be met first by a machine of Fugakus ilk (a technological demonstrator), and the practical barrier, which will see exascale computing deployed en masse.

Geopolitical factors will also play a role in how quickly the exascale barrier is breached. Researchers and engineers might focus exclusively on the technological feat, but the institutions and governments funding HPC research are likely motivated by different considerations.

Exascale computing is not just about reaching theoretical targets, it is about creating the ability to tackle problems that have been previously intractable, said Andy Grant, Vice President HPC & Big Data at IT services firm Atos, influential in the fields of HPC and quantum computing.

Those that are developing exascale technologies are not doing it merely to have the fastest supercomputer in the world, but to maintain international competitiveness, security and defence.

In Japan, their new machine is roughly 2.8x more powerful than the now-second place system. In broad terms, that will enable Japanese researchers to address problems that are 2.8x more complex. In the context of international competitiveness, that creates a significant advantage.

In years gone by, rival nations fought it out in the trenches or competed to see who could place the first human on the moon. But computing may well become the frontier at which the next arms race takes place; supremacy in the field of HPC might prove just as politically important as military strength.

Once exascale computers become an established resource - available for businesses, scientists and academics to draw upon - a wealth of possibilities will be unlocked across a wide variety of sectors.

HPC could prove revelatory in the fields of clinical medicine and genomics, for example, which require vast amounts of compute power to conduct molecular modelling, simulate interactions between compounds and sequence genomes.

In fact, IBM Summit and a host of other modern supercomputers are being used to identify chemical compounds that could contribute to the fight against coronavirus. The Covid-19 High Performance Computing Consortium assembled 16 supercomputers, accounting for an aggregate of 330 petaFLOPS - but imagine how much more quickly research could be conducted using a fleet of machines capable of reaching 1,000 petaFLOPS on their own.

Artificial intelligence, meanwhile, is another cross-disciplinary domain that will be transformed with the arrival of exascale computing. The ability to analyze ever-larger datasets will improve the ability of AI models to make accurate forecasts (contingent on the quality of data fed into the system) that could be applied to virtually any industry, from cybersecurity to e-commerce, manufacturing, logistics, banking, education and many more.

As explained by Rashid Mansoor, CTO at UK supercomputing startup Hadean, the value of supercomputing lies in the ability to make an accurate projection (of any variety).

The primary purpose of a supercomputer is to compute some real-world phenomenon to provide a prediction. The prediction could be the way proteins interact, the way a disease spreads through the population, how air moves over an aerofoil or electromagnetic fields interact with a spacecraft during re-entry, he told TechRadar Pro.

Raw performance such as the HPL benchmark simply indicates that we can model bigger and more complex systems to a greater degree of accuracy. One thing that the history of computing has shown us is that the demand for computing power is insatiable.

Other commonly cited areas that will benefit significantly from the arrival of exascale include brain mapping, weather and climate forecasting, product design and astronomy, but its also likely that brand new use cases will emerge as well.

The desired workloads and the technology to perform them form a virtuous circle. The faster and more performant the computers, the more complex problems we can solve and the faster the discovery of new problems, explained Shainer.

What we can be sure of is that we will see the continuous needs or ever growing demands for more performance capabilities in order to solve the unsolvable. Once this is solved, we will find the new unsolvable.

By all accounts, the exascale barrier will likely fall within the next two years, but the HPC industry will then turn its attention to the next objective, because the work is never done.

Some might point to quantum computers, which approach problem solving in an entirely different way to classical machines (exploiting symmetries to speed up processing), allowing for far greater scale. However, there are also problems to which quantum computing cannot be applied.

Mid-term (10 year) prospects for quantum computing are starting to shape up, as are other technologies. These will be more specialized where a quantum computer will very likely show up as an application accelerator for problems that relate to logistics first. They wont completely replace the need for current architectures for IT/data processing, explained Gorda.

As Mansoor puts it, on certain problems even a small quantum computer can be exponentially faster than all of the classical computing power on earth combined. Yet on other problems, a quantum computer could be slower than a pocket calculator.

The next logical landmark for traditional computing, then, would be one zettaFLOPS, equal to 1,000 exaFLOPS or 1,000,000 petaFLOPS.

Chinese researchers predicted in 2018 that the first zettascale system will come online in 2035, paving the way for new computing paradigms. The paper itself reads like science fiction, at least for the layman:

To realize these metrics, micro-architectures will evolve to consist of more diverse and heterogeneous components. Many forms of specialized accelerators are likely to co-exist to boost HPC in a joint effort. Enabled by new interconnect materials such as photonic crystal, fully optical interconnecting systems may come into use.

Assuming one exaFLOPS is reached by 2022, 14 years will have elapsed between the creation of the first petascale and first exascale systems. The first terascale machine, meanwhile, was constructed in 1996, 12 years before the petascale barrier was breached.

If this pattern were to continue, the Chinese researchers estimate would look relatively sensible, but there are firm question marks over the validity of zettascale projections.

While experts are confident in their predicted exascale timelines, none would venture a guess at when zettascale might arrive without prefacing their estimate with a long list of caveats.

Is that an interesting subject? Because to be honest with you, its so not obtainable. To imagine how we could go 1000x beyond [one exaFLOPS] is not a conversation anyone could have, unless theyre just making it up, said Calleja, asked about the concept of zettascale.

Others were more willing to theorize, but equally reticent to guess at a specific timeline. According to Grant, the way zettascale machines process information will be unlike any supercomputer in existence today.

[Zettascale systems] will be data-centric, meaning components will move to the data rather than the other way around, as data volumes are likely to be so large that moving data will be too expensive. Regardless, predicting what they might look like is all guesswork for now, he said.

It is also possible that the decentralized model might be the fastest route to achieving zettascale, with millions of less powerful devices working in unison to form a collective supercomputer more powerful than any single machine (as put into practice by the SETI Institute).

As noted by Saurabh Vij, CEO of distributed supercomputing firm Q Blocks, decentralized systems address a number of problems facing the HPC industry today, namely surrounding building and maintenance costs. They are also accessible to a much wider range of users and therefore democratize access to supercomputing resources in a way that is not otherwise possible.

There are benefits to a centralized architecture, but the cost and maintenance barrier overshadows them. [Centralized systems] also alienate a large base of customer groups that could benefit, he said.

We think a better way is to connect distributed nodes together in a reliable and secure manner. It wouldnt be too aggressive to say that, 5 years from now, your smartphone could be part of a giant distributed supercomputer, making money for you while you sleep by solving computational problems for industry, he added.

However, incentivizing network nodes to remain active for a long period is challenging and a high rate of turnover can lead to reliability issues. Network latency and capacity problems would also need to be addressed before distributed supercomputing can rise to prominence.

Ultimately, the difficulty in making firm predictions about zettascale lies in the massive chasm that separates present day workloads and HPC architectures from those that might exist in the future. From a contemporary perspective, its fruitless to imagine what might be made possible by a computer so powerful.

We might imagine zettascale machines will be used to process workloads similar to those tackled by modern supercomputers, only more quickly. But its possible - even likely - the arrival of zettascale computing will open doors that do not and cannot exist today, so extraordinary is the leap.

In a future in which computers are 2,000+ times as fast as the most powerful machine today, philosophical and ethical debate surrounding the intelligence of man versus machine are bound to be played out in greater detail - and with greater consequence.

It is impossible to directly compare the workings of a human brain with that of a computer - i.e. to assign a FLOPS value to the human mind. However, it is not insensible to ask how many FLOPS must be achieved before a machine reaches a level of performance that might be loosely comparable to the brain.

Back in 2013, scientists used the K supercomputer to conduct a neuronal network simulation using open source simulation software NEST. The team simulated a network made up of 1.73 billion nerve cells connected by 10.4 trillion synapses.

While ginormous, the simulation represented only 1% of the human brains neuronal network and took 40 minutes to replicate 1 seconds worth of neuronal network activity.

However, the K computer reached a maximum computational power of only 10 petaFLOPS. A basic extrapolation (ignoring inevitable complexities), then, would suggest Fugaku could simulate circa 40% of the human brain, while a zettascale computer would be capable of performing a full simulation many times over.

Digital neuromorphic hardware (supercomputers created specifically to simulate the human brain) like SpiNNaker 1 and 2 will also continue to develop in the post-exascale future. Instead of sending information from point A to B, these machines will be designed to replicate the parallel communication architecture of the brain, sending information simultaneously to many different locations.

Modern iterations are already used to help neuroscientists better understand the mysteries of the brain and future versions, aided by advances in artificial intelligence, will inevitably be used to construct a faithful and fully-functional replica.

The ethical debates that will arise with the arrival of such a machine - surrounding the perception of consciousness, the definition of thought and what an artificial uber-brain could or should be used for - are manifold and could take generations to unpick.

The inability to foresee what a zettascale computer might be capable of is also an inability to plan for the moral quandaries that might come hand-in-hand.

Whether a future supercomputer might be powerful enough to simulate human-like thought is not in question, but whether researchers should aspire to bringing an artificial brain into existence is a subject worthy of discussion.

See the article here:

I confess, I'm scared of the next generation of supercomputers - TechRadar

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Honeywell Wants To Show What Quantum Computing Can Do For The World – Forbes

Posted: August 14, 2020 at 11:51 pm

without comments

The race for quantum supremacy heated up in June, when Honeywell brought to market the worlds highest performing quantum computer. Honeywell claims it is more accurate (i.e., performs with less errors) than competing systems and that its performance will increase by an order of magnitude each year for the next five years.

Inside the chamber of Honeywells quantum computer

The beauty of quantum computing, says Tony Uttley, President of Honeywell Quantum Solutions, is that once you reach a certain level of accuracy, every time you add a qbit [the basic unit of quantum information] you double the computational capacity. So as the quantum computer scales exponentially, you can scale your problem set exponentially.

Tony Uttley, President, Honeywell Quantum Solutions

Uttley sees three distinct eras in the evolution of quantum computing. Today, we are in the emergent erayou can start to prove what kind of things work, what kind of algorithms show the most promise. For example, the Future Lab for Applied Research and Engineering (FLARE) group of JPMorgan Chase published a paper in June summarizing the results of running on the Honeywell quantum computer complex mathematical calculations used in financial trading applications.

The next era Uttley calls classically impractical, running computations on a quantum computer that typically are not run on todays (classical) computers because they take too long, consume too much power, and cost too much. Crossing the threshold from emergent to classically impractical is not very far away, he asserts, probably sometime in the next 18 to 24 months. This is when you build the trust with the organizations you work with that the answer that is coming from your quantum computer is the correct one, says Uttley.

The companies that understand the potential impact of quantum computing on their industries, are already looking at what it would take to introduce this new computing capability into their existing processes and what they need to adjust or develop from scratch, according to Uttley. These companies will be ready for the shift from emergent to classically impractical which is going to be a binary moment, and they will be able to take advantage of it immediately.

The last stage of the quantum evolution will be classically impossible"you couldnt in the timeframe of the universe do this computation on a classical best-performing supercomputer that you can on a quantum computer, says Uttley. He mentions quantum chemistry, machine learning, optimization challenges (warehouse routing, aircraft maintenance) as applications that will benefit from quantum computing. But what shows the most promise right now are hybrid [resources]you do just one thing, very efficiently, on a quantum computer, and run the other parts of the algorithm or calculation on a classical computer. Uttley predicts that for the foreseeable future we will see co-processing, combining the power of todays computers with the power of emerging quantum computing solutions.

You want to use a quantum computer for the more probabilistic parts [of the algorithm] and a classical computer for the more mundane calculationsthat might reduce the number of qbits needed, explains Gavin Towler, vice president and chief technology officer of Honeywell Performance Materials Technologies. Towler leads R&D activities for three of Honeywell's businesses: Advanced Materials (e.g., refrigerants), UOP (equipment and services for the oil and gas sector), and Process Automation (automation, control systems, software, for all the process industries). As such, he is the poster boy for a quantum computing lead-user.

Gavin Towler, Vice President and Chief Technology Officer, Honeywell Performance Materials and ... [+] Technologies

In the space of materials discovery, quantum computing is going to be critical. Thats not a might or could be. It is going to be the way people do molecular discovery, says Towler. Molecular simulation is used in the design of new molecules, requiring the designer to understand quantum effects. These are intrinsically probabilistic as are quantum computers, Towler explains.

An example he provides is a refrigerant Honeywell produces that is used in automotive air conditioning, supermarkets refrigeration, and homes. As the chlorinated molecules in the refrigerants were causing the hole in the Ozone layer, they were replaced by HFCs which later tuned out to be very potent greenhouse gasses. Honeywell already found a suitable replacement for the refrigerant used in automotive air conditioning, but is searching for similar solutions for other refrigeration applications. Synthesizing in the lab molecules that will prove to have no effect on the Ozone layer or global warming and will not be toxic or flammable is costly. Computer simulation replaces lab work but ideally, you want to have computer models that will screen things out to identify leads much faster, says Towler.

This is where the speed of a quantum computer will make a difference, starting with simple molecules like the ones found in refrigerants or in solvents that are used to remove CO2 from processes prevalent in the oil and gas industry. These are relatively simple molecules, with 10-20 atoms, amenable to be modeled with [todays] quantum computers, says Towler. In the future, he expects more powerful quantum computers to assist in developing vaccines and finding new drugs, polymers, biodegradable plastics, things that contain hundred and thousands of atoms.

There are three ways by which Towlers counterparts in other companies, the lead-users who are interested in experimenting with quantum computing, can currently access Honeywells solution: Run their program directly on Honeywells quantum computer; through Microsoft Azure Quantum services; and working with two startups that Honeywell has invested in, Cambridge Quantum Computing (CQC) and Zapata Computing, both assisting in turning business challenges into quantum computing and hybrid computing algorithms.

Honeywell brings to the quantum computing emerging market a variety of skills in multiple disciplines, with its decades-long experience with precision control systems possibly the most important one. Any at-scale quantum computer becomes a controls problem, says Uttley, and we have experience in some of the most complex systems integration problems in the world. These past experiences have prepared Honeywell to show what quantum computing can do for the world and to rapidly scale-up its solution. Weve built a big auditorium but we are filling out just a few seats right now and we have lots more seats to fill, Uttley sums up this point in time in Honeywells journey to quantum supremacy.

See the original post here:

Honeywell Wants To Show What Quantum Computing Can Do For The World - Forbes

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum Computing for the Next Generation of Computer Scientists and Researchers – Campus Technology

Posted: at 11:51 pm

without comments

C-Level View | Feature

A Q&A with Travis Humble

Travis Humble is a distinguished scientist and director of the Quantum Computing Institute at Oak Ridge National Laboratory. The institute is a lab-wide organization that brings together all of ORNL's capabilities to address the development of quantum computers. Humble is also an academic, holding a joint faculty appointment at the University of Tennessee, where he is an assistant professor with the Bredesen Center for Interdisciplinary Research and Graduate Education. In the following Q&A, Humble gives CT his unique perspectives on the advancement of quantum computing and its entry into higher education curricula and research.

"It's an exciting area that's largely understaffed. There are far more opportunities than there are people currently qualified to approach quantum computing." Travis Humble

Mary Grush: Working at the Oak Ridge National Laboratory as a scientist and at the University of Tennessee as an academic, you are in a remarkable position to watch both the development of the field of quantum computing and its growing importance in higher education curricula and research. First, let me ask about your role at the Bredesen Center for Interdisciplinary Research and Graduate Education. The Bredesen Center draws on resources from both ORNL and UT. Does the center help move quantum computing into the realm of higher education?

Travis Humble: Yes. The point of the Bredesen Center is to do interdisciplinary research, to educate graduate students, and to address the interfaces and frontiers of science that don't fall within the conventional departments.

For me, those objectives are strongly related to my role at the laboratory, where I am a scientist working in quantum information. And the joint work ORNL and UT do in quantum computing is training the next generation of the workforce that's going to be able to take advantage of the tools and research that we're developing at the laboratory.

Grush: Are ORNL and UT connected to bring students to the national lab to experience quantum computing?

Humble: They are so tightly connected that it works very well for us to have graduate students onsite performing research in these topics, while at the same time advancing their education through the university.

Grush: How does ORNL's Quantum Computing Institute, where you are director, promote quantum computing?

Humble: As part of my work with the Quantum Computing Institute, I manage research portfolios and direct resources towards our most critical needs at the moment. But I also use that responsibility as a gateway to get people involved with quantum computing: It's an exciting area that's largely understaffed. There are far more opportunities than there are people currently qualified to approach quantum computing.

The institute is a kind of storefront through which people from many different areas of science and engineering can become involved in quantum computing. It is there to help them get involved.

Grush: Let's get a bit of perspective on quantum computing why is it important?

Humble: Quantum computing is a new approach to the ways we could build computers and solve problems. This approach uses quantum mechanics that support the most fundamental theories of physics. We've had a lot of success in understanding quantum mechanics it's the technology that lasers, transistors, and a lot of things that we rely on today were built on.

But it turns out there's a lot of untapped potential there: We could take further advantage of some of the features of quantum physics, by building new types of technologies.

Here is the original post:

Quantum Computing for the Next Generation of Computer Scientists and Researchers - Campus Technology

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Page 11234..»