Page 4«..3456..»

Archive for the ‘Quantum Computer’ Category

Researchers Found Another Impediment for Quantum Computers to Overcome – Dual Dove

Posted: September 1, 2020 at 10:55 am

without comments

Maintaining qubits stable will be the pivot to realizing the potential of quantum computing, and now researchers have managed to discover a new obstacle to this stability: natural radiation.

Natural or background radiation is produced by various sources, both natural and artificial. Cosmic rays produce natural radiation, for instance, and so do concrete buildings. It is surrounding us all the time, and so this poses something of an issue for future quantum computers.

After numerous experiments that modified the level of natural radiation around qubits, physicists could establish that this background noise does indeed push qubits off balance in a way that hinders them from operating properly.

Our study is the first to show clearly that low-level ionizing radiation in the environment degrades the performance of superconducting qubits,says physicist John Orrell, from the Pacific Northwest National Laboratory (PNNL). These findings suggest that radiation shielding will be necessary to attain long-sought performance in quantum computers of this design.

Natural radiation is under no circumstance the most important or the only menace to qubit stability, which is basically known as coherence; everything from temperature variations to electromagnetic fields is able to mess with the qubit.

However, scientists say if were to attain a future where quantum computers are performing most of our advanced computing needs, then this hindrance from natural radiation is going to have to be addressed.

After the team that carried out the study was faced with issues regarding superconducting qubit decoherence, it decided to examine the possible problem with natural radiation. They discovered it breaks up a main quantum binding known as theCooper pairof electrons.

The radiation breaks apart matched pairs of electrons that typically carry electric current without resistance in a superconductor,says physicist Brent VanDevender, from PNNL. The resistance of those unpaired electrons destroys the delicately prepared state of a qubit.

Regular computers can be distorted by the same issues that impact qubits, but quantum states are a lot more delicate and sensitive. One of the reasons that we dont have authentic full-scale quantum computers at the moment is that theres no way yet to keep qubits stable for more than a few milliseconds at a time.

If we can develop on that, the benefits when it comes to computing power could be gigantic: while classical computer bits can only be set as 1 or 0, qubits can be set as 1,0, or both at the same time, a state known assuperposition.

Researchers have managed to get it happening, but only for a very short period, and in an extremely controlled setting. The good news, however, is that scientists like those at PNNL are dedicated to the challenge of discovering how to make quantum computers a reality, and with the new finding, we know a bit more about what weve to overcome.

Practical quantum computing with these devices will not be possible unless we address the radiation issue,says VanDevender. Without mitigation, radiation will limit the coherence time of superconducting qubits to a few milliseconds, which is insufficient for practical quantum computing.

A paper detailing the research has been published in the journalNature.

Known for her passion for writing, Paula contributes to both Science and Health niches here at Dual Dove.

See original here:

Researchers Found Another Impediment for Quantum Computers to Overcome - Dual Dove

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 – Kentucky Journal 24

Posted: at 10:55 am

without comments


Quantum cryptographyis a new method for secret communications that provides the assurance of security of digital data. Quantum cryptography is primarily based on the usage of individual particles/waves of light (photon) and their essential quantum properties for the development of an unbreakable cryptosystem, primarily because it is impossible to measure the quantum state of any system without disturbing that system.

Request For ReportSample@

It is hypothetically possible that other particles could be used, but photons offer all the necessary qualities needed, the their behavior is comparatively understandable, and they are the information carriers in optical fiber cables, the most promising medium for very high-bandwidth communications.

Quantum computing majorly focuses on the growing computer technology that is built on the platform of quantum theory which provides the description about the nature and behavior of energy and matter at quantum level. The fame of quantum mechanics in cryptography is growing because they are being used extensively in the encryption of information. Quantum cryptography allows the transmission of the most critical data at the most secured level, which in turn, propels the growth of the quantum computing market. Quantum computing has got a huge array of applications.

Market Analysis:

According to Infoholic Research, the Global Quantum cryptography Market is expected to reach $1.53 billion by 2023, growing at a CAGR of around 26.13% during the forecast period. The market is experiencing growth due to the increase in the data security and privacy concerns. In addition, with the growth in the adoption of cloud storage and computing technologies is driving the market forward. However, low customer awareness about quantum cryptography is hindering the market growth. The rising demands for security solutions across different verticals is expected to create lucrative opportunities for the market.

Market Segmentation Analysis:

The report provides a wide-ranging evaluation of the market. It provides in-depth qualitative insights, historical data, and supportable projections and assumptions about the market size. The projections featured in the report have been derived using proven research methodologies and assumptions based on the vendors portfolio, blogs, whitepapers, and vendor presentations. Thus, the research report serves every side of the market and is segmented based on regional markets, type, applications, and end-users.

Countries and Vertical Analysis:

The report contains an in-depth analysis of the vendor profiles, which include financial health, business units, key business priorities, SWOT, strategy, and views; and competitive landscape. The prominent vendors covered in the report include ID Quantique, MagiQ Technologies, Nucrypt, Infineon Technologies, Qutools, QuintenssenceLabs, Crypta Labs, PQ Solutions, and Qubitekk and others. The vendors have been identified based on the portfolio, geographical presence, marketing & distribution channels, revenue generation, and significant investments in R&D.

Get Complete TOC with Tables andFigures@

Competitive Analysis

The report covers and analyzes the global intelligent apps market. Various strategies, such as joint ventures, partnerships,collaborations, and contracts, have been considered. In addition, as customers are in search of better solutions, there is expected to be a rising number of strategic partnerships for better product development. There is likely to be an increase in the number of mergers, acquisitions, and strategic partnerships during the forecast period.

Companies such as Nucrypt, Crypta Labs, Qutools, and Magiq Technologies are the key players in the global Quantum Cryptography market. Nucrypt has developed technologies for emerging applications in metrology and communication. The company has also produced and manufactured electronic and optical pulsers. In addition, Crypta Labs deals in application security for devices. The company deals in Quantum Random Number Generator products and solutions and Internet of Things (IoT). The major sectors the company is looking at are transport, military and medical.

The report includes the complete insight of the industry, and aims to provide an opportunity for the emerging and established players to understand the market trends, current scenario, initiatives taken by the government, and the latest technologies related to the market. In addition, it helps the venture capitalists in understanding the companies better and to take informed decisions.

Regional Analysis

The Americas held the largest chunk of market share in 2017 and is expected to dominate the quantum cryptography market during the forecast period. The region has always been a hub for high investments in research and development (R&D) activities, thus contributing to the development of new technologies. The growing concerns for the security of IT infrastructure and complex data in America have directed the enterprises in this region to adopt quantum cryptography and reliable authentication solutions.

<<< Get COVID-19 Report Analysis >>>


The report provides an in-depth analysis of the global intelligent apps market aiming to reduce the time to market the products and services, reduce operational cost, improve accuracy, and operational performance. With the help of quantum cryptography, various organizations can secure their crucial information, and increase productivity and efficiency. In addition, the solutions are proven to be reliable and improve scalability. The report discusses the types, applications, and regions related to this market. Further, the report provides details about the major challenges impacting the market growth.

Read more here:

Quantum Cryptography Market Research Analysis Including Growth Factors, Types And Application By Regions From 2024 - Kentucky Journal 24

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Q-NEXT collaboration awarded National Quantum Initiative funding – University of Wisconsin-Madison

Posted: at 10:55 am

without comments

The University of WisconsinMadison solidified its standing as a leader in the field of quantum information science when the U.S. Department of Energy (DOE) and the White House announced the Q-NEXT collaboration as a funded Quantum Information Science Research Center through the National Quantum Initiative Act. The five-year, $115 million collaboration was one of five Centers announced today.

Q-NEXT, a next-generation quantum science and engineering collaboration led by the DOEs Argonne National Laboratory, brings together nearly 100 world-class researchers from three national laboratories, 10 universities including UWMadison, and 10 leading U.S. technology companies to develop the science and technology to control and distribute quantum information.

The main goals for Q-NEXT are first to deliver quantum interconnects to find ways to quantum mechanically connect distant objects, says Mark Eriksson, the John Bardeen Professor of Physics at UWMadison and a Q-NEXT thrust lead. And next, to establish a national resource to both develop and provide pristine materials for quantum science and technology.

Q-NEXT will focus on three core quantum technologies:

Eriksson is leading the Materials and Integration thrust, one of six Q-NEXT focus areas that features researchers from across the collaboration. This thrust aims to: develop high-coherence materials, including for silicon and superconducting qubits, which is an essential component of preserving entanglement; develop a silicon-based optical quantum memory, which is important in developing a quantum repeater; and improve color-center quantum bits, which are used in both communication and sensing.

One of the key goals in Materials and Integration is to not just improve the materials but also to improve how you integrate those materials together so that in the end, quantum devices maintain coherence and preserve entanglement, Eriksson says. The integration part of the name is really important. You may have a material that on its own is really good at preserving coherence, yet you only make something useful when you integrate materials together.

Six other UWMadison and Wisconsin Quantum Institute faculty members are Q-NEXT investigators: physics professors Victor Brar, Shimon Kolkowitz, Robert McDermott, and Mark Saffman, electrical and computer engineering professor Mikhail Kats, and chemistry professor Randall Goldsmith. UWMadison researchers are involved in five of the six research thrusts.

Im excited about Q-NEXT because of the connections and collaborations it provides to national labs, other universities, and industry partners, Eriksson says. When youre talking about research, its those connections that often lead to the breakthroughs.

The potential impacts of Q-NEXT research include the creation ofa first-ever National Quantum Devices Databasethat will promote the development and fabrication of next generation quantum devices as well as the development of the components and systems that enable quantum communications across distances ranging from microns to kilometers.

This funding helps ensure that the Q-NEXT collaboration will lead the way in future developments in quantum science and engineering, says Steve Ackerman, UWMadison vice chancellor for research and graduate education. Q-NEXT is the epitome of the Wisconsin Idea as we work together to transfer new quantum technologies to the marketplace and support U.S. economic competitiveness in this growing field.

Read more here:

Q-NEXT collaboration awarded National Quantum Initiative funding - University of Wisconsin-Madison

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

This Equation Calculates The Chances We Live In A Computer Simulation – Discover Magazine

Posted: at 10:55 am

without comments

Credit: metamorworks/Shutterstock

Sign up for our email newsletter for the latest science news

The Drake equation is one of the more famous reckonings in science. It calculates the likelihood that we are not alone in the universe by estimating the number of other intelligent civilizations in our galaxy that might exist now.

Some of the terms in this equation are well known or becoming better understood, such as the number of stars in our galaxy and the proportion that have planets in the habitable zone. But others are unknown, such as the proportion of planets that develop intelligent life; and some may never be known such as the proportion that destroy themselves before they can be discovered.

Nevertheless, the Drake equation allows scientists to place important bounds on the numbers of intelligent civilizations that might be out there.

However, there is another sense in which humanity could be linked with an alien intelligenceour world may just be a simulation inside a massively powerful supercomputer run by such a species. Indeed, various scientists, philosophers and visionaries have said that the probability of such a scenario could be close to one. In other words, we probably are living in a simulation.

The accuracy of these claims is somewhat controversial. So a better way to determine the probability that we live in a simulation would be much appreciated.

Enter Alexandre Bibeau-Delisle and Gilles Brassard at the University of Montreal in Canada. These researchers have derived a Drake-like equation that calculates the chances that we live in a computer simulation. And the results throw up some counterintuitive ideas that are likely to change the way we think about simulations, how we might determine whether we are in one and whether we could ever escape.

Bibeau-Delisle and Brassard begin with a fundamental estimate of the computing power available to create a simulation. They say, for example, that a kilogram of matter, fully exploited for computation, could perform 10^50 operations per second.

By comparison, the human brain, which is also kilogram-sized, performs up to 10^16 operations per second. It may thus be possible for a single computer the mass of a human brain to simulate the real-time evolution of 1.4 10^25 virtual brains, they say.

In our society, a significant number of computers already simulate entire civilizations, in games such as Civilization VI, Hearts of Iron IV, Humankind and so. So it may be reasonable to assume that in a sufficiently advanced civilization, individuals will be able to run games that simulate societies like ours, populated with sentient conscious beings.

So an interesting question is this: of all the sentient beings in existence, what fraction are likely to be simulations? To derive the answer, Bibeau-Delisle and Brassard start with the total number of real sentient beings NRe, multiply that by the fraction with access to the necessary computing power fCiv; multiply this by the fraction of that power that is devoted to simulating consciousness fDed (because these beings are likely to be using their computer for other purposes too); and then multiply this by the number of brains they could simulate Rcal.

The resulting equation is this, where fSim is the fraction of simulated brains:

Here RCal is the huge number of brains that fully exploited matter should be able to simulate.

The sheer size of this number, ~10^25, pushes Bibeau-Delisle and Brassard towards an inescapable conclusion. It is mathematically inescapable from [the above] equation and the colossal scale of RCal that fSim 1 unless fCiv fDed 0, they say.

So there are two possible outcomes. Either we live in a simulation or a vanishingly small proportion of advanced computing power is devoted to simulating brains.

Its not hard to imagine why the second option might be true. A society of beings similar to us (but with a much greater technological development) could indeed decide it is not very ethical to simulate beings with enough precision to make them conscious while fooling them and keeping them cut-off from the real world, say Bibeau-Delisle and Brassard.

Another possibility is that advanced civilizations never get to the stage where their technology is powerful enough to perform these kinds of computations. Perhaps they destroy themselves through war or disease or climate change long before then. There is no way of knowing.

But suppose we are in a simulation. Bibeau-Delisle and Brassard ask whether we might escape while somehow hiding our intentions from our overlords. They assume that the simulating technology will be quantum in nature. If quantum phenomena are as difficult to compute on classical systems as we believe them to be, a simulation containing our world would most probably run on quantum computing power, they say.

This raises the possibility that it may be possible to detect our alien overlords since they cannot measure the quantum nature of our world without revealing their presence. Quantum cryptography uses the same principle; indeed, Brassard is one of the pioneers of this technology.

That would seem to make it possible for us to make encrypted plans that are hidden from the overlords, such as secretly transferring ourselves into our own simulations.

However, the overlords have a way to foil this. All they need to do is to rewire their simulation to make it look as if we are able to hide information, even though they are aware of it all the time. If the simulators are particularly angry at our attempted escape, they could also send us to a simulated hell, in which case we would at least have the confirmation we were truly living inside a simulation and our paranoia was not unjustified...conclude Bibeau-Delisle and Brassard, with their tongues firmly in their cheeks.

In that sense, we are the ultimate laboratory guinea pigs: forever trapped and forever fooled by the evil genius of our omnipotent masters.

Time for another game of Civilization VI.

Ref: : Probability and Consequences of Living Inside a Computer Simulation

Here is the original post:

This Equation Calculates The Chances We Live In A Computer Simulation - Discover Magazine

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

I confess, I’m scared of the next generation of supercomputers – TechRadar

Posted: at 10:55 am

without comments

Earlier this year, a Japanese supercomputer built on Arm-based Fujitsu A64FX processors snatched the crown of worlds fastest machine, blowing incumbent leader IBM Summit out of the water.

Fugaku, as the machine is known, achieved 415.5 petaFLOPS by the popular High Performance Linpack (HPL) benchmark, which is almost three times the score of the IBM machine (148.5 petaFLOPS).

It also topped the rankings for Graph 500, HPL-AI and HPCH workloads - a feat never before achieved in the world of high performance computing (HPC).

Modern supercomputers are now edging ever-closer to the landmark figure of one exaFLOPS (equal to 1,000 petaFLOPS), commonly described as the exascale barrier. In fact, Fugaku itself can already achieve one exaFLOPS, but only in lower precision modes.

The consensus among the experts we spoke to is that a single machine will breach the exascale barrier within the next 6 - 24 months, unlocking a wealth of possibilities in the fields of medical research, climate forecasting, cybersecurity and more.

But what is an exaFLOPS? And what will it mean to break the exascale milestone, pursued doggedly for more than a decade?

To understand what it means to achieve exascale computing, its important to first understand what is meant by FLOPS, which stands for floating point operations per second.

A floating point operation is any mathematical calculation (i.e. addition, subtraction, multiplication or division) that involves a number containing a decimal (e.g. 3.0 - a floating point number), as opposed to a number without a decimal (e.g. 3 - a binary integer). Calculations involving decimals are typically more complex and therefore take longer to solve.

An exascale computer can perform 10^18 (one quintillion/100,000,000,000,000,000) of these mathematical calculations every second.

For context, to equal the number of calculations an exascale computer can process in a single second, an individual would have to perform one sum every second for 31,688,765,000 years.

The PC Im using right now, meanwhile, is able to reach 147 billion FLOPS (or 0.00000014723 exaFLOPS), outperforming the fastest supercomputer of 1993, the Intel Paragon (143.4 billion FLOPS).

This both underscores how far computing has come in the last three decades and puts into perspective the extreme performance levels attained by the leading supercomputers today.

The key to building a machine capable of reaching one exaFLOPS is optimization at the processing, storage and software layers.

The hardware must be small and powerful enough to pack together and reach the necessary speeds, the storage capacious and fast enough to serve up the data and the software scalable and programmable enough to make full use of the hardware.

For example, there comes a point at which adding more processors to a supercomputer will no longer affect its speed, because the application is not sufficiently optimized. The only way governments and private businesses will realize a full return on HPC hardware investment is through an equivalent investment in software.

Organizations such as the Exascale Computing Project (EPC) the ExCALIBUR programme are interested in solving precisely this problem. Those involved claim a renewed focus on algorithm and application development is required in order to harness the full power and scope of exascale.

Achieving the delicate balance between software and hardware, in an energy efficient manner and avoiding an impractically low mean time between failures (MTBF) score (the time that elapses before a system breaks down under strain) is the challenge facing the HPC industry.

15 years ago as we started the discussion on exascale, we hypothesized that it would need to be done in 20 mega-watts (MW); later that was changed to 40 MW. With Fugaku, we see that we are about halfway to a 64-bit exaFLOPS at the 40 MW power envelope, which shows that an exaFLOPS is in reach today, explained Brent Gorda, Senior Director HPC at UK-based chip manufacturer Arm.

We could hit an exaFLOPS now with sufficient funding to build and run a system. [But] the size of the system is likely to be such that MTBF is measured in single digit number-of-days based on todays technologies and the number of components necessary to reach these levels of performance.

When it comes to building a machine capable of breaching the exascale barrier, there are a number of other factors at play, beyond technological feasibility. An exascale computer can only come into being once an equilibrium has been reached at the intersection of technology, economics and politics.

One could in theory build an exascale system today by packing in enough CPUs, GPUs and DPUs. But what about economic viability? said Gilad Shainer of NVIDIA Mellanox, the firm behind the Infiniband technology (the fabric that links the various hardware components) found in seven of the ten fastest supercomputers.

Improvements in computing technologies, silicon processing, more efficient use of power and so on all help to increase efficiency and make exascale computing an economic objective as opposed to a sort of sporting achievement.

According to Paul Calleja, who heads up computing research at the University of Cambridge and is working with Dell on the Open Exascale Lab, Fugaku is an excellent example of what is theoretically possible today, but is also impractical by virtually any other metric.

If you look back at Japanese supercomputers, historically theres only ever been one of them made. They have beautifully exquisite architectures, but theyre so stupidly expensive and proprietary that no one else could afford one, he told TechRadar Pro.

[Japanese organizations] like these really large technology demonstrators, which are very useful in industry because it shows the direction of travel and pushes advancements, but those kinds of advancements are very expensive and not sustainable, scalable or replicable.

So, in this sense, there are two separate exascale landmarks; the theoretical barrier, which will likely be met first by a machine of Fugakus ilk (a technological demonstrator), and the practical barrier, which will see exascale computing deployed en masse.

Geopolitical factors will also play a role in how quickly the exascale barrier is breached. Researchers and engineers might focus exclusively on the technological feat, but the institutions and governments funding HPC research are likely motivated by different considerations.

Exascale computing is not just about reaching theoretical targets, it is about creating the ability to tackle problems that have been previously intractable, said Andy Grant, Vice President HPC & Big Data at IT services firm Atos, influential in the fields of HPC and quantum computing.

Those that are developing exascale technologies are not doing it merely to have the fastest supercomputer in the world, but to maintain international competitiveness, security and defence.

In Japan, their new machine is roughly 2.8x more powerful than the now-second place system. In broad terms, that will enable Japanese researchers to address problems that are 2.8x more complex. In the context of international competitiveness, that creates a significant advantage.

In years gone by, rival nations fought it out in the trenches or competed to see who could place the first human on the moon. But computing may well become the frontier at which the next arms race takes place; supremacy in the field of HPC might prove just as politically important as military strength.

Once exascale computers become an established resource - available for businesses, scientists and academics to draw upon - a wealth of possibilities will be unlocked across a wide variety of sectors.

HPC could prove revelatory in the fields of clinical medicine and genomics, for example, which require vast amounts of compute power to conduct molecular modelling, simulate interactions between compounds and sequence genomes.

In fact, IBM Summit and a host of other modern supercomputers are being used to identify chemical compounds that could contribute to the fight against coronavirus. The Covid-19 High Performance Computing Consortium assembled 16 supercomputers, accounting for an aggregate of 330 petaFLOPS - but imagine how much more quickly research could be conducted using a fleet of machines capable of reaching 1,000 petaFLOPS on their own.

Artificial intelligence, meanwhile, is another cross-disciplinary domain that will be transformed with the arrival of exascale computing. The ability to analyze ever-larger datasets will improve the ability of AI models to make accurate forecasts (contingent on the quality of data fed into the system) that could be applied to virtually any industry, from cybersecurity to e-commerce, manufacturing, logistics, banking, education and many more.

As explained by Rashid Mansoor, CTO at UK supercomputing startup Hadean, the value of supercomputing lies in the ability to make an accurate projection (of any variety).

The primary purpose of a supercomputer is to compute some real-world phenomenon to provide a prediction. The prediction could be the way proteins interact, the way a disease spreads through the population, how air moves over an aerofoil or electromagnetic fields interact with a spacecraft during re-entry, he told TechRadar Pro.

Raw performance such as the HPL benchmark simply indicates that we can model bigger and more complex systems to a greater degree of accuracy. One thing that the history of computing has shown us is that the demand for computing power is insatiable.

Other commonly cited areas that will benefit significantly from the arrival of exascale include brain mapping, weather and climate forecasting, product design and astronomy, but its also likely that brand new use cases will emerge as well.

The desired workloads and the technology to perform them form a virtuous circle. The faster and more performant the computers, the more complex problems we can solve and the faster the discovery of new problems, explained Shainer.

What we can be sure of is that we will see the continuous needs or ever growing demands for more performance capabilities in order to solve the unsolvable. Once this is solved, we will find the new unsolvable.

By all accounts, the exascale barrier will likely fall within the next two years, but the HPC industry will then turn its attention to the next objective, because the work is never done.

Some might point to quantum computers, which approach problem solving in an entirely different way to classical machines (exploiting symmetries to speed up processing), allowing for far greater scale. However, there are also problems to which quantum computing cannot be applied.

Mid-term (10 year) prospects for quantum computing are starting to shape up, as are other technologies. These will be more specialized where a quantum computer will very likely show up as an application accelerator for problems that relate to logistics first. They wont completely replace the need for current architectures for IT/data processing, explained Gorda.

As Mansoor puts it, on certain problems even a small quantum computer can be exponentially faster than all of the classical computing power on earth combined. Yet on other problems, a quantum computer could be slower than a pocket calculator.

The next logical landmark for traditional computing, then, would be one zettaFLOPS, equal to 1,000 exaFLOPS or 1,000,000 petaFLOPS.

Chinese researchers predicted in 2018 that the first zettascale system will come online in 2035, paving the way for new computing paradigms. The paper itself reads like science fiction, at least for the layman:

To realize these metrics, micro-architectures will evolve to consist of more diverse and heterogeneous components. Many forms of specialized accelerators are likely to co-exist to boost HPC in a joint effort. Enabled by new interconnect materials such as photonic crystal, fully optical interconnecting systems may come into use.

Assuming one exaFLOPS is reached by 2022, 14 years will have elapsed between the creation of the first petascale and first exascale systems. The first terascale machine, meanwhile, was constructed in 1996, 12 years before the petascale barrier was breached.

If this pattern were to continue, the Chinese researchers estimate would look relatively sensible, but there are firm question marks over the validity of zettascale projections.

While experts are confident in their predicted exascale timelines, none would venture a guess at when zettascale might arrive without prefacing their estimate with a long list of caveats.

Is that an interesting subject? Because to be honest with you, its so not obtainable. To imagine how we could go 1000x beyond [one exaFLOPS] is not a conversation anyone could have, unless theyre just making it up, said Calleja, asked about the concept of zettascale.

Others were more willing to theorize, but equally reticent to guess at a specific timeline. According to Grant, the way zettascale machines process information will be unlike any supercomputer in existence today.

[Zettascale systems] will be data-centric, meaning components will move to the data rather than the other way around, as data volumes are likely to be so large that moving data will be too expensive. Regardless, predicting what they might look like is all guesswork for now, he said.

It is also possible that the decentralized model might be the fastest route to achieving zettascale, with millions of less powerful devices working in unison to form a collective supercomputer more powerful than any single machine (as put into practice by the SETI Institute).

As noted by Saurabh Vij, CEO of distributed supercomputing firm Q Blocks, decentralized systems address a number of problems facing the HPC industry today, namely surrounding building and maintenance costs. They are also accessible to a much wider range of users and therefore democratize access to supercomputing resources in a way that is not otherwise possible.

There are benefits to a centralized architecture, but the cost and maintenance barrier overshadows them. [Centralized systems] also alienate a large base of customer groups that could benefit, he said.

We think a better way is to connect distributed nodes together in a reliable and secure manner. It wouldnt be too aggressive to say that, 5 years from now, your smartphone could be part of a giant distributed supercomputer, making money for you while you sleep by solving computational problems for industry, he added.

However, incentivizing network nodes to remain active for a long period is challenging and a high rate of turnover can lead to reliability issues. Network latency and capacity problems would also need to be addressed before distributed supercomputing can rise to prominence.

Ultimately, the difficulty in making firm predictions about zettascale lies in the massive chasm that separates present day workloads and HPC architectures from those that might exist in the future. From a contemporary perspective, its fruitless to imagine what might be made possible by a computer so powerful.

We might imagine zettascale machines will be used to process workloads similar to those tackled by modern supercomputers, only more quickly. But its possible - even likely - the arrival of zettascale computing will open doors that do not and cannot exist today, so extraordinary is the leap.

In a future in which computers are 2,000+ times as fast as the most powerful machine today, philosophical and ethical debate surrounding the intelligence of man versus machine are bound to be played out in greater detail - and with greater consequence.

It is impossible to directly compare the workings of a human brain with that of a computer - i.e. to assign a FLOPS value to the human mind. However, it is not insensible to ask how many FLOPS must be achieved before a machine reaches a level of performance that might be loosely comparable to the brain.

Back in 2013, scientists used the K supercomputer to conduct a neuronal network simulation using open source simulation software NEST. The team simulated a network made up of 1.73 billion nerve cells connected by 10.4 trillion synapses.

While ginormous, the simulation represented only 1% of the human brains neuronal network and took 40 minutes to replicate 1 seconds worth of neuronal network activity.

However, the K computer reached a maximum computational power of only 10 petaFLOPS. A basic extrapolation (ignoring inevitable complexities), then, would suggest Fugaku could simulate circa 40% of the human brain, while a zettascale computer would be capable of performing a full simulation many times over.

Digital neuromorphic hardware (supercomputers created specifically to simulate the human brain) like SpiNNaker 1 and 2 will also continue to develop in the post-exascale future. Instead of sending information from point A to B, these machines will be designed to replicate the parallel communication architecture of the brain, sending information simultaneously to many different locations.

Modern iterations are already used to help neuroscientists better understand the mysteries of the brain and future versions, aided by advances in artificial intelligence, will inevitably be used to construct a faithful and fully-functional replica.

The ethical debates that will arise with the arrival of such a machine - surrounding the perception of consciousness, the definition of thought and what an artificial uber-brain could or should be used for - are manifold and could take generations to unpick.

The inability to foresee what a zettascale computer might be capable of is also an inability to plan for the moral quandaries that might come hand-in-hand.

Whether a future supercomputer might be powerful enough to simulate human-like thought is not in question, but whether researchers should aspire to bringing an artificial brain into existence is a subject worthy of discussion.

See the article here:

I confess, I'm scared of the next generation of supercomputers - TechRadar

Written by admin

September 1st, 2020 at 10:55 am

Posted in Quantum Computer

Honeywell Wants To Show What Quantum Computing Can Do For The World – Forbes

Posted: August 14, 2020 at 11:51 pm

without comments

The race for quantum supremacy heated up in June, when Honeywell brought to market the worlds highest performing quantum computer. Honeywell claims it is more accurate (i.e., performs with less errors) than competing systems and that its performance will increase by an order of magnitude each year for the next five years.

Inside the chamber of Honeywells quantum computer

The beauty of quantum computing, says Tony Uttley, President of Honeywell Quantum Solutions, is that once you reach a certain level of accuracy, every time you add a qbit [the basic unit of quantum information] you double the computational capacity. So as the quantum computer scales exponentially, you can scale your problem set exponentially.

Tony Uttley, President, Honeywell Quantum Solutions

Uttley sees three distinct eras in the evolution of quantum computing. Today, we are in the emergent erayou can start to prove what kind of things work, what kind of algorithms show the most promise. For example, the Future Lab for Applied Research and Engineering (FLARE) group of JPMorgan Chase published a paper in June summarizing the results of running on the Honeywell quantum computer complex mathematical calculations used in financial trading applications.

The next era Uttley calls classically impractical, running computations on a quantum computer that typically are not run on todays (classical) computers because they take too long, consume too much power, and cost too much. Crossing the threshold from emergent to classically impractical is not very far away, he asserts, probably sometime in the next 18 to 24 months. This is when you build the trust with the organizations you work with that the answer that is coming from your quantum computer is the correct one, says Uttley.

The companies that understand the potential impact of quantum computing on their industries, are already looking at what it would take to introduce this new computing capability into their existing processes and what they need to adjust or develop from scratch, according to Uttley. These companies will be ready for the shift from emergent to classically impractical which is going to be a binary moment, and they will be able to take advantage of it immediately.

The last stage of the quantum evolution will be classically impossible"you couldnt in the timeframe of the universe do this computation on a classical best-performing supercomputer that you can on a quantum computer, says Uttley. He mentions quantum chemistry, machine learning, optimization challenges (warehouse routing, aircraft maintenance) as applications that will benefit from quantum computing. But what shows the most promise right now are hybrid [resources]you do just one thing, very efficiently, on a quantum computer, and run the other parts of the algorithm or calculation on a classical computer. Uttley predicts that for the foreseeable future we will see co-processing, combining the power of todays computers with the power of emerging quantum computing solutions.

You want to use a quantum computer for the more probabilistic parts [of the algorithm] and a classical computer for the more mundane calculationsthat might reduce the number of qbits needed, explains Gavin Towler, vice president and chief technology officer of Honeywell Performance Materials Technologies. Towler leads R&D activities for three of Honeywell's businesses: Advanced Materials (e.g., refrigerants), UOP (equipment and services for the oil and gas sector), and Process Automation (automation, control systems, software, for all the process industries). As such, he is the poster boy for a quantum computing lead-user.

Gavin Towler, Vice President and Chief Technology Officer, Honeywell Performance Materials and ... [+] Technologies

In the space of materials discovery, quantum computing is going to be critical. Thats not a might or could be. It is going to be the way people do molecular discovery, says Towler. Molecular simulation is used in the design of new molecules, requiring the designer to understand quantum effects. These are intrinsically probabilistic as are quantum computers, Towler explains.

An example he provides is a refrigerant Honeywell produces that is used in automotive air conditioning, supermarkets refrigeration, and homes. As the chlorinated molecules in the refrigerants were causing the hole in the Ozone layer, they were replaced by HFCs which later tuned out to be very potent greenhouse gasses. Honeywell already found a suitable replacement for the refrigerant used in automotive air conditioning, but is searching for similar solutions for other refrigeration applications. Synthesizing in the lab molecules that will prove to have no effect on the Ozone layer or global warming and will not be toxic or flammable is costly. Computer simulation replaces lab work but ideally, you want to have computer models that will screen things out to identify leads much faster, says Towler.

This is where the speed of a quantum computer will make a difference, starting with simple molecules like the ones found in refrigerants or in solvents that are used to remove CO2 from processes prevalent in the oil and gas industry. These are relatively simple molecules, with 10-20 atoms, amenable to be modeled with [todays] quantum computers, says Towler. In the future, he expects more powerful quantum computers to assist in developing vaccines and finding new drugs, polymers, biodegradable plastics, things that contain hundred and thousands of atoms.

There are three ways by which Towlers counterparts in other companies, the lead-users who are interested in experimenting with quantum computing, can currently access Honeywells solution: Run their program directly on Honeywells quantum computer; through Microsoft Azure Quantum services; and working with two startups that Honeywell has invested in, Cambridge Quantum Computing (CQC) and Zapata Computing, both assisting in turning business challenges into quantum computing and hybrid computing algorithms.

Honeywell brings to the quantum computing emerging market a variety of skills in multiple disciplines, with its decades-long experience with precision control systems possibly the most important one. Any at-scale quantum computer becomes a controls problem, says Uttley, and we have experience in some of the most complex systems integration problems in the world. These past experiences have prepared Honeywell to show what quantum computing can do for the world and to rapidly scale-up its solution. Weve built a big auditorium but we are filling out just a few seats right now and we have lots more seats to fill, Uttley sums up this point in time in Honeywells journey to quantum supremacy.

See the original post here:

Honeywell Wants To Show What Quantum Computing Can Do For The World - Forbes

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum Computing for the Next Generation of Computer Scientists and Researchers – Campus Technology

Posted: at 11:51 pm

without comments

C-Level View | Feature

A Q&A with Travis Humble

Travis Humble is a distinguished scientist and director of the Quantum Computing Institute at Oak Ridge National Laboratory. The institute is a lab-wide organization that brings together all of ORNL's capabilities to address the development of quantum computers. Humble is also an academic, holding a joint faculty appointment at the University of Tennessee, where he is an assistant professor with the Bredesen Center for Interdisciplinary Research and Graduate Education. In the following Q&A, Humble gives CT his unique perspectives on the advancement of quantum computing and its entry into higher education curricula and research.

"It's an exciting area that's largely understaffed. There are far more opportunities than there are people currently qualified to approach quantum computing." Travis Humble

Mary Grush: Working at the Oak Ridge National Laboratory as a scientist and at the University of Tennessee as an academic, you are in a remarkable position to watch both the development of the field of quantum computing and its growing importance in higher education curricula and research. First, let me ask about your role at the Bredesen Center for Interdisciplinary Research and Graduate Education. The Bredesen Center draws on resources from both ORNL and UT. Does the center help move quantum computing into the realm of higher education?

Travis Humble: Yes. The point of the Bredesen Center is to do interdisciplinary research, to educate graduate students, and to address the interfaces and frontiers of science that don't fall within the conventional departments.

For me, those objectives are strongly related to my role at the laboratory, where I am a scientist working in quantum information. And the joint work ORNL and UT do in quantum computing is training the next generation of the workforce that's going to be able to take advantage of the tools and research that we're developing at the laboratory.

Grush: Are ORNL and UT connected to bring students to the national lab to experience quantum computing?

Humble: They are so tightly connected that it works very well for us to have graduate students onsite performing research in these topics, while at the same time advancing their education through the university.

Grush: How does ORNL's Quantum Computing Institute, where you are director, promote quantum computing?

Humble: As part of my work with the Quantum Computing Institute, I manage research portfolios and direct resources towards our most critical needs at the moment. But I also use that responsibility as a gateway to get people involved with quantum computing: It's an exciting area that's largely understaffed. There are far more opportunities than there are people currently qualified to approach quantum computing.

The institute is a kind of storefront through which people from many different areas of science and engineering can become involved in quantum computing. It is there to help them get involved.

Grush: Let's get a bit of perspective on quantum computing why is it important?

Humble: Quantum computing is a new approach to the ways we could build computers and solve problems. This approach uses quantum mechanics that support the most fundamental theories of physics. We've had a lot of success in understanding quantum mechanics it's the technology that lasers, transistors, and a lot of things that we rely on today were built on.

But it turns out there's a lot of untapped potential there: We could take further advantage of some of the features of quantum physics, by building new types of technologies.

Here is the original post:

Quantum Computing for the Next Generation of Computer Scientists and Researchers - Campus Technology

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum mechanics is immune to the butterfly effect – The Economist

Posted: at 11:51 pm

without comments

That could help with the design of quantum computers

Aug 15th 2020

IN RAY BRADBURYs science-fiction story A Sound of Thunder, a character time-travels far into the past and inadvertently crushes a butterfly underfoot. The consequences of that minuscule change ripple through reality such that, upon the time-travellers return, the present has been dramatically changed.

The butterfly effect describes the high sensitivity of many systems to tiny changes in their starting conditions. But while it is a feature of classical physics, it has been unclear whether it also applies to quantum mechanics, which governs the interactions of tiny objects like atoms and fundamental particles. Bin Yan and Nikolai Sinitsyn, a pair of physicists at Los Alamos National Laboratory, decided to find out. As they report in Physical Review Letters, quantum-mechanical systems seem to be more resilient than classical ones. Strangely, they seem to have the capacity to repair damage done in the past as time unfolds.

To perform their experiment, Drs Yan and Sinitsyn ran simulations on a small quantum computer made by IBM. They constructed a simple quantum system consisting of qubitsthe quantum analogue of the familiar one-or-zero bits used by classical computers. Like an ordinary bit, a qubit can be either one or zero. But it can also exist in superposition, a chimerical mix of both states at once.

Having established the system, the authors prepared a particular qubit by setting its state to zero. That qubit was then allowed to interact with the others in a process called quantum scrambling which, in this case, mimics the effect of evolving a quantum system backwards in time. Once this virtual foray into the past was completed, the authors disturbed the chosen qubit, destroying its local information and its correlations with the other qubits. Finally, the authors performed a reversed scrambling process on the now-damaged system. This was analogous to running the quantum system all the way forwards in time to where it all began.

They then checked to see how similar the final state of the chosen qubit was to the zero-state it had been assigned at the beginning of the experiment. The classical butterfly effect suggests that the researchers meddling should have changed it quite drastically. In the event, the qubits original state had been almost entirely recovered. Its state was not quite zero, but it was, in quantum-mechanical terms, 98.3% of the way there, a difference that was deemed insignificant. The final output state after the forward evolution is essentially the same as the input state before backward evolution, says Dr Sinitsyn. It can be viewed as the same input state plus some small background noise. Oddest of all was the fact that the further back in simulated time the damage was done, the greater the rate of recoveryas if the quantum system was repairing itself with time.

The mechanism behind all this is known as entanglement. As quantum objects interact, their states become highly correlatedentangledin a way that serves to diffuse localised information about the state of one quantum object across the system as a whole. Damage to one part of the system does not destroy information in the same way as it would with a classical system. Instead of losing your work when your laptop crashes, having a highly entangled system is a bit like having back-ups stashed in every room of the house. Even though the information held in the disturbed qubit is lost, its links with the other qubits in the system can act to restore it.

The upshot is that the butterfly effect seems not to apply to quantum systems. Besides making life safe for tiny time-travellers, that may have implications for quantum computing, too, a field into which companies and countries are investing billions of dollars. We think of quantum systems, especially in quantum computing, as very fragile, says Natalia Ares, a physicist at the University of Oxford. That this result demonstrates that quantum systems can in fact be unexpectedly robust is an encouraging finding, and bodes well for potential future advances in the field.

This article appeared in the Science & technology section of the print edition under the headline "A flutter in time"

Read more:

Quantum mechanics is immune to the butterfly effect - The Economist

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Major quantum computational breakthrough is shaking up physics and maths – The Conversation UK

Posted: at 11:51 pm

without comments

MIP* = RE is not a typo. It is a groundbreaking discovery and the catchy title of a recent paper in the field of quantum complexity theory. Complexity theory is a zoo of complexity classes collections of computational problems of which MIP* and RE are but two.

The 165-page paper shows that these two classes are the same. That may seem like an insignificant detail in an abstract theory without any real-world application. But physicists and mathematicians are flocking to visit the zoo, even though they probably dont understand it all. Because it turns out the discovery has astonishing consequences for their own disciplines.

In 1936, Alan Turing showed that the Halting Problem algorithmically deciding whether a computer program halts or loops forever cannot be solved. Modern computer science was born. Its success made the impression that soon all practical problems would yield to the tremendous power of the computer.

But it soon became apparent that, while some problems can be solved algorithmically, the actual computation will last long after our Sun will have engulfed the computer performing the computation. Figuring out how to solve a problem algorithmically was not enough. It was vital to classify solutions by efficiency. Complexity theory classifies problems according to how hard it is to solve them. The hardness of a problem is measured in terms of how long the computation lasts.

RE stands for problems that can be solved by a computer. It is the zoo. Lets have a look at some subclasses.

The class P consists of problems which a known algorithm can solve quickly (technically, in polynomial time). For instance, multiplying two numbers belongs to P since long multiplication is an efficient algorithm to solve the problem. The problem of finding the prime factors of a number is not known to be in P; the problem can certainly be solved by a computer but no known algorithm can do so efficiently. A related problem, deciding if a given number is a prime, was in similar limbo until 2004 when an efficient algorithm showed that this problem is in P.

Another complexity class is NP. Imagine a maze. Is there a way out of this maze? is a yes/no question. If the answer is yes, then there is a simple way to convince us: simply give us the directions, well follow them, and well find the exit. If the answer is no, however, wed have to traverse the entire maze without ever finding a way out to be convinced.

Such yes/no problems for which, if the answer is yes, we can efficiently demonstrate that, belong to NP. Any solution to a problem serves to convince us of the answer, and so P is contained in NP. Surprisingly, a million dollar question is whether P=NP. Nobody knows.

The classes described so far represent problems faced by a normal computer. But computers are fundamentally changing quantum computers are being developed. But if a new type of computer comes along and claims to solve one of our problems, how can we trust it is correct?

Imagine an interaction between two entities, an interrogator and a prover. In a police interrogation, the prover may be a suspect attempting to prove their innocence. The interrogator must decide whether the prover is sufficiently convincing. There is an imbalance; knowledge-wise the interrogator is in an inferior position.

In complexity theory, the interrogator is the person, with limited computational power, trying to solve the problem. The prover is the new computer, which is assumed to have immense computational power. An interactive proof system is a protocol that the interrogator can use in order to determine, at least with high probability, whether the prover should be believed. By analogy, these are crimes that the police may not be able to solve, but at least innocents can convince the police of their innocence. This is the class IP.

If multiple provers can be interrogated, and the provers are not allowed to coordinate their answers (as is typically the case when the police interrogates multiple suspects), then we get to the class MIP. Such interrogations, via cross examining the provers responses, provide the interrogator with greater power, so MIP contains IP.

Quantum communication is a new form of communication carried out with qubits. Entanglement a quantum feature in which qubits are spookishly entangled, even if separated makes quantum communication fundamentally different to ordinary communication. Allowing the provers of MIP to share an entangled qubit leads to the class MIP*.

It seems obvious that communication between the provers can only serve to help the provers coordinate lies rather than assist the interrogator in discovering truth. For that reason, nobody expected that allowing more communication would make computational problems more reliable and solvable. Surprisingly, we now know that MIP* = RE. This means that quantum communication behaves wildly differently to normal communication.

In the 1970s, Alain Connes formulated what became known as the Connes Embedding Problem. Grossly simplified, this asked whether infinite matrices can be approximated by finite matrices. This new paper has now proved this isnt possible an important finding for pure mathematicians.

In 1993, meanwhile, Boris Tsirelson pinpointed a problem in physics now known as Tsirelsons Problem. This was about two different mathematical formalisms of a single situation in quantum mechanics to date an incredibly successful theory that explains the subatomic world. Being two different descriptions of the same phenomenon it was to be expected that the two formalisms were mathematically equivalent.

But the new paper now shows that they arent. Exactly how they can both still yield the same results and both describe the same physical reality is unknown, but it is why physicists are also suddenly taking an interest.

Time will tell what other unanswered scientific questions will yield to the study of complexity. Undoubtedly, MIP* = RE is a great leap forward.

See more here:

Major quantum computational breakthrough is shaking up physics and maths - The Conversation UK

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

IEEE International Conference on Quantum Computing and Engineering (QCE20) Transitions to All-Virtual Event – PRNewswire

Posted: at 11:51 pm

without comments

The exciting QCE20 conference programfeatures over 270 hours of programming. Each day the QCE20 conference, also known as IEEE Quantum Week, will virtually deliver 9-10 parallel tracks ofworld-class keynotes, workforce-building tutorials, community-building workshops, technical paper presentations, innovative posters, and thought-provoking panels through a digital combination of pre-recorded and live-streamed sessions. Attendees will be able to participate in live Q&A sessions with keynote speakers and panelists, paper and poster authors, as well as tutorial and workshop speakers. Birds of a Feather, Networking, and Beautiful Coloradosessions spice up the program between technical sessions. The recorded QCE20 sessions will be available for on-demand until November 30.

"With our expansive technical program and lineup of incredible presentations from thought-leaders all over the globe, this is shaping up to be the quantum event of the year," said Hausi Mller, QCE20 General Chair, IEEE Quantum Initiative Co-Chair. "I encourage all professionals and enthusiasts to become a quantum computing champion by engaging and participating in the inaugural IEEE International Conference on Quantum Computing & Engineering (QCE20)."

Workshops and tutorials will be conducted according to their pre-determined schedule in a live, virtual format. The QCE20 tutorials program features 16 tutorials by leading experts aimed squarely at workforce development and training considerations, and 21 QCE20 workshopsprovide forums for group discussions on topics in quantum research, practice, education, and applications.

Ten outstanding keynote speakers will address quantum computing and engineering topics at the beginning and at the end of each conference day, providing insights to stimulate discussion for the networking sessions and exhibits.

QCE20 panel sessionswill explore various perspectives of quantum topics, including quantum education and training, quantum hardware and software, quantum engineering challenges, fault-tolerant quantum computers, quantum error correction, quantum intermediate language representation, hardware-software co-design, and hybrid quantum-classical computing platforms. Visit Enabling and Growing the Quantum Industryto view the newest addition to the lineup.

Over 20 QCE20 exhibitors and sponsors including Platinum sponsors IBM, Microsoft, and Honeywell, and Gold sponsors Quantropi and Zapatawill be featured Monday through Friday in virtual exhibit rooms offering numerous opportunities for networking.

QCE20 is co-sponsored by the IEEE Computer Society, IEEE Communications Society, IEEE Photonics Society, IEEE Council on Superconductivity,IEEE Electronics Packaging Society, IEEE Future Directions Quantum Initiative, and IEEETechnology and Engineering Management Society.

Register to be a part of the highly anticipated virtual IEEE Quantum Week 2020.

Visit for all program details, as well as sponsorship and exhibitor opportunities.

About the IEEE Computer SocietyThe IEEE Computer Society is the world's home for computer science, engineering, and technology. A global leader in providing access to computer science research, analysis, and information, the IEEE Computer Society offers a comprehensive array of unmatched products, services, and opportunities for individuals at all stages of their professional career. Known as the premier organization that empowers the people who drive technology, the IEEE Computer Society offers international conferences, peer-reviewed publications, a unique digital library, and training programs. Visit more information.

About the IEEE Communications Society The IEEE Communications Societypromotes technological innovation and fosters creation and sharing of information among the global technical community. The Society provides services to members for their technical and professional advancement and forums for technical exchanges among professionals in academia, industry, and public institutions.

About the IEEE Photonics SocietyTheIEEE Photonics Societyforms the hub of a vibrant technical community of more than 100,000 professionals dedicated to transforming breakthroughs in quantum physics into the devices, systems, and products to revolutionize our daily lives. From ubiquitous and inexpensive global communications via fiber optics, to lasers for medical and other applications, to flat-screen displays, to photovoltaic devices for solar energy, to LEDs for energy-efficient illumination, there are myriad examples of the Society's impact on the world around us.

About the IEEE Council on SuperconductivityThe IEEE Council on Superconductivityand its activities and programs cover the science and technology of superconductors and their applications, including materials and their applications for electronics, magnetics, and power systems, where the superconductor properties are central to the application.

About the IEEE Electronics Packaging SocietyThe IEEE Electronics Packaging Societyis the leading international forum for scientists and engineers engaged in the research, design, and development of revolutionary advances in microsystems packaging and manufacturing.

About the IEEE Future Directions Quantum InitiativeIEEE Quantumis an IEEE Future Directions initiative launched in 2019 that serves as IEEE's leading community for all projects and activities on quantum technologies. IEEE Quantum is supported by leadership and representation across IEEE Societies and OUs. The initiative addresses the current landscape of quantum technologies, identifies challenges and opportunities, leverages and collaborates with existing initiatives, and engages the quantum community at large.

About the IEEE Technology and Engineering Management SocietyIEEE TEMSencompasses the management sciences and practices required for defining, implementing, and managing engineering and technology.

SOURCE IEEE Computer Society

Excerpt from:

IEEE International Conference on Quantum Computing and Engineering (QCE20) Transitions to All-Virtual Event - PRNewswire

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Page 4«..3456..»