Page 11234..10..»

Archive for the ‘Quantum Computer’ Category

Will Quantum Computing Ever Live Up to Its Hype? – Scientific American

Posted: April 24, 2021 at 1:56 am


without comments

Quantum computers have been on my mind a lot lately. A friend who likes investing in tech, and who knows about my attempt to learn quantum mechanics, has been sending me articles on how quantum computers might help solve some of the biggest and most complex challenges we face as humans, as a Forbes commentator declared recently. My friend asks, What do you think, Mr. Science Writer? Are quantum computers really the next big thing?

Ive also had exchanges with two quantum-computing experts with distinct perspectives on the technologys prospects. One is computer scientist Scott Aaronson, who has, as I once put it, one of the highest intelligence/pretension ratios Ive ever encountered. Not to embarrass him further, but I see Aaronson as the conscience of quantum computing, someone who helps keep the field honest.

The other expert is physicist Terry Rudolph. He is a co-author, the R, of the PBR theorem, which, along with its better-known predecessor, Bells theorem, lays bare the peculiarities of quantum behavior. In 2011 Nature described the PBR Theorem as the most important general theorem relating to the foundations of quantum mechanics since Bells theorem was published in 1964. Rudolph is also the author of Q Is for Quantum and co-founder of the quantum-computing startup PsiQuantum. Aaronson and Rudolph are on friendly terms; they co-authored a paper in 2007, and Rudolph wrote about Q Is for Quantum on Aaronsons blog. In this column, Ill summarize their views and try to reach a coherent conclusion.

First, a little background. Quantum computers exploit superposition (a particle inhabits two or more mutually exclusive states at the same time) and entanglement (a special form of superposition, in which two or more particles influence each other in spooky ways) to do things that ordinary computers cant. A bit, the basic unit of information of a conventional computer, can be in one of two states, representing a one or zero. Quantum computers, in contrast, traffic in qubits, which are constructed out of superposed particles that embody numerous states simultaneously.

For decades, quantum computing has been little more than a hypothesis, or laboratory curiosity, as researchers wrestled with the technical complexities of maintaining superposition and entanglement for long enough to perform useful calculations. (Remember that as soon as you look at an electron or cat, its superposition vanishes.) Now, tech giants like IBM, Amazon, Microsoft and Google have invested in quantum computing, as have many smaller companies, 193 by one count. In March, the startup IonQ announced a $2 billion deal that would make it the first publicly traded firm dedicated to quantum computers.

The Wall Street Journal reports that IonQ plans to produce a device roughly the size of an Xbox videogame console by 2023. Quantum computing, the Journal states, could speed up calculations related to finance, drug and materials discovery, artificial intelligence and others, andcrack many of the defensesused to secure the internet. According to Business Insider, quantum machines could help us cure cancer, and even take steps to reverse climate change.

This is the sort of hype that bugs Scott Aaronson. He became a computer scientist because he believes in the potential of quantum computing and wants to help develop it. Hed love to see someone build a machine that proves the naysayers wrong. But he worries that researchers are making promises they cant keep. Last month, Aaronson fretted on his blog Shtetl-Optimized that the hype, which he has been countering for years, has gotten especially egregious lately.

Whats new, Aaronson wrote, is that millions of dollars are now potentially available to quantum computing researchers, along with equity, stock options, and whatever else causes ka-ching sound effects and bulging eyes with dollar signs. And in many cases, to have a shot at such riches, all an expert needs to do is profess optimism that quantum computing will have revolutionary, world-changing applications and have themsoon. Or at least, not object too strongly when others say that. Aaronson elaborated on his concerns in a two-hour discussion on the media platform Clubhouse. Below I summarize a few of his points.

Quantum-computing enthusiasts have declared that the technology will supercharge machine learning. It will revolutionize the simulation of complex phenomena in chemistry, neuroscience, medicine, economics and other fields. It will solve the traveling-salesman problem and other conundrums that resist solution by conventional computers. Its still not clear whether quantum computing will achieve these goals, Aaronson says, adding that optimists might be in for a rude awakening.

Popular accounts often imply that quantum computers, because superposition and entanglement allow them to carry out multiple computations at the same time, are simply faster versions of conventional computers. Those accounts are misleading, Aaronson says. Compared to conventional computers, quantum computers are unnatural devices that might be best suited to a relatively narrow range of applications, notably simulating systems dominated by quantum effects.

The ability of a quantum computer to surpass the fastest conventional machine is known as quantum supremacy, a phrase coined by physicist John Preskill in 2012. Demonstrating quantum supremacy is extremely difficult. Even in conventional computing, proving that your algorithm beats mine isnt straightforward. You must pick a task that represents a fair test and choose valid methods of measuring speed and accuracy. The outcomes of tests are also prone to misinterpretation and confirmation bias. Testing creates an enormous space for mischief, Aaronson says.

Moreover, the hardware and software of conventional computers keeps improving. By the time quantum computers are ready for the marketplace, they might lose potential customersif, for example, classical computers become powerful enough to simulate the quantum systems that chemists and materials scientists actually care about in real life, Aaronson says. Although quantum computers would retain their theoretical advantage, their practical impact would be less.

As quantum computing attracts more attention and funding, Aaronson says, researchers may mislead investors, government agencies, journalists, the public and, worst of all, themselves about their works potential. If researchers cant keep their promises, excitement might give way to doubt, disappointment and anger, Aaronson warns. The field might lose funding and talent and lapse into a quantum-computer winter like those that have plagued artificial intelligence.

Lots of other technologiesgenetic engineering, high-temperature superconductors, nanotechnology and fusion energy come to mindhave gone through phases of irrational exuberance. But something about quantum computing makes it especially prone to hype, Aaronson suggests, perhaps because quantum stands for something cool you shouldnt be able to understand.

And that brings me back to Terry Rudolph. In January, after reading about my struggle to understand the Schrdinger equation, Rudolph emailed me to suggest that I read Q Is for Quantum. The 153-page book explains quantum mechanics with a little arithmetic and algebra and lots of diagrams of black-and-white balls going in and out of boxes. Q Is for Quantum has given me more insight into quantum mechanics, and quantum computing, than anything Ive ever read.

Rudolph begins by outlining simple rules underlying conventional computing, which allow for the manipulation of bits. He then shifts to the odd rules of quantum computing, which stem from superposition and entanglement. He details how quantum computing can solve a specific problemone involving thieves stealing code-protected gold bars from a vault--much more readily than conventional computing. But he emphasizes, like Aaronson, that the technology has limits; it cannot compute the uncomputable.

After I read Q Is for Quantum, Rudolph patiently answered my questions about it. You can find our exchange (which assumes familiarity with the book) here. He also answered my questions about PsiQuantum, the firm he co-founded in 2016, which until recently has avoided publicity. Although he is wittily modest about his talents as a physicist (which adds to the charm of Q Is for Quantum), Rudolph is boosterish about PsiQuantum. He shares Aaronsons concerns about hype, and the difficulties of establishing quantum supremacy, but he says those concerns do not apply to PsiQuantum.

The company, he says, is closer than any other firm by a very large margin to building a useful quantum computer, one that solves an impactful problem that we would not have been able to solve otherwise (e.g., something from quantum chemistry which has real-world uses). He adds, Obviously, I have biases, and people will naturally discount my opinions. But I have spent a lot oftime quantitatively comparing what we are doing to others.

Rudolph and other experts contend that a useful quantum computer with robust error-correction will require millions of qubits. PsiQuantum, which constructs qubits out of light, expects by the middle of the decade to be building fault-tolerant quantum computers with fully manufactured components capable of scaling to a million or morequbits, Rudolph says. PsiQuantum has partnered with the semiconductor manufacturer GlobalFoundries to achieve its goal. The machines will be room-sized, comparable to supercomputers or data centers. Most users will access the computers remotely.

Could PsiQuantum really be leading all the competition by a wide margin, as Rudolph claims? Can it really produce a commercially viable machine by 2025? I dont know. Quantum mechanics and quantum computing still baffle me. Im certainly not going to advise my friend or anyone else to invest in quantum computers. But I trust Rudolph, just as I trust Aaronson.

Way back in 1994, I wrote a brief report for Scientific American on quantum computers, noting that they could, in principle, perform tasks beyond the range of any classical device. Ive been intrigued by quantum computing ever since. If this technology gives scientists more powerful tools for simulating complex phenomena, and especially the quantum weirdness at the heart of things, maybe it will give science the jump start it badly needs. Who knows? I hope PsiQuantum helps quantum computing live up to the hype.

This is an opinion and analysis article.

Further Reading:

Will Artificial Intelligence Ever Live Up to Its Hype?

Is the Schrdinger Equation True?

Quantum Mechanics, the Chinese Room Experiment and the Limits of Understanding

Quantum Mechanics, the Mind-Body Problem and Negative Theology

For more ruminations on quantum mechanics, see my new bookPay Attention: Sex, Death, and Science and Tragedy and Telepathy, a chapter in my free online bookMind-Body Problems.

View original post here:

Will Quantum Computing Ever Live Up to Its Hype? - Scientific American

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Cambridge Quantum pushes into NLP and quantum computing with new head of AI – VentureBeat

Posted: at 1:56 am


without comments

Join Transform 2021 this July 12-16. Register for the AI event of the year.

Cambridge Quantum Computing (CQC) hiring Stephen Clark as head of AI last week could be a sign the company is boosting research into ways quantum computing could be used for natural language processing.

Quantum computing is still in its infancy but promises such significant results that dozens of companies are pursuing new quantum architectures. Researchers at technology giants such as IBM, Google, and Honeywell are making measured progress on demonstrating quantum supremacy for narrowly defined problems. Quantum computers with 50-100 qubits may be able to perform tasks that surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably, California Institute of Technology theoretical physics professor John Preskill wrote in a recent paper. We may feel confident that quantum technology will have a substantial impact on society in the decades ahead, but we cannot be nearly so confident about the commercial potential of quantum technology in the near term, say the next 5 to 10 years.

CQC has been selling software focused on specific use cases, such as in cybersecurity and pharmaceutical and drug delivery, as the hardware becomes available. We are very different from the other quantum software companies that we are aware of, which are primarily focused on consulting-based revenues, CQC CEO Ilyas Khan told VentureBeat.

For example, amid concerns that improvements in quantum hardware will make it easier to break existing algorithms used in modern cryptography, CQC devised a method to generate quantum-resistant cryptographic keys that cannot be cracked by todays methods. CQC partners with pharmaceutical and drug discovery companies to develop quantum algorithms for improving material discovery, such as working with Roche on drug development, Total on new materials for carbon capture and storage solutions, and CrownBio for novel cancer treatment biomarker discovery.

The addition of Clark to CQCs team signals the company will be shifting some of its research and development efforts toward quantum natural language processing (QNLP). Humans are good at composing meanings, but this process is not well understood. Recent research established that quantum computers, even with their current limitations, could learn to reason with the uncertainty that is part of real-world scenarios.

We do not know how we compose meaning, and therefore we have not been sure how this process can be carried over to machines/computers, Khan said.

QNLP could enable grammar-aware representation of language that makes sense of text at a deeper level than is currently available with state-of-the-art NLP algorithms like Bert and GPT 3.0. The company has already demonstrated some early success in representing and processing text using quantum computers, suggesting that QNLP is within reach.

Clark was previously senior staff research scientist at DeepMind and led a team working on grounded language learning in virtual environments. He has a long history with CQC chief scientist Bob Coecke, with whom he collaborated 15 years ago to devise a novel approach for processing language. That research stalled due to the limitations of classical computers. Quantum computing could help address these bottlenecks, and there are plans to continue that research program, Clark said in a statement.

The methods we developed to demonstrate this could improve a broad range of applications where reasoning in complex systems and quantifying uncertainty are crucial, including medical diagnoses, fault-detection in mission-critical machines, and financial forecasting for investment management, Khan said.

Continued here:

Cambridge Quantum pushes into NLP and quantum computing with new head of AI - VentureBeat

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Are We Doomed to Repeat History? The Looming Quantum Computer Event Horizon – Electronic Design

Posted: at 1:56 am


without comments

What youll learn:

A couple examples from history highlight our failure to secure the technology thats playing an increasingly larger role in both our personal lives and business. When computers were first connected to the internet, we had no idea of the Pandoras Box that was being opened, and cybersecurity wasnt even considered a thing. We failed to learn our lesson when mobile phones exploded onto the world and again with IoT still making fast to market more important than security. This has constantly left cybersecurity behind the 8 ball in the ongoing effort to secure data.

As we race to quantum computing, well see another, and perhaps the greatest, fundamental shift in the way computing is done. Quantum computers promise to deliver an increase in computing power that could spur enormous breakthroughs in disease research, understanding global climate, and delving into the origins of the universe.

As a result, the goal to further advance quantum-computing research has rightfully attracted a lot of attention and funding including $625 million from the U.S. government.1 However, it also will make many of our trusted security techniques inadequate, enabling encryption to be broken in minutes or hours instead of the thousands of years it currently takes.

Two important algorithms that serve as a basis for security of most commonly utilized public-key algorithms today will be broken by quantum computers:

As we prepare for a post-quantum world, we have another opportunity to get security right. The challenge of replacing the existing public-key cryptography in these applications with quantum-computer-resistant cryptography is going to be formidable.

Todays state-of-the-art quantum computers are so limited that while they can break toy examples, they dont endanger commercially used key sizes (such as specified in NIST SP800-57). However, most experts agree its only a matter of time until quantum computers evolve to the point of being able to break todays cryptography.

Cryptographers around the world have been studying the issue of post-quantum cryptography (PQC), and NIST has started a standardization process. However, even though were likely five to 10 years away from quantum computers becoming widely available, were approaching what can be described as the event horizon.

Data that has been cryptographically protected by quantum-broken algorithms up to Day 0 of the PQC deployment will likely need to remain secure for years decades in some cases after quantum computers are in use. This is known as Moscas Theorem (see figure).

%{[ data-embed-type="image" data-embed-id="6081ce0f2f5c1329008b4613" data-embed-element="span" data-embed-size="640w" data-embed-alt="Illustration of a bad outcome under Mosca’s Theorem, where a quantum adversary can break the security requirements for recorded messages. The adversary could, for example, break the encryption on a recorded message or alter a legal document and generate a fake signature indistinguishable from a valid signature." data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/PQC_Event_Horizon_Figure_1.6081ce0f24f07.png?auto=format&fit=max&w=1440" data-embed-caption="Illustration of a bad outcome under Moscas Theorem, where a quantum adversary can break the security requirements for recorded messages. The adversary could, for example, break the encryption on a recorded message or alter a legal document and generate a fake signature indistinguishable from a valid signature." ]}%

Deploying any secure solution takes time. Given the inherent longer development time of chips compared to software, chip-based security becomes even more pressing. Throw in the added challenge that PQC depends on entirely new algorithms, and our ability to protect against quantum computers will take many years to deploy. All this adds up to make PQC a moving target.

The good news is that, and I take heart in this, we seem to have learned from previous mistakes, and NISTs PQC standardization process is working. The effort has been underway for more than four years and has narrowed entrants from 69 to seven (four in the category of public-key encryption and three in the category of digital signatures) over three rounds.

However, in late January 2021, NIST started reevaluating a couple of the current finalists and is considering adding new entries as well as some of the candidates from the stand-by list. As mentioned previously, addressing PQC isnt an incremental step. Were learning as we go, which makes it difficult to know what you dont know.

The current finalists were heavily skewed toward a lattice-based scheme. What the potential new direction by NIST indicates is that as the community has continued studying the algorithms, lattice-based schemes may not be the holy grail we first had hoped.

Someone outside the industry may look at that as a failure, but I would argue thats an incorrect conclusion. Only by trial and error, facing failure and course correcting along the way, can we hope to develop effective PQC algorithms before quantum computers open another, potentially worse cybersecurity Pandoras box. If we fail to secure it, we risk more catastrophic security vulnerabilities than weve ever seen: Aggressors could cripple governments, economies, hospitals, and other critical infrastructure in a matter of hours.

While its old hat to say, Its time the world took notice of security and give it a seat at the table, the time to deliver on that sentiment is now.

Reference

1. Reuters, U.S. to spend $625 million in five quantum information research hubs

Continued here:

Are We Doomed to Repeat History? The Looming Quantum Computer Event Horizon - Electronic Design

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Quantum: It’s still not clear what its good for, but Amazon and QCI will help developers find out – ZDNet

Posted: at 1:56 am


without comments

When it comes to practical problems, including things such as the traveling salesman problem, a classic in optimization, the value of quantum is still to be decided, say Richard Moulds, left, head of Amazon's Braket quantum computing service, and Robert Liscouski, head of Quantum Computing Inc., which makes Qatalyst software to do optimization on both classical and quantum machines.

It's easy to imagine a problem for which, if one had a computer that magically leapt across steps of the computation, your life would be much better.

Say, for example, a computer that auto-magically searches through a vast space of possible solutions much faster than you can with a CPU or GPU.

That's the premise of quantum computing, and surprisingly, for all the hype, it's not clear if that premise is true.

"I don't think we've seen any evidence yet that a quantum machine can do anything that's commercially interesting faster or cheaper than a classical machine," Richard Moulds, head of Amazon Braket, the cloud giant's quantum computing service, said in an interview with ZDNet. "The industry is waiting for that to arrive."

It is the question of the "quantum advantage," the notion that the entangled quantum states in a quantum computer will perform better on a given workload than an electronic system.

"We haven't seen it yet," Robert Liscouski, CEO of Quantum Computing Inc, said of the quantum advantage, in the same Zoom interview with Moulds.

That aporia, the as-yet-unproven quantum advantage, is in fact the premise for a partnership announced this month, whereby QCI's Qatalyst software program will run as a cloud service on top of Braket.

QCI's corporate tag line is "ready-to-run quantum software," and the Qatalyst program is meant to dramatically simplify sending a computing task to the qubits of a quantum hardware machine, the quantum processing units, or QPUs, multiple instances of which are offered through Bracket, including D::Wave, IonQ, and Rigetti.

The idea is to get more people working with quantum machines precisely to find out what they might be good for.

"Our platform basically allows the democratization of quantum computing to extend to the user community," said Liscouski.

"If you look back on the quantum industry since it started, it's traditionally been very difficult to get access to quantum hardware," said Moulds, including some machines that are "totally unavailable unless you have a personal relationship with the the physicist that built it."

"We're trying to make it easy for everyone to have access to the same machinery; it shouldn't be those that have and those that have not, it should be everyone on the same flywheel," he said.

The spectrum of users who will be working with quantum comprise "two important communities" today, said Moulds, those that want to twiddle qubits at the hardware level, and those that want to spend time on particular problems in order to see if they actually gain any benefit when exposed to the quantum hardware.

"There's a lot of researchers focused on building better hardware, that is the defining force in this industry," said Moulds. "Those types of researchers need to be in the weeds, playing at the qubit level, tweaking the frequencies of the pulses sent to the chip inside the fridge."

On the other hand, "the other class of users is much more geared to Robert's view of the world: they don't really care how it gets done, they just want to understand how to program their problem so that it can be most easily solved."

That second class of users are "all about abstraction, all about getting away from the technology." As quantum evolves, "maybe it slides under so that customers don't even know it's there," mused Moulds.

When it comes to those practical problems, the value of quantum is still to be decided.

There has been academic work showing quantum can speed up tasks, but "that's not been applied to a problem that anybody cares about," said Moulds.

The entire quantum industry is "still finding its way to what applications are really useful," he said. "You tend to see this list of potential applications, a heralded era of quantum computing, but I don't think we really know," he said.

The Qatalyst software from QCI focuses on the kinds of problems that are of perennial interest, generally in the category of optimization, particularly constrained optimization, where a solution to a given loss function or objective function is made more complicated by having to narrow the solution to a bunch of variables that have a constraint of some sort enforced, such as bounded values.

"They are described at a high level as the traveling salesman problem, where you have multi-variate sort of outcomes," said Liscouski. "But it's supply-chain logistics, it's inventory management, it's scheduling, it's things that businesses do today that quantum can really accelerate the outcomes in the very near future."

Such problems are "a very important use case," said Moulds. Quantum computers are "potentially good at narrowing the field in problem spaces, searching through large potential combinations in a wide variety of optimization problems," he said.

However, "classical will probably give you the better result" at this time, said Liscouski.

One of the reasons quantum advantage is not yet certain is because the deep phenomena at the heart of the discipline, things such as entanglement, make the field much more complex than early digital computing.

"A lot of people draw the analogy between where we are and the emergence of the transistor," said Moulds.

"I think that's not true: this is not just a case of making the computers we have today smaller and faster and cheaper, we're not anywhere near that regime, that Moore's Law notion of just scaling these things up."

"There's fundamental scientific discoveries that have to be made to build machines that can tackle these sorts of problems on the grand scale that we've been talking about."

Beyond the machines' evolution, there is an evolution implicit for programmers. Quantum brings a fundamentally different approach to programming. "These are physics-based machines, they're not just computational engines that add ones and zeros together, it's not just a faster slide rule," said Moulds.

That different way of programming may, in fact, point the way to some near-term payoff for the Qatalyst software, and Braket. Both Liscouski and Moulds expressed enthusiasm for taking lessons learned from quantum and back-loading them into classical computers.

"Typically, access to quantum computing is through toolkits and resources that require some pretty sophisticated capabilities to program to ultimately get to some result that involves a quantum computer," observed Liscouski.

"With Braket, the platform provides both access to QPUs and classical computing at the same time, and the quantum techniques that we use in the platform will get results for both," said Liscouski.

"It isn't necessarily a black and white decision between quantum and classical," said Moulds. "There's an emerging area, particularly in the area of optimization, people use the term quantum-inspired approaches are used."

"What that means is, looking at the ways that quantum computers actually work and applying that as a new class of algorithms that run on classical machines," he said.

"So, there's a sort of a morphing going on," he said.

An advantage to working with QCI, said Moulds, is that "they bring domain expertise that we don't have," things such as the optimization expertise.

"We've coined the phrase, 'Build on Braket'," said Moulds. "We're trying to build a quantum platform, and we look to companies like QCI to bring domain expertise to use that platform and apply it to problems that customers have really got."

Also important is operational stability and reliability, said Moulds. For a first-tier Web service with tons of users, the priority for Amazon is "running a professional service, a platform that is reliable and secure and durable" on which companies can "build businesses and solve problems."

Although there are "experimental" aspects, he said, "this is not intended to be a best-effort showcase."

Although the quantum advantage is not certain, Moulds holds out the possibility someone working with the technology will find it, perhaps even someone working on Braket.

"The only way we can move this industry forward is by pulling the curtains apart and giving folks the chance to actually see what's real," he said.

"And, boy, the day we see a quantum computer doing something that is materially advantageous from a commercial point of view, you will not miss that moment, I guarantee."

Originally posted here:

Quantum: It's still not clear what its good for, but Amazon and QCI will help developers find out - ZDNet

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Australia and India team up on critical technology – ComputerWeekly.com

Posted: at 1:56 am


without comments

zapp2photo - stock.adobe.com

Published: 22 Apr 2021 7:07

Australia and India have joined hands to advance the development of critical and emerging technologies such as artificial intelligence (AI), 5G networks, the internet of things (IoT) and quantum computing through a research grant programme.

Through the programme, the two countries hope to help shape a global technology environment that meets Australia and Indias shared vision of an open, free, rules-based Indo-Pacific region.

The first three projects in the initial round of the programme, which prioritised proposals focused on strengthening understanding of ethical frameworks and developing technical standards for critical technologies, were recently announced by Australias department of foreign Affairs and trade.

This project, led by the Centre for International Security Studies at the University of Sydney and experts such as Rajeshwari Rajagopalan of the Delhi-based Observer Research Foundation and quantum physicist Shohini Ghose, aims to develop quantum accords to shape international governance of quantum technologies.

The team will build guiding principles on ethics, best practices and progressive applications of quantum technologies.

But rather than propose a formal set of universal rules, they will seek consensus among key stakeholders on what constitutes ethical or unethical behaviour, good or bad practices, productive or destructive applications for emerging quantum technologies.

The project, spearheaded by La Trobe University and Indian Institute of Technology Kampur, will provide Australian and Indian business with an ethics and policy framework when outsourcing their technology to Indian providers.

It will do by improving the understanding of how they translate being signatories of ethical codes to their actual practice. The project will also analyse the emotions and views of stakeholders expressed in social media on the ethical issues found to be important through business surveys.

In doing so, the project intends to advance knowledge in AI and cyber and critical technology, ethics and sustainability and risk by bringing together disciplines in business management and ethics, computer science and engineering, and AI and business analytics.

The outcomes expected include recommendations on revised ethical codes and practices and a framework for using AI and advanced analytics to review ethical practices of companies.

The explosive growth in wireless network usage and IoT systems is expected to accelerate. While 5G networks offer significant improvements in terms of capacity, data rates, and potential energy efficiency, there is a need to address critical privacy and security challenges.

The work will focus on the issues that arise from wireless tracking systems that rely on detecting variations in the channel state information (CSI) due to the users physical activities and wireless networking.

Based on a series of experiments in Australia and India, the project will develop a comprehensive understanding of the extent of private information and metadata exposed and related inferences. This will be used to engage with standards and regulatory agencies and government bodies to strengthen data protection regimes in Australia, India and globally.

The research will be the basis for a whitepaper detailing the emerging wireless network privacy and security threat landscape. This will be followed up with a workshop in Bangalore with key regulators, standards body officials, policy makers and researchers, with the goal of initiating action to effectively address the emerging threats.

The work will be led the University of Sydney, University of New South Wales, Orbit Australia, Reliance Jio Infocomm, Indian Institute of Technology Madras and Calligo Technologies.

The automation of the financial software that lies at the heart of any business & accountancy, budget management, general ledger, payroll, and so on & is a prize many organisations are eyeing up, with machine learning and robotic process automation close to mind. Find out everything you need to know by downloading this PDF E-Guide.

See original here:

Australia and India team up on critical technology - ComputerWeekly.com

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Cleveland Clinic and IBM hope their tech partnership could help prevent the next pandemic – WTHITV.com

Posted: at 1:56 am


without comments

After a year in which scientists raced to understand Covid-19 and to develop treatments and vaccines to stop its spread, Cleveland Clinic is partnering with IBM to use next-generation technologies to advance healthcare research and potentially prevent the next public health crisis.

The two organizations on Tuesday announced the creation of the "Discovery Accelerator," which will apply technologies such as quantum computing and artificial intelligence to pressing life sciences research questions. As part of the partnership, Cleveland Clinic will become the first private-sector institution to buy and operate an on-site IBM quantum computer, called the Q System One. Currently, such machines only exist in IBM labs and data centers.

Quantum computing is expected to expedite the rate of discovery and help tackle problems with which existing computers struggle.

The accelerator is part of Cleveland Clinic's new Global Center for Pathogen Research & Human Health, a facility introduced in January on the heels of a $500 million investment by the clinic, the state of Ohio and economic development nonprofit JobsOhio to spur innovation in the Cleveland area.

The new center is dedicated to researching and developing treatments for viruses and other disease-causing organisms. That will include some research on Covid-19, including why it causes ongoing symptoms (also called "long Covid") for some who have been infected.

"Covid-19 is an example" of how the center and its new technologies will be used, said Dr. Lara Jehi, chief research information officer at the Cleveland Clinic.

"But ... what we want is to prevent the next Covid-19," Jehi told CNN Business. "Or if it happens, to be ready for it so that we don't have to, as a country, put everything on hold and put all of our resources into just treating this emergency. We want to be proactive and not reactive."

Quantum computers process information in a fundamentally different way from regular computers, so they will be able to solve problems that today's computers can't. They can, for example, test multiple solutions to a problem at once, making it possible to come up with an answer in a fraction of the time it would take a different machine.

Applied to healthcare research, that capability is expected to be useful for modeling molecules and how they interact, which could accelerate the development of new pharmaceuticals. Quantum computers could also improve genetic sequencing to help with cancer research, and design more efficient, effective clinical trials for new drugs, Jehi said.

Ultimately, Cleveland Clinic and IBM expect that applying quantum and other advanced technologies to healthcare research will speed up the rate of discovery and product development. Currently, the average time from scientific discovery in a lab to getting a drug to a patient is around 17 years, according to the National Institutes of Health.

"We really need to accelerate," Jehi said. "What we learned with the Covid-19 pandemic is that we cannot afford, as a human race, to just drop everything and focus on one emergency at a time."

Part of the problem: It takes a long time to process and analyze the massive amount of data generated by healthcare, research and trials something that AI, quantum computing and high-performance computing (a more powerful version of traditional computing) can help with. Quantum computers do that by "simulating the world," said Dario Gil, director of IBM Research.

"Instead of conducting physical experiments, you're conducting them virtually, and because you're doing them virtually through computers, it's much faster," Gil said.

For IBM, the partnership represents an important proof point for commercial applications of quantum computing. IBM currently offers access to quantum computers via the cloud to 134 institutions, including Goldman Sachs and Daimler, but building a dedicated machine on-site for one organization is a big step forward.

"What we're seeing is the emergency of quantum as a new industry within the world of information technology and computing," Gil said. "What we're seeing here in the context of Cleveland Clinic is ... a partner that says, 'I want the entire capacity of a full quantum computer to be [dedicated] to my research mission."

The partnership also includes a training element that will help educate people on how to use quantum computing for research which is likely to further grow the ecosystem around the new technology.

Cleveland Clinic and IBM declined to detail the cost of the quantum system being installed on the clinic's campus, but representatives from both organizations called it a "significant investment." Quantum computers are complex machines to build and maintain because they must be stored at extremely cold temperatures (think: 200 times colder than outer space).

The Cleveland Clinic will start by using IBM's quantum computing cloud offering while waiting for its on-premises machine to be built, which is expected to take about a year. IBM plans to later install at the clinic a more advanced version of its quantum computer once it is developed in the coming years.

Jehi, the Cleveland Clinic research lead, acknowledged that quantum computing technology is still nascent, but said the organization wanted to get in on the ground floor.

"It naturally needs nurturing and growing so that we can figure out what are its applications in healthcare," Jehi said. "It was important to us that we design those applications and we learn them ourselves, rather than waiting for others to develop them."

Continue reading here:

Cleveland Clinic and IBM hope their tech partnership could help prevent the next pandemic - WTHITV.com

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Synopsys Rolls Out All-in-One Tool to Speed Up IC Simulation – Electronic Design

Posted: at 1:56 am


without comments

Synopsys, one of the largest vendors of electronic-design-automation (EDA) software, rolled out a unified suite of simulation software that promises to speed up the design of systems-on-a-chip (SoCs), systems-in-package (SiPs), and memory chips for use in data centers, 5G, automotive, artificial intelligence (AI), and other areas.

Today, the most advanced chips have billions of transistors, but it is impossible for engineers to verify by hand every single facet of the chip before it is manufactured. Failure to accurately test the blueprint of a chip for mistakes can drag out the development process and raise the possibility of a premature failure in the device in the future, which can damage a companys reputation.

"EDA is the unknown soldier of the semiconductor design process," said Hany Elhak, who handles product management and marketing for the custom IC and physical verification group at Synopsys. But as chips have become vastly more complicated in recent years, circuit simulation software has become an indispensable part of every engineer's toolbox (Fig.1).

%{[ data-embed-type="image" data-embed-id="607e0705a6ade9d3368b48e2" data-embed-element="span" data-embed-size="640w" data-embed-alt="Synopsys Spice 1" data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/Synopsys_SPICE_1.607e07043ac08.png?auto=format&fit=max&w=1440" data-embed-caption="" data-embed-credit="Figure 1. (Image courtesy of Synopsys)." ]}%

Synopsys sells software tools based on the industry-standard SPICE simulation technology. SPICE is used to create a computer model of an analog or other electronic circuit and put it through its paces to test whether it works as intended. SPICE can also be used to identify potential areas for improvement and test planned changes to the design without being forced to prototype it.

But when it comes to SoCs or SiPs consisting of memory, analog, radio frequency (RF), digital, and other blocks of intellectual property (IP) on the same silicon die or package, vendors have had to use differentdesign and verification tools for every part of the IC. But according to Synopsys, these disparate tools are not cut out for huge amount of complexity in modern chips.

The electronic design software giant said that it integrated all its simulation software into a single solution, PrimeSim Continuum, aimed at analog, mixed-signal, RF, and custom digital memory designs. The all-in-one system allows its customers to mix and match different simulation engines (Fig. 2) to simulate different parts of the SoC and run them all from the same environment.

%{[ data-embed-type="image" data-embed-id="607e0705fdc914194c8b48ad" data-embed-element="span" data-embed-size="640w" data-embed-alt="Synopsys Spice 2" data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/Synopsys_SPICE_2.607e0704491ac.png?auto=format&fit=max&w=1440" data-embed-caption="" data-embed-credit="Figure 2. (Image courtesy of Synopsys)." ]}%

To boost productivity, Synopsys said it enhanced the SPICE and FastSPICE architectures at the heart of the software, giving it the speed and capacity to test semiconductor designs up to 10 times faster than previously without giving up the accuracy of the analysis. Synopsys said PrimeSim Continuum can shorten the time it takes to bring products to market and, in turn, reduce costs.

As the semiconductor industry crams more and more transistors on tiny squares of silicon, Synopsys is trying to keep up with the needs of chip vendors with faster and more accurate simulation software.

Synopsys said more of its customers are bringing power management ICs, radio frequency ICs, and other analog chips previously slapped on the circuit board (PCB) in a smartphone or other device on the same slice of silicon as the CPU, I/O and memory. These increasingly heterogeneous SoCs are also housing larger slices of embedded memory and faster I/O. (Fig. 3).

%{[ data-embed-type="image" data-embed-id="607e07058f14e6da168b45b1" data-embed-element="span" data-embed-size="640w" data-embed-alt="Synopsys Spice 6" data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/Synopsys_SPICE_6.607e07044c9ce.png?auto=format&fit=max&w=1440" data-embed-caption="" data-embed-credit="Figure 3. (Image courtesy of Synopsys)." ]}%

Another problem on the semiconductor industrys plate is increased parasiticsor unwanted resistance, inductance, or capacitance in electronic circuitsas these types of chips scale to smaller and smaller nodes. The analog parts of the IC are also more vulnerable to variations that occur as a result of the IC production process. These slight aberrations can cause bugs or a complete failure of the IC in the future, adding to the challenges of verification.

Instead of loading all the different components of a smartphone or other device on a single die, other vendors are rolling out chips based on a system-in-package, or SiP, approach. That opens the door for vendors to create many different chips based on different nodes and then seal them all up together to wring out more performance, reduce power, or add new features.

"It is both scale complexity and system complexity that have been increasing," Elhak said. "You need to simulate not only the chip itself but at the same time all its interactions with other chips in the package," he added. The result in more simulations with longer runtimes and higher levels of accuracy to weed out potential weaknesses in the blueprint of the chip.

Synopsys said its latest solution brings together a wide range of different simulation engines in a single environment that is engineered for ease of use and improved productivity (Fig. 4).

%{[ data-embed-type="image" data-embed-id="607e07053903c565168b4598" data-embed-element="span" data-embed-size="640w" data-embed-alt="Synopsys Spice 5" data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/Synopsys_SPICE_5.607e07043bf6e.png?auto=format&fit=max&w=1440" data-embed-caption="" data-embed-credit="Figure 4. (Image courtesy of Synopsys)." ]}%

The all-in-one solution includes its PrimeSim SPICE technology for analog, radio frequency, and digital verification; PrimeSim HSPICE, its gold-standard signoff software for foundation IP as well as signal and power integrity; PrimeSim XA, a FastSPICE tool for mixed-signal and SRAM designs; and PrimeSim Pro, its latest FastSPICE architecture for DRAM and flash-memory chips. Linking them all together is PrimeWave, its new design environment.

"All of these engines are combined in a single, unified solution," Elhak said. "We allow you to use the right engine for any of the technologies you are verifying. Synopsys said PrimeSim is one of the cornerstones of its custom design platform, and it is also integrated with its suite of verification software so that customers can resolve problems that turn up in PrimeSim.

Synopsys said the tools are currently being used by Samsung Electronics, NVIDIA, and other early-access customers. The company's major rivals are EDA heavyweights Cadence Design Systems and Siemens EDA.

%{[ data-embed-type="image" data-embed-id="607e0705a6ade973668b4783" data-embed-element="span" data-embed-size="640w" data-embed-alt="Synopsys Spice 3" data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/Synopsys_SPICE_3.607e07044e582.png?auto=format&fit=max&w=1440" data-embed-caption="" data-embed-credit="Figure 5. (Image courtesy of Synopsys)." ]}%

EDA software uses huge amounts of computational horsepower, and semiconductor giants maintain colossal data centers or rent out computing power over the cloud to run them. But creating computer models of electronic circuits with millions to billions of elements and then testing them all out can take a day or more. SPICE is the bottleneck for signing off any large chip design, Elhak warned.

Today, semiconductor firms run thousands of simulations on the most intricately-designed chips before sending the final blueprint to a foundry to be manufactured. That further drags out the chip design process.

Synopsys is trying to solve the speed bottleneck with its state-of-the-art SPICE architecture. The company said that it delivers up to three times faster performance for analog, memory, RF, and other IC designs byscaling to more CPU cores. Synopsys said it can wring out up to 10 times more performance by taking advantage of accelerated computing on NVIDIA GPUs, without giving up accuracy (Fig. 5).

%{[ data-embed-type="image" data-embed-id="607e0705a6ade97d668b476b" data-embed-element="span" data-embed-size="640w" data-embed-alt="Synopsys Spice 4" data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/Synopsys_SPICE_4.607e070441218.png?auto=format&fit=max&w=1440" data-embed-caption="" data-embed-credit="Figure 6. (Image courtesy of Synopsys)." ]}%

"As modern compute workloads evolve, the scale and complexity of analog IC designs have moved beyond the capacity of traditional circuit simulators," said Edward Lee, vice president of mixed-signal design at NVIDIA, in a statement. He said that the improvements in PrimeSim SPICE shortens the time it takes to carry out verification on analog ICs from days to hours.

Synopsys said it upgraded its underlying FastSPICE architecture to model more advanced 3D DRAMincluding high-bandwidth memory (HBM) used in data centersand flash-memory chip designs. The PrimeSim Pro tool uses advanced partitioning and modeling technologies to split simulations into more manageable parts, promising two to five times the speed of other solutions on the market (Fig. 6).

"Relentless technology scaling and innovations around DRAM architecture have resulted in larger and more complex memory designs requiring higher simulation performance and capacity," said Jung Yun Choi, corporate vice president of memory design technology at Samsung. He added that PrimeSim Pro could "keep pace with the capacity needs of our advanced memory designsand allow us to meet our aggressive time-to-results targets."

Synopsys said PrimeSim XA, PrimeSim HSPICE, PrimeSim SPICE, and PrimeSim Pro are all supported by leading foundries, including TSMC and Samsung, on advanced process nodes.

See the article here:

Synopsys Rolls Out All-in-One Tool to Speed Up IC Simulation - Electronic Design

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Quantum Computing Market Share Current and Future Industry Trends, 2020 to 2027 The Courier – The Courier

Posted: at 1:56 am


without comments

Quantum Computing Market is a professional and a detailed report focusing on primary and secondary drivers, market share, leading segments and geographical analysis. This analysis provides an examination of various market segments that are relied upon to observe the fastest development amid the estimated forecast frame. The report encompasses market definition, currency and pricing, market segmentation, market overview, premium insights, key insights and company profile of the key market players. The persuasive Quantum Computing market report also helps to know about the types of consumers, their response and views about particular products, and their thoughts for the step up of a product.

Quantum computing is an advanced developing computer technology which is based on the quantum mechanics and quantum theory. The quantum computer has been used for the quantum computing which follows the concepts of quantum physics. The quantum computing is different from the classical computing in terms of speed, bits and the data. The classical computing uses two bits only named as 0 and 1, whereas the quantum computing uses all the states in between the 0 and 1, which helps in better results and high speed. Quantum computing has been used mostly in the research for comparing the numerous solutions and to find an optimum solution for a complex problem and it has been used in the sectors like chemicals, utilities, defence, healthcare & pharmaceuticals and various other sectors. Quantum computing is used for the applications like cryptography, machine learning, algorithms, quantum simulation, quantum parallelism and others on the basis of the technologies of qubits like super conducting qubits, trapped ion qubits and semiconductor qubits. Since the technology is still in its growing phase, there are many research operations conducted by various organizations and universities including study on quantum computing for providing advanced and modified solutions for different applications. For instance, Mercedes Benz has been conducting research over the quantum computing and how it can be used for discovering the new battery materials for advanced batteries which can be used in electric cars. Mercedes Benz has been working in collaboration with the IBM on IBM Q network program, which allows the companies in accessing the IBMs Q network and early stage computing systems over the cloud. Global quantum computing market is projected to register a healthy CAGR of 29.5% in the forecast period of 2019 to 2026.

Download Sample Copy of the Report to understand the structure of the complete report (Including Full TOC, Table & Figures) @https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-quantum-computing-market&Somesh

Quantum Computing Market Scope and Segmentation:

Global quantum computing market is segmented into seven notable segments which are system, qubits, deployment model, component, application, logic gates and vertical.

Quantum Computing Market Country Level Analysis

For detailed insights on Global Quantum Computing Market Size, competitive landscape is provided i.e. Revenue Share Analysis (Million USD) by Players, Revenue Market Share (%) by Players and further a qualitative analysis is made towards market concentration rate, product differentiation, new entrants are also considered in heat map concentration.

New Business Strategies, Challenges & Policies are mentioned in Table of Content, Request TOC at @https://www.databridgemarketresearch.com/toc/?dbmr=global-quantum-computing-market&Somesh

Leading Key Players Operating in the Quantum Computing Market Includes:

Some of the major players operating in this market are Honeywell International, Inc., Accenture, Fujitsu, Rigetti & Co, Inc., 1QB Information Technologies, Inc., IonQ, Atom Computing, ID Quantique, QuintessenceLabs, Toshiba Research Europe Ltd, Google,Inc., Microsoft Corporation, Xanadu, Magiq Technologies, Inc., QX branch, NEC Corporation, Anyon System,Inc. Cambridge Quantum Computing Limited, QC Ware Corp, Intel Corporation and others.

Product Launch

The Quantum Computing Market research covers a comprehensive analysis of the following facts:

Table of Content:

PART 01: EXECUTIVE SUMMARY

PART 02: SCOPE OF THE REPORT

PART 03: RESEARCH METHODOLOGY

PART 04: INTRODUCTION

PART 05: MARKET LANDSCAPE

PART 06: MARKET SIZING

PART 07: FIVE FORCES ANALYSIS

PART 08: MARKET SEGMENTATION BY PRODUCT

PART 09: MARKET SEGMENTATION BY DISTRIBUTION CHANNEL

PART 10: CUSTOMER LANDSCAPE

PART 11: MARKET SEGMENTATION BY END-USER

PART 12: REGIONAL LANDSCAPE

PART 13: DECISION FRAMEWORK

PART 14: DRIVERS AND CHALLENGES

PART 15: MARKET TRENDS

PART 16: COMPETITIVE LANDSCAPE

PART 17: COMPANY PROFILES

PART 18: APPENDIX

Inquire Before Buying This Research Report:https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-quantum-computing-market&Somesh

About Us:

An absolute way to forecast what future holds is to comprehend the trend today!

Data Bridge Market Research set forth itself as an unconventional and neoteric Market research and consulting firm with an unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge Market Research provides appropriate solutions to complex business challenges and initiates an effortless decision-making process.

Contact:

US: +1 888 387 2818

UK: +44 208 089 1725

Hong Kong: +852 8192 7475

corporatesales@databridgemarketresearch.com

Excerpt from:

Quantum Computing Market Share Current and Future Industry Trends, 2020 to 2027 The Courier - The Courier

Written by admin

April 24th, 2021 at 1:56 am

Posted in Quantum Computer

Newly Invented Device Controls Thousands of Qubits – Unite.AI

Posted: February 9, 2021 at 6:53 am


without comments

A team of scientists and engineers at the University of Sydney and Microsoft Corporation have teamed up to develop a new device that has big implications for quantum computing. The single chip can operate 40 times colder than deep space, and it is capable of controlling signals for thousands of qubits, which are the fundamental building blocks of quantum computers.

The results were published in Nature Electronics.

Professor David Reilly is the one responsible for designing the chip. He has a joint position with the University of Sydney and Microsoft.

To realise the potential of quantum computing, machines will need to operate thousands if not millions of qubits, said Professor Reilly.

The worlds biggest quantum computers currently operate with just 50 or so qubits, he continued. This small scale is partly because of limits to the physical architecture that control the qubits. Our new chip puts an end to those limits.

One of the main requirements of an efficient quantum system is qubits to operate at temperatures around zero, or -273.15 degrees. The reason for this temperature requirement is so that the qubits do not lose their character of matter or light, which is required by quantum computers to perform specialized applications.

One of the reasons quantum systems often involve many wires is that they operate based on instructions, which come in the form of electrical signals sent and received.

Professor Reilly is also the Chief Investigator at the ARC Centre for Engineered Quantum Systems (EQUS).

Current machines create a beautiful array of wires to control the signals; they look like an inverted gilded birds nest or chandelier. Theyre pretty, but fundamentally impractical. It means we cant scale the machines up to perform useful calculations. There is a real input-output bottleneck, said Professor Reilly.

According to Dr. Kushal Das, Microsoft Senior Hardware Engineer and joint inventor of the device, Our device does away with all those cables. With just two wires carrying information as input, it can generate control signals for thousands of qubits. This changes everything for quantum computing.

The new chip was invented at the Microsoft Quantum Laboratories, which is located at the University of Sydney. The partnership brings together the two different worlds to come up with innovative approaches to engineering challenges.

Building a quantum computer is perhaps the most challenging engineering task of the 21st century. This cant be achieved working with a small team in a university laboratory in a single country but needs the scale afforded by a global tech giant like Microsoft, Professor Reilly said.

Through our partnership with Microsoft, we havent just suggested a theoretical architecture to overcome the input-output bottleneck, weve built it.

We have demonstrated this by designing a custom silicon chip and coupling it to a quantum system, he said. Im confident to say this is the most advanced integrated circuit ever built to operate at deep cryogenic temperatures.

The newly developed chip could play a major role in advancing quantum computers, which are one of the most revolutionary technologies within our grasp. Quantum computers are extremely advanced in their abilities to solve problems that classical computers cannot, such as those within the fields of cryptography, medicine, AI, and more.

More:

Newly Invented Device Controls Thousands of Qubits - Unite.AI

Written by admin

February 9th, 2021 at 6:53 am

Posted in Quantum Computer

Universities are Building the Future of Quantum Internet – EdTech Magazine: Focus on Higher Education

Posted: at 6:53 am


without comments

In late 2019, Google, in partnership with NASA, said that its quantum computer performed in 200 seconds a computation that would take the worlds fastest supercomputer thousands of years. Even so, quantum computers need quantum networks to communicate, and todays internet doesnt cut it.

In hot pursuit of a quantum internet is the University of Arizona in Tucson, which the National Science Foundation selected last summer to receive a five-year, $26 million grant to establish the Center for Quantum Networks. CQNs director and principal investigator, Saikat Guha, a professor in the universitys College of Optical Sciences, will lead a team that brings together leading researchers from Howard University, the University of Massachusetts Amherst, the University of Oregon, Northern Arizona University, the University of Chicago and Brigham Young University.

One of the CQN projects will involve building a test bed in Tucson a quantum network spanning six buildings and 10 laboratory sites on campus. On the East Coast, CQNs partner universities, including Harvard and the Massachusetts Institute of Technology, will build a Boston-area test bed to explore quantum communications in a conceptually simple network setting over metropolitan-scale distances, Guha says.

Whenever it arrives, the quantum internet will not replace the classical internet. Instead, users will see an upgrade with a new service: that of quantum communication. The quantum internet would initially be used for research and targeted applications by government, academia and industry users, including national defense, banking and finance, the cloud computing industry, and pharmaceutical research and development, Guha explains. A biomedical researcher could use the quantum internet to simulate a new synthetic molecule. Eventually, a student could open a quantum cloud computing app on a handheld device to perform computations.

The biggest impact on academia that I foresee is creating a transdisciplinary bridge and collaboration among researchers in disciplines that would not have otherwise worked together, Guha says.

Quantum internet research could spawn a new generation of IT innovation. Source: University of Arizona

Other teams across the globe are similarly exploring quantum networking. The European Quantum Internet Alliance, formed in 2018 from 12 universities in eight countries, announced a major development from the Sorbonne University team in October in achieving the scalability of a quantum internet. And in the U.S., the collaboration between Stony Brook University in New York and Brookhaven National Laboratory recently demonstrated that quantum bits (qubits) from two distant quantum computers can be entangled in a third location.

There will be new apps that use this new service for things we do not know today, Guha says. The quantum internet, when available to the average home, will spawn a whole new generation of IT innovators and app developers who will come up with new ways the powerful new service of quantum communication can be used.

See the rest here:

Universities are Building the Future of Quantum Internet - EdTech Magazine: Focus on Higher Education

Written by admin

February 9th, 2021 at 6:53 am

Posted in Quantum Computer


Page 11234..10..»