Page 7«..6789..20..»

Archive for the ‘Quantum Computing’ Category

Classical vs. quantum computing: What are the differences? – TechTarget

Posted: December 21, 2022 at 12:15 am


without comments

As new technologies develop and gain traction, the public tends to divide into two groups: those who believe it will make an impact and grow, and those who don't. The former tends to be correct, so it is crucial to understand how future technologies differ from the status quo to prepare for their adoption en masse.

Classical computing has been the norm for decades, but in recent years, quantum computing has continued to rapidly develop. The technology is still in its early stages, but has existing and many more potential uses in AI/ML, cybersecurity, modeling and other applications.

It might be years before widespread implementation of quantum computing. However, explore the differences between classical vs. quantum computing to gain an understanding should the technology become more widespread.

Quantum computers typically must operate under more regulated physical conditions than classical computers because of quantum mechanics. Classical computers have less compute power than quantum computers and cannot scale as easily. They also use different units of data -- classical computers use bits and quantum computers use qubits.

In classical computers, data is processed in a binary manner.

Classical computers use bits -- eight units of bits is referred to as one byte -- as their basic unit of data. Classical computers write code in a binary manner as a 1 or a 0. Simply put, these 1s and 0s indicate the state of on or off, respectively. They can also indicate true or false or yes or no, for example.

This is also known as serial processing, which is successive in nature, meaning one operation must complete before another one follows. Lots of computing systems use parallel processing, an expansion of classical processing, which can perform simultaneous computing tasks. Classical computers also return one result because bits of 1s and 0s are repeatable due to their binary nature.

Quantum computing, however, follows a different set of rules. Quantum computers use qubits as their unit of data. Qubits, unlike bits, can be a value of 1 or 0, but can also be 1 and 0 at the same time, existing in multiple states at once. This is known as superposition, where properties are not defined until they are measured.

According to IBM, "Groups of qubits in superposition can create complex, multidimensional computational spaces," which enables more complex computations. When qubits become entangled, changes to one qubit directly affect the other, which makes information transfer between qubits much faster.

In classical computers, algorithms need a lot of parallel computations to solve problems. Quantum computers can account for multiple outcomes when they analyze data with a large set of constraints. The outputs have an associated probability, and quantum computers can perform more difficult compute tasks than classical computers can.

Most classical computers operate on Boolean logic and algebra, and power increases linearly with the number of transistors in the system -- the 1s and 0s. The direct relationship means in a classical computer, power increases 1:1 in tandem with the transistors in the system.

Because quantum computers' qubits can represent a 1 and 0 at the same time, a quantum computer's power increases exponentially in relation to the number of qubits. Because of superposition, the number of computations a quantum computer could take is 2N where N is the number of qubits.

Classical computers are well-suited for everyday use and normal conditions. Consider something as simple as a standard laptop. Most people can take their computer out of their briefcase and use it in an air-conditioned caf or on the porch during a sunny summer day. In these environments, performance won't take a hit for normal uses like web browsing and sending emails over short periods of time.

Organizations that don't plan on implementing quantum computing in their own business will still need to prepare for the external threats quantum computing might impose.

Data centers and larger computing systems are more complex and sensitive to temperature, but still operate within what most people would consider "reasonable" temperatures, such as room temperature. For example, ASHRAE recommends A1 to A4 class hardware stays at 18 to 27 degrees Celsius, or 64.4 to 80.6 degrees Fahrenheit.

Some quantum computers, however, need to reside in heavily regulated and stringent physical environments. Some need to be kept at absolute zero, which is around -273.15 degrees Celsius or -459.67 Fahrenheit, although recently the first room-temperature computer was developed by Quantum Brilliance.

The reason for the cold operating environments is that qubits are extremely sensitive to mechanical and thermal influences. Disturbances can cause the atoms to lose their quantum coherence -- essentially, the ability for the qubit to represent both a 1 and a 0 -- which can cause errors to computations.

Like most technologies, quantum computing poses opportunities and risks. While it might be a while before quantum computers really take off, start to have conversations with leadership and develop plans for quantum computing.

Organizations that don't plan on implementing quantum computing in their own business will still need to prepare for the external threats quantum computing might impose. Firstly, quantum computers can potentially crack even the most powerful and advanced security measures. For example, a motivated enough hacker can, in theory, use quantum computing to quickly break the cryptographic keys commonly used in encryption if they are savvy.

In addition, organizations that are considering quantum computers for their data centers or certain applications will have to prepare facilities. Like any other piece of infrastructure, quantum computers need space, electricity supply and resources to operate. Begin examining the options available to accommodate for them. Look at budget, space, facility and staffing needs to begin planning.

See the article here:

Classical vs. quantum computing: What are the differences? - TechTarget

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

The state of quantum computing in 2023 – Verdict

Posted: at 12:15 am


without comments

The quantum computing market is expected to reach between $1 billion and $5 billion by 2025, according to GlobalDatas Tech, Media and Telecom Predictions 2023 report. Quantum computing uses principles of quantum physics to store and compute data. Superposition describes the ability of a quantum bit (qubit) to exist in an on and off state simultaneously. Qubits must be isolated from their external environment to achieve a state of superposition known as coherence. This is where much of the scientific and technological challenge exists, as qubits often decohere before the computation is completed. Current quantum computers (QCs) are said to be in the noisy intermediate-scale quantum (NISQ) stage of development.

IBM is a world leader in quantum computing research and development. In November 2022, it unveiled its latest QC, Osprey. Boasting 433 qubits, Osprey is currently the highest qubit count QC in the world, more than triple IBMs previous record of the 127-qubit Eagle QC. IBM is on track to develop 4,000-qubit QCs by 2025 with a roadmap of 1,121 qubits in 2023 (Condor QC) and 1,386 qubits in 2024 (Flamingo QC). GlobalData predicts that full-scale commercial quantum computing will likely begin in 2027.

Big Tech firms have begun offering quantum-as-a-service (QaaS), notably Microsofts Azure Quantum, which provides users with access to hardware from other companies such as IonQ, Toshiba, and Honeywells Quantinuum. The QaaS market will continue to grow as more companies invest in quantum.

JP Morgan, Volkswagen, and Lockheed Martin are already investing in their quantum infrastructure in preparation for widespread adoption. These companies are well-positioned to benefit from a quantum advantage in financial modeling, process optimization, cybersecurity, and military research and development.

2023 will see the quantum capabilities gap continue to narrow between the US and China as the tech war intensifies. Of the 62 quantum computing start-ups listed on GlobalDatas companies database, 29% were headquartered in the US, followed by the UK (13%), and China (10%). These countries will become hubs of activity in quantum computing, attracting both domestic and international investment.

Quantum computing is the latest arena of competition between the US and China as both countries strive for technological supremacy. Major US tech firms currently dominate quantum computing. In China, Alibaba leads ahead of Huawei and Baidu. Alibaba is at the forefront of Chinas quantum strategy; it has invested $15 billion into the DAMO science and technology research center and partnered with the Chinese Academy of Sciences.

Though currently estimated to be lagging five years behind the US in quantum computing, China supersedes the US in quantum satellite communications. The Chinese government has invested $10 billion into the construction of the National Laboratory for Quantum Information Science. When completed, its research focus will be on quantum technologies with direct military application. China benefits from an autocratic economic model. China can pool resources from institutions, corporations, and the government, working collectively to achieve a single aim.

In contrast, US tech firms compete against each other. The US government is increasingly involving itself in quantum development. The US CHIPS and Science Act, which was signed into law in August 2022, details $153 million of domestic funding to support US quantum computing initiatives, including discovery, infrastructure, and workforce. This support package will be implemented between 2023 and 2027.

Read more here:

The state of quantum computing in 2023 - Verdict

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

U Toronto and Fujitsu team use quantum-inspired computing to … – Green Car Congress

Posted: at 12:15 am


without comments

Researchers from the University of Torontos Faculty of Applied Science & Engineering and Fujitsu have applied quantum-inspired computing to find the promising, previously unexplored chemical family of Ru-Cr-Mn-Sb-O2 as acidic oxygen evolution reaction catalysts for hydrogen production.

The best catalyst shows a mass activity eight times higher than state-of-the-art RuO2 and maintains performance for 180 h. A paper on their work appears in the journal Matter.

Choubisa et al.

Scaling up the production of what we call green hydrogen is a priority for researchers around the world because it offers a carbon-free way to store electricity from any source. This work provides proof-of-concept for a new approach to overcoming one of the key remaining challenges, which is the lack of highly active catalyst materials to speed up the critical reactions.

Ted Sargent, senior author

Nearly all commercial hydrogen is produced from natural gas. The process produces carbon dioxide as a byproduct; if the CO2 is vented to the atmosphere, the product is known as grey hydrogen, but if the CO2 is captured and stored, it is called blue hydrogen. Green hydrogen is a carbon-free method that uses an electrolyzer to split water into hydrogen and oxygen gas. The low efficiency of available electrolyzers means that most of the energy in the water-splitting step is wasted as heat, rather than being captured in the hydrogen.

Researchers around the world are striving to find better catalyst materials that can improve this efficiency. Because each potential catalyst material can be made of several different chemical elements, combined in a variety of ways, the number of possible permutations quickly becomes overwhelming.

One way to do it is by human intuition, by researching what materials other groups have made and trying something similar, but thats pretty slow. Another way is to use a computer model to simulate the chemical properties of all the potential materials we might try, starting from first principles. But in this case, the calculations get really complex, and the computational power needed to run the model becomes enormous.

Jehad Abed, co-lead author

To find a way through, the team turned to the emerging field of quantum-inspired computing. They made use of the Digital Annealer, a tool that was created as the result of a long-standing collaboration between U of T Engineering and Fujitsu Research. This collaboration has also resulted in the creation of the Fujitsu Co-Creation Research Laboratory at the University of Toronto.

Digital Annealer (DA) is a computer architecture developed to solve large-scale combinatorial optimization problems rapidly using CMOS digital technology. DA is unique in that it uses a digital circuit design inspired by quantum phenomena and can solve problems that are very difficult and time-consuming or even impossible for classical computers to address.

Digital Annealer is inspired by quantum mechanics, but unlike quantum computers, does not require cryogenic temperatures. DA makes use of a method called annealingnamed after the annealing process using in metallurgy. During this procedure, metal is heated to a high temperature before the structure stabilizes as it is slowly cooled to a lower energy, more stable state.

Using the analogy of placing blocks in a box, in the classical computational approach, the blocks are placed in sequence. If a solution is not found, the process is restarted and repeated until a solution is found. With the annealing approach, the blocks are placed randomly and the entire system is shaken. As the shaking is gradually reduced, the system becomes more stable as the shapes quickly fit together.

DA is designed to solve fully connected quadratic unconstrained binary optimization (QUBO) problems and is implemented on CMOS hardware. The second-generation Digital Annealer expands the scale of problems that can be solved from the 1,024 bits of the first generation, launched in May 2018, to 8,192 bits and an increase in computational precision.

This leads to substantial gains in precision and performance for enhanced problem-solving and new applications, expanding by a factor of one hundred the complexity that the second-generation Digital Annealer can tackle now. Its algorithm is based on simulated annealing, but also takes advantage of massive parallelization enabled by the custom application-specific CMOS hardware.

The Digital Annealer is a hybrid of unique hardware and software designed to be highly efficient at solving combinatorial optimization problems. These problems include finding the most efficient route between multiple locations across a transportation network, or selecting a set of stocks to make up a balanced portfolio. Searching through different combinations of chemical elements to a find a catalyst with desired properties is another example, and it was a perfect challenge for our Digital Annealer to address.

Hidetoshi Matsumura, senior researcher at Fujitsu Consulting (Canada)

In the paper, the researchers used a technique called cluster expansion to analyze an enormous number of potential catalyst material designsthey estimate the total as a number on the order of hundreds of quadrillions. For perspective, one quadrillion is approximately the number of seconds that would pass by in 32 million years.

Quantum annealers and similar quantum-inspired optimizers have the potential to provide accelerated computation for certain combinatorial optimization challenges. However, they have not been exploited for materials discovery because of the absence of compatible optimization mapping methods. Here, by combining cluster expansion with a quantum-inspired superposition technique, we lever quantum annealers in chemical space exploration for the first time. This approach enables us to accelerate the search of materials with desirable properties 1050 times faster than genetic algorithms and bayesian optimizations, with a significant improvement in ground state prediction accuracy.

Choubisa et al.

The results pointed toward a promising family of materials composed of ruthenium, chromium, manganese, antimony and oxygen, which had not been previously explored by other research groups.

The team synthesized several of these compounds and found that the best of them demonstrated a mass activity that was approximately eight times higher than some of the best catalysts currently available.

The new catalyst has other advantages too: it operates well in acidic conditions, which is a requirement of state-of-the-art electrolyzer designs. Currently, these electrolyzers depend on catalysts made largely of iridium, which is a rare element that is costly to obtain. In comparison, ruthenium, the main component of the new catalyst, is more abundant and has a lower market price.

The team aims to optimize further the stability of the new catalyst before it can be tested in an electrolyzer. Still, the latest work serves as a demonstration of the effectiveness of the new approach to searching chemical space.

I think whats exciting about this project is that it shows how you can solve really complex and important problems by combining expertise from different fields. For a long time, materials scientists have been looking for these more efficient catalysts, and computational scientists have been designing more efficient algorithms, but the two efforts have been disconnected. When we brought them together, we were able to find a promising solution very quickly. I think there are a lot more useful discoveries to be made this way.

Hitarth Choubisa, co-lead author

Resources

Hitarth Choubisa, Jehad Abed, Douglas Mendoza, Hidetoshi Matsumura, Masahiko Sugimura, Zhenpeng Yao, Ziyun Wang, Brandon R. Sutherland, Aln Aspuru-Guzik, Edward H. Sargent (2022) Accelerated chemical space search using a quantum-inspired cluster expansion approach, Matter doi: 10.1016/j.matt.2022.11.031

Originally posted here:

U Toronto and Fujitsu team use quantum-inspired computing to ... - Green Car Congress

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

Europe’s first-ever exascale supercomputer will launch in Germany … – TNW

Posted: at 12:15 am


without comments

JUPITER is set to become the first European supercomputer to make the leap into the exascale era. This means, itll be capable of performing more than an exaflop (or 1 quintillion) operations per second. In other words, the devices computing power willsurpassthat of 5 million laptops or PCs combined.

The European High Performance Computing Joint Undertaking (EuroHPC JU), which is being behind the project, has now signed ahosting agreementwith theJlich Supercomputing Centre(JSC) in Germany, where JUPITER will be located.

Under the terms of the agreement, JUPITER (which stands for Joint Undertaking Pioneer for Innovative and Transformative Exascale Research) will be installed on the campus of the Forschungszentrum Jlich research institute in 2023. The machine will be operated by the JSC.

This new supercomputer will be backed by a 500million budget, split equally between the EuroHPC JU and German federal and state sources.

JUPITERs remarkable power will support the development of high-precision models of complex systems. The machine will be used to analyse key societal issues in Europe, such as health, biology, climate, energy, security, and materials. It will also support intensive use of AI and analysis of enormous data volumes.

Experts expect the computer to improve research quality (while reducing costs), and integrate future technologies such as quantum computing. The device will be available to a wide range of European users in the scientific community, industry, and public sector.

Along with its outstanding computing power, JUPITER will feature a dynamic, modular architecture, which will enable optimal use of the various computing modules used during complex simulations. Notably, JUPITER has been designed as a green supercomputer and will be powered by green electricity, supported by a warm water cooling system. At the same time, its average power consumption is anticipated to be up to15 megawatts approximately six megawatts less than the USFrontierexascale supercomputer.

Upon completion, JUPITER will become the ninth (and best) supercomputer the EuroHPC JU has provided to Europe. Three are expected to be available shortly, and five are already operational. Among them isLUMI, which has beenrankedthe fastest in the EU and third fastest in the world.

Read more here:

Europe's first-ever exascale supercomputer will launch in Germany ... - TNW

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

2022 Year in Review – Caltech

Posted: at 12:15 am


without comments

As the end of 2022 quickly approaches, Caltech News looks back at our coverage of the research, discoveries, events, and experiences that shaped the Institute. Here are some highlights.

Caltech researchers used data gathered both in space by the Mars Reconnaissance Orbiter (MRO) and on the ground by the Mars Perseverance Rover to continue to probe the Red Planet's past and any potential signs it was previously hospitable to life. In January, MRO survey data revealed that liquid water was on Mars about one billion years earlier than suspected. Meanwhile, Perseverance made a beeline across the floor of Jezero Crater during spring 2022, arriving at an ancient river delta in April. The delta is thought to be one of the best possible places to search for past signs of life; there, Perseverance found signs of past water along with evidence of possible organic compounds in the igneous rocks on the crater floor. After a few months at the delta, Perseverance project scientist Ken Farley announced in September the discovery of a class of organic molecules in two samples of mudstone rock collected from a feature called Wildcat Ridge. While these organic molecules can be produced through nonliving chemical processes, some of the molecules themselves are among the building blocks of life.

Not all eyes aimed toward space are set on Mars, however. New instruments and surveys provided insights related to other celestial bodies in our Milky Way galaxy, such as asteroids, and helped discover an abundance of planets outside of our solar system.

In March, the NASA Exoplanet Archive, an official catalog for exoplanetsplanets that circle other stars beyond our sunhoused at Caltech's IPAC astronomy center, officially hit a new milestone: 5,000 exoplanets.

Looking even farther out into the universe from planet Earth, Caltech researchers made several discoveries, including a tight-knit pair of supermassive black holes locked in an epic waltz, and a new "black widow" star system, spotted by the Zwicky Transient Facility (ZTF), in which a rapidly spinning dead star called a pulsar is slowly evaporating its companion.

Caltech's ZTF sky survey instrument, based at Palomar Observatory, had previously discovered the first known asteroid to circle entirely within the orbit of Venus. To honor the Pauma band of Indigenous peoples whose ancestral lands include Palomar Mountain, the ZTF team asked the band to name the asteroid. They chose 'Ayl'chaxnim, which means "Venus girl" in their native language of Luiseo.

And far closer to home, new faculty member and historian Lisa Ruth Rand set her sights on the debris we have left in Earth's orbit (and beyond), and what it can tell us about humanity and our evolving relationship with space.

Caltech astronomers continue to lead the way in the development of ever more powerful instruments for answering fundamental questions about our place in the universe. The new Keck Planet Finder, led by astronomer Andrew Howard, will take advantage of the W. M. Keck Observatory's giant telescopes to search for and characterize hundreds, and ultimately, thousands of exoplanets, including Earth-size planets that may harbor conditions suitable for life.

NASA has also selected the UltraViolet EXplorer (UVEX) proposal, led by astronomer Fiona Harrison, for further study. If selected to become a mission, UVEX would conduct a deep survey of the whole sky in ultraviolet light to provide new insights into galaxy evolution and the life cycle of stars. Harrison's current NASA mission, NuSTAR (Nuclear Spectroscopic Telescope Array), an X-ray telescope that hunts black holes, celebrated 10 years in space. Meanwhile, the development of NASA's SPHEREx (Spectro-Photometer for the History of the Universe, Epoch of Reionization and Ices Explorer), led by astronomer Jamie Bock, is forging ahead with a customized test chamber delivered this year to Caltech.

As new telescopes continue to come together, a venerable Caltech telescope is being taken apart atop Maunakea in Hawaii. The Caltech Submillimeter Observatory (CSO) received the final permits to begin its decommissioning process. Scientists plan to ultimately repurpose the telescope and put it back together in Chile.

Caltech's fundamental quest for understanding life and our origins also inspires many research efforts and innovations with the potential to improve human health and well-being.

Continuing work that began with the COVID-19 pandemic, Pamela Bjrkman and colleagues developed a new type of vaccine that protects against the virus that causes COVID-19 and closely related viruses, while Sarkis Mazmanian has shown how an imbalance of gut microbes can cause binge eating. Meanwhile, other researchers made real what would have seemed like science fiction only a few years ago: Caltech medical engineer Wei Gao created an artificial skin for robots that interfaces with human skin and allows a human operator to "feel" what the robot is sensing; chemical engineer Mikhail Shapiro engineered a strain of remote-controlled bacteria that seek out tumors inside the human body to deliver targeted drugs on command; and neuroscientist Richard Andersen and colleagues developed a brainmachine interface that can read a person's brain activity and translate it into the words the person was thinking technology that may one day allow people with full-body paralysis to speak. Additionally, Caltech researchers created a "synthetic" mouse embryo, complete with brain and beating heart; completed a 20-year quest to decode one of the most complex and important pieces of machinery in our cells; and discovered how fruit flies' extremely sensitive noses help them find food.

In 2022, Caltech paid tribute to its long history of advances in sustainability and then looked forward to pioneering new initiatives and technologies that will reduce humanity's footprint on Earth's fragile environment. Through the newly launched Caltech Heritage Project, a series of oral histories published this year captured the pivotal role Caltech alumni played in the electric car revolution. Meanwhile, in April, Caltech hosted the Caltech Energy 10 (CE10) conference, bringing thought leaders to campus to chart a path toward achieving the Biden administration's stated goal to cut U.S. global warming gas emissions by 50 percent within the next 10 years.

Caltech researchers continue to contribute to research to generate cleaner energy, ranging from work in the laboratory of John Dabiri (MS '03, PhD '05) to optimize wind farms to efforts to create and commercialize technology for capturing carbon already released into the atmosphere (which earned a Caltech-based startup an XPrize Award).

On campus, Caltech began construction of the Resnick Sustainability Center, scheduled to open in 2024, which will bring together talent from across campus to tackle issues related to climate change and other human impacts on the natural environment. And as the year wraps up, the Space-based Solar Power Project is preparing to launch a demonstration into space to test three key elements of its ambitious plan to harvest solar energy in spacewhere there are no cloudy daysand beam it wirelessly down to Earth.

As the AI4Science Initiative continually demonstrates and the Caltech Science Exchange recently highlighted, artificial intelligence (AI) and machine learning (ML) have applications that reach every corner of campus. In 2022, AI was used to generate the first-ever picture of the black hole at the center of our own galaxy (only the second image of a black hole ever created), to pave the way to improve aircraft design, to help drones fly autonomously in real-weather conditions, and to fight COVID-19. This election year, researchers from Caltech discussed how machine learning can both combat misinformation and fight online bullying.

Caltech continues its role as a major hub of quantum research. The newly announced Dr. Allen and Charlotte Ginsburg Center for Quantum Precision Measurement will unite a diverse community of theorists and experimentalists devoted to understanding quantum systems and their potential uses (see a video about the new center). The 25th annual Conference on Quantum Information Processing, or QIP, the world's largest gathering of researchers in the field of quantum information, a discipline that unites quantum physics and computer science, was held in Pasadena for the first time and represented the first major collaboration between Caltech and the new AWS Center for Quantum Computing on campus.

Fundamental research in the quantum sciences charged ahead, with findings that included a quantum computer-based experiment to test theoretical wormholes and new demonstrations showing how graphene can be used in flexible and wearable electronics.

This year, members of the Caltech community received recognition for expanding the boundaries of scientific knowledge, but also for humanitarian endeavors and for blazing new educational and occupational paths for others to follow.

In March, Roman Korol, a Caltech graduate student, launched a project to collect and distribute humanitarian aid for families affected by the war in Ukraine.

In April, Jessica Watkins, who worked on the Mars Curiosity rover mission while a postdoc at Caltech, made history as the first Black woman on the International Space Station. From space, she hosted a live Q&A for Caltech students and faculty in Ramo Auditorium and reviewed a paper describing how geology on Mars works in dramatically different ways than on Earth.

In May, alumna Laurie Leshin (MS '89, PhD '95) assumed leadership of JPL, becoming its first female director.

In June, Carver Mead (BS '56, MS '57, PhD '60), one of the fathers of modern computing, received the 2022 Kyoto Prize for leading contributions to the establishment of the guiding principles for very large-scale integration systems design, which enables the basis for integrated computer circuits.

In October, Caltech alumnus John Clauser (BS '64) shared the 2022 Nobel Prize in Physics "for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science." The same month, Edward Stone retired as the project scientist for NASA's Voyager mission a half-century after taking on the role. Under his guidance, the Voyager probes explored the solar system's four gas-giant planets and became the first human-made objects to reach interstellar space, the region between stars containing material generated by the death of nearby stars. Also, Tracy Dennison began her term as the new Ronald and Maxine Linde Leadership Chair of the Division of the Humanities and Social Sciences.

In November, 50 years after they entered Caltech as the Institute's first Black female students, Karen Maples, MD (BS '76); Deanna Hunt (BS '76); and Lauretta Carroll (BS '77) reflected on the challenges and successes they experienced then and in the years that followed.

Throughout the year, the Institute took steps to implement new programs and bolster existing ones that underscore Caltech's guiding values, such as supporting students and postdoctoral scholars, creating a more inclusive environment, and celebrating and accounting for its history.

To create more opportunities for students and increase interdisciplinary research, Caltech created a new graduate education track that combines medical engineering and electrical engineering. To further boost interdisciplinary research and expand Caltech's prominence as a hub for mathematics, the Institute became the new home of the American Institute of Mathematics, an independent nonprofit organization funded in part by the National Science Foundation.

The Institute, which this year kicked off a partnership with the Carnegie Institution for Science, also became a charter member of SEA Change, an initiative of the American Association for the Advancement of Science that supports educational institutions as they systemically transform to improve diversity, equity, accessibility, and inclusion in science, technology, engineering, mathematics, and medicine.

The Institute expanded its Presidential Postdoctoral Fellowship, which supports efforts to diversify academia by recruiting and supporting promising postdoctoral scholars from underrepresented communities.

On campus, Caltech marked the dedication of the Grant D. Venerable House, honoring its namesake alumnus, who was the first Black undergraduate student to graduate from Caltech and an active student leader and athlete during his time on campus. It also celebrated the dedication of the Lee F. Browne Dining Hall, honoring the late Lee Franke Browne, a former Caltech employee and lecturer who dedicated his life and career to efforts that expanded students' access to STEM and who advanced human rights.

With the return of in-person events, the Institute was able to reestablish and strengthen ties to the local community through educational programs for area students, and through cultural events and lectures whose online components often reached even broader audiences across the world.

This year, the Institute celebrated the centennial of the Caltech Seismological Laboratory, marking an unparalleled century at the forefront of earthquake science and geophysics.

Caltech also celebrated the 100th anniversary of the Watson Lectures, which launched in 1922 as a way to benefit the public through education and outreach. Continuing that tradition, Caltech partnered with local schools to bring high school students to campus to see the lectures and engaged young students through other educational outreach programs, including the new Caltech Earthquake Fellows program and the Caltech Planet Finder Academy, both of which launched this year. Other programs designed to bolster science education for young students included Summer Research Connection, a program that invites high school students and teachers from Pasadena Unified School District and other nearby schools into Caltech laboratories, and the National Science Olympiad Tournament, which Caltech hosted this year for the first time and whose students played the main role in conducting the event.

For the campus community, TechFest returned to campus for the first time since the start of the COVID-19 pandemic, welcoming students with an in-person block party on Beckman Mall complete with games and fireworks.

Caltech's Public Programming was able to re-engage with the community through in-person events, including CaltechLive! events such as the performance of Nobuntu, a female a cappella quintet from Zimbabwe; and lectures from the Science Journeys, Movies that Matter and Behind the Book series that showcased such varied topics as a journey to the center of Jupiter, a discussion of the science of cooking, and how climate migration will reshape the world.

See the original post here:

2022 Year in Review - Caltech

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

VC Fund Nemesis Technologies To Add More Liquidity By Connecting Investors With Opportunities In AI, – Crowdfund Insider

Posted: at 12:15 am


without comments

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

The Future of Sensing and Imaging Using Quantum Microscopy – AZoOptics

Posted: at 12:15 am


without comments

Quantum theory is used in a variety of microscopy techniques. Quantum microscopy enables the measurement and imaging of tiny features of matter and quantum particles. This article provides an overview of how quantum microscopy can drive the future of sensing and imaging.

Image Credit:Ravenash/Shutterstock.com

Modern research extensively uses optical microscopy and spectroscopy in various fields, from fundamental physics to chemistry, material science, and life sciences. It is fascinating to see how advances in understanding light properties have prompted new imaging applications over time.

Understanding diffraction and interference requires considering light as a wave. At the beginning of the twentieth century, the basic realization that light exists as discrete energy units called quanta sparked the first quantum revolution, which built the whole laser and photonics industry. In the second quantum revolution, quantum states that can display entanglement and superposition are used for quantum technology applications. Due to these new findings, various innovative sensing and imaging methods are now feasible.

One approach to overcoming some of the constraints of conventional imaging systems, where entanglement plays a key role, is to use the quantum features of light. The energy, momentum, and position correlations of the entangled photon pairs are particularly important. They enable imaging and spectroscopy in spectral bands where effective detection is not feasible.

Beyond classical restrictions like the shot noise level, sensing and imaging become conceivable by employing certain quantum states of light and associated photon number statistics. Additionally, two-photon fluorescence microscopy may be performed at very low light intensities when using quantum light, opening up new perspectives for photosensitive biological probes.

There are several ways to go beyond the traditional restrictions of sensitivity and resolution in optical microscopy, thanks to the principles of quantum optics. Imaging a biological sample has remained difficult despite using several concepts in proof-of-concept tests, primarily because of the intrinsically weak signal recorded and the fragility of quantum states of light. However, in theory, these quantum protocols may increase the capabilities of current super-resolution methods by introducing new information without erasing the conventional information.

Bright sources of entangled photons have sparked a revival in quantum optical interferometry. Quantum metrology, quantum computing logic gates, quantum lithography, quantum cryptography and quantum teleportation are some of the unique concepts related to quantum entanglement that have been implemented using optical interferometry to test the fundamentals of quantum mechanics.

In order to overcome the shot-noise limit in quantum metrology, new techniques have been developed. For example, these techniques may be employed in fiber optical gyroscopes and sensors for biological or chemical targets. Furthermore, imaging techniques like LIDAR and optical lithography may surpass the Rayleigh diffraction limit by using this entanglement.

Image scanning microscopy (ISM) is a new super-resolution technique that improves reliable resolution without lowering the signal intensity. Recently, researchers developed quantum image scanning microscopy (Q-ISM), which increases the resolution of ISM up to twofold, four times above the diffraction limit, by combining ISM with the measurement of quantum photon correlation. They developed the Q-ISM concept and used photon antibunching, a quantum phenomenon, as a resolution-enhancing contrast mechanism to produce super-resolved optical pictures of a biological material dyed with fluorescent quantum dots.

A quantum microscope platform created by University of Technology Sydney (UTS) researchers provides new techniques to examine material characteristics and physical processes.

Due to their propensity to react to electromagnetic fields or other stimuli, quantum sensors based on diamond nitrogen-vacancy centres are recognized as potentially sensitive devices for monitoring specific physical attributes. However, reliance on quantum defects housed in stiff 3D crystals like diamond has made it challenging to interact intimately with a sample when employing solid-state spin sensors as microscopy tools up to this point.

Image Credit:metamorworks/Shutterstock.com

Instead of a larger crystal, this novel method takes advantage of point flaws embedded inside a tiny layer of hexagonal boron nitride (hBn). As a van der Waals substance, hBn comprises weaker-hold material layers in two dimensions. As a result, Van der Waals sensors might make it possible to use a quantum microscopy method on materials and targets that were not previously reachable.

Quantum microscopy enables the measurement and imaging of tiny features of matter and quantum particles. Due to quantum microscopy, several novel sensing and imaging techniques are now possible. The specifics covered in this article strongly imply that quantum microscopy will play a significant part in future sensing and imaging. The development of technologies like hBN-based quantum microscopes and quantum image scanning microscopy has the potential to enhance resolution significantly. Future MRI and NMR imaging of chemical processes, as well as imaging and remote sensing applications, may all be done using hBN-based quantum microscopes.

More from AZoOptics: What are Fiber Optic Microendoscopes?

Gilaberte Basset, M., Setzpfandt, F., Steinlechner, F., Beckert, E., Pertsch, T., & Grfe, M. (2019). Perspectives for applications of quantum imaging. Laser & Photonics Reviews. https://onlinelibrary.wiley.com/doi/pdfdirect/10.1002/lpor.201900097

Healey, A. J., Scholten, S. C., Yang, T., Scott, J. A., Abrahams, G. J., Robertson, I. O., ... & Tetienne, J. P. (2022). Quantum microscopy with van der Waals heterostructures. Nature Physics. https://www.nature.com/articles/s41567-022-01815-5

Jonathan P. Dowling and Kaushik P. Seshadreesan (2015) Quantum Optical Technologies for Metrology, Sensing, and Imaging. Journal of Lightwave Technology. https://opg.optica.org/jlt/abstract.cfm?URI=jlt-33-12-2359

Quantum microscopy prototype points to novel sensing and imaging (2022) Optics.org. Available at: https://optics.org/news/13/11/13 (Assessed: November 28, 2022)

Tenne, R., Rossman, U., Rephael, B., Israel, Y., Krupinski-Ptaszek, A., Lapkiewicz, R., ... & Oron, D. (2019). Super-resolution enhancement by quantum image scanning microscopy. Nature Photonics. https://arxiv.org/ftp/arxiv/papers/1806/1806.07661.pdf

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Go here to read the rest:

The Future of Sensing and Imaging Using Quantum Microscopy - AZoOptics

Written by admin

December 21st, 2022 at 12:15 am

Posted in Quantum Computing

Quantum computing use cases are getting real–what you need to know – McKinsey

Posted: December 15, 2021 at 1:55 am


without comments

Accelerating advances in quantum computing are serving as powerful reminders that the technology is rapidly advancing toward commercial viability. In just the past few months, for example, a research center in Japan announced a breakthrough in entangling qubits (the basic unit of information in quantum, akin to bits in conventional computers) that could improve error correction in quantum systems and potentially make large-scale quantum computers possible. And one company in Australia has developed software that has shown in experiments to improve the performance of any quantum-computing hardware.

As breakthroughs accelerate, investment dollars are pouring in, and quantum-computing start-ups are proliferating. Major technology companies continue to develop their quantum capabilities as well: companies such as Alibaba, Amazon, IBM, Google, and Microsoft have already launched commercial quantum-computing cloud services.

Of course, all this activity does not necessarily translate into commercial results. While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field (for more on these open questions, see sidebar, Debates in quantum computing).

Still, the activity suggests that chief information officers and other leaders who have been keeping an eye out for quantum-computing news can no longer be mere bystanders. Leaders should start to formulate their quantum-computing strategies, especially in industries, such as pharmaceuticals, that may reap the early benefits of commercial quantum computing. Change may come as early as 2030, as several companies predict they will launch usable quantum systems by that time.

To help leaders start planning, we conducted extensive research and interviewed 47 experts around the globe about quantum hardware, software, and applications; the emerging quantum-computing ecosystem; possible business use cases; and the most important drivers of the quantum-computing market. In the report Quantum computing: An emerging ecosystem and industry use cases, we discuss the evolution of the quantum-computing industry and dive into the technologys possible commercial uses in pharmaceuticals, chemicals, automotive, and financefields that may derive significant value from quantum computing in the near term. We then outline a path forward and how industry decision makers can start their efforts in quantum computing.

An ecosystem that can sustain a quantum-computing industry has begun to unfold. Our research indicates that the value at stake for quantum-computing players is nearly $80 billion (not to be confused with the value that quantum-computing use cases could generate).

Because quantum computing is still a young field, the majority of funding for basic research in the area still comes from public sources (Exhibit 1).

Exhibit 1

However, private funding is increasing rapidly. In 2021 alone, announced investments in quantum-computing start-ups have surpassed $1.7 billion, more than double the amount raised in 2020 (Exhibit 2). We expect private funding to continue increasing significantly as quantum-computing commercialization gains traction.

Exhibit 2

Hardware is a significant bottleneck in the ecosystem. The challenge is both technical and structural. First, there is the matter of scaling the number of qubits in a quantum computer while achieving a sufficient level of qubit quality. Hardware also has a high barrier to entry because it requires a rare combination of capital, experience in experimental and theoretical quantum physics, and deep knowledgeespecially domain knowledge of the relevant options for implementation.

Multiple quantum-computing hardware platforms are under development. The most important milestone will be the achievement of fully error-corrected, fault-tolerant quantum computing, without which a quantum computer cannot provide exact, mathematically accurate results (Exhibit 3).

Exhibit 3

Experts disagree on whether quantum computers can create significant business value before they are fully fault tolerant. However, many say that imperfect fault tolerance does not necessarily make quantum-computing systems unusable.

When might we reach fault tolerance? Most hardware players are hesitant to reveal their development road maps, but a few have publicly shared their plans. Five manufacturers have announced plans to have fault-tolerant quantum-computing hardware by 2030. If this timeline holds, the industry will likely establish a clear quantum advantage for many use cases by then.

The number of software-focused start-ups is increasing faster than any other segment of the quantum-computing value chain. In software, industry participants currently offer customized services and aim to develop turnkey services when the industry is more mature. As quantum-computing software continues to develop, organizations will be able to upgrade their software tools and eventually use fully quantum tools. In the meantime, quantum computing requires a new programming paradigmand software stack. To build communities of developers around their offerings, the larger industry participants often provide their software-development kits free of charge.

In the end, cloud-based quantum-computing services may become the most valuable part of the ecosystem and can create outsize rewards to those who control them. Most providers of cloud-computing services now offer access to quantum computers on their platforms, which allows potential users to experiment with the technology. Since personal or mobile quantum computing is unlikely this decade, the cloud may be the main way for early users to experience the technology until the larger ecosystem matures.

Most known use cases fit into four archetypes: quantum simulation, quantum linear algebra for AI and machine learning, quantum optimization and search, and quantum factorization. We describe these fully in the report, as well as outline questions leaders should consider as they evaluate potential use cases.

We focus on potential use cases in a few industries that research suggests could reap the greatest short-term benefits from the technology: pharmaceuticals, chemicals, automotive, and finance. Collectively (and conservatively), the value at stake for these industries could be between roughly $300 billion and $700 billion (Exhibit 4).

Exhibit 4

Quantum computing has the potential to revolutionize the research and development of molecular structures in the biopharmaceuticals industry as well as provide value in production and further down the value chain. In R&D, for example, new drugs take an average of $2 billion and more than ten years to reach the market after discovery. Quantum computing could make R&D dramatically faster and more targeted and precise by making target identification, drug design, and toxicity testing less dependent on trial and error and therefore more efficient. A faster R&D timeline could get products to the right patients more quickly and more efficientlyin short, it would improve more patients quality of life. Production, logistics, and supply chain could also benefit from quantum computing. While it is difficult to estimate how much revenue or patient impact such advances could create, in a $1.5 trillion industry with average margins in earnings before interest and taxes (EBIT) of 16 percent (by our calculations), even a 1 to 5 percent revenue increase would result in $15 billion to $75 billion of additional revenues and $2 billion to $12 billion in EBIT.

Quantum computing can improve R&D, production, and supply-chain optimization in chemicals. Consider that quantum computing can be used in production to improve catalyst designs. New and improved catalysts, for example, could enable energy savings on existing production processesa single catalyst can produce up to 15 percent in efficiency gainsand innovative catalysts may enable the replacement of petrochemicals by more sustainable feedstock or the breakdown of carbon for CO2 usage. In the context of the chemicals industry, which spends $800 billion on production every year (half of which relies on catalysis), a realistic 5 to 10 percent efficiency gain would mean a gain of $20 billion to $40 billion in value.

The automotive industry can benefit from quantum computing in its R&D, product design, supply-chain management, production, and mobility and traffic management. The technology could, for example, be applied to decrease manufacturing processrelated costs and shorten cycle times by optimizing elements such as path planning in complex multirobot processes (the path a robot follows to complete a task) including welding, gluing, and painting. Even a 2 to 5 percent productivity gainin the context of an industry that spends $500 billion per year on manufacturing costswould create $10 billion to $25 billion of value per year.

Finally, quantum-computing use cases in finance are a bit further in the future, and the advantages of possible short-term uses are speculative. However, we believe that the most promising use cases of quantum computing in finance are in portfolio and risk management. For example, efficiently quantum-optimized loan portfolios that focus on collateral could allow lenders to improve their offerings, possibly lowering interest rates and freeing up capital. It is earlyand complicatedto estimate the value potential of quantum computingenhanced collateral management, but as of 2021, the global lending market stands at $6.9 trillion, which suggests significant potential impact from quantum optimization.

In the meantime, business leaders in every sector should prepare for the maturation of quantum computing.

Until about 2030, we believe that quantum-computing use cases will have a hybrid operating model that is a cross between quantum and conventional high-performance computing. For example, conventional high-performance computers may benefit from quantum-inspired algorithms.

Beyond 2030, intense ongoing research by private companies and public institutions will remain vital to improve quantum hardware and enable moreand more complexuse cases. Six key factorsfunding, accessibility, standardization, industry consortia, talent, and digital infrastructurewill determine the technologys path to commercialization.

Leaders outside the quantum-computing industry can take five concrete steps to prepare for the maturation of quantum computing:

Leaders in every industry have an uncommon opportunity to stay alert to a generation-defining technology. Strategic insights and soaring business value could be the prize.

Excerpt from:

Quantum computing use cases are getting real--what you need to know - McKinsey

Written by admin

December 15th, 2021 at 1:55 am

Posted in Quantum Computing

Running the international quantum race – Axios

Posted: at 1:55 am


without comments

The race for quantum supremacy isn't just between tech companies, but between nation-states as well.

Why it matters: The first country to produce effective, working quantum computers will have a key advantage in economics, defense and cybersecurity and the U.S., China, and Europe are all competing.

What's happening: Last month, the Commerce Department added a dozen Chinese companies to a trade blacklist in an effort to prevent emerging U.S. technologies from being used for quantum computing efforts that would boost Beijing's military.

The big picture: One of the clearest uses of quantum computing is to eventually break the complex mathematical problems used to encrypt information of all kinds on the internet, including sensitive government data.

Between the lines: While U.S. companies generally have the lead on building better quantum computers, China has invested massively in the industry, including an $11 billion national laboratory for quantum information sciences.

What to watch: Progress on American efforts to develop post-quantum cryptography standards that would resist more powerful quantum computers, as well as research from the five new quantum institutes created by the White House last year.

The bottom line: "The economy for the next hundred years will be driven by quantum," says Chapman. "So it's not a game we want to lose."

View original post here:

Running the international quantum race - Axios

Written by admin

December 15th, 2021 at 1:55 am

Posted in Quantum Computing

In the race for the best quantum computer there are two sides, and they are not the United States and China: they are the two most advanced qubit…

Posted: at 1:55 am


without comments

The race to lead in quantum computing is not just a matter of two. Currently the United States and China have the upper hand, but other countries, including Germany, France or the United Kingdom, are also making important contributions with a very clear purpose: to acquire a solid technological foundation in this discipline.

In the medium term, quantum computing, if it continues to develop as it has done during the last decade and little by little it circumvents the challenges that still remain to be solved, will make a difference not only in the field of scientific research; also in telecommunications, economy or in the very sensitive field of cryptography, among other critical areas for many nations.

Quantum computing will make a difference not only in the field of scientific research; also in telecommunications, economics or in the very sensitive field of cryptography

The countries that I mentioned in the first paragraph of this article, and some others, have embarked on a long-distance race to avoid being left off the hook, but beyond this international pulse there is a strictly technical struggle that is going relatively unnoticed outside the scientific realm.

The interesting thing is that in this context, the leading role is not played by the countries that seek to lead in quantum computing; It is demanded by the most advanced technologies that are being used to make qubits. However, that in an area where there is so much to do we have several options on the table is great news. Far from being a problem, any innovation that allows us to develop more and better qubits is welcome.

Having quantum computers with many qubits is crucial. And it is so not only because by increasing the number of qubits it is possible to carry out many more calculations simultaneously, but also because in order to make these equipment capable of make amends for their own mistakes it is essential to have more qubits. So many more.

Eagle, the most advanced quantum processor to date, is from IBM and has no less than 127 qubits

The most advanced quantum processor developed to date, known as Eagle, was unveiled by IBM in mid-November, and has 127 qubits. This company plans to have a 433 qubit quantum chip ready by 2022, and one of no less than 1121 qubits in 2023. If this progression is confirmed, come from the hand of IBM or any other company, the first quantum processors with more than a million qubits will arrive in a few years, and just at that moment quantum computers will reach a tipping point.

During the conversation we had with Ignacio Cirac, a Spanish scientist unanimously considered one of the founding fathers of quantum computing, last June he explained to us how many qubits a quantum processor must have to be capable of solve truly significant problems and implement the long-awaited bug fix:

The number of qubits will depend on the type of problems we want to solve with quantum computers. To tackle symbolic problems we will need to have several million qubits. Probably even hundreds of millions of qubits. Right now we are talking about a hundred qubits, so there is a long way to go. There are people who say that 100,000 qubits may solve a specific problem, but it really takes a lot of qubits.

There is no doubt that much remains to be done. Very much. But the researchers are on it. And, furthermore, they are not following a single path. Currently there are two technologies that are proving to have enormous potential not only because of their ability to allow us to increase the number of qubits of quantum processors, but also because they are allowing researchers fine-tune higher quality qubits.

As the quality of a qubit increases, the greater its ability to resist quantum decoherence, which is the phenomenon that appears when the quantum effects that give these computers an insurmountable advantage over classical supercomputers vanish. This is the reason why it is crucial not only to have processors with more qubits, but also to develop higher quality qubits.

Intel is working to increase the scalability of its quantum processors by applying the knowledge that this company has accumulated during decades of producing CMOS devices in their manufacture.

The technology that companies such as IBM, Google or Intel, among others, are using to manufacture their quantum processors uses the superconducting qubits, which are characterized by working at a temperature of about 20 millikelvin, which is approximately -273 degrees Celsius. It is imperative that they operate with the highest degree of isolation from the environment possible and at such an astonishingly low temperature.

IBM aims to have a quantum processor with 1,121 qubits by 2023, and its competitors are likely to be on its heels.

And it is because this minimum level of energy allows them to extend the time during which the quantum states of the system are maintained, and, at the same time, also postpone the moment in which quantum decoherence appears. Quantum states are maintained for a limited period of time, and this time is precisely what we have for carry out quantum logic operations with the qubits of our computer.

One of the greatest successes that superconducting qubits are achieving is precisely how fast they are allowing scaling the number of quantum bits. As weve seen, IBM aims to have a 1,121-qubit quantum processor by 2023, and possibly Intel, Google, and the quantum chips being developed by China will undergo similar development.

In fact, Intel announced just a few days ago that it is working to increase scalability of its quantum processors, applying in their manufacture all the knowledge that this company has accumulated during decades of production of CMOS devices. In fact, the background that both this company and IBM have in the field of semiconductor production works in their favor because all that knowledge is being very useful when it comes to addressing the progressive refinement of their superconducting qubits.

This is the path that IonQ and Honeywell are taking, and they seem to be getting good results. In this article we are not going to delve into the operation of this technology so as not to complicate it too much (we can do it in another report if you are interested in this topic and you confirm it in the comments), but it is interesting that very broadly we can intuit what is your strategy to identify how superconducting qubits differ from those that use ion traps.

Ion trap qubits use positively charged ionized atoms, keeping them confined and isolated in an electromagnetic field

These last use ionized atomsTherefore, they have a non-neutral global electrical charge that allows them to be isolated and confined within an electromagnetic field. This is the starting point of this technology, and from here the strategies used by IonQ and Honeywell, which are the companies that have decided to travel this path with more impetus, to manipulate these ionized atoms and carry out logical operations with them differ. slightly.

The ion-trapped qubits that IonQ and Honeywell are using are more robust than superconducting qubits, allowing them to effectively bypass quantum decoherence for longer.

IonQ acts on the quantum state of its qubits with ion traps by cooling them to reduce the computational noise level and using lasers just below to trade them. But it does not use a single laser; It uses one for each ion, and also a global laser that acts on all of them simultaneously. Honeywell also uses ionized atoms and lasers, but the procedure it uses to establish entanglement between two ions and act on them with a laser is different from that used by IonQ.

In any case, the most interesting thing is that both Honeywell and IonQ ensure that their qubits with ion traps they are more robust than the superconducting qubits used by your competitors. And this means, as we have seen a few lines above, that they manage to preserve the stability of a quantum state for a longer time, which allows them to carry out, according to these companies, more operations with their qubits before quantum decoherence appears. .

Although, as we have seen, superconducting qubits and those using ion traps are currently the most highly developed, they are not the only technologies within our grasp. Many research groups are working in this area, and some promising lines of research propose different ideas at the two in which we just inquired.

There are experts in Spain who work in quantum computing with molecules. They implant ions in macromolecules, store information in them, and can do little calculations. It is a very unique line both in Europe and in the world that could be strengthened. There are many areas where all the greater robustness of ion-trapped qubits versus superconducting qubits. And, incidentally, he told us about another very promising technology, which reminds us that, fortunately, there are several attractive lines of research open that seek to develop more robust and stable qubits:

Superconducting qubits will probably help us to have more qubits, but we think they will have more errors than ion qubits. There is also a third technology, neutral atoms, in which several research groups are working and which is managing to gather more qubits while maintaining the accuracy and lack of errors of the other systems. I hope that very soon we will be able to develop more advanced technologies that will surpass those we have today.

Images | Honeywell | IonQ | Intel

Facebook

Twitter

Pinterest

WhatsApp

Read the original here:

In the race for the best quantum computer there are two sides, and they are not the United States and China: they are the two most advanced qubit...

Written by admin

December 15th, 2021 at 1:55 am

Posted in Quantum Computing


Page 7«..6789..20..»



matomo tracker