Page 11234..1020..»

Archive for the ‘Quantum Computing’ Category

Chevron invests in quantum computing development for oil and gas market – WorldOil

Posted: March 9, 2024 at 2:40 am


without comments

(WO) OQC announced that Chevron Technology Ventures, part of Chevron Corporation, has joined its $100m Series B funding round.

Quantum computing in the oil and gas market is expected to grow at a CAGR of 37.9%, owing to the increasing demand for efficient optimization and simulation across the sector. Chevron's investment marks a significant move by a supermajor into the rapidly evolving field of quantum computing.

"OQC's development of the quantum computer has the potential to change the information processing landscape by merging the bounds of engineering and physics," said Jim Gable, Vice President, Innovation and President of Technology Ventures at Chevron. "This is the latest investment from our Core Energy Fund, which focuses on high-tech, high-growth startups and breakthrough technologies that could improve Chevron's core oil and gas business performance as well as create new opportunities for growth."

A quantum future for oil and gas. OQC's technology provides several potential groundbreaking opportunities for the oil and gas sector, including the development and optimization of catalysts and the efficiency of transportation and distribution networks. Quantum is anticipated to accelerate the oil and gas industry's discovery and development of new materials through the simulation of complex molecules to lower carbon products.

To realize this future, the oil and gas industry requires secure, accessible and powerful quantum computing that is integrated with existing high-performance computing. Prior to the launch of OQC Toshiko, quantum computers were only available in labs, making secure access for companies and integration with existing high-performance computing the largest barriers to wider business adoption of this groundbreaking technology.

Commenting on the news, Ilana Wisby, Chief Executive Officer at OQC, said, "Chevron's investment marks a significant milestone in harnessing quantum computing for the energy sector. We're excited to drive innovation and efficiency in exploration and renewables and pioneer enterprise-ready quantum in the energy sector."

Read the original post:

Chevron invests in quantum computing development for oil and gas market - WorldOil

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Why the QPU Is the Next GPU – Built In

Posted: at 2:40 am


without comments

The computational demands of various sectors, such as drug discovery, materials science, and AI are skyrocketing. Graphics processing units (GPUs) have been at the forefront of this journey, serving as the backbone for tasks demanding high parallel processing capabilities. Their integration into data centers has marked a significant advancement in computational technology.

As we push the boundaries of what's computationally possible, however, the limitations of GPUs become apparent, especially when facing problems that classical computing struggles to solve efficiently. Enter the quantum processing unit (QPU), a technology that promises not just to complement but potentially transcend the capabilities of GPUs, heralding a new era in computational science.

A quantum processing unit, or QPU, uses qubits and quantum circuit model architecture to solve problems that are too computationally intensive for classical computing. Its potential is analogous to the transformational impact the GPU had on computing in the 2000s.

More From Yuval BogerWhat Role Will Open-Source Development Play in Quantum Computing?

The binary system is at the core of classical computing, with bits that exist in one of two states: zero or one. Through logic gates within the von Neumann architecture (an architecture that includes a CPU, memory, I/O, and data bus), this binary processing has propelled technological progress for decades. GPUs, enhancing this system, offer parallel processing by managing thousands of threads simultaneously, significantly outpacing traditional CPUs for specific tasks.

Despite their prowess, GPUs are still bound by the linear progression of classical algorithms and the binary limitation of bits, making some complex problems inefficient and energy-intensive to solve. A key reason for this linear progression limitation is that a classical algorithm can only process one possible solution at a time.

The integration of GPUs into data centers began in the late 1990s and early 2000s, initially focused on graphics rendering. NVIDIAs GeForce 256, released in 1999 and billed as the worlds first GPU, marked a significant shift towards GPUs as programmable units rather than merely graphics accelerators. Their general-purpose computing potential was realized in the mid-2000s with NVIDIAs introduction of CUDA in 2006, enabling GPUs to handle computational tasks beyond graphics, such as simulations and financial modeling.

The democratization of GPU computing spurred its adoption for scientific computing and AI, particularly benefiting from GPUs parallel processing capabilities. This led to wider use in research and high-performance computing, driving significant advancements in GPU architecture.

By the early 2010s, the demand for big data processing and AI applications accelerated GPU adoption in cloud services. This period also saw the rise of specialized AI data centers optimized for GPU clusters, enhancing the training of complex neural networks.

The 2020s have seen continued growth in GPU demand, driven by deep learning applications in natural language processing, computer vision, and speech recognition. Modern deep learning frameworks and the introduction of specialized AI accelerators, such as Googles TPU and NVIDIAs Tensor Core GPUs, underscore the critical role of GPUs in AI development and the evolving landscape of computational hardware in data centers.

Despite these developments, GPUs did not displace traditional CPUs. Rather, they ran side by side. We saw the rise of heterogeneous computing: the increasingly popular integration of GPUs with CPUs and other specialized hardware within a single system. This allows different processors to handle tasks best suited to their strengths, leading to improved overall efficiency and performance.

Quantum computing introduces a transformative approach to computing with the concept of qubits. Unlike classical bits, qubits can exist in a state of superposition, embodying both zero and one simultaneously. This characteristic, along with quantum entanglement, enables quantum computers to process information on a scale that classical machines cant match. Quantum gates manipulate these qubits, facilitating parallel processing across exponentially larger data sets.

Quantum gates are the fundamental building blocks of quantum circuits, analogous to logic gates in classical computing, but designed for operations on qubits instead of classical bits. Quantum gates manipulate the state of qubits according to the principles of quantum mechanics, enabling the execution of quantum algorithms. Some quantum gates operate only on a single qubit, whereas others operate on two or more qubits. Multi-qubit gates are critical to exploiting the entangle and superposition properties of quantum computing.

The quantum computing field is grappling with challenges like qubit stability and effective quantum error correction, however, which are crucial for achieving scalable quantum computing. Qubits are inherently fragile and can be affected by a variety of environmental conditions. Therefore, maintaining a stable qubit state is challenging, and researchers still must develop special techniques to detect and correct unwanted changes in the qubit state.

QPU technology is poised to revolutionize areas where classical computing reaches its limits. In drug discovery, for instance, QPUs could simulate molecular interactions at scales never before possible, expediting the creation of new therapeutics. Materials science could benefit from the design of novel materials with tailored properties. In finance, QPUs could enhance complex model optimizations and risk analysis. In AI, they could lead to algorithms that learn more efficiently from less data. QPUs are thus able to tackle problems that CPUs and GPUs cannot and never will, and thus open new frontiers of discovery and innovation.

Although GPUs have revolutionized data center operations, they also bring formidable challenges. The voracious GPU appetite for power generates significant heat, which demands sophisticated and often expensive cooling systems to maintain optimal performance levels. This not only increases the operational costs but also raises environmental concerns due to the high energy consumption required for both running the units and cooling them.

In addition to these physical constraints, the technological landscape in which GPUs operate is rapidly evolving. The constant need for updates and upgrades to accommodate new software demands and improve processing capabilities presents substantial logistical and financial hurdles. This strains resources and complicates long-term planning for data center infrastructure.

QPUs promise to address many of these challenges. QPUs perform computations in ways fundamentally different from classical systems. Specifically, the intrinsic ability of qubits to exist in multiple states simultaneously allows QPUs to tackle complex problems more effectively, reducing the need for constant hardware upgrades. This promises not only a leap in computational power but also a move towards more sustainable and cost-effective computing solutions, directly addressing the critical limitations faced by GPUs in todays data centers.

The journey toward QPU adoption in computational infrastructures is laden with hurdles, though. Achieving stable, large-scale quantum systems and ensuring reliable computations through quantum error correction are paramount challenges. Some types of quantum computers require special cooling and environmental conditions that are uncommon in data centers and thus require adaptation.

Additionally, the quantum software development field is in its infancy, necessitating the creation of new programming tools and languages. To make use of the quantum properties of QPUs, just translating classical algorithms is insufficient. Instead, we will need to invent new types of algorithms. Just like GPUs allow us to leverage parallel processing, QPUs allow us to execute code differently. Despite these obstacles, ongoing research and development are gradually paving the way for QPUs to play a central role in future computational tasks.

Today, QPU integration into broader computational infrastructures and their practical application in industry and research is still in the nascent stages. The development and commercial availability of quantum computers is growing, with several companies and research institutions demonstrating quantum advantage and offering cloud-based quantum computing services.

How close are QPUs to taking a prime position next to GPUs? In other words, if we were to compare the development of QPUs with the historical development of GPUs, what year would we be in now?

Drawing a parallel with the GPU timeline, the current stage of QPU integration closely mirrors the GPU landscape in the mid-2000s, when GPUs became general-purpose computing machines that were adopted for niche applications.

Given these considerations, the current stage of QPU integration might be analogous to the GPU industry around 2006-2007. That was a time of pivotal change, where the foundational technologies and programming models that would enable widespread adoption were just being established. For QPUs, the development of quantum algorithms, error correction techniques, and qubit coherence are akin to the early challenges faced by GPUs in transitioning to general-purpose computing.

More on Quantum ComputingAre You Prepared for the Quantum Revolution?

In summary, although GPUs continue to play a critical role in advancing computational capacities, the integration of QPUs into data centers holds the promise of overcoming the operational and environmental challenges posed by current technologies. With their potential for lower power consumption, reduced heat output, and diminished need for frequent upgrades, QPUs represent a hopeful horizon in the quest for more efficient, sustainable, and powerful computing solutions. QPUs wont replace GPUs, just like GPUs did not eliminate classical CPUs. Instead, the data center of the future will include all three computing methods.

Original post:

Why the QPU Is the Next GPU - Built In

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

What is quantum computing good for? XPRIZE and Google offer cash for answers – Network World

Posted: at 2:40 am


without comments

The sponsors of a new $5 million prize want to boost the quantum computing industry by encouraging developers to write new algorithms to help the emerging technology solve real-world problems.

The new Quantum for Real-World Impact contest, from the XPRIZE Foundation, aims to speed the development of quantum computing algorithms focused on sustainability, health, and other societal issues. The three-year contest, sponsored by Google Quantum AI and the Geneva Science and Diplomacy Anticipator Foundation, wants to unleash the potential of quantum computing, according to the contest site.

Currently, quantum computers are not sufficiently advanced enough to solve real-world societal problems that classical computers cannot, the contest site says. However, as the technology advances, relatively few companies and university researchers are focused on translating quantum algorithms into real-world application scenarios and assessing their feasibility to address global challenges once sufficiently powerful hardware is available.

The new contest is crucial for the advancement of quantum computing, said Rebecca Krauthamer, co-founder and chief product officer at QuSecure, a vendor of quantum-resilient cybersecurity tools.

XPRIZE has a powerful history of pushing forward advancements in cutting-edge technology in spaceflight, conservation, advanced medicine, and more, she said. The contest signifies were in a truly exciting time for quantum computing.

Quantum computing hardware development still has a significant road ahead, she added, but much of the innovation from the technology will come from new algorithms and the application of quantum computers to real-world problems.

The contest provides the recognition of the great potential of quantum computing for both commercial and societal gain, she added.

Contestants can write new algorithms to solve new problems using quantum computing, they can show how existing algorithms can be used to solve previously unknown applications of quantum computing, or they can show ways to reduce the computing resources needed for a quantum computer to work on already established algorithms or applications.

Examples of possible contest entries include:

The contest is a good starting point for quantum computing in business models, said Jim Ingraham, vice president of strategic research, EPB of Chattanooga, a power and telecommunications company that launched a quantum-powered network in late 2022. Commercialization is the next essential step for bringing quantum technologies out of the lab and into the real world, he said.

The EPB Quantum Network was another step forward, he added. The network provides access to the necessary proving ground for quantum technologists to show investment worthiness and commercial viability, he said. This is a necessary step to help companies, government agencies and researchers accelerate the development of their technologies.

The contest may assist companies that havent found a way to profit from quantum computing innovation, added Lawrence Gasman, founder and president of Inside Quantum Technology, a quantum research firm.

It may bring in firms that could otherwise not survive, he said. This implies that the use of money is carefully vetted and only goes to firms that can make money in the short-to-medium term.

While quantum computing is not yet mainstream, that day is coming, said QuSecures Krauthamer.

When you see a news headline stating that quantum computers have been used to solve a problem that you recognize something like enhancing battery technology, or optimizing financial portfolios, or improving greenhouse emissions thats when youll know that quantum computing has gone mainstream, she said. We will begin seeing these headlines more in the next couple of years.

View original post here:

What is quantum computing good for? XPRIZE and Google offer cash for answers - Network World

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

3 Quantum Computing Stocks to Buy for Real-World Breakthrough – InvestorPlace

Posted: at 2:40 am


without comments

The quantum computing industry is experiencing significant growth, with advancements in both hardware and software making it a key consideration for organizations looking to invest in cutting-edge technology. To this end, we look at some of the top quantum computing stocks to buy as businesses utilize this next-gen technology across various industries.

Major tech players are increasingly interested in making significant investments in quantum computing to align with the rapid pace of technological advancements amid customers current demands, which are seeking innovative computational solutions.

Drawing on data from the quantum market and insights from industry thought leaders gathered in the fourth quarter of 2023, the recent State of Quantum 2024 report noted the transition from theoretical exploration to practical application, highlighted by the emergence of full-stack quantum computer deliveries in national labs and quantum centers.

In 2022, venture investments in quantum technology soared to over $2 billion amid strong investor confidence in this burgeoning field. However, by 2023, these investments saw a sharp 50% drop, sparking debates about a potential quantum winter.

Industry experts argue the decline reflects broader venture capital trends and not a loss of faith in the quantum sectors prospects. Government funding has increasingly filled the gap private investors left, mitigating concerns over the investment slowdown.

The bottom line is the quantum industry is still advancing, albeit at a moderate pace. This emphasizes the need for realistic expectations and a sustained commitment to research and development. Despite the recent dip in investment, the sectors insiders remain cautiously optimistic about its future. This suggests the industry is far from stagnating.

Lets take a closer look at leading quantum computing stocks to buy.

Intel (NASDAQ:INTC), the semiconductor giant, is actively pursuing a turnaround strategy to regain its leadership in the technology industry. The plan involves a significant restructuring of its operations, investment in advanced chip manufacturing technologies and a renewed focus on innovation.

Among other things, Intel is pushing hard to develop its quantum computing products. The chipmaker introduced Tunnel Falls, a quantum computing chip leveraging the companys cutting-edge manufacturing techniques.

The company has collaborated with various government and academic research entities to facilitate the testing of Tunnel Falls. According to Intel, the new chip has a 95% yield rate across the wafer and voltage uniformity.

Quantum computing isnt the core focus of Intels strategy to reclaim its semiconductor industry leadership. However, the initiative represents a potential growth area. Success in quantum computing research could position Intel as a key player in this innovative technology domain in the future. This could make Intel one of the top quantum computing stocks to buy.

Similarly to Intel, Alphabet (NASDAQ:GOOGL, NASDAQ:GOOG) is making significant strides in quantum computing through its subsidiary, Quantum AI. Focusing on developing quantum processors and algorithms, Googles parent company aims to harness quantum technology for breakthroughs in computing power.

Alphabet recently exceeded Q4 earnings expectations with a net income of $20.69 billion and a 13% revenue increase to $86.3 billion. Its advertising revenue of $65.52 billion slightly missed analyst projections.

While fighting Microsoft (NASDAQ:MSFT) on the AI front, Google has also ventured into the quantum computing realm with its proprietary quantum computing chips, Sycamore. In a strategic move, Google spun off its quantum computing software division into a standalone startup, SandboxAQ, in March 2022.

Its dominant position in search drives Googles foray into quantum computing. It aims to develop more efficient, faster and intelligent solutions. The company plays a crucial role in managing vast volumes of digital information. It can gain immensely by enabling various organizations to harness the transformative power of quantum computing and AI.

FormFactor (NASDAQ:FORM), a leading provider in the semiconductor industry, specializes in the design, development and manufacture of advanced wafer probe cards. These probe cards are essential for the electrical testing of semiconductor wafers before cutting them into individual chips.

FormFactor is strategically positioned within the quantum computing ecosystem through its semiconductor test and measurement solutions expertise. The company provides advanced systems essential for developing and testing quantum computing chips. These systems are designed to operate at extremely low temperatures, a fundamental requirement for quantum computing experiments where qubits must be maintained in a coherent state.

Its flagship products include precision engineering solutions like the Advanced Matrix series for high-density applications and the TouchMatrix series for touchscreen panels. FormFactors products enable semiconductor manufacturers to perform reliable and accurate testing at various stages of the production process. This ensures the functionality and quality of the final semiconductor products.

Last month, FormFactor reported a modest top-line year-over-year increase of 1.3%, reaching $168.2 million. Looking ahead, expectations for the first quarter are aligned with the recent quarterly performance, with projected revenue of around $165 million.

On the date of publication, Shane Neagle did not hold (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Shane Neagle is fascinated by the ways in which technology is poised to disrupt investing. He specializes in fundamental analysis and growth investing.

More here:

3 Quantum Computing Stocks to Buy for Real-World Breakthrough - InvestorPlace

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Longer coherence: How the quantum computing industry is maturing – DatacenterDynamics

Posted: at 2:40 am


without comments

Quantum computing theory dates back to the 1980s, but it's really only in the last five to ten years or so that weve seen it advance enough to the point it could realistically become a commercial enterprise.

Most quantum computing companies have been academic-led science ventures; companies founded by PhDs leading teams of PhDs. But, as the industry matures and companies look towards a future of manufacturing and operating quantum computers at a production-scale, the employee demographics are changing.

While R&D will always play a core part of every technology company, making quantum computers viable out in the real world means these startups are thinking about how to build, maintain, and operate SLA-bound systems in production environments.

This new phase in the industry requires companies to change mindset, technology, and staff.

Plus rebuilding Ukraine, Cologix's CEO, and more

20 Dec 2023

At quantum computing firm Atom Computing, around 40 of the companys 70 employees have PhDs, many joining straight out of academia. This kind of academic-heavy employee demographic is commonplace across the quantum industry.

I'd venture that over half of our company doesn't have experience working at a company previously, says Rob Hays, CEO of Atom. So theres an interesting bridge between the academic culture versus the Silicon Valley tech startup; those are two different worlds and trying to bridge people from one world to the other is challenging. And it's something you have to focus and work on openly and actively.

Maturing from small startups into large companies with demanding customers and shareholders is a well-trodden path for hundreds of technology companies in Silicon Valley and across the world.

And quantum computers are getting there: the likes of IonQ, Rigetti, and D-Wave are already listed in the Nasdaq and New York Stock Exchange although the latter two companies have had to deal at various times with the prospect of being de-listed due to low stock prices.

Most of the quantum companies DCD spoke to for this piece are undergoing a transition from pure R&D mode to a more operational and engineering phase.

When I first joined four years ago, the company was entirely PhDs, says Peter Chapman, IonQ CEO. We're now in the middle of a cultural change from an academic organization and moving to an engineering organization. We've stopped hiring PhDs; most of the people we're hiring nowadays are software, mechanical, and hardware engineers. And the next phase is to a customer-focused product company.

Chapman points to the hirings of the likes of Pat Tan and Dean Kassmann previously at Amazons hardware-focused Lab126 and rocket firm Blue Origin, respectively as evidence of the company moving to a more product- and engineering-focused workforce.

2023 also saw Chris Monroe, IonQ co-founder and chief scientist, leave the company to return to academia at North Carolinas Duke University.

During the earnings call announcing Monroes departure, Chapman said: Chris would be the first one to tell you that the physics behind what IonQ is doing is now solved. It's [now] largely an engineering problem.

Atoms Hays notes a lot of the engineering work that the company is doing to get ready for cloud services and applications is software-based, meaning the company is looking for software engineers.

We are mostly looking for people that have worked at cloud service providers or large software companies and have an interest in either learning or already some foundational knowledge of the underlying physics and science, he says. But we're kind of fortunate that those people self-select and find us. We have a pretty high number of software engineers who have physics undergrads and an extreme interest in quantum mechanics, even though by trade and experience they're software engineers.

On-premise quantum computers are currently rarities largely reserved for national computing labs and academic institutions. Most quantum processing unit (QPU) providers offer access to their systems via their own web portals and through public cloud providers.

But todays systems are rarely expected (or contracted) to run with the five-9s resiliency and redundancy we might expect from tried and tested silicon hardware.

Right now, quantum systems are more like supercomputers and they're managed with a queue; they're probably not online 24 hours, users enter jobs into a queue and get answers back as the queue executes, says Atoms Hays.

We are approaching how we get closer to 24/7 and how we build in redundancy and failover so that if one system has come offline for maintenance, there's another one available at all times. How do we build a system architecturally and engineering-wise, where we can do hot swaps or upgrades or changes with minimal downtime as possible?

Other providers are going through similar teething phases of how to make their systems which are currently sensitive, temperamental, and complicated enterprise-ready for the data centers of the world.

I already have a firm SLA with the cloud guys around the amount of time that we do jobs on a daily basis, and the timeframes to be able to do that, says Chapman. We are moving that SLA to 24/7 and being able to do that without having an operator present. It's not perfect, but its getting better. In three or four years from now, you'll only need an on-call when a component dies.

Rigetti CTO David Rivas says his company is also working towards higher uptimes.

The systems themselves are becoming more and more lights out every quarter, he says, as we outfit them for that kind of remote operation and ensure that the production facilities can be outfitted for that kind of operation.

Rigetti

Manufacturing and repair of these systems is also maturing, since the first PhD-built generations of quantum computers. These will never be mass-produced, but the industry needs to move away from one-off artisanal machines to a more production line-like approach.

A lot of the hardware does get built with the assistance of electronics engineers, mechanical engineers, says Atoms Hays, but much is still built by experimental physicists.

IonQs Chapman adds: In our first-generation systems, you needed a physicist with a screwdriver to tune the machine to be able to run your application. But every generation of hardware puts more under software control.

Everywhere a screwdriver could be turned, there's now a stepper motor under software control, and the operating system is now doing the tuning.

Simon Phillips, CTO of the UKs Oxford Quantum Circuits, says OQC is focused on how it hires staff and works with partners to roll out QPUs into colocation data centers.

And the first part of that starts with if we put 10 QPUs in 10 locations around the world, how do we do that without having an army of 100 quantum engineers on each installation?

And the first part of that starts with having a separate deployment team and a site reliability engineering team that can then run the SLA on that machine.

He adds: Not all problems are quantum problems. It can't just be quantum engineers; it's not scalable if it's the same people doing everything.

It's about training and understanding where the first and second lines of support sit, having a cascading system, and utilizing any smart hands so we can train people who already exist in data centers.

IonQ

While the quantum startups are undergoing their own maturing process, their suppliers are also being forced to learn about the needs of commercial operators and what it means to deploy in a production data center.

For years, the supply chain including for the dilution refrigerators that keep many quantum computers supercooled has dealt with largely self-reliant academic customers in lab spaces.

Richard Moulds, general manager of Amazon Braket at AWS, told DCD the dilution refrigerator market is a cottage industry with few suppliers.

One of the main fridge suppliers is Oxford Instruments, an Oxford University spin-out from the late 1950s that released the first commercial dilution unit back in 1966. The other large incumbent, Blufors, was spun out of what is now the Low Temperature Laboratory at Aalto University in Finland 15 years ago.

Prior to the quantum computing rush, the biggest change in recent years was the introduction of pulse tube technology. Instead of a cryostat inserted into a bath of liquid helium4, quantum computers could now use a closed loop system (aka a dry fridge/cryostat).

This meant the systems could become smaller, more efficient, more software-controlled - and more user-friendly.

With the wet dilution fridge (or wet cryostat), you need two-floor rooms for ceiling height. You need technicians to top up helium and run liquefiers, you need to buy helium to keep topping up, says Harriet van der Vliet, product segment manager, quantum technologies, Oxford Instruments.

It was quite a manual process and it would take maybe a week just to pre-cool and that would not even be getting to base temperature.

For years, the fridges were the preserve of academics doing materials science; they were more likely to win a Nobel prize than be part of a computing contract.

Historically, it's been a lab product. Our customers were ultra-low temperature (ULT) experts; if anything went wrong, they would fix it themselves, says van der Vliet. Now our customers have moved from being simply academics to being commercial players who need user-friendly systems that are push button.

While the company declined to break out numbers, Oxford said it has seen a noticeable change in the customer demographic towards commercial quantum computing customers in recent years, but also a change in buying trends. QPU companies are more likely to buy multiple fridges at once, rather than a single unit every few years for an academic research lab.

The commercial part is growing for sure, adds David Gunnarsson, CTO at Blufors. The company has expanded factory capacity to almost double production capabilities to meet growing demand.

There have been more and more attempts to create revenue on quantum computing technology. They are buying our systems to actually deploy or have an application that they think they can create money from. We welcome discussion with data centers so they can understand our technology from the cryogenics perspective.

And while the industry is working towards minimizing form factors as much as possible, for the foreseeable future the industry has settled on essentially brute force supercooling with bigger fridges. Both companies have released new dilution fridges designed for quantum computers.

Smaller fridges (and lower qubit-count) systems may be able to fit into racks, but most larger qubit-count supercooled systems require a much larger footprint than traditional racks. Blufors largest Kide system can cool around 1,000 qubits: the system is just under three meters in height and 2.5 meters in diameter, and the floor beneath it needs to be able to take about 7,000 kilograms of weight.

It has changed the way we do our product, says Gunnarsson. They were lab tools before; uptime wasnt discussed much before. Now we are making a lot of changes to our product line to ensure that you can be more certain about what the uptime of your system will be.

Part of the uptime challenge suppliers face around fridges an area where Gunnarsson notes there is still something of a mismatch is in the warm-up/cool-down cycle of the machines.

While previously the wet bath systems could take a week to get to the required temperatures, the new dry systems might only take a day or two each way. That is important, because cooling down and warming up cycles are effectively downtime; a dirty word when talking about service availability.

The speed with which you can get to temperature is almost as important as the size of the chip that you can actually chill, says AWS Moulds. Today, if you want to change the device's physical silicon, you have got to warm this device up and then chill it back down again, that's a four-day cycle. That's a problem; it means machines are offline for a long time for relatively minor changes.

While this might not be an issue for in-operation machines Rigetti CTO Rivas says its machines can be in service for months at a time, while Oxford Instruments says an OQC system was in operation non-stop for more than a year the long warm-up/cool-down cycle is a barrier to rapid testing.

From a production perspective, the systems remain cold for a relatively long time, says Rivas. But we're constantly running chips through test systems as we innovate and grow capacity, and 48 hours to cool a chip down is a long time in an overall development cycle.

Oxford Instruments and Blufors might be the incumbents, but there are a growing number of new players entering the fridge space, some specifically focusing on quantum computing.

The market has grown for dilution fridges, so there are lots more startups in the space as well making different cooling systems, says van der Vliet. There are many more players, but the market is growing.

I think it's really healthy that there's loads of players in the field, particularly new players who are doing things a little bit differently to how we've always done it.

The incumbents are well-placed to continue their lead in the market, but QPU operators are hopeful that competition will result in better products.

There will be genuine intellectual property that will emerge in this area and you'll definitely start to see custom designs and proprietary systems that can maintain temperature in the face of increasing power.

Atoms Hays notes that, for laser-based quantum systems, the lasers themselves are probably the largest constraint in the supply chain. Like the dilution fridges, these are still largely scientific technologies made by a handful of suppliers.

We need relatively high-powered lasers that need to be very quiet and very precise," he says. Ours are off the shelf, but they're semi-custom and manufacturer builds to order. That means that there's long lead times; in some cases up to a year.

He adds that many of the photonic integrated circuits are still relatively small - the size of nickels and dimes - but hopes they can shrink down to semiconductor size in future to help reduce the footprint

For now, the quantum industry is still enjoying what might be the autumn of its happy-go-lucky academic days. The next phase may well lead to quantum supremacy and a new phase in high-performance computing, but it will likely lead to a less open industry.

I think its nice that the industry is still sort of in that mode, says AWS Moulds. The industry is still taking a relatively open approach to the development. We're not yet in the mode of everybody working in their secret bunkers, building secret machines. But history shows that once there's a clear opportunity, there's a risk of the shutters coming down, and it becoming a more cut-throat industry.

In the end, that's good for customers; it drives down costs and drives up reliability and performance. But it might feel that might feel a little bit brutal for some of the academics that are in the industry now.

Read more:

Longer coherence: How the quantum computing industry is maturing - DatacenterDynamics

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Quantum Attack Protection Added to HP Business PCs – SecurityWeek

Posted: at 2:40 am


without comments

HP announced on Thursday that several of its business PCs now benefit from protection against quantum computer attacks thanks to a new security chip.

The tech giant said the 5th generation of its Endpoint Security Controller (ESC) chip, which is built into some of its computers, can protect the integrity of the devices firmware using quantum-resistant cryptography.

According to HP, the 5th generation ESC is currently available in Zbool Firefly, Power and Studio workstations; EliteBook 1000 series, 800 series and some 600 series notebooks; and some 400 series ProBook notebooks.

By embedding protection against quantum computer hacks at the chip level, HP is today setting a new standard in hardware and firmware security with our 5th generation ESC chip, HP said. By isolating the chip from the processor and OS, the ESC provides a hardware platform that reduces the risk of data breaches and improves productivity by preventing downtime.

[ Read: Cyber Insights 2024: Quantum and the Cryptopocalypse ]

While practical quantum computer attacks may still be at least a decade away, major tech companies have already started taking steps to ensure that the cryptography used in their products will be able to provide protection against quantum attacks when that day comes.

Apple, for instance, recently announced adding post-quantum encryption to iMessage to protect communications against quantum computing attacks.

Governments have also started taking steps to tackle the theoretical threats posed by quantum computing before they become a reality.

HP urges businesses to immediately start planning for the future and begin migrating their fleets. The company recommends identifying the highest priority use cases, finding out what technology providers are planning in regards to quantum protections, and creating a plan to ensure protection is rolled out in the required timeframe.

Related: AI Helps Crack NIST-Recommended Post-Quantum Encryption Algorithm

Related: In Other News: WEFs Unsurprising Cybersecurity Findings, KyberSlash Cryptography Flaw

Read the original here:

Quantum Attack Protection Added to HP Business PCs - SecurityWeek

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Quantum Computing Takes a Giant Leap With Light-Based Processors – SciTechDaily

Posted: at 2:40 am


without comments

Researchers have developed a groundbreaking light-based processor that enhances the efficiency and scalability of quantum computing and communication. By minimizing light losses, the processor promises significant advancements in secure data transmission and sensing applications. Credit: SciTechDaily.com

A team of scientists has created a reprogrammable light-based quantum processor, reducing light losses and enabling advancements in quantum computing and secure communications.

Scientists have created a reprogrammable light-based processor, a world-first, that they say could usher in a new era of quantum computing and communication.

Technologies in these emerging fields that operate at the atomic level are already realizing big benefits for drug discovery and other small-scale applications.

In the future, large-scale quantum computers promise to be able to solve complex problems that would be impossible for todays computers.

Lead researcher Professor Alberto Peruzzo from RMIT University in Australia said the teams processor a photonics device, which used light particles to carry information could help enable successful quantum computations, by minimizing light losses.

Our design makes the quantum photonic quantum computer more efficient in terms of light losses, which is critical for being able to keep the computation going, said Peruzzo, who heads the ARC Centre of Excellence for Quantum Computation and Communication Technology (CQC2T) node at RMIT.

If you lose light, you have to restart the computation.

Other potential advances included improved data transmission capabilities for unhackable communications systems and enhanced sensing applications in environmental monitoring and healthcare, Peruzzo said.

The teams reprogrammable light-based processor. Credit: Will Wright, RMIT University

The team reprogrammed a photonics processor in a range of experiments, achieving a performance equivalent to 2,500 devices, by applying varying voltages. Their results and analysis are published in Nature Communications.

This innovation could lead to a more compact and scalable platform for quantum photonic processors, Peruzzo said.

Yang Yang, lead author and RMIT PhD scholar, said the device was fully controllable, enabled fast reprogramming with reduced power consumption, and replaced the need for making many tailored devices.

We experimentally demonstrated different physical dynamics on a single device, he said.

Its like having a switch to control how particles behave, which is useful for both understanding the quantum world and creating new quantum technologies.

Professor Mirko Lobino from the University of Trento in Italy made the innovative photonic device, using a crystal called lithium niobate, and Professor Yogesh Joglekar from Indiana University Purdue University Indianapolis in the United States brought his expertise in condensed matter physics.

Lithium niobate has unique optical and electro-optic properties, making it ideal for various applications in optics and photonics.

My group was involved in the fabrication of the device, which was particularly challenging because we had to miniaturize a large number of electrodes on top of the waveguides to achieve this level of reconfigurability, Lobino said.

Programmable photonic processors offer a new route to explore a range of phenomena in these devices that will potentially unlock incredible advancements in technology and science, Joglekar said.

Meanwhile, Peruzzos team has also developed a world-first hybrid system that combines machine learning with modeling to program photonic processors and help control the quantum devices.

Peruzzo said the control of a quantum computer was crucial to ensure the accuracy and efficiency of data processing.

One of the biggest challenges to the devices output accuracy is noise, which describes the interference in the quantum environment that impacts how qubits perform, he said.

Qubits are the basic units of quantum computing.

There are a whole range of industries that are developing full-scale quantum computing, but they are still fighting against the errors and inefficiencies caused by noise, Peruzzo said.

Attempts to control qubits typically relied on assumptions about what noise was and what caused it, Peruzzo said.

Rather than make assumptions, we developed a protocol that uses machine learning to study the noise while also using modelling to predict what the system does in response to the noise, he said.

With the use of the quantum photonic processors, Peruzzo said this hybrid method could help quantum computers perform more precisely and efficiently, impacting how we control quantum devices in the future.

We believe our new hybrid method has the potential to become the mainstream control approach in quantum computing, Peruzzo said.

Lead author Dr. Akram Youssry, from RMIT, said the results of the newly-developed approach showed significant improvement over the traditional methods of modelling and control, and could be applied to other quantum devices beyond photonic processors.

The method helped us uncover and understand aspects of our devices that are beyond the known physical models of this technology, he said.

This will help us design even better devices in the future.

This work is published in Npj Quantum Information.

Peruzzo said startup companies in quantum computing could be created around his teams photonic device design and quantum control method, which they would continue to study in terms of applications and their full potential.

Quantum photonics is one of the most promising quantum industries, because the photonics industry and manufacturing infrastructure are very well established, he said.

Quantum machine-learning algorithms have potential advantages over other methods in certain tasks, especially when dealing with large datasets.

Imagine a world where computers work millions of times faster than they do today, where we can send information securely without any fear of it being intercepted, and where we can solve problems in seconds that would currently take years.

This isnt just fantasy its the potential future powered by quantum technologies, and research like ours is paving the way.

References:

Programmable high-dimensional Hamiltonian in a photonic waveguide array by Yang Yang, Robert J. Chapman, Ben Haylock, Francesco Lenzini, Yogesh N. Joglekar, Mirko Lobino and Alberto Peruzzo, 2 January 2024, Nature Communications. DOI: 10.1038/s41467-023-44185-z

Experimental graybox quantum system identification and control by Akram Youssry, Yang Yang, Robert J. Chapman, Ben Haylock, Francesco Lenzini, Mirko Lobino and Alberto Peruzzo, 13 January 2024, npj Quantum Information. DOI: 10.1038/s41534-023-00795-5

Go here to read the rest:

Quantum Computing Takes a Giant Leap With Light-Based Processors - SciTechDaily

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Google Is Offering $5 Million in a Quantum Computing Contest – Entrepreneur

Posted: at 2:40 am


without comments

Google, the Geneva Science and Diplomacy Anticipator (GESDA), and XPRIZE launched a competition Monday that will award $5 million over three years to teams who can find real-life applications for quantum computers.

Quantum computers process information differently from the regular, classical computers in use today, which allows them to complete certain tasks in shorter periods of time. Google researchers found in 2019 that a quantum computer took 200 seconds to complete a task that a high-performing supercomputer, which IBM estimates can have a million times more processing power than a standard laptop, would take 10,000 years to complete.

The problem that the XPRIZE competition sets out to solve is the disconnect between quantum algorithms and the real world. Applicants should be working on quantum algorithms that address sustainability and social impact.

The contest is open to anyone across the world working in any field. Winners will have submissions that "most accelerate" quantum algorithms for "positive real-world applications," according to the competition guidelines.

Applicants can submit a new quantum algorithm, a new application of an existing algorithm, or enhanced performance in the form of fewer resources to run an established algorithm. The University of Chicago, IBM, Microsoft, and Purdue University are some of the many institutions that offer courses on quantum computing.

Registration is open on the XPRIZE website.

Related: Quantum Computing Threatens Everything Could it be Worse Than the Apocalypse?

A cryostat from a quantum computer stands during a press tour of the Leibniz Computing Center. Photo: Sven Hoppe/dpa (Photo by Sven Hoppe/picture alliance via Getty Images

Quantum computing is a focus area for many tech giants, with McKinsey estimating a record $2.35 billion in investments in 2022. The McKinsey report further suggests that four industries are likely to see the earliest benefits of quantum computing: automotive, chemicals, financial services, and life sciences.

Related: Why This Technology Will Surge This Year and How You Can Capitalize On It

IBM CEO Arvind Krishna spoke to the Duke Fuqua School of Business last April about the benefits of quantum computing, and about how business minds were essential to determine the right use cases for the technology.

"So, you need to work on what kind of algorithms, which use case can leverage those algorithms, and the technology," Krishna told the outlet.

IBM and Google gave $150 million last year to advance quantum computing research at the University of Chicago and the University of Tokyo.

Read this article:

Google Is Offering $5 Million in a Quantum Computing Contest - Entrepreneur

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

Google sets up $5 million competition to find out what quantum computers can really do – The Indian Express

Posted: at 2:40 am


without comments

If you have no idea what quantum computers can do, do not worry. You are in the same club as search and advertising giant Google. The company on Monday announced it is launching the 3-year, $5-million-dollar XPRIZE Quantum Applications to solve real-world challenges with the technology.

The competition is soliciting quantum computing algorithms that can potentially be used to achieve what Google is referring to as societally beneficial goals, like the United Nations Sustainable Development Goals. This is in line with Google Quantum AIs mission to build a large-scale, error-corrected quantum computer and develop useful quantum computing applications.

The device you are reading this on, whether it is a personal computer, a smartphone or even a VR headset, is powered by classical computing. Classical computers store information in binary bits, which can have two values either 0 or 1. Quantum computers encode information in what is known as a quantum bit or a qubit.

Quantum computers are machines that use the properties of quantum physics to store data and perform computations. This can be extremely advantageous for certain tasks where they could vastly outperform even our best supercomputers.

Classical computers, which include smartphones and laptops, encode information in binary bits that can either be 0s or 1s. In a quantum computer, the basic unit of memory is a quantum bit or qubit. Just like a classical computing bit, a qubit can have two distinct states and these can be used to represent either a 0 or a 1. But unlike a classical bit which can only exist in one of these states, a qubit can exist in superposition states or even be entangled with other quantum bits.

This, in theory, makes qubits much more powerful than classical bits, therefore making quantum computers much more powerful than classical computers, depending on the application.

A majority of the efforts spent in research is directed at actually building viable quantum computers. As such, most quantum algorithms are mainly studied in the context of abstract mathematical problems.

Scientists have many reasons to be optimistic about the potential of quantum computing but they are still in the dark about the full scope of this technology and what real-world applications, especially during its early stages. Google is hoping that this prize will incentivise the quantum computing community to come up with the answer to the most pressing question What do we do with quantum computers once they are built?

IE Online Media Services Pvt Ltd

First uploaded on: 06-03-2024 at 14:17 IST

See the original post:

Google sets up $5 million competition to find out what quantum computers can really do - The Indian Express

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with

How Quantum AI Adapts to Changing Market Trends – Native News Online

Posted: at 2:40 am


without comments

Details

In today's fast-paced and ever-changing world, staying ahead of market trends is crucial for businesses to thrive. Traditional methods of market analysis and prediction are often limited by their reliance on classical computing algorithms. However, a revolutionary technology called Quantum AI is changing the game, offering unprecedented capabilities for adapting to changing market dynamics.

The essence of Quantum AI lies in the convergence of quantum computing and artificial intelligence. To comprehend the power of Quantum AI in market analysis, it is imperative to grasp the fundamental concepts behind quantum computing.

Quantum AI represents a cutting-edge technological frontier that holds immense promise for revolutionizing various industries, and with this knowledge in mind, your Quantum AI journey begins. By leveraging the principles of quantum mechanics and artificial intelligence, Quantum AI opens up a realm of possibilities that were previously unimaginable. The fusion of these two advanced fields not only enhances computational capabilities but also paves the way for groundbreaking advancements in data analysis, machine learning, and predictive modeling.

Quantum computing is based on the principles of quantum mechanics, which deal with the behavior of matter and energy at the atomic and subatomic levels. Unlike classical computers that use bits to represent information as either a 0 or a 1, quantum computers utilize quantum bits, or qubits, which can exist in multiple states simultaneously. This enables quantum computers to perform complex calculations at an unimaginable speed.

At the core of quantum computing is the phenomenon of superposition, where qubits can exist in a state of 0, 1, or any quantum superposition of these states. This unique property allows quantum computers to explore multiple solutions to a problem simultaneously, leading to exponential speedup in certain computational tasks. Additionally, quantum entanglement, another key principle in quantum mechanics, enables qubits to be interconnected in such a way that the state of one qubit is dependent on the state of another, regardless of the physical distance between them.

Artificial intelligence, on the other hand, focuses on the development of intelligent machines capable of simulating human-like behavior. By combining the computational power of quantum computers with AI algorithms, Quantum AI harnesses the potential for enhanced problem-solving, optimization, and prediction capabilities.

Quantum AI algorithms have the capacity to process and analyze vast amounts of data with unprecedented efficiency, leading to more accurate insights and predictions. The synergy between quantum computing and AI not only accelerates the pace of innovation but also unlocks new frontiers in machine learning, natural language processing, and robotics. As Quantum AI continues to evolve, it is poised to redefine the boundaries of what is possible in the realm of advanced computing and artificial intelligence.

Market analysis involves examining vast amounts of data to identify patterns, make predictions, and inform decision-making. Quantum AI offers unique advantages in this regard, revolutionizing traditional approaches and opening up new possibilities.

Quantum AI combines the principles of quantum mechanics with artificial intelligence, creating a powerful tool for market analysis. By harnessing the properties of superposition and entanglement, Quantum AI can process and analyze data in ways that classical computers cannot. This allows for more sophisticated modeling of market dynamics and more accurate predictions of future trends.

One of the key strengths of Quantum AI lies in its predictive capabilities. By leveraging the immense computational power of quantum computers, it becomes possible to analyze extensive historical data and identify complex patterns and trends. Quantum AI algorithms can uncover hidden correlations and make highly accurate predictions, empowering businesses with actionable insights.

Furthermore, Quantum AI can handle non-linear relationships and high-dimensional data with ease, providing a more comprehensive understanding of market behavior. This enhanced predictive ability enables businesses to anticipate market shifts, optimize investment strategies, and mitigate risks effectively.

Another significant advantage of Quantum AI is its ability to adapt to real-time market changes and make informed decisions on the fly. Traditional market analysis methods often struggle to keep up with rapidly evolving trends. Quantum AI, however, excels in handling large volumes of data in real-time, enabling businesses to react promptly to emerging opportunities and risks.

Moreover, Quantum AI's adaptive nature allows for dynamic decision-making processes that can adjust strategies in response to changing market conditions. This agility is crucial in today's fast-paced and volatile business environment, where timely decisions can mean the difference between success and failure.

Understanding and capitalizing on market trends is vital for businesses to stay competitive. Quantum AI offers unique advantages in identifying and leveraging emerging market trends.

By processing vast amounts of data from multiple sources, Quantum AI can detect subtle shifts and patterns that may indicate emerging market trends. This provides businesses with a competitive edge, allowing them to anticipate changes and adapt their strategies proactively.

Market forecasting is an essential aspect of market analysis, helping businesses make informed decisions about future market conditions. Quantum AI's ability to process vast amounts of data and identify hidden patterns and correlations allows for more accurate and reliable market forecasting. This assists businesses in making strategic decisions to drive growth and profitability.

As Quantum AI continues to evolve, its potential applications in market analysis are vast. However, several challenges and considerations need to be addressed to fully realize the benefits of this revolutionary technology.

One of the main challenges in adopting Quantum AI for market analysis is the need for highly specialized skills and resources. Quantum computing is a complex field that requires expertise in quantum physics and computer science. Collaboration between different disciplines and investments in research and development are crucial to overcoming these challenges.

The impact of Quantum AI in market analysis extends across various industries. From finance and healthcare to retail and manufacturing, businesses can leverage the power of Quantum AI to gain a competitive advantage. The ability to gather valuable insights and make informed decisions based on accurate market analysis has the potential to transform industries and reshape market dynamics.

In conclusion, Quantum AI represents a monumental leap forward in market analysis. By combining the computational power of quantum computing with the intelligence of AI algorithms, businesses can enhance their ability to adapt to changing market trends. The predictive capabilities, real-time adaptability, and accurate market analysis offered by Quantum AI can empower businesses to make informed decisions and stay ahead of the competition. Embracing Quantum AI is not only an investment in the future but also a means to drive innovation and growth in an ever-evolving market landscape.

DISCLAIMER:Branded Voices features paid content from our marketing partners. Articles are not created by Native News Online staff and have not been fact-checked for accuracy.The information presented and views and opinions expressed in the Branded Voices stories are those of the authors and do not necessarily reflect the official policy or position of Native News Online or its ownership. Any content provided by our bloggers or authors are their property, may include their opinions, and are not intended to malign any religion, ethnic group, club, organization, company, individual or anyone or anything.

Continued here:

How Quantum AI Adapts to Changing Market Trends - Native News Online

Written by admin

March 9th, 2024 at 2:40 am

Posted in Quantum Computing

Tagged with


Page 11234..1020..»



matomo tracker