Page 21234..10..»

Archive for the ‘Quantum Computing’ Category

The Quantum Dream: Are We There Yet? – Toolbox

Posted: September 2, 2020 at 1:57 am

without comments

Share Share

The emergence of quantum computing has led industry heavyweights to fast track their research and innovations. This week, Google conducted the largest chemical simulation on a quantum computer to date. The U.S. Department of Energy, on the other hand, launched five new Quantum Information Science (QIS) Research Centers. Will this accelerate quantum computings progress?

Quantum technology is the next big wave in the tech landscape. As opposed to traditional computers where all the information emails, tweets, YouTube videos, and Facebook photos are streams of electrical pulses in binary digits, 1s and 0s; quantum computers rely on quantum bits or qubits to store information. Qubits are subatomic particles, such as electrons or photons which change their state regularly. Therefore, they can be 1s and 0s at the same time. This enables quantum computers to run multiple complex computational tasks simultaneously and faster when compared to digital computers, mainframes, and servers.

Introduced in the 1900s, quantum computing can unlock the complexities across different industries much faster than traditional computers. A quantum computer can decipher complex encryption systems that can easily impact digital banking, cryptocurrencies, and e-commerce sectors, which heavily depend on encrypted data. Quantum computers can expedite the discovery of new medicines, aid in climate change, power AI, transform logistics, and design new materials. In the U.S., technology giants, including IBM, Google, Honeywell, Microsoft, Intel, IonQ, and Rigetti Computing, are leading the race to build quantum computers and gain a foothold in the quantum computing space. Whereas Alibaba, Baidu, Huawei are leading companies in China.

For a long time, the U.S. and its allies, such as Japan and Germany, had been working hard to compete with China to dominate the quantum technology space. In 2018, the U.S. government released the National Strategy Overview for Quantum Information Science to reduce technical skills gaps and accelerate quantum computing research and development.

In 2019, Google claimed quantum supremacy for supercomputers when the companys Sycamore processor performed specific tasks in 200 seconds, which would have taken a supercomputer 10,000 years to complete. In the same year, Intel rolled out Horse Ridge, a cryogenic quantum control chip, to reduce the quantum computing complexities and accelerate quantum practicality.

Tech news: Is Data Portability the Answer To Anti-Competitive Practices?

Whats 2020 Looking Like For Quantum Computing?

In July 2020, IBM announced a research partnership with the Japanese business and academia to advance quantum computing innovations. This alliance will deepen ties between the countries and build an ecosystem to improve quantum skills and advance research and development.

More recently, in June 2020, Honeywell announced the development of the worlds highest-performing quantum computer. AWS, Microsoft, and several other IaaS providers have announced quantum cloud services, an initiative to advance quantum computing adoption. In August 2020, AWS announced the general availability of its Amazon Braket, a quantum cloud service that allows developers to design, develop, test, and run quantum algorithms.

Since last year, auto manufacturers, such as Daimler and Volkswagen have been leveraging quantum computers to identify new methods to improve electric vehicle battery performance. Pharmaceutical companies are also using the technology to develop new medicines and drugs.

Last week, the Google AI Quantum team used their quantum processor, Sycamore, to simulate changes in the configuration of a chemical molecule, diazene. During the process, the computer was able to describe the changes in the positions of hydrogen accurately. The computer also gave an accurate description of the binding energy of hydrogen in bigger chains.

If quantum computers develop the ability to predict chemical processes, it would advance the development of a wide range of new materials with unknown properties. Current quantum computers, unfortunately, lack the augmented scaling required for such a task. Although todays computers are not ready to take on such a challenge yet, computer scientists hope to accomplish this in the near future as tech giants like Google invest in quantum computing-related research.

Tech news: Will Googles Nearby Share Have Anything Transformative to Offer?

It, therefore, came as a relief to many computer scientists when the U.S. Department of Energy announced an investment of $625 million over the next five years for five newly formed Quantum Information Science (QIS) Research Centers in the U.S. The newly formed hubs are an amalgam of research universities, national labs, and tech titans in quantum computing. Each of the research hubs is led by the Energy Departments Argonne National Laboratory, Oak Ridge National Laboratory, Brookhaven National Laboratory, Fermi National Laboratory, and Lawrence Berkeley National Laboratory; powered by Microsoft, IBM, Intel, Riggeti, and ColdQuanta. This partnership aims to advance quantum computing commercialization.

Chetan Nayak, general manager of Quantum Hardware at Microsoft, says, While quantum computing will someday have a profound impact, todays quantum computing systems are still nascent technologies. To scale these systems, we must overcome a number of scientific challenges. Microsoft has been tackling these challenges head-on through our work towards developing topological qubits, classical information processing devices for quantum control, new quantum algorithms, and simulations.

At the start of this year, Daniel Newman, principal analyst and founding partner at Futurum Research, predicted that 2020 will be a big year for investors and Silicon Valley to invest in quantum computing companies. He said, It will be incredibly impactful over the next decade, and 2020 should be a big year for advancement and investment.

Quantum computing is still in the development phase, and the lack of suppliers and skilled researchers might be one of the influential factors in its establishment. However, if tech giants, and researchers continue to collaborate on a large scale, quantum technology can turbocharge innovation at a large scale.

What are your thoughts on the progress of quantum computing? Comment below or let us know on LinkedIn, Twitter, or Facebook. Wed love to hear from you!

Go here to read the rest:

The Quantum Dream: Are We There Yet? - Toolbox

Written by admin

September 2nd, 2020 at 1:57 am

Posted in Quantum Computing

Bipartisan Bill Calls for Government-Led Studies Into Emerging Tech Impacts – Nextgov

Posted: at 1:57 am

without comments

Commerce Department and Federal Trade Commission-led studies diving deep into Americas pursuit, use and governance of multiple emerging technologiesand resulting in tips for national strategies to advance each and secure supply chainswould be required under a bipartisan bill introduced Friday.

The American Competitiveness on More Productive Emerging Tech Economy, or COMPETE Act, set forth by Reps. Cathy McMorris Rodgers, R-Wash., and Bobby Rush, D-Ill., is a legislative package of several other previously-introduced bills focused on boosting Congress grasp of the tech landscape.

If passed, it would mandate new research into confronting online harms, and advancing eight buzzy areas of on-the-rise emerging technology: artificial intelligence, quantum computing, blockchain, new and advanced materials, unmanned delivery services, 3D printing, the internet of things, and IoT in manufacturing.

Such tech has expanded the horizons of humankind, drastically changing the way we exchange information and interact with the world around us, Rush said in a statement, adding that, as these technologies develop and become more prolific, it is imperative that the U.S. take the lead in appreciating both the benefits and risks associated with [them], and ensure that we remain competitive on the world stage.

Referred to the House Committee on Energy and Commerce upon introduction, the 36-page bill incorporates the Advancing Blockchain Act, initially introduced by Rep. Brett Guthrie, R-Ky., the Advancing Quantum Computing Act from Rep. Morgan Griffith, R-Va., and almost 10 other pieces of previously put forward legislation calling for research into contemporary technologies impact on commerce and society. The bill calls for year-long, agency-led investigations into each of the listed burgeoning technological industries and areaswith explicit instructions for the type of information the agencies would need to report back to Congress. The work would entail developing lists of public-private partnerships promoting the various techs adoption, exploring standards and policies implemented by those tapping into each, identifying near- and long-term risks among supply chains, pinpointing tech industry impacts on the U.S. economy and much more.

Studies are studies and from a Congressional standpoint they are generally used to inform oversight and legislative activity. Thats likely the case here, Mike Hettinger, founder of Hettinger Strategy Group and former House Oversight Committee staffer told Nextgov Tuesday. On [its] face, the bill is not going to change any existing policy related to any of the areas on which it is focused. That said, the more we know, the better off we will be.

Agencies involved in producing the reports would also need to craft recommendations for policies and legislation that would advance the expeditious adoption of the said technologies, according to the act.

Hettinger noted that the bill could signal that the participating lawmakers are teeing up potential legislative action.

Thats the thing to watch because for the most part when you have emerging technology you want to be very careful not to over-regulate it in a way that would hinder innovation, he said, noting that what we need more than anything in these areas is continued robust federal investment in related research and development.

You hope that by studying these areas in-depth first, youll avoid any knee-jerk regulation that could harm innovation, he added.

On top of honing in on each specific emerging technology, the bill also includes a section that Hettinger said hes particularly intrigued by, which is the full text of what was originally introduced as the Countering Online Harms Act. In the COMPETE Act, the portion mandates a study to consider whether and how artificial intelligence may be used to identify, remove, or take any other appropriate action necessary to address online harms, like manipulated content such as deepfakes used to mislead people, disinformation campaigns, fraudulent content intended to scamand beyond.

The issue of deceptive content and deepfakes is front and center today as the 2020 election moves into full swing, Hettinger said. Being able to identify what content is authentic and what has been manipulated is increasingly critical for protecting the integrity of our electoral process.

The bills included in the legislative bundle were put forth prior by several other lawmakersall of whom contributed to what Hettinger suggested marks a unique approach. He pointed out that outside of the Smart IoT Act, most pieces of legislation included in COMPETE were formerly introduced on the same date this summerMay 19and their language is strikingly similar, at times nearly identical.

This suggests to me that this was a coordinated approach from the outset, and part of an innovation agenda, Hettinger said. I dont know the behind the scenes posturing thats going on, but we do expect to see a lot of legislative activity between now and the end of the year so I assume the plan is to try and pass this combined package in the House before Congress adjourns for the year.

Go here to read the rest:

Bipartisan Bill Calls for Government-Led Studies Into Emerging Tech Impacts - Nextgov

Written by admin

September 2nd, 2020 at 1:57 am

Posted in Quantum Computing

Two Pune Research Institutes Are Building India’s First Optical Atomic Clocks – The Wire Science

Posted: at 1:57 am

without comments

Students of IISER Pune next to the strontium-based optical atomic clocks setup. Photo: IISER Pune.

Pune/Bengaluru: Two Pune-based premier research institutes, the Inter-University Centre for Astronomy and Astrophysics (IUCAA) and the Indian Institute of Science Education and Research (IISER), have joined hands to build Indias first two optical atomic clocks.

The institutes will build one clock each, with help from the Government of India. If the project is successful, India will join a small global club of countries with the ability to build these ultra-precise timekeeping devices.

According to the scientists involved, the clocks will only skip one second in more than 13.8 billion years, which is the approximate age of our universe.

Since the middle of the 20th century till now, there have been tremendous efforts in the field of atomic clocks, making time the most accurately measured physical quantity, the authors of a paper published in 2014 wrote.

Optical atomic clocks themselves have a few well-known applications. Foremost of course is accurate timekeeping which in turn has multiple applications of its own, according to Subhadeep De, an associate professor and expert in optical physics at IUCAA and one of the members of the project.

For example, GPS satellites use radar signals to determine the position of an object on the ground. However, there is a time lag both due to time taken for the signals to move between the ground and the satellites and because the satellites are in motion relative to the object while they move through Earths gravitational field, incurring really tiny but significant time delays arising from the theories of relativity.

The worlds prevailing frequency standard for measuring time is derived from caesium atomic clocks. Here, caesium atoms are imparted energy by different means in different designs and forced to jump from one energy level to a slightly higher one, called the atoms hyperfine ground states. Shortly after, the atom drops back to its previous state by emitting microwave radiation at 9,192,631,770 Hz.

Hz here is hertz, the SI unit of frequency, defined as per second. So when a detector measures 9,192,631,770 waves from crest to trough of this microwave emission, coming from the caesium atoms, one second will have passed.

According to the Mechatronics Handbook (2002), all timekeeping machines have three parts: an energy source, a resonator and a counter. In a household wall clock, the energy source is a AA or AAA battery; the resonator, in this case the clocks gears, is the system that moves in a periodic manner; and the counter is the display. The energy and resonator are together called an oscillator.

In atomic clocks, the oscillator is, say, a laser imparting energy to a caesium atom ticking between the two hyperfine ground states. The radiation the atom releases is the resonator. The detector is the counter.

The clocks being built by IUCAA and IISER have the same underlying principle but use more advanced technologies. Indeed, optical atomic clocks are considered to be the next step in the evolution of atomic clocks and are likely to replace caesium atomic clocks as the worlds time standard in future. A glimpse of the underlying engineering shows us why.

First, confining the atoms or ions is very difficult. To keep the clock precise, its operators need to ensure the atoms dont combine to form molecules, bump into each other and/or dont react with the containers walls. So instead of confining them in material containers, the IUCAA and IISER teams are using optical and electromagnetic traps.

Specifically, neutral atoms are confined in an optically created storage basket known as an optical lattice, which is created by interfering two counter-propagating laser beams, Umakant Rapol, an associate professor at IISER, said. The ions are confined by oscillating electric fields.

Second, once the particles have been confined, they will be laser-cooled to nearly absolute zero (the coldest temperature possible, 0 K or -273.15 C). In their simplest form, laser-cooling techniques force atoms to lose their kinetic energy and come very nearly to a still. Since the temperature of a macroscopic body is nothing but the collective kinetic energy of its atoms, a container of nearly-still atoms is bound to feel very cold. And once more of the atoms kinetic energy has been removed, their quantum physical effects become more noticeable, allowing the clock to be more precise.

The choice of atoms to use in the clock is dictated by whether they can be cooled to a few microkelvin above absolute zero using laser-cooling, and if their switching between the two energy states is immune to stray magnetic fields, electric fields, the temperature of the background, etc., Rapol said.

Ytterbium and strontium atoms check both these boxes. IUCAA will be building a ytterbium-ion clock. In this clock, a single ytterbium ion will be used to produce the resonating radiation. Using multiple ions gives rise to an effect called a Coulomb shift, which interferes with the clock design. IISER will be building a strontium-atom clock.

When a caesium atom swings between the two hyperfine ground states, it emits a specific amount of energy as microwave radiation. When the ytterbium and strontium atoms swing between two of their energy states, they emit energy as optical radiation. Both these elements have highly stable optical emissions at wavelengths of 467 nm and 698.4 nm corresponding to 642,121,496,772,645 Hz and 429,228,066,418,009 Hz for ytterbium-ion and strontium atom, respectively.

These high frequencies two orders of magnitude higher than the microwave radiation in caesium clocks is the source of the clocks ability to miss less than one second in 13.8 billion years.

(The makers of an optical strontium clock reported in 2014 that their device wouldnt miss one second in 15 billion years!)

Also read: Experimenting with Cold, Magnetic Materials in Indore

However, taking advantage of this stable emission means accurately detecting the high-frequency optical radiation. That is, if researchers need to build optical atomic clocks, they also need to be able to build and operate state-of-the-art frequency measurement systems. These devices in the form of frequency combs constitute the third feature of the IUCAA and IISER clocks.

A frequency comb is an advanced laser whose output radiation lies in multiple, evenly-spaced frequencies. This output can be used to convert high-frequency optical signals into more easily countable lower-frequency microwave signals like in the diagram shown below (source).

The principal challenge before India is to build all these devices from scratch. Rapol said the teams plan to develop most of the required technologies in Pune. They require expertise in the fields of optics, instrumentation, electronics, ultra-high vacuums, and mechanical and software engineering, among others.

National collaborations such as [us] working together with our next-door neighbour IISER will be beneficial, De said. Rapol mirrored this opinion: We are going to share expertise with IUCAA and are already working [together] to create an ion trap.

Rapol also said one clock is half-ready: We have laser-cooled the strontium atoms and are ready to load these atoms into one-dimensional chains, to increase the signal-to-noise ratio, and will have the optical clock soon, he said. They are also waiting to fit in the frequency comb.

He estimated that once the funds and equipment have been procured, it should take two years or less to build the clock at IISER. The IUCAA clock is expected to be ready in four or five years.

Once both clocks are operational, they will be linked together.

Grander applications

There are multiple open problems in physics at the moment. Four of the more prominent ones include the search for new physics, the reconciliation of quantum mechanics and relativity, an explanation for what happened to the universes antimatter, and the nature of dark matter.

De noted that various experiments designed to help answer these questions and others besides require researchers to be able to measure time in different contexts with increasingly higher precision and accuracy.

Rapol also expressed excitement about measuring changes in the values of fundamental constants. Constants are called so because their values dont change but the values of some constants could be changing too slightly for existing clocks to notice.

For example, the fine-structure constant is a number that determines the strength with which a charged particle, like an electron or a ytterbium ion, couples with an electromagnetic field. If this number increases or decreases with time, there could be implications for the whole universe everywhere charged particles interact with each other.

According to De, the ytterbium ion is more sensitive to the fine structure constant than strontium atoms. So if the constants value changes with time, the ytterbium clocks transition frequency will vary at a much faster rate relative to that of the strontium clock. This [difference] will eventually allow us to measure time variation of the fundamental constant, if there is any at all.

For a different example, physicists who study particles called neutrinos sometimes need to beam these particles from a source to a detector hundreds of kilometres away, through the atmosphere (these particles are entirely harmless). In 2011, physicists in Italy found that some neutrinos that had been beamed from a facility near Geneva and detected at their instrument, called OPERA, had travelled faster than light. The claim became a major source of controversy because faster-than-light travel violates the special theory of relativity.

The problem was found a few months later: the OPERA master clock had glitched, and measured the neutrinos time of arrival wrong by just 75 nanoseconds.

Other applications of atomic clocks include GPS systems, gravity-aided navigation, astronomy and geology.

Also read: Listen | Tick-tock, Tick-tock, Say Hello To the Doomsday Clock

More immediate concerns

The clocks also bring deeper opportunities for Indias scientists and engineers.

In 2017, the Department of Science and Technology had mooted its Quantum-Enabled Science & Technology programme. Its aim, the principal scientific adviser had told The Print in 2019, was to ramp up research and development activities related to quantum computing. In the 2020 Union budget, finance minister Nirmala Sitharaman announced the Centre would invest Rs 8,000 crore in the next five years under a new national mission for quantum technologies.

So as such, there are both interest and funds available at the moment to develop concepts and technologies to address a variety of applications. At present, we are using conventional technologies in our daily life for commercial and navigational purposes, De said. The world is moving towards the quantum computers, quantum communication systems and quantum internet.

In this regard, we can import the clock, but [operating it] will need highly skilled professionals. On the other hand, being able to build optical atomic clocks could help us become self-sustained and develop skilled human resources in the process, De noted.

And of course, theres the pride. A few years ago, a team at the National Physical Laboratory of India, New Delhi, led by Poonam Arora built Indias first atomic clock with caesium atoms (the authors of the 2014 paper quoted earlier). This clock is Indias current frequency standard the machine that defines how time is measured in the country. The researchers acknowledge in their paper that they expect optical frequency standards will replace the [caesium fountain clock] as primary frequency standards in the next few years.

De, Rapol and their colleagues and students at IUCAA and IISER are now attempting to bring India to this next threshold.

Japan is the only country in the Asia-Pacific to have built [optical atomic clocks], and China is working hard among other nations like Australia, Taiwan, Thailand, South Korea, Singapore and Russia, according to De.

Himanshu N. is a freelance journalist. Vasudevan Mukunth is editor, The Wire Science.

Read more:

Two Pune Research Institutes Are Building India's First Optical Atomic Clocks - The Wire Science

Written by admin

September 2nd, 2020 at 1:57 am

Posted in Quantum Computing

Vitalik Buterin highlights major threats to Bitcoin BTC and Ethereum ETH – Digital Market News

Posted: at 1:57 am

without comments

Bitcoin BTC, Ethereum ETH, and the rest of the crypto-market is off to a good start. But the major concern is, what might prevent Bitcoin and Ethereum from surging. Well, the Co-Founder of Ethereum, Vitalik Buterin holds the answer to that question.

- Advertisement -

Recently, Buterin was on What Bitcoin Did podcast, where he weighed in some threats to Bitcoin and the rest of the market, may encounter soon. Buterin seemed quite curious while speaking about quantum computing.

Are you looking for fast-news, hot-tips and market analysis? Sign-up for the Invezz newsletter, today.

Buterin said:

So the thing that I tend to worry about I mean one is that theres always this kind of black swan risk of technical failure. What if the NSA comes out with a quantum computer out of the blue and just steals a bunch of coins before you can do anything about it?

[Theres also] political failure. So what if governments banned Bitcoin, commandeered the mining pools, and use that to do what I call a 51% spawn camping attack attacking the chain over and over again until it becomes non-viable? And meanwhile, the prices are low because the things banned and theres a crisis of confidence?

- Advertisement -

Especially for Bitcoin, he was concerned about the fact that whether Bitcoin will keep attracting investors interest in the long run.

Buterin added:

Bitcoin doesnt have what I call functionality escape velocity. So basically, sufficient functionality to serve as a trustless base layer for a lot of different applications. As a result of this, theres a possibility that over time people will find Bitcoin less and less interesting and other platforms more interesting.

He further addressed the notions about BTC/USD and ETH/USD becoming the norm and being used as the new form of money. Although Bitcoin and Ethereum have outplayed the bashing community and proved its importance, it depends on ones definition of what makes a currency.

Buterin further added:

The word money does combine a lot of different concepts. For example, people talk about the unit of account, a medium of exchange, store of value. For the unit of account, ETH is not that and BTC is not that either. For the medium of exchange, Bitcoin is used like that, and ETH is used as that sometimes ETH has a store of value. That is something that people use ETH for.

Read this article:

Vitalik Buterin highlights major threats to Bitcoin BTC and Ethereum ETH - Digital Market News

Written by admin

September 2nd, 2020 at 1:57 am

Posted in Quantum Computing

What Is Quantum Supremacy And Quantum Computing? (And How Excited Should We Be?) – Forbes

Posted: August 23, 2020 at 10:57 pm

without comments

In 2019, Google announced with much fanfare that it had achieved quantum supremacy the point at which a quantum computer can perform a task that would be impossible for a conventional computer (or would take so long it would be entirely impractical for a conventional computer).

What Is Quantum Supremacy And Quantum Computing? (And How Excited Should We Be?)

To achieve quantum supremacy, Googles quantum computer completed a calculation in 200 seconds that Google claimed would have taken even the most powerful supercomputer 10,000 years to complete. IBM loudly protested this claim, stating that Google had massively underestimated the capacity of its supercomputers (hardly surprising since IBM also has skin in the quantum computing game). Nonetheless, Googles announcement was hailed as a significant milestone in the quantum computing journey.

But what exactly is quantum computing?

Not sure what quantum computing is? Dont worry, youre not alone. In very simple terms, quantum computers are unimaginably fast computers capable of solving seemingly unsolvable problems. If you think your smartphone makes computers from the 1980s seem painfully old fashioned, quantum computers will make our current state-of-the-art technology look like something out of the Stone Age. Thats how big a leap quantum computing represents.

Traditional computers are, at their heart, very fast versions of the simplest electronic calculators. They are only capable of processing one bit of information at a time, in the form of a binary 1 or 0. Each bit is like an on/off switch with 0 meaning "off" and 1 meaning "on." Every task you complete on a traditional computer, no matter how complex, is ultimately using millions of bits, each one representing either a 0 or a 1.

But quantum computers dont rely on bits; they use qubits. And qubits, thanks to the marvels of quantum mechanics, arent limited to being either on or off. They could be both at the same time, or exist somewhere in between. Thats because quantum computing harnesses the peculiar phenomena that take place at a sub-atomic level in particular, the ability of quantum particles to exist in multiple states at the same time (known as superposition).

This allows quantum computers to look at many different variables at the same time, which means they can crunch through more scenarios in a much shorter space of time than even the fastest computers available today.

What does this mean for our everyday lives?

Reaching quantum supremacy is clearly an important milestone, yet were still a long way from commercially available quantum computers hitting the market. Right now, current quantum computing work is limited to labs and major tech players like Google, IBM, and Microsoft.

Most technology experts, myself included, would admit we dont yet fully understand how quantum computing will transform our world we just know that it will. Its like trying to imagine how the internet or social media would transform our world before they were introduced.

Here are just some of the ways in which quantum computers could be put to good use:

Strengthening cyber security. Quantum computers could change the landscape of data security by creating virtually unbreakable encryption.

Accelerating artificial intelligence. Quantum computing could provide a massive boost to AI, since these superfast computers will prove far more effective at recognizing patterns in data.

Modeling traffic flows to improve our cities. Modeling traffic is an enormously complex process with a huge number of variables, but researchers at Volkswagen have been running quantum pilot programs to model and optimize the flow of traffic through city centers in Beijing, Barcelona, and Lisbon.

Making the weather forecast more accurate. Just about anything that involves complex modeling could be made more efficient with quantum computing. The UKs Met Office has said that it believes quantum computers offer the potential for carrying out far more advanced modeling than is currently possible today, and it is one of the avenues being explored for building next-generation forecasting systems.

Developing new medicines. Biotech startup ProteinQure has been exploring the potential of quantum computing in modeling protein, a key route in drug development. In other words, quantum computing could lead to the discovery of effective new drugs for some of the worlds biggest killers, including cancer and heart disease.

Most experts agree that truly useful quantum computing is not likely to be a feature of everyday life for some time. And even when quantum computers are commercially available, we as individuals will hardly be lining up to buy one. For most of the tasks we carry out on computers and smartphones, a traditional binary computer or smartphone will be all we need. But at an industry and society level, quantum computing could bring many exciting opportunities in the future.

Quantum computing is just one of 25 technology trends that I believe will transform our society. Read more about these key trends including plenty of real-world examples in my new book, Tech Trends in Practice: The 25 Technologies That Are Driving The 4th Industrial Revolution.

View original post here:

What Is Quantum Supremacy And Quantum Computing? (And How Excited Should We Be?) - Forbes

Written by admin

August 23rd, 2020 at 10:57 pm

Posted in Quantum Computing

Has the world’s most powerful computer arrived? – The National

Posted: at 10:57 pm

without comments

The quest to build the ultimate computer has taken a big step forward following breakthroughs in ensuring its answers can be trusted.

Known as a quantum computer, such a machine exploits bizarre effects in the sub-atomic world to perform calculations beyond the reach of conventional computers.

First proposed almost 40 years ago, tech giants Microsoft, Google and IBM are among those racing to exploit the power of quantum computing, which is expected to transform fields ranging from weather forecasting and drug design to artificial intelligence.

The power of quantum computers comes from their use of so-called qubits, the quantum equivalent of the 1s and 0s bits used by conventional number-crunchers.

Unlike bits, qubits exploit a quantum effect allowing them to be both 1s and 0s at the same time. The impact on processing power is astonishing. Instead of processing, say, 100 bits in one go, a quantum computer could crunch 100 qubits, equivalent to 2 to the power 100, or a million trillion trillion bits.

At least, that is the theory. The problem is that the property of qubits that gives them their abilities known as quantum superposition is very unstable.

Once created, even the slightest vibration, temperature shift or electromagnetic signal can disturb the qubits, causing errors in calculations. Unless the superposition can be maintained long enough, the quantum computer either does a few calculations well or a vast amount badly.

For years, the biggest achievement of any quantum computer involved using a few qubits to find the prime factors of 15 (which every schoolchild knows are 3 and 5).

Using complex shielding methods, researchers can now stabilise around 50 qubits long enough to perform impressive calculations.

Last October, Google claimed to have built a quantum computer that solved in 200 seconds a maths problem that would have taken an ultra-fast conventional computer more than 10,000 years.

Yet even this billion-fold speed-up is just a shadow of what would be possible if qubits could be kept stable for longer. At present, many of the qubits have their powers wasted being used to spot and fix errors.

Now two teams of researchers have independently found new ways of tackling the error problem.

Physicists at the University of Chicago have found a way of keeping qubits stable for longer not by blocking disturbances, but by blurring them.

It is like sitting on a merry-go-round with people yelling all around you

Dr Kevin Miao, computing expert

In some quantum computers, the qubits take the form of electrons whose direction of spin is a superposition of both up and down. By adding a constantly flipping magnetic field, the team found that the electrons rotated so quickly that they barely noticed outside disturbances. The researchers explain the trick with an analogy: It's like sitting on a merry-go-round with people yelling all around you, says team member Dr Kevin Miao. When the ride is still, you can hear them perfectly, but if you're rapidly spinning, the noise blurs into a background.

Describing their work in the journal Science, the team reported keeping the qubits working for about 1/50th of a second - around 10,000 times longer than their lifetime if left unshielded. According to the team, the technique is simple to use but effective against all the standard sources of disturbance. Meanwhile, researchers at the University of Sydney have come up with an algorithm that allows a quantum computer to work out how its qubits are being affected by disturbances and fix the resulting errors. Reporting their discovery in Nature Physics, the team says their method is ready for use with current quantum computers, and could work with up to 100 qubits.

These breakthroughs come at a key moment for quantum computing. Even without them, the technology is already spreading beyond research laboratories.

In June, the title of worlds most powerful quantum computer was claimed not by a tech giant but by Honeywell a company perhaps best known for central heating thermostats.

Needless to say, the claim is contested by some, not least because the machine is reported to have only six qubits. But Honeywell points out that it has focused its research on making those qubits ultra-stable which allows them to work reliably for far longer than rival systems. Numbers of qubits alone, in other words, are not everything.

And the company insists this is just the start. It plans to boost the performance of its quantum computer ten-fold each year for the next five years, making it 100,000 times more powerful still.

But apart from bragging rights, why is a company like Honeywell trying to take on the tech giants in the race for the ultimate computer ?

A key clue can be found in remarks made by Honeywell insiders to Forbes magazine earlier this month. These reveal that the company wants to use quantum computers to discover new kinds of materials.

Doing this involves working out how different molecules interact together to form materials with the right properties. Thats something conventional computers are already used for. But quantum computers wont just bring extra number-crunching power to bear. Crucially, like molecules themselves, their behaviour reflects the bizarre laws of quantum theory. And this makes them ideal for creating accurate simulations of quantum phenomena like the creation of new materials.

This often-overlooked feature of quantum computers was, in fact, the original motivation of the brilliant American physicist Richard Feynman, who first proposed their development in 1981.

Honeywell already has plans to use quantum computers to identify better refrigerants. These compounds were once notorious for attacking the Earths ozone layer, but replacements still have unwanted environmental effects. Being relatively simple chemicals, the search for better refrigerants is already within the reach of current quantum computers.

But Honeywell sees a time when far more complex molecules such as drugs will also be discovered using the technology.

For the time being, no quantum computer can match the all-round number-crunching power of standard computers. Just as Honeywell made its claim, the Japanese computer maker Fujitsu unveiled a supercomputer capable of over 500 million billion calculations a second.

Even so, the quantum computer is now a reality and before long it will make even the fastest supercomputer seem like an abacus.

Robert Matthews is Visiting Professor of Science at Aston University, Birmingham, UK

Updated: August 21, 2020 12:06 PM

Visit link:

Has the world's most powerful computer arrived? - The National

Written by admin

August 23rd, 2020 at 10:57 pm

Posted in Quantum Computing

Will Quantum Computers Really Destroy Bitcoin? A Look at the Future of Crypto, According to Quantum Physicist Anastasia Marchenkova – The Daily Hodl

Posted: at 10:57 pm

without comments

A quantum physicist is laying out the real-world impact of quantum computers on cryptography and cryptocurrency.

In a YouTube video, quantum physicist Anastasia Marchenkova shares her two cents about the race to break encryption technology with quantum computers.

Shors [quantum] algorithm can break RSA and elliptic curve cryptography, which is a problem because a lot of our data these days is encrypted with those two algorithms. Quantum computers are not faster at everything. Theyre just faster at certain problems and it just happens that this RSA and elliptic curve encryptions fall under that umbrella.

But there are other encryption algorithms that are not affected by quantum computers and we have to discover them and then actually implement them and put them into action before a large enough quantum computer actually emerges. [Breaking cryptography] requires a huge amount of qubits, something like 10 million qubits estimated. But it was one of the first discoveries of what practical application that quantum computers can actually do.

[Quantum computing] harnesses quantum properties to actually factor numbers a lot faster, and thats the whole core of the security behind RSA encryption. The consequences of this is that our data is not going to be secure anymore if we get a big enough quantum computer. So were going to have to do something about it.

Quantum computing has recently grabbed headlines as it poses a serious threat to cryptographic algorithms which keeps cryptocurrencies and the internet secure. Quantum computers have the capability to crack complex mathematical problems as qubits or quantum bits can maintain a superimposition by being in two states at a given time.

Meanwhile, Marchenkova doesnt think crypto holders must find a way to move their Bitcoin to a quantum secure wallet immediately. But she does believe anyone holding crypto should be concerned and keep tabs on the latest developments because blockchains will one day need to be upgraded to protect against the rise of quantum computing.

Yes, you should worry. But not anytime soon. You dont need to move your Bitcoin today to some other quantum secure wallet But in general, how do we upgrade the blockchain?

We can fork it and moving forward everything will be fine assuming we find a good quantum secure algorithm. But what are we going to do with all the old coins or the coins that have all private their keys lost? Are we just going to say Sorry, bye, this part of the chain will no longer be valid unless you move it or re-encrypt it. Or are we going to find new technology?


Read the original post:

Will Quantum Computers Really Destroy Bitcoin? A Look at the Future of Crypto, According to Quantum Physicist Anastasia Marchenkova - The Daily Hodl

Written by admin

August 23rd, 2020 at 10:57 pm

Posted in Quantum Computing

This Week’s Awesome Tech Stories From Around the Web (Through August 22) – Singularity Hub

Posted: at 10:57 pm

without comments


IBM Doubles Its Quantum Computer Performance Stephen Shankland | CNET Theres now a race afoot to make the fastest quantum computer. What makes the quantum computing competition different from most in the industry is that rivals are taking wildly different approaches. Its like a race pitting a horse against a car against an airplane against a bicycle.

750 Million Genetically Engineered Mosquitos Approved for Release in Florida Keys Sandee LaMotte | CNN the pilot project is designed to test if a genetically modified mosquito is a viable alternative to spraying insecticides to control the Aedes aegypti. Its a species of mosquito that carries several deadly diseases, such as Zika, dengue, chikungunya, and yellow fever.

A Rocket Scientists Love Algorithm Adds Up During Covid-19 Stephen Marche | Wired Online dating isway up, with more than half of users saying they have been on their dating appsmore during lockdown than before. Just as local businesses had to rush onto delivery platforms, and offices had to figure out Zoom meeting schedules, so the hard realities of the disease have pushed love in the direction it was already going: fully online.

How a Designer Used AI and Photoshop to Bring Ancient Roman Emperors Back to Life James Vincent | The Verge Machine learning is a fantastic tool for renovating old photos and videos. So much so that it can even bring ancient statues to life, transforming the chipped stone busts of long-dead Roman emperors into photorealistic faces you could imagine walking past on the street.

What If We Could Live for a Million Years? Avi Loeb | Scientific American With advances in bioscience and technology, one can imagine a post-Covid-19 future when most diseases are cured and our life span will increase substantially. If that happens, how would our goals change, and how would this shape our lives?

A Radical New Model of the Brain Illuminates Its Wiring Grace Huckins | Wired The brain literally is a network, agrees Olaf Sporns, a professor of psychological and brain sciences at Indiana University. Its not a metaphor. Im not comparing apples and oranges. I think this is literally what it is. And if network neuroscience can produce a clearer, more accurate picture of the way that the brain truly works, it may help us answer questions about cognition and health that have bedeviled scientists since Brocas time.

How Life Could Continue to Evolve Caleb Scharf | Nautilus theultimate currency of life in the universe may be life itself: The marvelous genetic surprises that biological and technological Darwinian experimentation can come up with given enough diversity of circumstances and time. Perhaps, in the end, our galaxy, and even our universe, is simply the test tube for a vast chemical computation exploring a mathematical terrain of possibilities that stretches on to infinity.

British Grading Debacle Shows Pitfalls of Automating Government Adam Satariano | The New York Times Those who have called for more scrutiny of the British governments use of technology said the testing scandal was a turning point in the debate, a vivid and easy-to-understand example of how software can affect lives.

Image credit: ESA/Hubble & NASA, J. Lee and the PHANGS-HST Team; Acknowledgment: Judy Schmidt (Geckzilla)

More here:

This Week's Awesome Tech Stories From Around the Web (Through August 22) - Singularity Hub

Written by admin

August 23rd, 2020 at 10:57 pm

Posted in Quantum Computing

A Meta-Theory of Physics Could Explain Life, the Universe, Computation, and More – Gizmodo

Posted: at 10:57 pm

without comments

You may think of physics as a way to explain the behaviors of things like black holes, colliding particles, falling apples, and quantum computers. But a small group physicists today is working on a theory that doesnt just study individual phenomena; its an entirely new way to describe the universe itself. This theory might solve wide-ranging problems such as why biological evolution is possible and how abstract things like ideas and information seem to possess properties that are independent of any physical system. Its called constructor theory, but as fascinating as it is, theres one glaring problem: how to test it.

When I first learned of constructor theory, it seemed too bold to be true, said Abel Jansma, a graduate student in physics and genetics at the University of Edinburgh. The early papers covered life, thermodynamics, and information, which seemed to be too much groundwork for such a young theory. But maybe its natural to work through the theory in this way. As an outsider, its exciting to watch.

As a young physics researcher in the 2010s, Chiara Marletto had been interested in problems regarding biological processes. The laws of physics do not say anything about the possibility of lifeyet even a slight tweak of any of the constants of physics would render life as we know it impossible. So why is evolution by natural selection possible in the first place? No matter how long you stared at the equations of physics, it would never dawn on you that they allow for biological evolutionand yet, apparently, they do.

Marletto was dissatisfied by this paradox. She wanted to explain why the emergence and evolution of life is possible when the laws of physics contain no hints that it should be. She came across a 2013 paper written by Oxford physicist and quantum computing pioneer David Deutsch, in which he laid the foundation for constructor theory, the fundamental principle of which is: All other laws of physics are expressible entirely in terms of statements about which physical transformations are possible and which are impossible, and why.

Marletto said she suspected that constructor theory had a useful set of tools to address this problem of why evolution is possible despite the laws of physics not explicitly encoding the design of biological adaptations. Intrigued by the possibilities, Marletto soon shifted the focus of her PhD research to constructor theory.

While many theories are concerned with what does happen, constructor theory is about what can possibly happen. In the current paradigm of physics, one seeks to predict the trajectory of, say, a wandering comet, given its initial state and general relativitys equations of motion. Constructor theory, meanwhile, is more general and seeks to explain which trajectories of said comet are possible in principle. For instance, no tra jectory in which the comets velocity exceeds the speed of light is possible, but trajectories in which its velocity remains below this limit are possible, provided that they are also consistent with the laws of relativity.

The prevailing theories of physics today can explain things as titanically violent as the collision of two black holes, but they struggle to explain how and why a tree exists. Because constructor theory is concerned with what can possibly happen, it can explain regularitiesany patterns that warrant explanationin domains that are inherently unpredictable, such as evolution.

Constructor theory can also capture properties of information, which do not depend on the physical system in which they exist: The same song lyrics can be sent over radio waves, conjured in ones mind, or written on a piece of paper, for example. The constructor theory of information also proposes new principles that explain which transformations of information are possible and impossible, and why.

The laws of thermodynamics, too, have been expressed exactly in constructor theory; previously, theyd only been stated as approximations that would only apply at certain scales. For example, in attempting to capture the Second Law of Thermodynamicsthat the entropy of isolated systems can never decrease over timesome models show that a physical system will reach eventual equilibrium (maximum entropy) because that is the most probable configuration of the system. But the scale at which these configurations are measured has traditionally been arbitrary. Would such models work for systems at the nanoscale, or for systems that are composed of merely one particle? By recasting the laws of thermodynamics in terms of possible and impossible transformations, rather than in terms of the time evolution of a physical system, constructor theory has expressed these laws in exact, scale-independent statements: It describes the Second Law of Thermodynamics as allowing some transformation from X to Y to be possible, but not its inversework can be entirely converted into heat, but heat can never be entirely converted into work without side effects.

Physics has come a long way since the days of the Scientific Revolution. In 1687, Isaac Newton proposed his universal physical theory in his magnum opus, Principia Mathematica. Newtons theory, called classical mechanics, was founded on his famous three laws of motion. Newtons theory implies that if one knows both the force acting on a system for some time interval as well as the systems initial velocity and position, then one could use classical mechanics equations of motion to predict the systems velocity and position at any subsequent moment in that time interval. In the first few decades of the 20th century, classical mechanics was shown to be wrong from two directions. Quantum mechanics overturned Newton in explaining the physics of the microscopic world. Einsteins general relativity superseded classical mechanics and deepened our understanding of gravity and the nature of mass, space, and time. Although the details differ between the three theoriesclassical mechanics, quantum mechanics, and general relativitythey are all nevertheless expressible in terms of initial conditions and dynamical laws of motion that allow one to predict the state of a systems trajectory across time. This general framework is known as the prevailing conception.

But there are many domains in which our best theories are simply not expressible in terms of the prevailing conception of initial conditions plus laws of motion. For instance, quantum computations laws are not fundamentally about what happens in a quantum system following some initial state but rather about what transformations of information are possible and impossible. The problem of whether or not a so-called universal quantum computera quantum computer that is capable of simulating any physical system to arbitrary accuracycan possibly be built is utterly foreign to the initial conditions plus laws of motion framework. Even in cosmology, the well-known problem of explaining the initial conditions of the universe is difficult in the prevailing conception: We can work backward to understand what happened in the moments after the Big Bang, but we have no explanation for why the universe was in its particular initial state rather than any other. Constructor theory, though, may be able to show that the initial conditions of our universeat the moment of the Big Bangcan be deduced from the theorys principles. If you only think of physics in terms of the prevailing conception, problems in quantum computation, biology, and the creation of the universe can seem impossible to solve.

The basic ingredients of constructor theory are the constructor, the input substrate, and the output substrate. The constructor is any object that is capable of causing a particular physical transformation and retains its ability to do so again. The input substrate is the physical system that is presented to the constructor, and the output substrate is the physical system that results from the constructors transformation of the input.

For a simple example of how constructor theory might describe a system, consider a smoothie blender. This device takes in ingredients such as milk, fruits, and sugar and outputs a drink in completed, homogenized form. The blender is a constructor, as it is capable of repeating this transformation again and again. The input substrate is the set of ingredients, and the output substrate is the smoothie.

A more cosmic example is our Sun. The Sun acts as a nuclear fusion reactor that takes hydrogen as its input substrate and converts it into helium and light as its output substrate. The Sun itself is the constructor, as it retains its ability to cause another such conversion.

In the prevailing conception, one might take the Suns initial state and run it through the appropriate algorithm, which would yield a prediction of the Suns ending once it has run out of fuel. In constructor theory, one instead expresses that the transformation of hydrogen into helium and light is possible. Once its known that the transformation from hydrogen to helium and light is possible, it follows that a constructor that can cause such a transformation is also possible.

Constructor theorys fundamental principle implies that all laws of physicsthose of general relativity, thermodynamics, quantum mechanics, and even informationcan be expressed as which physical transformations are possible in principle and which are not.

This setup is, perhaps counterintuitively, extremely general. It includes a chemical reaction in the presence of a catalyst: the chemical catalyst is the constructor, while the reactants are the input substrate and the products are the output substrate. The operation of a computer is also a kind of construction: the computer (and its program) is a constructor, and the informational input and output correspond to constructor theorys input substrate and output substrate. A heat engine is yet another kind of constructor, and so are all forms of self-reproducing life. Think of a bacterium with some genetic code. The cell along with its code are a kind of constructor whose output is an offspring cell with a copy of the parent cells genetic code.

Because explaining which transformations are possible and which are impossible never relies on the particular form that a constructor takes, it can be abstracted away, leaving statements about transformations as the main focus of constructor theory. This is already extremely advantageous, since, for instance, one could express which computer programs or simulations are realizable and which are not in principle, without having to worry about the details of the computer itself.

How could one show that the evolution of life, with all of its elegant adaptations and appearance of design, is compatible with the laws of physics, which seem to contain no design whatsoever? No amount of inspection of the equations of general relativity and quantum mechanics would result in a eureka momentthey show no hint of the possibility of life. Darwins theory of evolution by natural selection explains the appearance of design in the biosphere, but it fails to explain why such a process is possible in the first place.

Biological evolution is understood today as a process whereby genes propagate over generations by replicating themselves at the expense of rival, alternative genes called alleles. Furthermore, genes have evolved complex vehicles for themselves that they use to reproduce, such as cells and organisms, including you. The biologist Richard Dawkins is famous for, among other things, popularizing this view of evolution: G enes are the fundamental unit of natural selection, and they strive for immortality by copying themselves as strands of DNA, using temporary, protective vehicles to proliferate from generation to generation. Copying is imperfect, which results in genetic mutations and therefore variation in the ability of genes to spread in this great competition with their rivals. The environment of the genes is the arbiter that determines which genes are best able to spread and which are unfit to do soand therefore, is the source of natural selection.

With this replicator-vehicle logic in mind, one can state the problem more precisely: The laws of physics do not make explicit that the transformations required by evolution and by biological adaptations are possible. Given this, what properties must the laws of physics possess to allow for such a process that demands self-reproduction, the appearance of design, and natural selection?

Note that this question cannot be answered in the prevailing conception, which would force us to try to predict the emergence of life following, say, the initial conditions of the universe. Constructor theory allows us to reframe the problem and consider why and under what conditions life is possible. As Marletto put it in a 2014 paper: the prevailing conception could at most predict the exact number of goats that will (or will probably) appear on Earth given certain initial conditions. In constructor theory, one states instead whether goats are possible and why.

Marlettos paper, Constructor Theory of Life, was published just two years after Deutschs initial paper. In it, she shows that the evolution of life is compatible with laws of physics that themselves contain no design, provided that they allow for the embodiment of digital information (on Earth, this takes the form of DNA). She also shows that an accurate replicator, such as survivable genes, must use vehicles in order to evolve. In this sense, if constructor theory is true, then temporary vehicles are not merely a contingency of life on our planet but rather mandated by the laws of nature. One interesting prediction that bears on the search for extraterrestrial life is that wherever you find life in the universe, it will necessarily rely on replicators and vehicles. Of course, these may not be the DNA, cells, and organisms with which we are familiar, but replicators and vehicles will be present in some arrangement.

You can think of constructor theory as a theory about theories. By contrast, general relativity explains and predicts the motions of objects as they interact with each other and the arena of space-time. Such a theory can be called an object-level theory. Constructor theory, on the other hand, is a meta-level theoryits statements are laws about laws. So while general relativity mandates the behavior of all stars, both those weve observed and those that weve never seen, constructor theory mandates that all object-level theories, both current and future, conform to its meta-level laws, also called principles. With hindsight, we can see that scientists have already taken such principles seriously, even before the dawn of constructor theory. For example, physicists expect that all as-yet unknown physical theories will conform to the principle of conservation of energy.

General relativity can be tested by observing the motions of stars and galaxies; quantum mechanics can be tested in laboratories like the Large Hadron Collider. But since constructor theory principles do not make direct predictions about the motion of physical systems, how could one test them? Vlatko Vedral, Oxford physicist and professor of quantum information science, has been collaborating with Marletto to do exactly that, by imagining laboratory experiments in which quantum mechanical systems could interact with gravity.

One of the greatest outstanding problems in modern physics is that general relativity and quantum mechanics are incompatible with each othergeneral relativity does not explain the tiny motions and interactions of atoms, while quantum mechanics does not explain gravity nor its effects on massive objects. All sorts of proposals have been formulated that might unify the two pillars under a deeper theory that contains both of them, but these are notoriously difficult to test experimentally. However, one could go around directly testing such theories by instead considering the principles to which they should conform.

In 2014, Marletto and Deutsch published a paper outlining the constructor theory of information, in which they expressed quantities such as information, computation, measurement, and distinguishability in terms of possible and impossible transformations. Importantly, they also showed that all of the accepted features of quantum information follow from their proposed constructor theoretic prin ciples. An information medium is a physical system in which information is substantiated, such as a computer or a brain. An observable is any physical quantity that can be measured. They defined a superinformation medium as an information medium with at least two information observables whose union is not an information observable. For example, in quantum theory, one can measure exactly a particles velocity or its position, but never both simultaneously. Quantum information is an example of superinformation. But crucially, the constructor theoretic concept of superinformation is more general and is expected to hold for any theories that supersede quantum theory and general relativity as well.

In a working paper from March 2020, Marletto and Vedral showed that if the constructor theoretic principles of information are correct, then if two quantum systems, such as two masses, become entangled with each other via a third system, such as a gravitational field, then this third system must itself be quantum (one of their earlier publications on the problem can be found here). So, if one could construct an experiment in which a gravitational field can locally generate entanglement between, say, two qubits, then gravity must be non-classicalit would have two observables that cannot simultaneously be measured with the same precision, as is the case in quantum theory. If such an experiment were to show no entanglement between the qubits, then constructor theory would require an overhaul, or it may be outright false.

Should the experiment show entanglement between the two masses, all current attempts to unify general relativity and quantum mechanics that assume that gravity is classical would be ruled out.

There are three versions of how gravity could be made consistent with quantum physics, said Vedral. One of them is to have a fully quantum gravity. Theories that propose fully quantum gravity include loop quantum gravity, the idea that space is composed of loops of gravitational fields, and string theory, the idea that particles are made up of strings, which move through space and some of whose vibrations correspond to quantum mechanical particles that carry gravitational force.

These would be consistent with a positive outcome of our proposed experiment, said Vedral. The ones that would be refuted are the so-called semi-classical theories, such as whats called quantum theory in curved space-time. There is a whole range of these theories. All of them would be ruled outit would be inconsistent to think of space-time as classical if its really capable of producing entanglement between two massive particles.

Marletto and Vedrals proposed experiment, unfortunately, faces some major practical challenges.

I think our experiment is still five or six orders of magnitude away from current technological capabilities, said Vedral. One issue is that we need to eliminate any sources of noise, like induced electromagnetic interaction... The other issue is that its very hard to create a near-perfect vacuum. If you have a background bunch of molecules around objects that you want to entangle, even a single collision between one of the background molecules and one of the objects you wish to entangle, this could be detrimental and cause decoherence. The vacuum has to be so close to perfect as to guarantee that not a single atomic collision happens during the experiment.

Vedral came to constructor theory as an interested outsider, having focused primarily on issues of quantum information. He sometimes thinks about the so-called universal constructor, a theoretical device that is capable of performing all possible tasks that the laws of physics allow.

While we have models of the universal computermeaning ideas of how to make a computer that can simulate any physical systemwe have no such thing for the universal constructor. A breakthrough might be a set of axioms that capture what it means to be a universal constructor. This is a big open problem. What kind of machine would that be? This excites me a lot. Its a wide-open field. If I was a young researcher, I would jump on that now. It feels like the next revolution.

Samuel Kuypers, a physics graduate student at the University of Oxford who works in the field of quantum information, said that constructor theory has unequivocally achieved great successes already, such as grounding concepts of information in exact physical terms and rigorously explaining the difference between heat and work in thermodynamics, but it should be judged as an ongoing project with a set of aims and problems. Thinking of potential future achievements, Kuypers hopes that general relativity can be reformulated in constructor theoretic terms, which I think would be extremely fruitful for trying to unify general relativity and quantum mechanics.

Time will tell whether or not constructor theory is a revolution in the making. In the few years since its inception, only a handful of physicists, primarily at Oxford University, have been working on it. Constructor theory is of a different character than other speculative theories, like string theory. It is an entirely different way of thinking about the nature of reality, and its ambitions are perhaps even bolder than those of the more mainstream speculations. If constructor theory continues to solve problems, then physicists may come to adopt a revolutionary new worldview. They will think of reality not as a machine that behaves predictably according to laws of motion, but as a cosmic ocean full of resources capable of being transformed by an appropriate constructor. It would be a reality defined by possibility rather than destiny.

Logan Chipkin is a freelance writer in Philadelphia. His writing focuses on science, philosophy, economics, and history. Links to previous publications can be found at Follow him on Twitter @ChipkinLogan.

Read the rest here:

A Meta-Theory of Physics Could Explain Life, the Universe, Computation, and More - Gizmodo

Written by admin

August 23rd, 2020 at 10:57 pm

Posted in Quantum Computing

This Twist on Schrdinger’s Cat Paradox Has Major Implications for Quantum Theory – Scientific American

Posted: at 10:57 pm

without comments

What does it feel like to be both alive and dead?

That question irked and inspired Hungarian-American physicist Eugene Wigner in the 1960s. He was frustrated by the paradoxes arising from the vagaries of quantum mechanicsthe theory governing the microscopic realm that suggests, among many other counterintuitive things, that until a quantum system is observed, it does not necessarily have definite properties. Take his fellow physicist Erwin Schrdingers famous thought experiment in which a cat is trapped in a box with poison that will be released if a radioactive atom decays. Radioactivity is a quantum process, so before the box is opened, the story goes, the atom has both decayed and not decayed, leaving the unfortunate cat in limboa so-called superposition between life and death. But does the cat experience being in superposition?

Wigner sharpened the paradox by imagining a (human) friend of his shut in a lab, measuring a quantum system. He argued it was absurd to say his friend exists in a superposition of having seen and not seen a decay unless and until Wigner opens the lab door. The Wigners friend thought experiment shows that things can become very weird if the observer is also observed, says Nora Tischler, a quantum physicist at Griffith University in Brisbane, Australia.

Now Tischler and her colleagues have carried out a version of the Wigners friend test. By combining the classic thought experiment with another quantum head-scratcher called entanglementa phenomenon that links particles across vast distancesthey have also derived a new theorem, which they claim puts the strongest constraints yet on the fundamental nature of reality. Their study, which appeared in Nature Physics on August 17, has implications for the role that consciousness might play in quantum physicsand even whether quantum theory must be replaced.

The new work is an important step forward in the field of experimental metaphysics, says quantum physicist Aephraim Steinberg of the University of Toronto, who was not involved in the study. Its the beginning of what I expect will be a huge program of research.

Until quantum physics came along in the 1920s, physicists expected their theories to be deterministic, generating predictions for the outcome of experiments with certainty. But quantum theory appears to be inherently probabilistic. The textbook versionsometimes called the Copenhagen interpretationsays that until a systems properties are measured, they can encompass myriad values. This superposition only collapses into a single state when the system is observed, and physicists can never precisely predict what that state will be. Wigner held the then popular view that consciousness somehow triggers a superposition to collapse. Thus, his hypothetical friend would discern a definite outcome when she or he made a measurementand Wigner would never see her or him in superposition.

This view has since fallen out of favor. People in the foundations of quantum mechanics rapidly dismiss Wigners view as spooky and ill-defined because it makes observers special, says David Chalmers, a philosopher and cognitive scientist at New York University. Today most physicists concur that inanimate objects can knock quantum systems out of superposition through a process known as decoherence. Certainly, researchers attempting to manipulate complex quantum superpositions in the lab can find their hard work destroyed by speedy air particles colliding with their systems. So they carry out their tests at ultracold temperatures and try to isolate their apparatuses from vibrations.

Several competing quantum interpretations have sprung up over the decades that employ less mystical mechanisms, such as decoherence, to explain how superpositions break down without invoking consciousness. Other interpretations hold the even more radical position that there is no collapse at all. Each has its own weird and wonderful take on Wigners test. The most exotic is the many worlds view, which says that whenever you make a quantum measurement, reality fractures, creating parallel universes to accommodate every possible outcome. Thus, Wigners friend would split into two copies and, with good enough supertechnology, he could indeed measure that person to be in superposition from outside the lab, says quantum physicist and many-worlds fan Lev Vaidman of Tel Aviv University.

The alternative Bohmian theory (named for physicist David Bohm) says that at the fundamental level, quantum systems do have definite properties; we just do not know enough about those systems to precisely predict their behavior. In that case, the friend has a single experience, but Wigner may still measure that individual to be in a superposition because of his own ignorance. In contrast, a relative newcomer on the block called the QBism interpretation embraces the probabilistic element of quantum theory wholeheartedly (QBism, pronounced cubism, is actually short for quantum Bayesianism, a reference to 18th-century mathematician Thomas Bayess work on probability.) QBists argue that a person can only use quantum mechanics to calculate how to calibrate his or her beliefs about what he or she will measure in an experiment. Measurement outcomes must be regarded as personal to the agent who makes the measurement, says Ruediger Schack of Royal Holloway, University of London, who is one of QBisms founders.According to QBisms tenets, quantum theory cannot tell you anything about the underlying state of reality, nor can Wigner use it to speculate on his friends experiences.

Another intriguing interpretation, called retrocausality, allows events in the future to influence the past. In a retrocausal account, Wigners friend absolutely does experience something, says Ken Wharton, a physicist at San Jose State University, who is an advocate for this time-twisting view. But that something the friend experiences at the point of measurement can depend upon Wigners choice of how to observe that person later.

The trouble is that each interpretation is equally goodor badat predicting the outcome of quantum tests, so choosing between them comes down to taste. No one knows what the solution is, Steinberg says. We dont even know if the list of potential solutions we have is exhaustive.

Other models, called collapse theories, do make testable predictions. These models tack on a mechanism that forces a quantum system to collapse when it gets too bigexplaining why cats, people and other macroscopic objects cannot be in superposition. Experiments are underway to hunt for signatures of such collapses, but as yet they have not found anything. Quantum physicists are also placing ever larger objects into superposition: last year a team in Vienna reported doing so with a 2,000-atom molecule. Most quantum interpretations say there is no reason why these efforts to supersize superpositions should not continue upward forever, presuming researchers can devise the right experiments in pristine lab conditions so that decoherence can be avoided. Collapse theories, however, posit that a limit will one day be reached, regardless of how carefully experiments are prepared. If you try and manipulate a classical observera human, sayand treat it as a quantum system, it would immediately collapse, says Angelo Bassi, a quantum physicist and proponent of collapse theories at the University of Trieste in Italy.

Tischler and her colleagues believed that analyzing and performing a Wigners friend experiment could shed light on the limits of quantum theory. They were inspired by a new wave of theoretical and experimental papers that have investigated the role of the observer in quantum theory by bringing entanglement into Wigners classic setup. Say you take two particles of light, or photons, that are polarized so that they can vibrate horizontally or vertically. The photons can also be placed in a superposition of vibrating both horizontally and vertically at the same time, just as Schrdingers paradoxical cat can be both alive and dead before it is observed.

Such pairs of photons can be prepared togetherentangledso that their polarizations are always found to be in the opposite direction when observed. That may not seem strangeunless you remember that these properties are not fixed until they are measured. Even if one photon is given to a physicist called Alice in Australia, while the other is transported to her colleague Bob in a lab in Vienna, entanglement ensures that as soon as Alice observes her photon and, for instance, finds its polarization to be horizontal, the polarization of Bobs photon instantly syncs to vibrating vertically. Because the two photons appear to communicate faster than the speed of lightsomething prohibited by his theories of relativitythis phenomenon deeply troubled Albert Einstein, who dubbed it spooky action at a distance.

These concerns remained theoretical until the 1960s, when physicist John Bell devised a way to test if reality is truly spookyor if there could be a more mundane explanation behind the correlations between entangled partners. Bell imagined a commonsense theory that was localthat is, one in which influences could not travel between particles instantly. It was also deterministic rather than inherently probabilistic, so experimental results could, in principle, be predicted with certainty, if only physicists understood more about the systems hidden properties. And it was realistic, which, to a quantum theorist, means that systems would have these definite properties even if nobody looked at them. Then Bell calculated the maximum level of correlations between a series of entangled particles that such a local, deterministic and realistic theory could support. If that threshold was violated in an experiment, then one of the assumptions behind the theory must be false.

Such Bell tests have since been carried out, with a series of watertight versions performed in 2015, and they have confirmed realitys spookiness. Quantum foundations is a field that was really started experimentally by Bells [theorem]now over 50 years old. And weve spent a lot of time reimplementing those experiments and discussing what they mean, Steinberg says. Its very rare that people are able to come up with a new test that moves beyond Bell.

The Brisbane teams aim was to derive and test a new theorem that would do just that, providing even stricter constraintslocal friendliness boundson the nature of reality. Like Bells theory, the researchers imaginary one is local. They also explicitly ban superdeterminismthat is, they insist that experimenters are free to choose what to measure without being influenced by events in the future or the distant past. (Bell implicitly assumed that experimenters can make free choices, too.) Finally, the team prescribes that when an observer makes a measurement, the outcome is a real, single event in the worldit is not relative to anyone or anything.

Testing local friendliness requires a cunning setup involving two superobservers, Alice and Bob (who play the role of Wigner), watching their friends Charlie and Debbie. Alice and Bob each have their own interferometeran apparatus used to manipulate beams of photons. Before being measured, the photons polarizations are in a superposition of being both horizontal and vertical. Pairs of entangled photons are prepared such that if the polarization of one is measured to be horizontal, the polarization of its partner should immediately flip to be vertical. One photon from each entangled pair is sent into Alices interferometer, and its partner is sent to Bobs. Charlie and Debbie are not actually human friends in this test. Rather, they are beam displacers at the front of each interferometer. When Alices photon hits the displacer, its polarization is effectively measured, and it swerves either left or right, depending on the direction of the polarization it snaps into. This action plays the role of Alices friend Charlie measuring the polarization. (Debbie similarly resides in Bobs interferometer.)

Alice then has to make a choice: She can measure the photons new deviated path immediately, which would be the equivalent of opening the lab door and asking Charlie what he saw. Or she can allow the photon to continue on its journey, passing through a second beam displacer that recombines the left and right pathsthe equivalent of keeping the lab door closed. Alice can then directly measure her photons polarization as it exits the interferometer. Throughout the experiment, Alice and Bob independently choose which measurement choices to make and then compare notes to calculate the correlations seen across a series of entangled pairs.

Tischler and her colleagues carried out 90,000 runs of the experiment. As expected, the correlations violated Bells original boundsand crucially, they also violated the new local-friendliness threshold. The team could also modify the setup to tune down the degree of entanglement between the photons by sending one of the pair on a detour before it entered its interferometer, gently perturbing the perfect harmony between the partners. When the researchers ran the experiment with this slightly lower level of entanglement, they found a point where the correlations still violated Bells bound but not local friendliness. This result proved that the two sets of bounds are not equivalent and that the new local-friendliness constraints are stronger, Tischler says. If you violate them, you learn more about reality, she adds. Namely, if your theory says that friends can be treated as quantum systems, then you must either give up locality, accept that measurements do not have a single result that observers must agree on or allow superdeterminism. Each of these options has profoundand, to some physicists, distinctly distastefulimplications.

The paper is an important philosophical study, says Michele Reilly, co-founder of Turing, a quantum-computing company based in New York City, who was not involved in the work. She notes that physicists studying quantum foundations have often struggled to come up with a feasible test to back up their big ideas. I am thrilled to see an experiment behind philosophical studies, Reilly says. Steinberg calls the experiment extremely elegant and praises the team for tackling the mystery of the observers role in measurement head-on.

Although it is no surprise that quantum mechanics forces us to give up a commonsense assumptionphysicists knew that from Bellthe advance here is that we are a narrowing in on which of those assumptions it is, says Wharton, who was also not part of the study. Still, he notes, proponents of most quantum interpretations will not lose any sleep. Fans of retrocausality, such as himself, have already made peace with superdeterminism: in their view, it is not shocking that future measurements affect past results. Meanwhile QBists and many-worlds adherents long ago threw out the requirement that quantum mechanics prescribes a single outcome that every observer must agree on.

And both Bohmian mechanics and spontaneous collapse models already happily ditched locality in response to Bell. Furthermore, collapse models say that a real macroscopic friend cannot be manipulated as a quantum system in the first place.

Vaidman, who was also not involved in the new work, is less enthused by it, however, and criticizes the identification of Wigners friend with a photon. The methods used in the paper are ridiculous; the friend has to be macroscopic, he says. Philosopher of physics Tim Maudlin of New York University, who was not part of the study, agrees. Nobody thinks a photon is an observer, unless you are a panpsychic, he says. Because no physicist questions whether a photon can be put into superposition, Maudlin feels the experiment lacks bite. It rules something outjust something that nobody ever proposed, he says.

Tischler accepts the criticism. We dont want to overclaim what we have done, she says. The key for future experiments will be scaling up the size of the friend, adds team member Howard Wiseman, a physicist at Griffith University. The most dramatic result, he says, would involve using an artificial intelligence, embodied on a quantum computer, as the friend. Some philosophers have mused that such a machine could have humanlike experiences, a position known as the strong AI hypothesis, Wiseman notes, though nobody yet knows whether that idea will turn out to be true. But if the hypothesis holds, this quantum-based artificial general intelligence (AGI) would be microscopic. So from the point of view of spontaneous collapse models, it would not trigger collapse because of its size. If such a test was run, and the local-friendliness bound was not violated, that result would imply that an AGIs consciousness cannot be put into superposition. In turn, that conclusion would suggest that Wigner was right that consciousness causes collapse. I dont think I will live to see an experiment like this, Wiseman says. But that would be revolutionary.

Reilly, however, warns that physicists hoping that future AGI will help them home in on the fundamental description of reality are putting the cart before the horse. Its not inconceivable to me that quantum computers will be the paradigm shift to get to us into AGI, she says. Ultimately, we need a theory of everything in order to build an AGI on a quantum computer, period, full stop.

That requirement may rule out more grandiose plans. But the team also suggests more modest intermediate tests involving machine-learning systems as friends, which appeals to Steinberg. That approach is interesting and provocative, he says. Its becoming conceivable that larger- and larger-scale computational devices could, in fact, be measured in a quantum way.

Renato Renner, a quantum physicist at the Swiss Federal Institute of Technology Zurich (ETH Zurich), makes an even stronger claim: regardless of whether future experiments can be carried out, he says, the new theorem tells us that quantum mechanics needs to be replaced. In 2018 Renner and his colleague Daniela Frauchiger, then at ETH Zurich, published a thought experiment based on Wigners friend and used it to derive a new paradox. Their setup differs from that of the Brisbane team but also involves four observers whose measurements can become entangled. Renner and Frauchiger calculated that if the observers apply quantum laws to one another, they can end up inferring different results in the same experiment.

The new paper is another confirmation that we have a problem with current quantum theory, says Renner, who was not involved in the work. He argues that none of todays quantum interpretations can worm their way out of the so-called Frauchiger-Renner paradox without proponents admitting they do not care whether quantum theory gives consistent results. QBists offer the most palatable means of escape, because from the outset, they say that quantum theory cannot be used to infer what other observers will measure, Renner says. It still worries me, though: If everything is just personal to me, how can I say anything relevant to you? he adds. Renner is now working on a new theory that provides a set of mathematical rules that would allow one observer to work out what another should see in a quantum experiment.

Still, those who strongly believe their favorite interpretation is right see little value in Tischlers study. If you think quantum mechanics is unhealthy, and it needs replacing, then this is useful because it tells you new constraints, Vaidman says. But I dont agree that this is the casemany worlds explains everything.

For now, physicists will have to continue to agree to disagree about which interpretation is best or if an entirely new theory is needed. Thats where we left off in the early 20th centurywere genuinely confused about this, Reilly says. But these studies are exactly the right thing to do to think through it.

Disclaimer: The author frequently writes for the Foundational Questions Institute, which sponsors research in physics and cosmologyand partially funded the Brisbane teams study.

See original here:

This Twist on Schrdinger's Cat Paradox Has Major Implications for Quantum Theory - Scientific American

Written by admin

August 23rd, 2020 at 10:57 pm

Posted in Quantum Computing

Page 21234..10..»