IBM and MIT kickstarted the age of quantum computing in 1981 – Fast Company

Posted: May 9, 2021 at 1:52 am

without comments

In May 1981, at a conference center housed in a chateau-style mansion outside Boston, a few dozen physicists and computer scientists gathered for a three-day meeting. The assembled brainpower was formidable: One attendee, Caltechs Richard Feynman, was already a Nobel laureate and would earn a widespread reputation for genius when his 1985 memoir Surely Youre Joking, Mr. Feynman!: Adventures of a Curious Character became a bestseller. Numerous others, such as Paul Benioff, Arthur Burks, Freeman Dyson, Edward Fredkin, Rolf Landauer, John Wheeler, and Konrad Zuse, were among the most accomplished figures in their respective research areas.

The conference they were attending, The Physics of Computation, was held from May 6 to 8 and cohosted by IBM and MITs Laboratory for Computer Science. It would come to be regarded as a seminal moment in the history of quantum computingnot that anyone present grasped that as it was happening.

Its hard to put yourself back in time, says Charlie Bennett, a distinguished physicist and information theorist who was part of the IBM Research contingent at the event. If youd said quantum computing, nobody would have understood what you were talking about.

Why was the conference so significant? According to numerous latter-day accounts, Feynman electrified the gathering by calling for the creation of a quantum computer. But I dont think he quite put it that way, contends Bennett, who took Feynmans comments less as a call to action than a provocative observation. He just said the world is quantum, Bennett remembers. So if you really wanted to build a computer to simulate physics, that should probably be a quantum computer.

For a guide to whos who in this 1981 Physics of Computation photo, click here. [Photo: courtesy of Charlie Bennett, who isnt in itbecause he took it]Even if Feynman wasnt trying to kick off a moonshot-style effort to build a quantum computer, his talkand The Physics of Computation conference in generalproved influential in focusing research resources. Quantum computing was nobodys day job before this conference, says Bennett. And then some people began considering it important enough to work on.

It turned out to be such a rewarding area for study that Bennett is still working on it in 2021and hes still at IBM Research, where hes been, aside from the occasional academic sabbatical, since 1972. His contributions have been so significant that hes not only won numerous awards but also had one named after him. (On Thursday, he was among the participants in an online conference on quantum computings past, present, and future that IBM held to mark the 40th anniversary of the original meeting.)

Charlie Bennett [Photo: courtesy of IBM]These days, Bennett has plenty of company. In recent years, quantum computing has become one of IBMs biggest bets, as it strives to get the technology to the point where its capable of performing useful work at scale, particularly for the large organizations that have long been IBMs core customer base. Quantum computing is also a major area of research focus at other tech giants such as Google, Microsoft, Intel, and Honeywell, as well as a bevy of startups.

According to IBM senior VP and director of research Dario Gil, the 1981 Physics of Computation conference played an epoch-shifting role in getting the computing community excited about quantum physicss possible benefits. Before then, in the context of computing, it was seen as a source of noiselike a bothersome problem that when dealing with tiny devices, they became less reliable than larger devices, he says. People understood that this was driven by quantum effects, but it was a bug, not a feature.

Making progress in quantum computing has continued to require setting aside much of what we know about computers in their classical form. From early room-sized mainframe monsters to the smartphone in your pocket, computing has always boiled down to performing math with bits set either to one or zero. But instead of depending on bits, quantum computers leverage quantum mechanics through a basic building block called a quantum bit, or qubit. It can represent a one, a zero, orin a radical departure from classical computingboth at once.

Dario Gil [Photo: courtesy of IBM]Qubits give quantum computers the potential to rapidly perform calculations that might be impossibly slow on even the fastest classical computers. That could have transformative benefits for applications ranging from drug discovery to cryptography to financial modeling. But it requires mastering an array of new challenges, including cooling superconducting qubits to a temperature only slightly above abolute zero, or -459.67 Farenheit.

Four decades after the 1981 conference, quantum computing remains a research project in progress, albeit one thats lately come tantalizingly close to fruition. Bennett says that timetable isnt surprising or disappointing. For a truly transformative idea, 40 years just isnt that much time: Charles Babbage began working on his Analytical Engine in the 1830s, more than a century before technological progress reached the point where early computers such as IBMs own Automated Sequence Controlled Calculator could implement his concepts in a workable fashion. And even those machines came nowhere near fulfilling the vision scientists had already developed for computing, including some things that [computers] failed at miserably for decades, like language translation, says Bennett.

I think was the first time ever somebody said the phrase quantum information theory.

In 1970, as a Harvard PhD candidate, Bennett was brainstorming with fellow physics researcher Stephen Wiesner, a friend from his undergraduate days at Brandeis. Wiesner speculated that quantum physics would make it possible to send, through a channel with a nominal capacity of one bit, two bits of information; subject however to the constraint that whichever bit the receiver choose to read, the other bit is destroyed, as Bennett jotted in notes whichfortunately for computing historyhe preserved.

Charlie Bennetts 1970 notes on Stephen Wiesners musings about quantum physics and computing (click to expand). [Photo: courtesy of Charlie Bennett]I think was the first time ever somebody said the phrase quantum information theory,' says Bennett. The idea that you could do things of not just a physics nature, but an information processing nature with quantum effects that you couldnt do with ordinary data processing.

Like many technological advances of historic proportionsAI is another examplequantum computing didnt progress from idea to reality in an altogether predictable and efficient way. It took 11 years from Wiesners observation until enough people took the topic seriously enough to inspire the Physics of Computation conference. Bennett and the University of Montreals Gilles Brassard published important research on quantum cryptography in 1984; in the 1990s, scientists realized that quantum computers had the potential to be exponentially faster than their classical forebears.

All along, IBM had small teams investigating the technology. According to Gil, however, it wasnt until around 2010 that the company had made enough progress that it began to see quantum computing not just as an intriguing research area but as a powerful business opportunity. What weve seen since then is this dramatic progress over the last decade, in terms of scale, effort, and investment, he says.

IBMs superconducting qubits need to be kept chilled in a super fridge. [Photo: courtesy of IBM]As IBM made that progress, it shared it publicly so that interested parties could begin to get their heads around quantum computing at the earliest opportunity. Starting in May 2016, for instance, the company made quantum computing available as a cloud service, allowing outsiders to tinker with the technology in a very early form.

It is really important that when you put something out, you have a path to deliver.

One of the things that road maps provide is clarity, he says, allowing that road maps without execution are hallucinations, so it is really important that when you put something out, you have a path to deliver.

Scaling up quantum computing into a form that can trounce classical computers at ambitious jobs requires increasing the number of reliable qubits that a quantum computer has to work with. When IBM published its quantum hardware road map last September, it had recently deployed the 65-qubit IBM Quantum Hummingbird processor, a considerable advance on its previous 5- and 27-qubit predecessors. This year, the company plans to complete the 127-qubit IBM Quantum Eagle. And by 2023, it expects to have a 1,000-qubit machine, the IBM Quantum Condor. Its this machine, IBM believes, that may have the muscle to achieve quantum advantage by solving certain real-world problems faster the worlds best supercomputers.

Essential though it is to crank up the supply of qubits, the software side of quantum computings future is also under construction, and IBM published a separate road map devoted to the topic in February. Gil says that the company is striving to create a frictionless environment in which coders dont have to understand how quantum computing works any more than they currently think about a classical computers transistors. An IBM software layer will handle the intricacies (and meld quantum resources with classical ones, which will remain indispensable for many tasks).

You dont need to know quantum mechanics, you dont need to know a special programming language, and youre not going to need to know how to do these gate operations and all that stuff, he explains. Youre just going to program with your favorite language, say, Python. And behind the scenes, there will be the equivalent of libraries that call on these quantum circuits, and then they get delivered to you on demand.

IBM is still working on making quantum computing ready for everyday reality, but its already worked with designers to make it look good. [Photo: courtesy of IBM]In this vision, we think that at the end of this decade, there may be as many as a trillion quantum circuits that are running behind the scene, making software run better, Gil says.

Even if IBM clearly understands the road ahead, theres plenty left to do. Charlie Bennett says that quantum researchers will overcome remaining challenges in much the same way that he and others confronted past ones. Its hard to look very far ahead, but the right approach is to maintain a high level of expertise and keep chipping away at the little problems that are causing a thing not to work as well as it could, he says. And then when you solve that one, there will be another one, which you wont be able to understand until you solve the first one.

As for Bennetts own current work, he says hes particularly interested in the intersection betweeninformation theory and cosmologynot so much because I think I can learn enough about it to make an original research contribution, but just because its so much fun to do. Hes also been making explainer videos about quantum computing, a topic whose reputation for being weird and mysterious he blames on inadequate explanation by others.

Unfortunately, the majority of science journalists dont understand it, he laments. And they say confusing things about itpainfully, for me, confusing things.

For IBM Research, Bennett is both a living link to its past and an inspiration for its future. Hes had such a massive impact on the people we have here, so many of our top talent, says Gil. In my view, weve accrued the most talented group of people in the world, in terms of doing quantum computing. So many of them trace it back to the influence of Charlie. Impressive though Bennetts 49-year tenure at the company is, the fact that hes seen and made so much quantum computing historyincluding attending the 1981 conferenceand is here to talk about it is a reminder of how young the field still is.

Harry McCracken is the technology editor for Fast Company, based in San Francisco. In past lives, he was editor at large for Time magazine, founder and editor of Technologizer, and editor of PC World.


Visit link:

IBM and MIT kickstarted the age of quantum computing in 1981 - Fast Company

Related Post

Written by admin |

May 9th, 2021 at 1:52 am

Posted in Quantum Computing