Page 9«..6789

Archive for the ‘Quantum Computer’ Category

Google and IBM square off in Schrodingers catfight over quantum supremacy – The Register

Posted: January 12, 2020 at 8:50 am

without comments

Column Just before Christmas, Google claimed quantum supremacy. The company had configured a quantum computer to produce results that would take conventional computers some 10,000 years to replicate - a landmark event.

Bollocks, said IBM - which also has big investments both in quantum computing and not letting Google get away with stuff. Using Summit, the world's largest conventional supercomputer at the Oak Ridge National Laboratories in Tennessee, IBM claimed it could do the same calculation in a smidge over two days.

As befits all things quantum, the truth is a bit of both. IBM's claim is fair enough - but it's right at the edge of Summit's capability and frankly a massive waste of its time. Google could, if it wished, tweak the quantum calculation to move it out of that range. And it might: the calculation was chosen precisely not because it was easy, but because it was hard. Harder is better.

Google's quantum CPU has 54 qubits, quantum bits that can stay in a state of being simultaneously one and zero. The active device itself is remarkably tiny, a silicon chip around a centimetre square, or four times the size of the Z80 die in your childhood ZX Spectrum. On top of the silicon, a nest of aluminium tickled by microwaves hosts the actual qubits. The aluminium becomes superconducting below around 100K, but the very coldest part of the circuit is just 15 millikelvins. At this temperature the qubits have low enough noise to survive long enough to be useful

By configuring the qubits in a circuit, setting up data and analysing the patterns that emerge when the superpositions are observed and thus collapse to either one or zero, Google can determine the probable correct outcome for the problem the circuit represents. 54 qubits, if represented in conventional computer terms, would need 254 bits of RAM to represent each step of the calculation, or two petabytes' worth. Manipulating this much data many times over gives the 10 millennia figure Google claims.

IBM, on the other hand, says that it has just enough disk space on Summit to store the complete calculation. However you do it, though, it's not very useful; the only application is in random number generation. That's a fun, important and curiously nuanced field, but you don't really need a refrigerator stuffed full of qubits to get there. You certainly don't need the 27,648 NVidia Tesla GPUs in Summit chewing through 16 megawatts of power.

What Google is actually doing is known in the trade as "pulling a Steve", from the marketing antics of the late Steve Jobs. In particular, his tour at NeXT Inc, the company he started in the late 1980s to annoy Apple and produce idiosyncratic workstations. Hugely expensive to make and even more so to buy, the NeXT systems were never in danger of achieving dominance - but you wouldn't know that from Jobs' pronouncements. He declared market supremacy at every opportunity, although in carefully crafted phrases that critics joked defined the market as "black cubic workstations running NeXTOS."

Much the same is true of Google's claim. The calculation is carefully crafted to do precisely the things that Google's quantum computer can do - the important thing isn't the result, but the journey. Perhaps the best analogy is with the Wright Brothers' first flight: of no practical use, but tremendous significance.

What happened to NeXT? It got out of hardware and concentrated on software, then Jobs sold it - and himself - to Apple, and folded in some of that software into MacOS development. Oh, and some cat called Berners-Lee built something called the World Wide Web on a Next Cube.

Nothing like this will happen with Google's technology. There's no new web waiting to be borne on the wings of supercooled qubits. Even some of the more plausible things, like quantum decryption of internet traffic, is a very long way from reality - and, once it happens, it's going to be relatively trivial to tweak conventional encryption to defeat it. But the raw demonstration, that a frozen lunchbox consuming virtually no power in its core can outperform a computer chewing through enough wattage to keep a small town going, is a powerful inducement for more work.

That's Google's big achievement. So many new and promising technologies have failed not because they could never live up to expectations but because they cant survive infancy. Existing, established technology has all the advantages: it generates money, it has distribution channels, it has an army of experts behind it, and it can adjust to close down challengers before they get going. To take just one company - Intel has tried for decades to break out of the x86 CPU prison. New wireless standards, new memory technologies, new chip architectures, new display systems, new storage and security ideas - year after year, the company casts about for something new that'll make money. It never gets there.

Google's "quantum supremacy" isn't there either, but it has done enough to protect its infant prince in its superconducting crib. That's worth a bit of hype.

Sponsored: Detecting cyber attacks as a small to medium business

See the rest here:

Google and IBM square off in Schrodingers catfight over quantum supremacy - The Register

Written by admin

January 12th, 2020 at 8:50 am

Posted in Quantum Computer

Is Quantum Technology The Future Of The World? – The Coin Republic

Posted: at 8:50 am

without comments

Steve Anderrson Saturday, 11 January 2020, 04:58 EST Modified date: Saturday, 11 January 2020, 04:58 EST

At a glance, the quantum volume is a measure of the complexity of a problem that a quantum computer can provide a solution. Quantum volume can also use to compare the performance of different quantum computers.

Ever since 2016, the IBM executives have doubled this value. In the 21st Century, Quantum computers have hailed as one of the most important innovations of the 21st century, along with potential applications in almost all fields of industries. Be it healthcare or artificial intelligence, and even financial modelling, to name a few.

Recently, quantum computers have also entered a new phase of development which can describe as practical. The first real quantum computer was launched in 2009 by Jonathan Holm. From that time, the quantum computer development has travelled a long way. At the moment, the industry driven by a handful of tech giants, including Google and IBM.

Even though IBMs latest advances viewed as significant advances, quantum computers can currently only be used for particular tasks. This indicates that they are far away from the general-purpose which classic computers serve us and to which we are used to.

Therefore, some people start worrying that the encryption technology which used to protect cryptocurrencies, for example, bitcoin may get destroyed. This worry is at least unfounded at present.

As the network is entirely built around the secure cryptographic transactions, a powerful quantum computer could eventually crack the encryption technology which used to generate Bitcoins private keys.

However, as per an article which was published by Martin Roetteler and various co-authors in June in 2017, such type of a machine requires approximately 2,500 qubits of processing power so that they can crack the 256-bit encryption technology which is used by Bitcoin.

Since the most powerful quantum computer which the world currently has only consisted of 72 qubit processors, one thing is clear that it will take several years for a quantum computer to reach the level of threatening encryption technology.

With the help of IBMs computing power which keeps doubling every year, and also the fact that Google has achieved quantum hegemony, Quantum might be working to ensure that Bitcoin can resist potential quantum computing attacks.

Read more:

Is Quantum Technology The Future Of The World? - The Coin Republic

Written by admin

January 12th, 2020 at 8:50 am

Posted in Quantum Computer

Were approaching the limits of computer power we need new programmers now – The Guardian

Posted: at 8:50 am

without comments

Only so many transistors can fit on a silicon chip. Photograph: Rowan Morgan/Alamy Stock Photo

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moores law, which for most people working in the computer industry or at any rate those younger than 40 has provided the kind of bedrock certainty that Newtons laws of motion did for mechanical engineers.

There is, however, one difference. Moores law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. In terms of size of transistor, he said, you can see that were approaching the size of atoms, which is a fundamental barrier, but itll be two or three generations before we get that far but thats as far out as weve ever been able to see. We have another 10 to 20 years before we reach a fundamental limit.

Weve now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, theres been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called cores in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moores law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; theres a legend that for years afterwards he could recite the entire program by heart.

There are thousands of stories like this from the early days of computing. But as Moores law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed. Programming became industrialised as software engineering. The construction of sprawling software ecosystems such as operating systems and commercial applications required large teams of developers; these then spawned associated bureaucracies of project managers and executives. Large software projects morphed into the kind of death march memorably chronicled in Fred Brookss celebrated book, The Mythical Man-Month, which was published in 1975 and has never been out of print, for the very good reason that its still relevant. And in the process, software became bloated and often inefficient.

But this didnt matter because the hardware was always delivering the computing power that concealed the bloatware problem. Conscientious programmers were often infuriated by this. The only consequence of the powerful hardware I see, wrote one, is that programmers write more and more bloated software on it. They become lazier, because the hardware is fast they do not try to learn algorithms nor to optimise their code this is crazy!

It is. In a lecture in 1997, Nathan Myhrvold, who was once Bill Gatess chief technology officer, set out his Four Laws of Software. 1: software is like a gas it expands to fill its container. 2: software grows until it is limited by Moores law. 3: software growth makes Moores law possible people buy new hardware because the software requires it. And, finally, 4: software is only limited by human ambition and expectation.

As Moores law reaches the end of its dominion, Myhrvolds laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What just happened? Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says no Theres a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beasts How to lose a monopoly: Microsoft, IBM and antitrust is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

Read more here:

Were approaching the limits of computer power we need new programmers now - The Guardian

Written by admin

January 12th, 2020 at 8:50 am

Posted in Quantum Computer

Jeffrey Epstein scandal: MIT professor put on leave, he ‘failed to inform’ college that sex offender made donations – CNBC

Posted: at 8:49 am

without comments

Jeffrey Epstein in 2004.

Rick Friedman | Corbis News | Getty Images

The Massachusetts Institute of Technology said Friday that it had placed one of its tenured professors on paid administrative leave after finding that he "purposefully failed to inform MIT" that convicted sex offender Jeffrey Epstein was the source of two donations in 2012 to support the professor's research, and that the professor got a $60,000 personal gift from Epstein.

A scathing report released by MIT also found that the decision by three administrators to accept donations from Epstein, who pleaded guilty to sex crimes in Florida in 2008 one of which involved a minor girl "was the result of collective and serious errors in judgment that resulted in serious damage to the MIT community."

The report noted that even as its findings have been made public, "MIT is still without a clear and comprehensive gift policy or a process to properly vet donors." However, the university has begun to develop such a process.

Epstein, a former friend of Presidents Donald Trump and Bill Clinton, donated $850,000 to MIT from 2002 through 2017 in 10 separate gifts, the report said.

That was $50,000 more than the amount MIT has previously reported to have received from Epstein.

"The earliest gift was $100,000 given in 2002 to support the research of the late Professor Marvin Minsky, who died in 2016," MIT said as it released the report, which comes after four months of investigation of Epstein's ties to MIT conducted by the law firm Goodwin Procter.

"The remaining nine donations, all made after Epstein's 2008 conviction, included $525,000 to the Media Lab and $225,000 to" mechanical engineering professor Seth Lloyd, the report said.

The report also found that, "Unbeknownst to any members of MIT's senior leadership ... Epstein visited MIT nine times between 2013 and 2017."

"The fact-finding reveals that these visits and all post-conviction gifts from Epstein were driven by either former Media Lab director Joi Ito or professor of mechanical engineering Seth Lloyd, and not by the MIT administration or the Office of Resource Development."

Ito resigned last year after revelations about Epstein's donations to the Media Lab.

Lloyd received two donations of $50,000 in 2012, and the remaining $125,000 in 2017, according to the report.

"Epstein viewed the 2012 gifts as a trial balloon to test MIT's willingness to accept donations following his conviction" in Florida, MIT said.

"Professor Lloyd knew that donations from Epstein would be controversial and that MIT might reject them," MIT said.

"We conclude that, in concert with Epstein, he purposefully decided not to alert the Institute to Epstein's criminal record, choosing instead to allow mid-level administrators to process the donations without any formal discussion or diligence concerning Epstein."

Seth Lloyd is a professor of mechanical engineering and physics at the Massachusetts Institute of Technology.

Photo:Dmitry Rozhkov | Wikipedia CC

Lloyd was put on paid leave after it was found that he "purposefully failed to" tell MIT that Epstein was the source of the two earliest donations to him.

The report also found that Lloyd had "received a personal gift of $60,000 from Epstein in 2005 or 2006, which he acknowledged was deposited into a personal bank account and not reported to MIT," the university said in a press statement.

Lloyd is an influential thinker in the field of quantum mechanical engineering.

Educated at Harvard College and Cambridge University in England, Lloyd was the first person to propose a "technologically feasible design for a quantum computer," according to his resume. His 2006 book, "Programming the Universe," argues that the universe is a giant quantum computer calculating its own evolution.

"In addition to his own donations, Epstein claimed to have arranged for donations to MIT from other wealthy individuals," the report said. "In 2014, Epstein claimed to have arranged for Microsoft cofounder Bill Gates to provide an anonymous $2 million donation to the Media Lab. He also claimed that same year to have arranged for a $5 million anonymous donation to the Media Lab from Leon Black, the co-founder of Apollo Global Management. Representatives of Bill Gates have told us that Gates flatly denies that Epstein had anything to do with Gates's donation to the Media Lab."

University President L. Rafael Reif had not been aware that MIT was accepting donations from Epstein, who killed himself in a Manhattan jail in August after being arrested the prior month on federal child sex trafficking charges, according to the report.

"But the review finds that three MIT vice presidents learned of Epstein's donations to the MIT Media Lab, and his status as a convicted sex offender, in 2013," the university said in a prepared statement.

"In the absence of any MIT policy regarding controversial gifts, Epstein's subsequent gifts to the Institute were approved under an informal framework developed by the three administrators, R. Gregory Morgan, Jeffrey Newton, and Israel Ruiz."

"Since MIT had no policy or processes for handling controversial donors in place at the time, the decision to accept Epstein's post-conviction donations cannot be judged to be a policy violation," the report said.

"But it is clear that the decision was the result of collective and significant errors in judgment that resulted in serious damage to the MIT community."

Reif, in a letter addressed to the university's community, said, "Today's findings present disturbing new information about Jeffrey Epstein's connections with individuals at MIT: how extensive those ties were and how long they continued. This includes the decision by a lab director to bring this Level 3 sex offender to campus repeatedly."

"That it was possible for Epstein to have so many opportunities to interact with members of our community is distressing and unacceptable; I cannot imagine how painful it must be for survivors of sexual assault and abuse," Reif said.

"Clearly, we must establish policy guardrails to prevent this from happening again."

The report notes, "While Epstein made charitable donations before his 2008 conviction, after that conviction he may have had a second motive for his donations: to launder his reputation by associating himself with reputable individuals and institutions."

See the original post:

Jeffrey Epstein scandal: MIT professor put on leave, he 'failed to inform' college that sex offender made donations - CNBC

Written by admin

January 12th, 2020 at 8:49 am

Posted in Quantum Computer

Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s – The Daily Hodl

Posted: at 8:49 am

without comments

The new decade will unfurl a bag of seismic shifts, predicts the creator of Cardano and Ethereum, Charles Hoskinson. And these changes will propel cryptocurrency and blockchain solutions to the forefront as legacy systems buckle, transform or dissolve.

In an ask-me-anything session uploaded on January 3rd, the 11th birthday of Bitcoin, Hoskinson acknowledges how the popular cryptocurrency gave him an eye-opening introduction to the world of global finance, and he recounts how dramatically official attitudes and perceptions have changed.

Every central bank in the world is aware of cryptocurrencies and some are even taking positions in cryptocurrencies. Theres really never been a time in human history where one piece of technology has obtained such enormous global relevance without any central coordinated effort, any central coordinated marketing. No company controls it and the revolution is just getting started.

And he expects its emergence to coalesce with other epic changes. In a big picture reveal, Hoskinson plots some of the major events he believes will shape the new decade.

2020 Predictions

Hoskinson says the consequences of these technologies will reach every government service and that cryptocurrencies will gain an opening once another economic collapse similar to 2008 shakes the markets this decade.

I think that means its a great opening for cryptocurrencies to be ready to start taking over the global economy.

Hoskinson adds that hes happy to be alive to witness all of the changes he anticipates, including a reorganization of the media.

This is the last decade of traditional organized media, in my view. Were probably going to have less CNNs and Fox Newses and Bloombergs and Wall Street Journals and more Joe Rogans, especially as we enter the 2025s and beyond. And I think our space in particular is going to fundamentally change the incentives of journalism. And well actually move to a different way of paying for content, curating content.

Check Latest News Headlines

Featured Image: Shutterstock/Liu zishan

See the original post:

Charles Hoskinson Predicts Economic Collapse, Rise of Quantum Computing, Space Travel and Cryptocurrency in the 2020s - The Daily Hodl

Written by admin

January 12th, 2020 at 8:49 am

Posted in Quantum Computer

This Week in Tech: What on Earth Is a Quantum Computer? – The New York Times

Posted: December 11, 2019 at 4:45 am

without comments

David Bacon, senior software engineer in Googles quantum lab: Quantum computers do computations in parallel universes. This by itself isnt useful. U only get to exist in 1 universe at a time! The trick: quantum computers dont just split universes, they also merge universes. And this merge can add and subtract those other split universes.

David Reilly, principal researcher and director of the Microsoft quantum computing lab in Sydney, Australia: A quantum machine is a kind of analog calculator that computes by encoding information in the ephemeral waves that comprise light and matter at the nanoscale. Quantum entanglement likely the most counterintuitive thing around holds it all together, detecting and fixing errors.

Daniel Lidar, professor of electrical and computer engineering, chemistry, and physics and astronomy at the University of Southern California, with his daughter Nina, in haiku:

Quantum computers solve some problems much faster but are prone to noise

Superpositions: to explore multiple paths to the right answer

Interference helps: cancels paths to wrong answers and boosts the right ones

Entanglement makes classical computers sweat, QCs win the race

Scott Aaronson, professor of computer science at the University of Texas at Austin: A quantum computer exploits interference among positive and negative square roots of probabilities to solve certain problems much faster than we think possible classically, in a way that wouldnt be nearly so interesting were it possible to explain in the space of a tweet.

Alan Baratz, executive vice president of research and development at D-Wave Systems: If were honest, everything we currently know about quantum mechanics cant fully describe how a quantum computer works. Whats more important, and even more interesting, is what a quantum computer can do: A.I., new molecules, new materials, modeling climate change

Original post:

This Week in Tech: What on Earth Is a Quantum Computer? - The New York Times

Written by admin

December 11th, 2019 at 4:45 am

Posted in Quantum Computer

Security leaders fear that quantum computing developments will outpace security technologies – Continuity Central

Posted: at 4:45 am

without comments

Details Published: Wednesday, 11 December 2019 07:59

More than half (54 percent) of cyber security professionals have expressed concerns that quantum computing will outpace the development of security technologies, according to new research from the Neustar International Security Council (NISC). Keeping a watchful eye on developments, 74 percent of organizations said that they are paying close attention to the technologys evolution, with 21 percent already experimenting with their own quantum computing strategies.

A further 35 percent of experts claimed to be in the process of developing a quantum strategy, while just 16 percent said they were not yet thinking about it. This shift in focus comes as the vast majority (73 percent) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years. Almost all respondents (93 percent) believe the next-generation computers will overwhelm existing security technology, with just 7 percent under the impression that true quantum supremacy will never happen.

Despite expressing concerns that other technologies will be overshadowed, an overwhelming number (87 percent) of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13 percent were more cautious and under the impression that the technology would create more harm than good.

At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years, said Rodney Joffe, Chairman of NISC and Security CTO at Neustar. Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyber attack unlike anything weve ever seen.

For both todays major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately. The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organization holding sensitive data should have quantum on their radar. Quantum computing's ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cyber security, added Joffe.

Go here to see the original:

Security leaders fear that quantum computing developments will outpace security technologies - Continuity Central

Written by admin

December 11th, 2019 at 4:45 am

Posted in Quantum Computer

Inside the weird, wild, and wondrous world of quantum video games – Digital Trends

Posted: at 4:45 am

without comments

Back to Menu By Luke Dormehl December 10, 2019 3:00AM PST Close IBM Research

In 1950, a man named John Bennett, an Australian employee of the now-defunct British technology firm Ferranti, created what may be historys first gaming computer. It could play a game called Nim, a long-forgotten parlor game in which players take turns removing matches from several piles. The player who loses is the one who removes the last match. For his computerized version, Bennett created a vast machine 12 feet wide, 5 feet tall, and 9 feet deep. The majority of this space was taken up by light-up vacuum tubes which depicted the virtual matches.

Bennetts aim wasnt to create a game-playing machine for the sake of it; the reason that somebody might build a games PC today. As writer Tristan Donovan observed in Replay, his superlative 2010 history of video games: Despite suggesting Ferranti create a game-playing computer, Bennetts aim was not to entertain but to show off the ability of computers to do [math].

Jump forward almost 70 years and a physicist and computer scientist named Dr. James Robin Wootton is using games to demonstrate the capabilities of another new, and equally large, experimental computer. The computer in this question is a quantum computer, a dream of scientists since the 1980s, now finally becoming a scientific reality.

Quantum computers encode information as delicate correlations with an incredibly rich structure. This allows for potentially mind-boggling densities of information to be stored and manipulated. Unlike a classical computer, which encodes as a series of ones and zeroes, the bits (called qubits) in a quantum computer can be either a one, a zero, or both at the same time. These qubits are composed of subatomic particles, which conform to the rules of quantum rather than classical mechanics. They play by their own rules a little bit like Tom Cruises character Maverick from Top Gun if he spent less time buzzing the tower and more time demonstrating properties like superpositions and entanglement.

I met Wootton at IBMs research lab in Zurich on a rainy day in late November. Moments prior, I had squeezed into a small room with a gaggle of other excited onlookers, where we stood behind a rope and stared at one of IBMs quantum computers like people waiting to be allowed into an exclusive nightclub. I was reminded of the way that people, in John Bennetts day, talked about the technological priesthood surrounding computers: then enormous mainframes sequestered away in labyrinthine chambers, tended to by highly qualified people in white lab coats. Lacking the necessary seminary training, we quantum computer visitors could only bask in its ambience from a distance, listening in reverent silence to the weird vee-oing vee-oing vee-oing sound of its cooling system.

Wottons interest in quantum gaming came about from exactly this scenario. In 2016, he attended a quantum computing event at the same Swiss ski resort where, in 1925, Erwin Schrdinger had worked out his famous Schrdinger wave equation while on vacation with a girlfriend. If there is a ground zero for quantum computing, this was it. Wotton was part of a consortium, sponsored by the Swiss government, to do (and help spread the word about) quantum computing.

At that time quantum computing seemed like it was something that was very far away, he told Digital Trends. Companies and universities were working on it, but it was a topic of research, rather than something that anyone on the street was likely to get their hands on. We were talking about how to address this.

Wootton has been a gamer since the early 1990s. I won a Game Boy in a competition in a wrestling magazine, he said. It was a Slush Puppy competition where you had to come up with a new flavor. My Slush Puppy flavor was called something like Rollin Redcurrant. Im not sure if you had to use the adjective. Maybe thats what set me apart.

While perhaps not a straight path, Wootton knew how an interest in gaming could lead people to an interest in other aspects of technology. He suggested that making games using quantum computing might be a good way of raising public awareness of the technology.He applied for support and, for the next year, was given to my amazement the chance to go and build an educational computer game about quantum computing. At the time, a few people warned me that this was not going to be good for my career, he said. [They told me] I should be writing papers and getting grants; not making games.

But the idea was too tantalizing to pass up.

That same year, IBM launched its Quantum Experience, an online platform granting the general public (at least those with a background in linear algebra) access to IBMs prototype quantum processors via the cloud. Combined with Project Q, a quantum SDK capable of running jobs on IBMs devices, this took care of both the hardware and software component of Woottons project. What he needed now was a game. Woottons first attempt at creating a quantum game for the public was a version of the game Rock-Paper-Scissors, named Cat-Box-Scissors after the famous Schrdingers cat thought experiment. Wootton later dismissed it as [not] very good Little more than a random number generator with a story.

But others followed. There was Battleships, his crack at the first multiplayer game made with a quantum computer. There was Quantum Solitaire. There was a text-based dungeon crawler, modeled on 1973s Hunt the Wumpus, called Hunt the Quantpus. Then the messily titled, but significant, Battleships with partial NOT gates, which Wootton considers the first true quantum computer game, rather than just an experiment. And so on. As games, these dont exactly make Red Dead Redemption 2 look like yesterdays news. Theyre more like Atari 2600 or Commodore 64 games in their aesthetics and gameplay. Still, thats exactly what youd expect from the embryonic phases of a new computing architecture.

If youd like to try out a quantum game for yourself, youre best off starting with Hello Quantum, available for both iOS and Android. It reimagines the principles of quantum computing as a puzzle game in which players must flip qubits. It wont make you a quantum expert overnight, but it will help demystify the process a bit. (With every level, players can hit a learn more button for a digestible tutorial on quantum basics.)

Quantum gaming isnt just about educational outreach, though. Just as John Bennett imagined Nim as a game that would exist to show off a computers abilities, only to unwittingly kickstart a $130 billion a year industry, so quantum games are moving beyond just teaching players lessons about quantum computing.Increasingly, Wootton is excited about what he sees as real world uses for quantum computing. One of the most promising of these is taking advantage of quantum computings random number generating to create random terrain within computer games. In Zurich, he showed me a three-dimensional virtual landscape reminiscent of Minecraft. However, while much of the world of Minecraft is user generated, in this case the blocky, low-resolution world was generated using a quantum computer.

Quantum mechanics is known for its randomness, so the easiest possibility is just to use quantum computing as a [random number generator], Wootton said. I have a game in which I use only one qubit: the smallest quantum computer you can get. All you can do is apply operations that change the probabilities of getting a zero or one as output. I use that to determine the height of the terrain at any point in the game map.

Plenty of games made with classical computers have already included procedurally generated elements over the years. But as the requirements for these elements ranging from randomly generated enemies to entire maps increase in complexity, quantum could help.

Gaming is an industry that is very dependent on how fast things run

Gaming is an industry that is very dependent on how fast things run, he continued. If theres a factor of 10 difference in how long it takes something to run that determines whether you can actually use it in a game. He sees today as a great jumping-on point for people in the gaming industry to get involved to help shape the future development of quantum computing. Its going to be driven by what people want, he explained. If people find an interesting use-case and everyone wants to use quantum computing for a game where you have to submit a job once per frame, that will help dictate the way that the technology is made.

Hes now reached the point where he thinks the race may truly be on to develop the first commercial game using a quantum computer. Weve been working on these proof-of-principle projects, but now I want to work with actual game studios on actual problems that they have, he continued. That means finding out what they want and how they want the technology to be [directed].

One thing thats for certain is that Wootton is no longer alone in developing his quantum games. In the last couple of years, a number ofquantum game jams have popped up around the world. What most people have done is to start small, Wootton said. They often take an existing game and use one or two qubits to help allow you to implement a quantum twist on the game mechanics. Following this mantra, enthusiasts have used quantum computing to make remixed versions of existing games, including Dr. Qubit (a quantum version of Dr. Mario), Quantum Cat-sweeper (a quantum version of Minesweeper), and Quantum Pong (a quantum version of, err, Pong).

The world of quantum gaming has moved beyond its 1950 equivalent of Nim. Now we just have to wait and see what happens next. The decades which followed Nim gave us MITs legendary Spacewar in the 1960s, the arcade boom of the 1970s and 80s, the console wars of Sega vs. Nintendo, the arrival of the Sony PlayStation in the 1990s, and so on. In the process, classical computers became part of our lives in a way they never were before. As Whole Earth Catalog founder Stewart Brand predicted as far back as 1972 Rolling Stone in his classic essay on Spacewar: Ready or not, computers are coming to the people.

At present, quantum gamings future is at a crossroads. Is it an obscure niche occupied by just a few gaming physics enthusiasts or a powerful tool that will shape tomorrows industry? Is it something that will teach us all to appreciate the finer points of quantum physics or a tool many of us wont even realize is being used, that will nevertheless give us some dope ass games to play?

Like Schrdingers cat, right now its both at once. What a superposition to be in.

Show More

Continued here:

Inside the weird, wild, and wondrous world of quantum video games - Digital Trends

Written by admin

December 11th, 2019 at 4:45 am

Posted in Quantum Computer

Why Move Fast and Break Things Doesn’t Work Anymore – Harvard Business Review

Posted: at 4:45 am

without comments

Executive Summary

Over the next few decades, agility will not come from speed; it will come from the ability to explore multiple domains at once and combine them into something that produces value. This means computer scientists working with cancer scientists, for example, to identify specific genetic markers that could lead to a cure. This change will be profound and we will need to rethink old notions about how we compete, collaborate, and bring new products to market. Here are three key shifts.

For the past few decades, agility in the technology sector has largely meant moving faster and faster down a predetermined path; innovation has largely been driven by our ability to cram more transistors onto a silicon wafer. With every new generation of chips came new possibilities and new applications. The firms that developed those applications the fastest won.

Over the coming decades, however, agility will take on a new meaning: the ability to explore multiple domains at once and combine them into something that produces value. Well need computer scientists working with cancer scientists, for example, to identify specific genetic markers that could lead to a cure. To do this, well need to learn how to go slower to have a greater impact.

This change will be profound. We will need to rethink old notions about how we compete, collaborate, and bring new products to market. More specifically, we will have to manage three profound shifts that will force us to widen and deepen connections between talent, technology, and information rather than just moving fast and breaking things.

Shift 1: From A Digital To A Post-Digital Age. Its hard to imagine that 30 years ago, most American households didnt have a computer, much less a mobile phone. Yet today, a typical teenager armed with a smartphone has access to more information than a highly trained specialist working at a major institution a generation ago.

Whats driven all this advancement has been Moores Law, our ability to double the power of our computing technology about every 18 months. Yet now Moores Law is approaching theoretical limits and will most likely come to an end in the next decade. New computing architectures, such as quantum and neuromorphic technologies, have great potential to further advancement, but will be far more complex than digital chips. Make no mistake, the transition will not be seamless.

At the same time, were seeing the rise of nascent technologies, such as synthetic biology, advanced materials science and artificial intelligence. Again, these new technologies represent a significant increase in complexity. Were rapidly moving from an environment where we understand the technologies we use and their implications extremely well to an era in which we do not. If we continue to move fast and break things, we are likely to break something important.

Shift 2: From Rapid Iteration to Exploration. Over the past 30 years, weve had the luxury of working with technologies we understand extremely well. Every generation of microchips opened vast new possibilities, but worked exactly the same way as the last generation, creating minimal switching costs. The main challenge was to design applications.

So it shouldnt be surprising that rapid iteration emerged as a key strategy. When you understand the fundamental technology that underlies a product or service, you can move quickly, trying out nearly endless permutations until you arrive at an optimized solution. Thats often far more effective than a more planned, deliberate approach.

Over the next decade or two, however, the challenge will be to advance technology that we dont understand well at all. Quantum and neuromorphic computing are still in their nascent stages. Exponential improvements in genomics and materials science are redefining the boundaries of those fields. There are also ethical issues involved with artificial intelligence and genomics that will require us to tread carefully.

So in the future, we will need to put greater emphasis on exploration. We will need to spend time understanding these new technologies and how they relate to our businesses. Most of all, its imperative to start exploring early. By the time many of these technologies hit their stride, it may be too late to catch up.

Shift 3: From Hypercompetition to Mass Collaboration.The competitive environment weve become used to has been relatively simple. For each particular industry, there have been distinct ecosystems based on established fields of expertise. Competing firms raced to transform fairly undifferentiated digital inputs (chips, code, components, etc.) into highly differentiated products and services. You needed to move fast to get an edge.

This new era, on the other hand, will be one of mass collaboration in which government partners with academia and industry to explore new technologies in the pre-competitive phase. For example, the Joint Center for Energy Storage Research combines the work of five national labs, a few dozen academic institutions, and hundreds of companies to develop advanced batteries.

Or consider the Manufacturing Institutes, which focus on everything from advanced fabrics and biopharmaceuticals to robotics and composite materials. These active hubs allow companies to collaborate with government labs and top academics to develop the next generation of technologies. They also operate dozens of testing facilities to help bring new products to market faster.

Ive visited some of these facilities and have had the opportunity to talk with executives from participating companies. What has struck me is how how excited they are for the possibilities of this new era. Agility for them doesnt mean learning to run faster down a chosen course, but to widen and deepen connections throughout a technological ecosystem.

Not so long ago, this kind of mass collaboration, often involving direct competitors would have seemed strange, if not hopelessly naive. Yet today, high performing firms from corporate VCs to corporate accelerators are increasingly aware that they need to connect or get shut out. One example is especially instructive. When IBM decided to develop the PC in 1980, they sent a team to Boca Raton to work in secret and launched the product a year later. To develop quantum computing, however, theyve created a Q Network, which includes several of the National Labs, research universities, potential end users like major banks and manufacturers as well as startups.

Whats becoming increasingly clear is that the breakthrough applications of the future will not be based on a single technology like a digital microchip. These new technologies are far too complex for anyone to develop on their own. Thats why we can expect the basis of competition to shift away from design sprints, iterating, and pivoting to building meaningful relationships in order to solve grand challenges. Power in this new era will not sit at the top of industrial hierarchies, but will emanate from the center of networks and ecosystems.

Read the rest here:

Why Move Fast and Break Things Doesn't Work Anymore - Harvard Business Review

Written by admin

December 11th, 2019 at 4:45 am

Posted in Quantum Computer

Page 9«..6789