Page 13«..11121314

Archive for the ‘Quantum Computing’ Category

Physicists Just Achieved The First-Ever Quantum Teleportation Between Computer Chips – ScienceAlert

Posted: December 30, 2019 at 8:49 pm


without comments

As 2019 winds to a close, the journey towards fully realised quantum computing continues: physicists have been able to demonstrate quantum teleportation between two computer chips for the first time.

Put simply, this breakthrough means that information was passed between the chips not by physical electronic connections, but through quantum entanglement by linking two particles across a gap using the principles of quantum physics.

We don't yet understand everything about quantum entanglement (it's the same phenomenon Albert Einstein famously called "spooky action"), but being able to use it to send information between computer chips is significant, even if so far we're confined to a tightly controlled lab environment.

"We were able to demonstrate a high-quality entanglement link across two chips in the lab, where photons on either chip share a single quantum state," explains quantum physicist Dan Llewellynfrom the University of Bristol in the UK.

"Each chip was then fully programmed to perform a range of demonstrations which utilise the entanglement."

Hypothetically, quantum entanglement can work over any distance. Two particles get inextricably linked together, which means looking at one tells us something about the other, wherever it is (in this case, on a separate computer chip).

To achieve their result, the team generated pairs of entangled photons, encoding quantum information in a way that ensured low levels of interference and high levels of accuracy. Up to four qubits the quantum equivalent of classical computing bits were linked together.

"The flagship demonstration was a two-chip teleportation experiment, whereby the individual quantum state of a particle is transmitted across the two chips after a quantum measurement is performed," says Llewellyn.

"This measurement utilises the strange behaviour of quantum physics, which simultaneously collapses the entanglement link and transfers the particle state to another particle already on the receiver chip."

The researchers were then able to run experiments in which the fidelity reached 91 percent as in, almost all the information was accurately transmitted and logged.

Scientists are learning more and more about how quantum entanglement works, but for now it's very hard to control. It's not something you can install inside a laptop: you need a lot of bulky, expensive scientific equipment to get it working.

But the hope is that advances in the lab, such as this one, might one day lead to advances in computing that everyone can take advantage of super-powerful processing power and a next-level internet with built-in hacking protections.

The low data loss and high stability of the teleportation, as well as the high level of control that the scientists were able to get over their experiments, are all promising signs in terms of follow-up research.

It's also a useful study for efforts to get quantum physics working with the silicon chip (Si-chip) tech used in today's computers, and the complementary metal-oxide-semiconductor (CMOS) techniques used to make those chips.

"In the future, a single Si-chip integration of quantum photonic devices and classical electronic controls will open the door for fully chip-based CMOS-compatible quantum communication and information processing networks," says quantum physicist Jianwei Wang, from Peking University in China.

The research has been published in Nature Physics.

See original here:

Physicists Just Achieved The First-Ever Quantum Teleportation Between Computer Chips - ScienceAlert

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

Quantum Supremacy and the Regulation of Quantum Technologies – The Regulatory Review

Posted: at 8:49 pm


without comments

Advancing technology requires regulators to act quickly to develop standards and defenses against cyberattacks.

After a false-start in September, Google provided the first peer-reviewed evidence of quantum supremacy a month later in the prestigious journal Nature. The announcement was the latest crescendo in the development of quantum computersemerging technologies that can efficiently solve complicated computational problems with hardware that takes advantage of quantum mechanics.

With data privacy and national security at stake, agile and adaptive regulatory strategies are needed to manage the risks of fast-approaching quantum computers without thwarting their potential benefits.

Although classical computers use binary bits to perform calculations, devices under development, like Googles, use qubits that are not limited to 1s and 0s when they process information. Instead, through phenomena like superposition and entanglement, groups of qubits can have exponentially more power by not merely being on or off, but also being some blend of on and off at the same time. With the right programming and hardware design, quantum computers should be able to work smarter than classical computers when making sense of large datasets.

Demonstrating that a quantum computer can actually solve problems even supercomputers cannot handleso called quantum supremacy (or, preferably, the less violent quantum advantage)has long been an envied goal in the quantum engineering field. But, as the CEO of leading quantum technology firm Rigetti noted, practical quantum devices will create new risks and could lead to unanticipated policy challenges.

Setting risks aside, quantum technologies do promise exciting near-term benefits. Quantum advantage highlights the raw power of these devices to work with big datasets and could be used to advance drug discovery, business analytics, artificial intelligence, traffic control, and more. Although IBM has moved to cast doubt on the achievement, Googles publication claims the team is only one creative algorithm away from valuable near-term applications. The world could almost be at the dawn of an era of quantum computers with day-to-day applications.

But practical quantum computers could also rip through current cybersecurity infrastructure. The abilities of these emerging technologies create significant national security concerns, both in the United States and for other countries investing heavily in quantum technologies, such as China.

Quantum cyberattacks could also put private or sensitive information at risk or expose corporate intellectual property and trade secrets.

To be sure, one developer showing quantum advantage for a single task does not mean the quantum cyberattacks will start tomorrow, so panic should be avoided. But, despite the hype, attaining quantum advantage does signal an approaching time when these attacks could become possible.

Achieving quantum advantage or supremacy is bittersweet, then, given the potential for both benefit and harm. Even though this is the first report of the achievement in the United States, it is not impossible that this goal has been reached elsewhere or will be soon. With this understanding, what should the regulatory and policy responses look like to manage novel risks while still encouraging benefits?

Three strategies can help prepare for the coming wave of quantum computers without undermining innovation, drawing on technical standards and codes of conduct as regulatory tools.

First, private standards will be useful for responding to quantum concerns. These voluntary, technical standards can give government and industry a common language to speak by creating agreed-upon definitions and ways of measuring quantum computers performance capabilities. Technical standards can therefore facilitate policy conversations about how powerful quantum computers really are and what types of risks are realistic and deserve policymakers attention.

The Institute of Electrical and Electronics Engineers Standards Association (IEEE) is currently working on setting standards for terminology and performance metrics in quantum computing. Given the global authority and reputation of IEEE, these standards could become quite influential when adopted and even be helpful for industry. To get ahead of potential quantum cyberattacks, experts from government, industry, academia, and NGOs should participate in standardization efforts to accelerate this work and add different perspectives to make standards more comprehensive and inclusive.

Second, the quantum computing industry itself can be proactive even without government taking the lead. I argue in a recent paper that, to guide responsible development of these powerful new technologies, quantum computing companies could create codes of conduct todetail best practices and principles for the responsible deployment of quantum computing.

Codes of conduct can show that an emerging industry is trying to be responsible and transparent while publicly setting expectations for good behavior. With concerns that quantum computers might be used for nefarious purposes or fall into the wrong hands, the industry should respond by committing to act responsibly through quantum codesand have a chance to help define what responsibility means in this new area as an added benefit.

Finally, the industry should work to support the development of standards for another technology intended to defend from quantum cyberattacks, called post-quantum cryptography. Quantum computers excel at solving problems that require factoring large numbers, which gets right to the heart of current cybersecurity methods. Post-quantum cryptography tries to counter this strength by creating new types of encryption that quantum computers will be less adept at cracking.

Post-quantum methods still must be fully developed, standardized, and then implemented in critical networkscreating a need for policy and governance efforts to facilitate the transition to a post-quantum world. The National Institute of Standards and Technology has begun to work on post-quantum standards, but these efforts will not finish overnight. The potential urgency of practical quantum computers means that work to standardize and advance post-quantum cryptographic methods deserves greater attention and resources from both the public and private sectors, as well as expert groups and non-governmental organizations.

Googles announcement that it has reached quantum advantage or supremacy is a great achievement in the long push to develop pragmatic quantum computers that can benefit society. But even though this announcement does not mean cybersecurity ends tomorrow, the security and privacy risks of quantum computers deserve policymakers prompt attention.

Responding to these challenges with public and private standards and codes of conduct should promote responsibility, security, and growth in the development of emerging quantum technologies.

Walter G. Johnson, a J.D. candidate and research assistant at the Sandra Day OConnor College of Law at Arizona State University, where he also holds a masters degree in science and technology policy.

More:

Quantum Supremacy and the Regulation of Quantum Technologies - The Regulatory Review

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing – Analytics India Magazine

Posted: at 8:49 pm


without comments

Quantum computing has come a long way since its first introduction in the 1980s. Researchers have always been on a lookout for a better way to enhance the ability of quantum computing systems, whether it is in making it cheaper or the quest of making the present quantum computers last longer. With the latest technological advancements in the world of quantum computing which superconducting bits, a new way of improving the world of silicon quantum computing has come to light, making use of the silicon spin qubits for better communication.

Until now, the communication between different qubits was relatively slow. It could be done by passing the messages to the next bit to get the communication over to another chip at a relatively far distance.

Now, researches at Princeton University have explored the idea of two quantum computing silicon components known as silicon spin qubits interacting in a relatively spaced environment, that is with a relatively large distance between them. The study was presented in the journal Nature on December 25, 2019.

The silicon quantum spin qubits give the ability to the quantum hardware to interact and transmit messages across a certain distance which will provide the hardware new capabilities. With transmitting signals over a distance, multiple quantum bits can be arranged in two-dimensional grids that can perform more complex calculations than the existing hardware of quantum computers can do. This study will help in better communications of qubits not only on a chip but also from one to another, which will have a massive impact on the speed.

The computers require as many qubits as possible to communicate effectively with each other to take the full advantage of quantum computings capabilities. The quantum computer that is used by Google and IBM contains around 50 qubits which make use of superconducting circuits. Many researchers believe that silicon-based qubit chips are the future in quantum computing in the long run.

The quantum state of silicon spin qubits lasts longer than the superconducting qubits, which is one of their significant disadvantages (around five years). In addition to lasting longer, silicon which has a lot of application in everyday computers is cheaper, another advantage over the superconducting qubits because these cost a ton of money. Single qubit will cost around $10,000, and thats before you consider research and development costs. With these costs in mind a universal quantum computer hardware alone will be around at least $10bn.

But, silicon spin cubits have their challenges which are part of the fact that they are incredibly small, and by small we mean, these are made out from a single electron. This problem is a huge factor when it comes to establishing an interconnect between multiple qubits when building a large scale computer.

To counter the problem of interconnecting these extremely small silicon spin qubits, the Princeton team connected these qubits with a wire which are similar to the fibre optic (for internet delivery at houses) wires and these wires carry light. This wire contains photon that picks up a message from a single qubit and transmits it the next qubit. To understand this more accurately, if the qubits are placed at a distance of half-centimetre apart from each other for the communication, in real-world, it would be like these qubits are around 750 miles away.

The next step forward for the study was to establish a way of getting qubits and photons to communicate the same language by tuning both the qubits and the photon to the same frequency. Where previously the devices architecture allowed tuning only one qubit to one photon at a time, the team now succeeded in tuning both the qubits independent from each other while still coupling them to the photon.

You have to balance the qubit energies on both sides of the chip with the photon energy to make all three elements talk to each other,

Felix Borjans, a graduate student and first author on the study on what he describes as the challenging part of the work.

The researchers demonstrated entangling of electrons spins in silicon separated by distances more substantial than the device housing, this was a significant development when it comes to wiring these qubits and how to lay them out in silicon-based quantum microchips.

The communication between the distant silicon-based qubits devices builds on the works of Petta research team in 2010 which shows how to trap s single electron in quantum wells and also from works in the journal Nature from the year 2012 (transfer of quantum information from electron spins)

From the paper in Science 2016 (demonstrated the ability to transmit information from a silicon-based charge qubit to a photon), from Science 2017 (nearest-neighbour trading of information in qubits) and 2018 Nature (silicon spin qubit can exchange information with a photon).

This demonstration of interactions between two silicon spin qubits is essential for the further development of quantum tech. This demonstration will help technologies like modular quantum computers and quantum networks. The team has employed silicon and germanium, which is widely available in the market.

comments

Sameer is an aspiring Content Writer. Occasionally writes poems, loves food and is head over heels with Basketball.

Read more from the original source:

How This Breakthrough Makes Silicon-Based Qubit Chips The Future of Quantum Computing - Analytics India Magazine

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

Information teleported between two computer chips for the first time – New Atlas

Posted: at 8:49 pm


without comments

Scientists at the University of Bristol and the Technical University of Denmark have achieved quantum teleportation between two computer chips for the first time. The team managed to send information from one chip to another instantly without them being physically or electronically connected, in a feat that opens the door for quantum computers and quantum internet.

This kind of teleportation is made possible by a phenomenon called quantum entanglement, where two particles become so entwined with each other that they can communicate over long distances. Changing the properties of one particle will cause the other to instantly change too, no matter how much space separates the two of them. In essence, information is being teleported between them.

Hypothetically, theres no limit to the distance over which quantum teleportation can operate and that raises some strange implications that puzzled even Einstein himself. Our current understanding of physics says that nothing can travel faster than the speed of light, and yet, with quantum teleportation, information appears to break that speed limit. Einstein dubbed it spooky action at a distance.

Harnessing this phenomenon could clearly be beneficial, and the new study helps bring that closer to reality. The team generated pairs of entangled photons on the chips, and then made a quantum measurement of one. This observation changes the state of the photon, and those changes are then instantly applied to the partner photon in the other chip.

We were able to demonstrate a high-quality entanglement link across two chips in the lab, where photons on either chip share a single quantum state, says Dan Llewellyn, co-author of the study. Each chip was then fully programmed to perform a range of demonstrations which utilize the entanglement. The flagship demonstration was a two-chip teleportation experiment, whereby the individual quantum state of a particle is transmitted across the two chips after a quantum measurement is performed. This measurement utilizes the strange behavior of quantum physics, which simultaneously collapses the entanglement link and transfers the particle state to another particle already on the receiver chip.

The team reported a teleportation success rate of 91 percent, and managed to perform some other functions that will be important for quantum computing. That includes entanglement swapping (where states can be passed between particles that have never directly interacted via a mediator), and entangling as many as four photons together.

Information has been teleported over much longer distances before first across a room, then 25 km (15.5 mi), then 100 km (62 mi), and eventually over 1,200 km (746 mi) via satellite. Its also been done between different parts of a single computer chip before, but teleporting between two different chips is a major breakthrough for quantum computing.

The research was published in the journal Nature Physics.

Source: University of Bristol

See original here:

Information teleported between two computer chips for the first time - New Atlas

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

The 5 Most Important Federal Government Tech Predictions to Watch in 2020 – Nextgov

Posted: at 8:49 pm


without comments

Its taken a while for the U.S. federal government to fully climb aboard the emerging technology train, but as 2020 approaches it is clear that more agencies are ready to rideand steerthe train toward new digital trends.

Which technologies are likely to attract the most attention? It will certainly vary by agency as each has a unique mission, budget and outlook on the value of various technologies. But generally, in my conversations with government leaders, Im hearing about a few common areas of interest.

These are my federal government technology predictions for 2020:

1. Quantum computing takes a quantum leap.

Its probably the geekiest of technologies, but thats not going to stop the federal government from continuing to explore the possibilities around quantum computing in the coming year.

Whereas traditional computers are built around 1s and 0s, or what we call bits, quantum computers will use subatomic quantum bits or qubits. Its thought this still-developing technology could eventually solve problems in minutes rather than thousands of years. In fact, Google claimed it achieved quantum supremacy in October 2019, with its chip completing a task in 200 seconds that researches estimated would take a current supercomputer 10,000 years or more. This could dramatically accelerate how people create everything from drugs to cars to new food sources.

China sees quantum computing as the next front in its economic battle with the United States and is determined to own this next great technological leap. But the U.S. government is positioning to compete. In late 2018, it signed the National Quantum Initiative Act into law, which committed $1.2 billion to quantum intelligence research. More recently, the Department of Energy said it would provide $40 million for research to develop quantum computing software. And in May, a White House subcommittee issued a request for information seeking outside input on how the U.S. government should further quantum research. Even with this investment, the U.S. is falling behind the rest of the world in this field.

In 2020, expect quantum information science momentum to intensify as governments step up their game.

2. Everything-as-a-service goes mainstream.

As workplace technology needs have grown, it doesnt make sense for government agencies to handle many IT operations internally. Take device management, for example. When an organization purchases computers, it tends to buy them all at once, meaning theres a large investment up front. Then, they have to either staff up internally to manage and secure those devices or hire outside maintenance teams to do the job. On top of that, as those devices start showing signs of age, workers often have to hold onto them until the next budgetary window of opportunity allows them to be updated, which can affect worker productivity and job satisfaction.

But with a device-as-a-service (DaaS) approach, computer purchases become a monthly operating expense, so the investment is spread out over time. This system ensures that customers, always have access to the latest devices, which are maintained and secured by outside experts. Agency IT personnel are then free to focus on more strategic matters, such as critical management and operations functions beyond device maintenance. Ive seen that agencies are more open to the XaaS and DaaS model and expect adoption to expand in 2020.

3. Supply chain security becomes critical.

One of the greatest concerns of any supply chain, and especially for technology purchased by the U.S. government from international vendors, is the potential for parts suppliers to be compromised by foreign governments. This is an issue of national security, and one thats been in the headlines for most of 2019.

Its a valid concern, and one that I expect to stay top of mind in 2020. In fact, respected security wonk Bruce Schneier, a lecturer at the Harvard Kennedy School, recently asserted that every part of the supply chain can be attacked, including emerging 5G networks and new information systems.

This is why government technology purchasing decisions are so critical. In the past, many budget-minded government agencies have defaulted to purchasing lowest priced technically acceptable, or LPTA, computers and printers, because thats how theyve always done it. With cyber threats against government institutions increasing in frequency and maliciousness, it implores every agency to consider purchasing equipment from vendors with trustworthy supply chains.

Next year I would expect more progress around government legislation, such as the recently passed House Resolution 2500 and Senate Bill 1790, which aims to bring greater accountability into the nations procurement processes and make agencies smarter buyers.

4. Ambient technology energizes workers.

As the office of the future takes shape and employees increasingly work from multiple locations, the technology underlying physical spaces will adapt for remote employees. It will work in the background to invisibly empower people to communicate and collaborate anytime and anywhere.

This approach called ambient computing isnt entirely new. The idea has been around since the late 1980s when Mark Weiser, a scientist at Xerox PARC, described its precursor, ubiquitous computing, where he imagined people interacting with computers, wherever they might be. Of course, the technology didnt really exist back then to make that happen. Gartners 2020 technology trends refer to it as multiexperience, and frames it as the replacement of technology-literate people with people-literate technology. The rise of mobile and connected devices and technologies like artificial intelligence, virtual reality and augmented reality, are all ways that ambient technology could become a part of everyday life for government workers.

In the near future technology could help us in every phase of our day, from traffic and direction recommendations, connected devices with our projects and materials fully updated, and even recommendations on where to stop after work for happy hour.

In 2020, dont be surprised to hear about more breakthroughs in ambient technology and how its playing an integral role in every office, including government agencies.

5. AI continues its march on Washington.

At times, artificial intelligence sounds like some magical technology that can cure almost any government ill. The fact is AI algorithms are great at some things and not so good at others. For government agencies, AI will become increasingly important in 2020 because of its ability to automate time-consuming tasks, such as data research, and create efficiencies for government employees. It also presents amazing opportunities for instinctively detecting and guarding against unknown, unforeseen, or zero-day cyberattacks that most IT software wouldnt catch.

Looking ahead to 2020, there are undoubtedly many more trends likely to emerge and influence government spending and use of information technology. One thing is certain, though: the key for every agency in the coming decade will be to ditch the old LPTA procurement model and focus instead on technology delivering better operational efficiency, productivity and security. Most important for government is acquiring the best security.

Tommy Gardner is chief technology officer of HP Federal.

Continue reading here:

The 5 Most Important Federal Government Tech Predictions to Watch in 2020 - Nextgov

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

5 open source innovation predictions for the 2020s – TechRepublic

Posted: at 8:49 pm


without comments

IBM's CTO of Open Technology also looks back at the innovations of the past decade.

Open source played a significant role in software development over the past decade from containers to microservices, blockchain and serverless.

Chris Ferris, chief technology officer of Open Technology at IBM, discusses some of the open source trends from the past decade and what to expect in 2020 and beyond.

SEE: Deploying containers: Six critical concepts (TechRepublic)

The concepts of containers and microservices were merely concepts before 2010, Ferris said. Then Docker launched in 2013, planting the early seeds of the container industry.

At the same time, microservices and the technologies to make them possible were borne in open source through the Netflix OSS project.

Docker went on to become one of the most influential technologies of the 2010s, giving rise to a myriad of new open source projects, including Kubernetes, which launched in 2015.

Today, he noted, Kubernetes is the largest open source project on the planet. Companies are using the platform to transform monolithic application architectures, embracing containerized microservices that are supported by service mesh capabilities of projects such as Istio.

"In the next decade, we anticipate that open source projects such as Istio, Kubernetes and OKD will focus on making containers and microservices smaller and faster to serve the needs of cloud-native development and to reduce the container's attack surface," Ferris said.

OKD is the open source version of Red Hat's OpenShift platform. "Keep an eye on unikernels (executable images that contain system libraries, a language runtime, and necessary applications), which may also gain traction thanks to the open source communities around them."

AWS Lambda was released in 2014 and put all the PaaS services on notice. Lambda's release was followed by IBM OpenWhisk (which became Apache OpenWhisk), among others, in 2016. Both open source, distributed serverless platforms execute functions in response to events at any scale, Ferris said.

Kubernetes gained prominence in the latter part of the decade, fueling the desire to extend Kubernetes with capabilities that would enable serverless. This gave rise to Knative in 2018. Now Knative has split into multiple open source projects including Tekton, each with their own set of innovations, he said.

In the next few years, Ferris said we can expect to see containers get smaller, faster. "The potential exists to have an environment that can run containers at very little cost, instantaneously,'' pushing the boundaries of serverless platforms, he said.

IBM Watson made a huge splash when it appeared on "Jeopardy!" in 2011, bringing artificial intelligence into the mainstream. Now, Ferris noted, AI is part of our everyday lives and we interact with Siri and Alexa daily, talk with customer service chatbots regularly, use facial recognition to unlock our gadgets, and are nearing the advent of fully autonomous self-driving cars.

AI and machine learning have powered these innovations and many of the AI advancements came about thanks to open source projects such as TensorFlow and PyTorch, which launched in 2015 and 2016, respectively.

In the next decade, Ferris stressed the importance of not just making AI smarter and more accessible, but also more trustworthy. This will ensure that AI systems make decisions in a fair manner, aren't vulnerable to tampering, and can be explained, he said.

Open source is the key for building this trust into AI. Projects like the Adversarial Robustness 360 Toolkit, AI Fairness 360 Open Source Toolkit, and AI Explainability 360 Open Source Toolkit were created to ensure that trust is built into these systems from the beginning, he said.

Expect to see these projects and others from the Linux Foundation AI such as the ONNX project drive the significant innovation related to trusted AI in the future. The Linux Foundation AI provides a vendor-neutral interchange format for deep learning and machine learning.

In 2008, the pseudonymous Satoshi Nakamoto published his now famous paper on bitcoin, which introduced the concept of a blockchain network, whose purpose was to be a decentralized cryptocurrency platform.

That innovation made people start to wonder about different ways that the blockchain concepts and technology might be applied in non-cryptocurrency use cases in asset management, supply chains, healthcare, and identity, among others, Ferris said.

In 2015, IBM contributed its Open Blockchain project to the newly established Hyperledger organization, founded to develop open source blockchain technology for the enterprise. That contribution launched what has arguably become one of the two or three most popular blockchain frameworks: Hyperledger Fabric, he said.

While blockchain's initial uses were confined to cryptocurrency, open source engagement around Hyperledger and Ethereum has expanded the possibilities for how this technology is used.

In the enterprise, different approaches are being explored not only to enhance privacy but also to build a collection of nodes required to achieve confirmation on a transaction with trust almost all in open source, he said.

There has been lots of buzz around the promise of quantum computing, and although an app with a "quantum advantage" hasn't been developed yet, the ability for developers to start using quantum processors is growing and will continue to evolve in the next decade, Ferris said.

IBM's open source Qiskit software framework, released in 2016, lets developers code in Python on real quantum hardware for systems around research, education, business, and even games.

"The possibilities for how quantum computing will solve problems and interact with today's technology seem endless quantum computing could impact a wide range of domains, such as chemistry, finance, artificial intelligence, and others," he said. For that to happen will require a "significant hardware environment," Ferris said.

Open source is the best mechanism to bring about these changes, he maintained. That is what spawned ideas like microsystems, which grew out of the virtualization space, and Knative from Kubernetes.

"That wouldn't have happened in the closed source space, so it's a matter of everyone building up on everyone else's successes and someone coming along and saying, 'Here's a better idea,'" he said.

Working together, developers have the power to change entire industries, Ferris believes. "I can't think of anything that's been developed exclusively in closed source that didn't eventually come out in open source."

You don't want to miss our tips, tutorials, and commentary on the Linux OS and open source applications. Delivered Tuesdays

Image: Ildo Frazao, Getty Images/iStockphoto

Excerpt from:

5 open source innovation predictions for the 2020s - TechRepublic

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

20 technologies that could change your life in the next decade – Economic Times

Posted: at 8:49 pm


without comments

The decade thats knocking on our doors now the 2020s is likely to be a time when science fiction manifests itself in our homes and roads and skies as viable, everyday technologies. Cars that can drive themselves. Meat that is derived from plants. Robots that can be fantastic companions both in bed and outside.

Implanting kidneys that can be 3-D printed using your own biomaterial. Using gene editing to eradicate diseases, increase crop yield or fix genetic disorders in human beings. Inserting a swarm of nanobots that can cruise through your blood stream and monitor parameters or unblock arteries. Zipping between Delhi and New York on a hypersonic jet. All of this is likely to become possible or substantially closer to becoming a reality in the next 10 years.

Ideas that have been the staple of science fiction for decades artificial intelligence, universal translators, sex robots, autonomous cars, gene editing and quantum computing are at the cusp of maturity now. Many are ready to move out of labs and enter the mainstream. Expect the next decade to witness breakout years for the world of technology.

Read on:

The 2020s: A new decade promising miraculous tech innovations

Universal translators: End of language barrier

Climate interventions: Clearing the air from carbon

Personalised learning: Pedagogy gets a reboot with AI

Made in a Printer: 3-D printing going to be a new reality

Digital money: End of cash is near, cashless currencies are in vogue

Singularity: An era where machines will out-think human

Mach militaries: Redefining warfare in the 2020

5G & Beyond: Ushering a truly connected world

Technology: Solving the problem of clean water

Quantum computing : Beyond the power of classical computing

Nanotechnology: From science fiction to reality

Power Saver: Energy-storage may be the key to maximise power generation

Secret code: Gene editing could prove to be a game-changer

Love in the time of Robots: The rise of sexbots and artificial human beings

Wheels of the future: Flying cars, hyperloops and e-highways will transform how people travel

New skies, old fears: The good, bad& ugly of drones

Artificial creativity: Computer programs could soon churn out books, movies and music

Meat alternatives: Alternative meat market is expected to grow 10 times by 2029

Intelligent robots & cyborg warriors will lead the charge in battle

Why we first need to focus on the ethical challenges of artificial intelligence

It's time to reflect honestly on our motivations for innovation

India's vital role in new space age

Plastic waste: Environment-friendly packaging technologies will gain traction

Read more:

20 technologies that could change your life in the next decade - Economic Times

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

Donna Strickland appointed to Order of Canada – University of Rochester

Posted: at 8:49 pm


without comments

December 30, 2019

University of Rochesteralumna Donna Strickland 89 (PhD), who shared the 2018 Nobel Prize in Physics, has been appointed to theOrder of Canada.

The award recognizes individuals who have made extraordinary contributions to the nation. Strickland was appointed a Companion of the Order, the highest of three levels of the award. There can be no more than 165 living companions at any time.

The professor of physics at the University of Waterloo in Ontario, Canada, is being recognized for her contributions to optical physics and for her innovative developments in ultra-fast optical science.

I feel so proud and privileged to be Canadian and I am thrilled to receive this recognition from my country, Strickland toldCBC news. It is an exceptional honor for me to be named a companion of the Order of Canada. This award means a great deal to me.

Strickland and Grard Mourou, former engineering professor and scientist at the University of Rochesters Laboratory for Laser Energetics (LLE), were together recognized with the 2018 Nobel Prize for revolutionizing the field of high-intensity laser physics.

Mourou was Stricklands PhD advisor during the time they pioneered chirped-pulse amplification. Known as CPA, this work was the basis of Stricklands PhD in optics dissertation.

Today, CPA has applications in corrective eye surgeries and other surgical procedures, data storage, and quantum computing.

Tags: alumni, announcement, Institute of Optics, Laboratory for Laser Energetics, Nobel Prize

Category: University News

More:

Donna Strickland appointed to Order of Canada - University of Rochester

Written by admin

December 30th, 2019 at 8:49 pm

Posted in Quantum Computing

The Quantum Computing Decade Is ComingHeres Why You Should Care – Observer

Posted: December 21, 2019 at 9:52 am


without comments

Googles Sycamore quantum processor. Erik Lucero, Research Scientist and Lead Production Quantum Hardware

Multiply 1,048,589 by 1,048,601, and youll get 1,099,551,473,989. Does this blow your mind? It should, maybe! That 13-digit prime number is the largest-ever prime number to be factored by a quantum computer, one of a series of quantum computing-related breakthroughs (or at least claimed breakthroughs) achieved over the last few months of the decade.

An IBM computer factored this very large prime number about two months after Google announcedthat it had achieved quantum supremacya clunky term for the claim, disputed by its rivals including IBM as well as others, that Google has a quantum machine that performed some math normal computers simply cannot.

SEE ALSO: 5G Coverage May Set Back Accurate Weather Forecasts By 30 Years

An arcane field still existing mostly in the theoretical, quantum computers have done enough recently and are commanding enough very real public and private resources to be deserving of your attentionnot the least of which is because if and when the Chinese government becomes master of all your personal data, sometime in the next decade, it will be because a quantum computer cracked the encryption.

Building the quantum computer, it is said, breathlessly, is a race to be won, as important as being the first in space (though, ask the Soviet Union how that worked out) or fielding the first workable atomic weapon (seems to be going OK for the U.S.).

And so here is a postwritten in terms as clear and simple as this human could mustersumming up these recent advances and repeating other experts predictions that the 2020s appear to be the decade when quantum computers begin to contribute to your life, by both making slight improvements to your map app, and powering artificial intelligence robust and savvy enough to be a real-life Skynet.

First, the requisite introduction to the concept. Normal computers, such as the device you are using to access and display this content, process information in a binary. Everything is either a one, or a zero, or a series of ones and zeroes. On, or off. But what if the zero was simultaneously also a one? (Please exit here for your requisite digression into quantum physics and mechanics.)

The idea that a value can be a zero, or a one, or both at the same time is the quantum principle of superposition. Each superposition is a quantum bit, or qubit. The ability to process qubits is what allows a quantum computer to perform functions a binary computer simply cannot, like computations involving 500-digit numbers. To do so quickly and on demand might allow for highly efficient traffic flow. It could also render current encryption keys mere speedbumps for a computer able to replicate them in an instant.

An artists rendition of Googles Sycamore quantum processor mounted in a cryostat. Forest Stearns, Google AI Quantum Artist in Residence

Why hasnt this been mastered already, whats holding quantum computers back? Particles like photons only exist in quantum states if they are either compressed very, very small or made very, very coldwith analog engineering techniques. What quantum computers do exist are thus resource-intensive. Googles, for example, involves metals cooled (the verb is inadequate) to 460 degrees below zero, to a state in which particles behave in an erratic and random fashion akin to a quantum state.

And as Subhash Kak, the regents professor of electrical and computer engineering at Oklahoma State University and an expert in the field,recently wrote, the power of a quantum computer can be gauged by how many quantum bits, or qubits, it can process. The machines built by Google, Microsoft, Intel, IBM and possibly the Chinese all have less than 100 qubits,he wrote. (In Googles case, the company claims to have created a quantum state of 53 qubits.)

To achieve useful computational performance,according to Kak, you probably need machines with hundreds of thousands of qubits. And what qubits a quantum computer can offer are notoriously unstable and prone to error. They need many of the hard-won fixes and advancements that saw binary computers morph from room-sized monstrosities spitting out punch cards to iPhones.

How fast will that happencan it happen?

Skeptics, doubters, and haters might note that Google first pledged to achieve quantum supremacy (defined as the point in time at which quantum computers are outperforming binary computers) by the end of 2017meaning its achievement was almost two full years behind schedule, and meaning other quantum claims, like Dario Gil of IBMs pledge that quantum computers will be useful for commercial and scientific advantage sometime next year, may also be dismissed or at least subject to deserved skepticism.

Dario Gil, director of IBM Research, stands in front of IBMs Q System One quantum computer on October 18, 2019. Misha Friedman/Getty Images

And those of us who can think only in binary may also find confusion in the dispute between quantum rivals. The calculation performed by Googles Sycamore quantum computer in 200 seconds, the company claimed, would take a normal binary supercomputer 10,000 years to solve. Not so, according to IBM, which asserted that the calculation could be done by a binary computer in two and a half days. Either way, as The New York Times wrote, quantum supremacy is still a very arcane experiment that cant necessarily be applied to other things. Googles breakthrough might be the last achievement for a while.

But everybody is tryingincluding the U.S. government, which is using your money to do it. Commercial spending on quantum computing research is estimated to reach hundreds of millions of dollars sometime in the next decade. A year ago, spooked and shamed by what appeared to be an unanswered flurry of quantum progress in China, Congress dedicated $1.2 billion to the National Quantum Initiative Act, money specifically intended to boost American-based quantum computing projects. According to Bloomberg, China may have already spent 10 times that.

If you walk away with nothing else, know that quantum computer spending is very real, even if the potential is theoretical.

See the rest here:

The Quantum Computing Decade Is ComingHeres Why You Should Care - Observer

Written by admin

December 21st, 2019 at 9:52 am

Posted in Quantum Computing

2020 and beyond: Tech trends and human outcomes – Accountancy Age

Posted: at 9:52 am


without comments

The next decade promises to offer both incredible opportunity and challenge for all of us. Technologies like artificial intelligence (and its close friend, machine learning) will no longer be considered new but will instead be at the heart of some huge disruptive changes that will run right through our society. In particular, AI will start to enable the automation of many things that were previously deemed too complex or even too human.

Well see these changes at work traditional professions like accountancy, lawyers and others will, over time, see significant portions of what they do be taken over by virtual robots. Vocations such as lorry drivers, taxi drivers and even chefs may disappear as machines are introduced to perform the same function but with more consistent results and less risk.

Well also see these changes at home as AI will bring a host of new changes to how we live. AI will help us speak any language to anyone in the world, it will help us discover and create new content and maybe even help us decide what food to eat and when we should rest (and for how long!) in order to help us live lives that are not just more healthy, but more productive and of course more fun.

Well (hopefully) see these changes at school and in education too when we finally realise that in the 21st century, simply knowing stuff is no longer enough. Instead we might seek to use AI to build personalised learning schemes that tailor learning for every unique student such that they can reach their true potential regardless of their background, ability to learn or particular strengths and weaknesses. This could also mean the end of exams and tests as we know it as we move away from the unnecessary stress and futility of a single measure of knowledge taken at a single moment in time to a world of continuous assessment, where the system is able to measure progress as a by-product of the work that the student does every single day.

As for the technology itself, its going to continue to get quicker, cheaper, more powerful and smaller. Your huge smartphone may not be so huge by the time we get to 2030, in fact it may not be a phone at all but instead a small implant that you have inserted under your skin, just like the one we use today for our pets

Well also see the introduction of new game changing technologies like Quantum Computing. Dont be fooled, this is not just another computer but faster, the power and potential Quantum Computing offers us is almost unimaginable. Todays quantum computers are limited, complex machines that require an extreme environment in which to run, (most early quantum computers need to run at -273 degrees centigrade so dont think youre going to see one in your office or your home any time soon. But they are important because of the scale at which they operate. In simple terms, the power of todays quantum computers is measured at around 50 cubits (a cubit is a quantum computers measure of power, a bit like the digital equivalent of horse power), scientists believe that when we can get Quantum computers to 500 cubits, those computers will be able to answer as many questions as there are atoms in the world and at the same time! This is a kind of computational power that we cant even begin to imagine.

Oh and robots too. These wont be the industrial robots youre used to seeing, they might not even be the science fiction looking robots (you know, the ones that start as friends and then take over the world). These robots are going to be not just our friends, theyll be a part of our families. Its already started. If you have a smart speaker at home, youve got an early ancestor of something that will end up becoming your own personal C3PO, not just there to help you but there to provide companionship and friendship while you go about your busy lives.

But all this wont be without some risks.

Massive parts of our current labour market will be challenged by the rise of the machines. Our kids will continue to lack the skills theyre going to need to thrive and we adults are going to struggle to make sense of it all at home and at work.

The machines wont be perfect either, seeing as theyre created by humans, they end up with some human problems as a result, algorithmic bias will be one of the defining challenges of 2020 and beyond and its going to take a lot of human effort to get all of us to a point where we can trust our lives to the algorithms alone.

The good news in all of this is that the end result is still ultimately down to us humans. The real answer to what 2020 will hold for technology and how it affects us in our everyday lives will continue to be all about how we humans choose to use it. Im hopeful for a new era in 2020, one where we turn the corner in our relationship with technology and look not for dystopia, but instead we seek to ensure everyone has the right skills and ambition to build the utopia we deserve. To get there we need to teach our kids (and ourselves!) to break free of the technology that traps and disconnects us, an instead use the same technology to elevate what we could achieve not by replacing us, but by freeing us to do all of the amazing things that the technology alone cannot do. The best future awaits those that can combine the best of technological capability with the best of human ability.

Dave Coplin is former Chief Envisioning Officer for Microsoft UK, he has written two books, worked all over the world with organisations, individuals and governments all with the goal of demystifying technology and championing it as a positive transformation in our society.

Excerpt from:

2020 and beyond: Tech trends and human outcomes - Accountancy Age

Written by admin

December 21st, 2019 at 9:52 am

Posted in Quantum Computing


Page 13«..11121314