Page 5«..4567..10..»

Archive for the ‘Quantum Computer’ Category

Quantum computing: Photon startup lights up the future of computers and cryptography – ZDNet

Posted: October 8, 2020 at 2:54 am


without comments

A fast-growing UK startup is quietly making strides in the promising field of quantum photonics. Cambridge-based company Nu Quantum is building devices that can emit and detect quantum particles of light, called single photons. With a freshly secured 2.1 million ($2.71 million) seed investment, these devices could one day underpin sophisticated quantum photonic systems, for applications ranging from quantum communications to quantum computing.

The company is developing high-performance light-emitting and light-detecting components, which operate at the single-photon level and at ambient temperature, and is building a business based on the combination of quantum optics, semiconductor photonics, and information theory, spun out of the University of Cambridge after eight years of research at the Cavendish Laboratory.

"Any quantum photonic system will start with a source of single photons, and end with a detector of single photons," Carmen Palacios-Berraquero, the CEO of Nu Quantum, tells ZDNet. "These technologies are different things, but we are bringing them together as two ends of a system. Being able to controllably do that is our main focus."

SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium)

As Palacios-Berraquero stresses, even generating single quantum particles of light is very technically demanding.

In fact, even the few quantum computers that exist today, which were designed by companies such as Google and IBM, rely on the quantum states of matter, rather than light. In other words, the superconducting qubits that can be found in those tech giants' devices rely on electrons, not photons.

Yet the superconducting qubits found in current quantum computers are, famously, very unstable. The devices have to operate in temperatures colder than those found in deep space to function, because thermal vibrations can cause qubits to fall from their quantum state. On top of impracticality, this also means that it is a huge challenge to scale up the number of qubits in the computer.

A photonic quantum computer could have huge advantages over its matter-based counterpart. Photons are much less prone to interact with their environment, which means they can retain their quantum state for much longer and over long distances. A photonic quantum computer could, in theory, operate at room temperature and as a result, scale up much faster.

The whole challenge comes from creating the first quantum photon, explains Palacios-Berraquero. "Being able to emit one photon at a time is a ground-breaking achievement. In fact, it has become the Holy Grail of quantum optics."

"But I worked on generating single photons for my PhD. That's the IP I brought to the table."

Carmen Palacios-Berraquero and the Nu Quantum team just secured a 2.1 million ($2.71 million) seed investment.

Combined with improved technologies in the fields of nanoscale semi-conductor fabrication, Palacios-Berraquero and her team set off to crack the single-photon generation problem.

Nu Quantum's products come in the form of two little boxes: the first one generates the single photons that can be used to build quantum systems for various applications, and the other measures the quantum signals emitted by the first one. The technology, maintains the startup CEO, is bringing quantum one step closer to commercialization and adoption.

"Between the source and the detector of single photons, many things can happen, from the simplest to the most complex," explains Palacios-Berraquero. "The most complex one being a photonic quantum computer, in which you have thousands of photons on one side and thousands of detectors on the other. And in the middle, of course, you have gates, and entanglement, and and, and and. But that's the most complex example."

A photonic quantum computer is still a very long-term ambition of the startup CEO. A simpler application, which Nu Quantum is already working on delivering commercially with the UK's National Physical Laboratory, is quantum random number generation a technology that can significantly boost the security of cryptographic keys that secure data.

The keys that are currently used to encrypt the data exchanged between two parties are generated thanks to classical algorithms. Classical computing is deterministic: a given input will always produce the same output, meaning that complete randomness is fundamentally impossible. As a result, classical algorithms are predictable to an extent. In cryptography, this means that security keys can be cracked fairly easily, given sufficient computing power.

Not so much with quantum. A fundamental property of quantum photons is that they behave randomly: for example, if a single photon is sent down a path that separates in two ways, there is no way of knowing deterministically which way the particle will choose to go through.

SEE: What is the quantum internet? Everything you need to know about the weird future of quantum networks

The technology that Nu Quantum is developing with the National Physical Laboratory, therefore, consists of a source of single photons, two detectors, and a two-way path linking the three devices. "If we say the right detector is a 1, and the left detector is a 0, you end up with a string of numbers that's totally random," says Palacios-Berraquero. "The more random, the more unpredictable the key is, and the more secure the encryption."

Nu Quantum is now focusing on commercializing quantum random number generation, but the objective is to build up systems that are increasingly complex as the technology improves. Palacios-Berraquero expects that in four or five years, the company will be able to start focusing on the next step.

One day, she hopes, Nu Quantum's devices could be used to connect quantum devices in a quantum internet a decade-long project contemplated by scientists in the US, the EU, and China, which would tap the laws of quantum mechanics to almost literally teleport some quantum information from one quantum device to the next. Doing so is likely to require single photons to be generated and distributed between senders and receivers, because of the light particles' capacity to travel longer distances.

In the shorter term, the startup will be focusing on investing the seed money it has just raised. On the radar, is a brand-new lab and headquarters in Cambridge, and tripling the size of the team with a recruitment drive for scientists, product team members and business functions.

Read the rest here:

Quantum computing: Photon startup lights up the future of computers and cryptography - ZDNet

Written by admin

October 8th, 2020 at 2:54 am

Posted in Quantum Computer

Canadian quantum computing firms partner to spread the technology – IT World Canada

Posted: at 2:54 am


without comments

In a bid to accelerate this countrys efforts in quantum computing, 24 Canadian hardware and software companies specializing in the field are launching an association this week to help their work get commercialized.

Called Quantum Industry Canada, the group says they represent Canadas most commercial-ready technologies, covering applications in quantum computing, sensing, communications, and quantum-safe cryptography.

The group includes Burnaby, B.C., manufacturer D-Wave Systems, Vancouver software developer 1Qbit, Torontos photonic quantum computer maker Xanadu Quantum Technologies, the Canadian division of software maker Zapata Computing, Waterloo, Ont.,-based ISARA which makes quantum-safe solutions and others.

The quantum opportunity has been brewing for many years, association co-chair Michele Mosca of the University of Waterloos Institute for Quantum Computing and the co-founder of two quantum startups, said in an interview, explaining why the new group is starting now. Canadas been a global leader at building up the global opportunity, the science, the workforce, and we didnt want this chance to pass. Weve got over 24 innovative companies, and we wanted to work together to make these companies a commercial success globally.

Its also important to get Canada known as a leader in quantum-related products and services, he added. This will help assure a strong domestic quantum industry as we enter the final stages of quantum readiness.

And while quantum computing is a fundamental new tool, Mosca said, its also important for Canadian organizations to start planning for a quantum computing future, even if the real business value isnt obvious. We dont know exactly when youll get the real business advantage you want to be ready for when quantum computers can give you an advantage.

Adib Ghubril, research director at Toronto-based Info-Tech Research Group, said in an interview creation of such a group is needed. When you want to foster innovation you want to gain critical mass, a certain number of people working in different disciplines it will help motivate them, even maybe compete.

Researchers from startups and even giants like Google, Microsoft, Honeywell and IBM have been throwing billions at creating quantum computers. So are countries, especially China, but also Australia, the U.K., Germany and Switzerland. Many big-name firms are touting projects with experimental equipment, or hybrid hardware that does accelerated computations but dont meet the standard definition of a quantum computer.

True quantum computers may be a decade off, some suggest. Ghubril thinks were 15 years from what he calls reliable, effective quantum computing. Still, last December IDC predicted that by 2023, one-quarter of the Fortune Global 500 will gain a competitive advantage from emerging quantum computing solutions.

Among the recent signposts:

Briefly, quantum computers take the theory of quantum mechanics to change the world of traditional computation of bits represented by zeros and ones. Instead, a bit can be a zero or a one. In a quantum computer, such basic elements are called qubits. With their expected ability to do astonishing fast computations, quantum computers may be able to help pharmaceutical companies create new drugs and nation-states to break encryption protecting government secrets.

Companies are taking different approaches. D-Wave uses a quantum annealing process to make machines it says are suited to solving real-world computing problems today. Xanadu uses what Mosca calls a more circuit-type computing architecture. Theres certainly the potential that some of the nearer-term technologies will offer businesses advantage, especially as they scale.

We know the road towards a full-fledged quantum computer is long. But there are amazing milestones in that direction.

Ghubril says Canada is in the leading pack of countries working on quantum computing. The momentum out of China is enormous, he said, but it looks like the country will focus on using quantum for telecommunications and not business solutions.

From his point of view companies are taking two approaches to quantum computers. Some, like D-Wave, are trying to use quantum ideas to optimize solving modelling problems. The problem is not every problem is an optimization problem, he said. Other companies are trying for the Grand Poobah the real (quantum) computer. So the IBMs of the world are going for the gusto. They want the real deal. They want to solve the material chemistry and biosynthesis and so on. Theyve gone big, but by doing so theyve gone slower. You cant do much on the IBM platform. You can learn a lot, but you cant do much. You can do more on a D-Wave, but you can only do one thing.

Ghburil encourages companies to dabble in the emerging technology.

Thats Infotechs recommendation: Just learn about it. Join a forum, open an account, try a few things. Nobody is going to gain a (financial) competitive advantage. Its a learning advantage.

MapleSEC: How municipalities can guard against inevitable ransomware threats

Hashtag Trending - COVID Alert open source; AT&T ends DSL; John McAfee arrested

See original here:

Canadian quantum computing firms partner to spread the technology - IT World Canada

Written by admin

October 8th, 2020 at 2:54 am

Posted in Quantum Computer

Google’s Billion Dollar News, Commercial Quantum Computers And More In This Week’s Top News – Analytics India Magazine

Posted: at 2:54 am


without comments

The Dutch and the Finnish are doing their part in shedding the dystopian sci-fi rep that AI gets usually. These European nations often show up on the top when it comes to initiatives that take the human aspect seriously. Now they are at it again. Amsterdam and Helsinki are making moves to make sure that transparency of AI applications is established. Not only that but these cities want their citizens to play an active role going forward. In what can be a more sci-fi sounding announcement, quantum computing industry leader DWave opens up their tech for business applications making it the first to do so. There is more to news, thanks to Google and find out why in this weeks top news brought to you by Analytics India Magazine.

VMware and NVIDIA are coming together to offer an end-to-end enterprise platform for AI along with a new architecture for data center, cloud and edge; services that use NVIDIAs DPUs. We are partnering with NVIDIA to bring AI to every enterprise; a true democratization of one of the most powerful technologies, said Pat Gelsinger, CEO of VMware.

The full stack of AI software available on the NVIDIA NGCTM hub will be integrated into VMware vSphere, VMware Cloud Foundation and VMware Tanzu. This in turn will help accelerate AI adoption across the industru and allows enterprises to deploy AI-ready infrastructure across the data centers, cloud and edge.

On Thursday, Googles CEO Sundar Pichai announced that they would be sparing $1 billion for enabling high quality journalism. In a blog post penned by Pichai, underlined Googles mission to organize the worlds information and make it universally accessible and useful. Googles News Showcase features the editorial curation of award-winning newsrooms to give readers more insight on the stories that matter, and in the process, helps publishers develop deeper relationships with their audiences. Google has already signed partnerships for News Showcase with nearly 200 leading publications across Germany, Brazil, Argentina, Canada, the U.K. and Australia and will soon be expanding to India, Belgium and the Netherlands.

On Tuesday, D-Wave Systems, the Canadian quantum computing company announced the general availability of its next-gen quantum computing platform that flaunt new hardware, software, and tools to enable and accelerate the delivery of in-production quantum computing applications. The company stated that the platform is available in the Leap quantum cloud service and includes the Advantage quantum system, with more than 5000 qubits and 15-way qubit connectivity. In addition to this, there is an expanded hybrid solver service that can run problems with up to one million variables. Together, these services enables users to scale to address real-world problems with enabling businesses to run real-time quantum applications for the first time.

The PyTorch has announced that developers can leverage its libraries on Cloud TPUs. The XLA library, SAID pYtoRCH, has reached general availability (GA) on Google Cloud and supports a broad set of entry points for developers. It has a fast-growing community of researchers from MIT, Salesforce Research, Allen AI and elsewhere who train a wide range of models accelerated with Cloud TPUs and Cloud TPU Pods.

According to PyTorch, the aim of this project was to make it as easy as possible for the PyTorch community to leverage the high performance capabilities that Cloud TPUs offer while maintaining the dynamic PyTorch user experience. To enable this workflow, the team created PyTorch / XLA, a package that lets PyTorch connect to Cloud TPUs and use TPU cores as devices.

Github announced that the code scanning option, CodeQL is now generally available to all developers. With this new option developers get prompts It scans code as its created and surfaces actionable security reviews within pull requests and other GitHub experiences you use everyday, automating security as a part of your workflow. This helps ensure vulnerabilities never make it to production in the first place.Code scanning is powered by CodeQLthe worlds most powerful code analysis engine and will enable developers to use the 2,000+ CodeQL queries created by GitHub and the community, or create custom queries to easily find and prevent new security concerns.

No two palms are alike. Thats the idea behind Amazon One, a new service by the e commerce giant which allows customers to pay with their palm. Contactless payments were all the rage this pandemic and Amazon wants to step up their technology at one of their stores. All you need is a credit card, your mobile number, and of course, your palm. Once youre signed up, you can use your palm to enter, identify, and pay where Amazon One is available. Governments around the world started to ease the restrictions for public spaces like malls and stadiums and services like Amazon One might see a huge rise in demand because touching surfaces is so 2019!

On Monday, Amsterdam and Helsinki launched AI registries to detail how the respective governments use algorithms to deliver services. AI Register is a window into the artificial intelligence systems used by these cities through the register, citizens can get acquainted with the quick overviews of the citys artificial intelligence systems or examine their more detailed information based on your own interests. They can also give feedback and thus participate in building human-centred AI.

I have a master's degree in Robotics and I write about machine learning advancements. email:ram.sagar@analyticsindiamag.com

Go here to read the rest:

Google's Billion Dollar News, Commercial Quantum Computers And More In This Week's Top News - Analytics India Magazine

Written by admin

October 8th, 2020 at 2:54 am

Posted in Quantum Computer

SC20 Invited Speakers Tackle Challenges for the Earth, Its Inhabitants, and Our Security Using ‘More Than HPC’ – HPCwire

Posted: at 2:54 am


without comments

Oct. 5, 2020 The Invited Talks for SC20 represent the breadth, depth and future outlook of technology and its societal and scientific impact. HPC has always played a critical role in advancing breakthroughs in weather and climate research. This years invited talks extend this further to data driven approaches, including biodiversity, geoscience, and quantum computing. Our speakers will also touch on responsible application of HPC and new technological developments to highlight the impact of this potent and versatile technology on a wide range of applications.

Hear these illustrious speakers during SC20 Invited Talks, TuesdayThursday, November 1719.

Lorena Barba(George Washington University) will explore the need for trustworthy computational evidence through transparency and reproducibility. With the explosion of new computational models for vital research, including COVID-19, applications that are of such importance to society highlight the requirement of building trustworthy computational models. Emphasizing transparency and reproducibility have helped us build more trust in computational findings. How should we adapt our practices for reproducibility to achieve unimpeachable provenance, and reach full accountability of scientific evidence produced via computation?

Shekhar Borkar(Qualcomm Inc.) will speak on the future of computing in the so-called post Moores law era. While speculations about the end of Moores law have created some level of fear in the community, this ending may not be coming as soon as we think. This talk will revisit the historic predictions of the end, and discuss promising opportunities and innovations that may further Moores law and continue to deliver unprecedented performance for years to come.

Dalia A. Conde(University of Southern Denmark) will offer a presentation on fighting the extinction crisis with data. With biodiversity loss identified by the World Economic Forum as one of humanitys greatest challenges, computational methods are urgently needed to secure a healthier planet. We must design and implement effective species conservation strategies, which rely on vast and disparate volumes of data, from genetics and habitat to legislation and human interaction. This talk will introduce the Species Knowledge Index initiative, which aims to map, quantify, analyze, and disseminate open information on animal species to policy makers and conservationists around the globe.

Tom Conte(Georgia Tech) will examine HPC after Moores law. Whether Moores law has ended, is about to end, or will never end, the slowing of the semiconductor innovation curve has left the industry looking for alternatives. Different approaches, beyond quantum or neuromorphic computing, may disrupt current algorithms and software development. This talk will preview the road ahead, and suggest some exciting new technologies on the horizon.

Marissa Giustina(Google LLC) will share the challenges and recent discoveries in the development of Googles Quantum computer, from both the hardware and quantum-information perspectives. This prototype hardware holds promise as a platform for tackling problems that have been impossible to address with existing HPC systems. The talk will include recent technological developments, as well as some perspective for the future of quantum computing.

Patrick Heimbach(The University of Texas at Austin) will discuss the need for advanced computing to help solve the global ocean state estimation problem. Because of the challenge of observing the full-depth global ocean circulation in its spatial detail, numerical simulations play an essential role in quantifying patterns of climate variability and change. New methods that are being developed at the interface of predictive data science remain underutilized in ocean climate modeling. These methods face considerable practical hurdles in the context of HPC, but will be indispensable for advancing simulation-based contributions to real world problems.

Simon Knowles(Graphcore) will discuss the reinvention of accelerated computing for artificial intelligence. As HPC changes in response to the needs of the growing user community, AI can harness enormous quantities of processing power even as we move towards power-limited computing. To balance these needs, the intelligence processor (IPU) architecture is able to capture learning processes and offer massive heterogeneous parallelism. This ground-up reinvention of accelerated computing will show considerable results for real applications.

Ronald P. Luijten(Data Motion Architecture and Consulting GmbH) will offer a presentation on data-centric architecture of a weather and climate accelerator. Using a co-design approach, a non-Von-Neumann accelerator targeting weather and climate situations was developed in tandem with the application code to optimize memory bandwidth. This also led to the filing of a patent for a novel CGRA (Course Grain Reconfigurable Array) layout that reflects grid points in the physical world. The talk will include benchmarks achieved in the project, and a discussion of next steps.

Catherine (Katie) Schuman(Oak Ridge National Laboratory) will introduce us to the future of AI and HPC, in the form of neuromorphic computing and neural accelerators. These two new types of computing technologies offer significant advantages over traditional approaches, including considerably increased energy efficiency and accelerated neural network-style computing. This talk will illustrate the fundamental computing concepts involved in these new hardware developments, and highlight some initial performance results.

Compton Tucker(NASA Goddard Space Flight Center) will speak on satellite tree enumeration outside of forests at the Fifty Centimeter Scale. Non-forest trees, which grow isolated outside of forests, and are not well documented, nevertheless play a crucial role for biodiversity, carbon storage, food resources, and shelter for humans & animals. This talk will detail the use of HPC and machine learning to enumerate isolated trees globally, to identify localized areas of degradation, and quantify the role of isolated trees in the global carbon cycle.

Cliff Young(Google LLC) will entertain the question of whether we can build a virtuous cycle between machine learning and HPC. While machine learning draws on many HPC components, the two areas are diverging in precision and programming models. However, it may be possible to construct a positive feedback loop between them. The Tensor Processing Unit (TPU) could provide opportunities to unite these fields to solve common problems through parallelization, mixed precision, and new algorithms.

Source: Melyssa Fratkin, SC20 Communications Chair

Here is the original post:

SC20 Invited Speakers Tackle Challenges for the Earth, Its Inhabitants, and Our Security Using 'More Than HPC' - HPCwire

Written by admin

October 8th, 2020 at 2:54 am

Posted in Quantum Computer

A new claimant for "most powerful quantum computer" – Axios

Posted: October 3, 2020 at 5:59 am


without comments

The startup IonQ today announced what it's calling "the world's most powerful quantum computer."

Why it matters: Quantum is the next frontier in computing, theoretically capable of solving problems beyond the ability of classical computers. IonQ's next-generation computer looks set to push the boundaries of quantum, but it will still take years before the technology becomes truly reliable.

How it works: IonQ reports its new quantum computer system has 32 "perfect" qubits the basic unit of information in a quantum computer that the company says gives it an expected quantum volume of more than 4,000,000.

Background: IonQ was co-founded by Chris Monroe, a University of Maryland professor and major figure in the development of quantum computers. In the mid-1990s, he began working on entangling atoms to make more precise atomic clocks, the most accurate timekeeping devices known.

The catch: IonQ hasn't yet released detailed specifications of its new system, and its research needs to be verified.

Context: IonQ's announcement comes in the same week that its competitor Honeywell, which also use a version of trapped ions, reported achieving a quantum volume of 128, and the Canadian startup D-Wave announced a 5,000-qubit system built yet another way would that be available for customers, including via the cloud.

Be smart: Comparing different kinds of quantum computing systems is difficult because they function in fundamentally different ways.

Go here to see the original:

A new claimant for "most powerful quantum computer" - Axios

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

ESAs -Week: Digital Twin Earth, Quantum Computing and AI Take Center Stage – SciTechDaily

Posted: at 5:59 am


without comments

Digital Twin Earth will help visualize, monitor, and forecast natural and human activity on the planet. The model will be able to monitor the health of the planet, perform simulations of Earths interconnected system with human behavior, and support the field of sustainable development, therefore, reinforcing Europes efforts for a better environment in order to respond to the urgent challenges and targets addressed by the Green Deal. Credit: ESA

ESAs 2020 -week event kicked off this morning with a series of stimulating speeches on Digital Twin Earth, updates on -sat-1, which was successfully launched into orbit earlier this month, and an exciting new initiative involving quantum computing.

The third edition of the -week event, which is entirely virtual, focuses on how Earth observation can contribute to the concept of Digital Twin Earth a dynamic, digital replica of our planet which accurately mimics Earths behavior. Constantly fed with Earth observation data, combined with in situ measurements and artificial intelligence, the Digital Twin Earth provides an accurate representation of the past, present, and future changes of our world.

Digital Twin Earth will help visualize, monitor, and forecast natural and human activity on the planet. The model will be able to monitor the health of the planet, perform simulations of Earths interconnected system with human behavior, and support the field of sustainable development, therefore, reinforcing Europes efforts for a better environment in order to respond to the urgent challenges and targets addressed by the Green Deal.

Todays session opened with inspiring statements from ESAs Director General, Jan Wrner, ESAs Director of Earth Observation Programmes, Josef Aschbacher, ECMWFS Director General, Florence Rabier, European Commissions Deputy Director General for Defence Industry and Space, Pierre Delsaux, as well as Director General of DG CONNECT at the European Commission, Roberto Viola.

The -week 2020 opened on 28 September with inspiring statements from ESAs Director General, Jan Wrner (left) and ESAs Director of Earth Observation Programmes, Josef Aschbacher. Credit: ESA

Pierre Delsaux commented, As our EU Commission President repeated recently during her State of the Union speech, its clear we need to address climate change. The Copernicus program offers us some of the best instruments, satellites, to give us a complete picture of our planets health. But space is not only a monitoring tool, it is also about applied solutions for our economy to make it more green and more digital.

Roberto Viola said, -week is the week for disruptive technology and it is communities like this that our European programmes were designed to support.

Florence Rabier added, Machine learning and artificial intelligence could improve the realism and efficiency of the Digital Twin Earth especially for extreme weather events and numerical forecast models.

Jan Wrner concluded, -week is the perfect example of the New Space approach focusing on disruptive innovation, artificial intelligence, agility and flexibility.

During the week, experts will come together to discuss the role of artificial intelligence for the Digital Twin Earth concept, its practical implementation, the infrastructure requirements needed to build the Digital Twin Earth, and present ideas on how industries and the science community can contribute.

Cloud mask from -sat-1. Credit: Cosine remote sensing B.V

Earlier this month, on 3 September, the first artificial intelligence (AI) technology carried onboard a European Earth observation mission, -sat-1, was launched from Europes spaceport in French Guiana. An enhancement of the Federated Satellite Systems mission (FSSCat), the pioneering artificial intelligence technology is the first experiment to improve the efficiency of sending vast quantities of data back to Earth.

Today, ESA, along with cosine remote sensing, are happy to reveal the first ever hardware-accelerated AI inference of Earth observation images on an in-orbit satellite performed by a Deep Convolutional Neural Network, developed by the University of Pisa.

-sat-1 has successfully enabled the pre-filtering of Earth observation data so that only relevant part of the image with usable information are downlinked to the ground, thereby improving bandwidth utilization and significantly reducing aggregated downlink costs.

Initial data downlinked from the satellite has shown that the AI-powered automatic cloud detection algorithm has correctly sorted hyperspectral Earth observation imagery from the satellites sensor into cloudy and non-cloudy pixels.

Lake Tharthar, Iraq. Credit: Cosine remote sensing B.V

Massimiliano Pastena, -sat-1 Technical Officer at ESA, commented, We have just entered the history of space.

Todays successful application of the Ubotica Artificial Intelligence technology, which is powered by the Intel Movidius Myriad 2 Vision Processing Unit, has demonstrated real on-board data processing autonomy.

Aubrey Dunne, Co-Founder and Vice President of Engineering at Ubotica Technologies, said, We are very excited to be a key part of what is to our knowledge the first ever demonstration of AI applied to Earth Observation data on a flying satellite. This is a watershed moment both for onboard processing of satellite data, and for the future of AI inference in orbital applications.

As the overall 2017 Copernicus Masters winner, FSSCat, was proposed by Spains Universitat Politcnica de Catalunya and developed by a consortium of European companies and institutes including Tyvak International.

Also mentioned in his opening speech this morning, Josef Aschbacher made a special announcement regarding an exciting new ESA initiative, the EOP AI-enhanced Quantum Initiative for EO QC4EO in collaboration with the European Organization for Nuclear Research (CERN).

Quantum computing has the potential to improve performance, decrease computational costs and solve previously intractable problems in Earth observation by exploiting quantum phenomena such as superposition, entanglement, and tunneling.

Quantum computing has the potential to improve performance, decrease computational costs and solve previously intractable problems in Earth observation by exploiting quantum phenomena such as superposition, entanglement and tunneling. Credit: IBM

The initiative involves creating a quantum capability which will have the ability to solve demanding Earth observation problems by using artificial intelligence to support programmes such as Digital Twin Earth and Copernicus. The initiative will be developed at the -lab an ESA laboratory at ESAs center for Earth observation in Italy, which embraces transformational innovation in Earth observation.

ESA and CERN enjoy a long-standing collaboration, centered on technological matters and fundamental physics. This collaboration will be extended to link to the CERN Quantum Technology Initiative, which was announced in June 2020 by the CERN Director General, Fabiola Gianotti.

Through this partnership, ESA and CERN will create new synergies, building on their common experience in big data, data mining and pattern recognition.

Giuseppe Borghi, Head of the -lab, said, Quantum computing together with AI are perhaps the most promising breakthrough to come along in computer technology. In the coming years, we will see more Earth or space science disciplines employing current or future quantum computing techniques to solve geoscience problems.

Josef Aschbacher added, ESA will exploit the broad range of specialized expertise available at ESA and we will place ourselves in a unique position and take a leading role in the development of quantum technologies in the Earth observation domain.

Alberto Di Meglio, Coordinator of the CERN Quantum Technology Initiative, said, Quantum technologies are a rapidly growing field of research and their applications have the potential to revolutionize the way we do science. Preparing for that paradigm change, by building knowledge and tools, is essential. This new collaboration on quantum technologies bears great promise.

See the original post:

ESAs -Week: Digital Twin Earth, Quantum Computing and AI Take Center Stage - SciTechDaily

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

Schrdingers Web offers a sneak peek at the quantum internet – Science News

Posted: at 5:59 am


without comments

Schrdingers Web Jonathan P. Dowling CRC Press, $40.95

When news broke last year that Googles quantum computer Sycamore had performed a calculation faster than the fastest supercomputers could (SN: 12/16/19), it was the first time many people had ever heard of a quantum computer.

Quantum computers, which harness the strange probabilities of quantum mechanics, may prove revolutionary. They have the potential to achieve an exponential speedup over their classical counterparts, at least when it comes to solving some problems. But for now, these computers are still in their infancy, useful for only a few applications, just as the first digital computers were in the 1940s. So isnt a book about the communications network that will link quantum computers the quantum internet more than a little ahead of itself?

Surprisingly, no. As theoretical physicist Jonathan Dowling makes clear in Schrdingers Web, early versions of the quantum internet are here already for example, quantum communication has been taking place between Beijing and Shanghai via fiber-optic cables since 2016 and more are coming fast. So now is the perfect time to read up.

Dowling, who helped found the U.S. governments quantum computing program in the 1990s, is the perfect guide. Armed with a seemingly endless supply of outrageous anecdotes, memorable analogies, puns and quips, he makes the thorny theoretical details of the quantum internet both entertaining and accessible.

Readers wanting to dive right in to details of the quantum internet will have to be patient. Photons are the particles that will power the quantum internet, so we had better be sure we know what the heck they are, Dowling writes. Accordingly, the first third of the book is a historical overview of light, from Newtons 17th century idea of light as corpuscles to experiments probing the quantum reality of photons, or particles of light, in the late 20th century. There are some small historical inaccuracies the section on the Danish physicist Hans Christian rsted repeats an apocryphal tale about his serendipitous discovery of the link between electricity and magnetism and the footnotes rely too much on Wikipedia. But Dowling accomplishes what he sets out to do: Help readers develop an understanding of the quantum nature of light.

Headlines and summaries of the latest Science News articles, delivered to your inbox

Like Dowlings 2013 book on quantum computers, Schrdingers Killer App, Schrdingers Web hammers home the nonintuitive truths at the heart of quantum mechanics. For example, key to the quantum internet is entanglement that spooky action at a distance in which particles are linked across time and space, and measuring the properties of one particle instantly reveals the others properties. Two photons, for instance, can be entangled so they always have the opposite polarization, or angle of oscillation.

In the future, a user in New York could entangle two photons and then send one along a fiber-optic cable to San Francisco, where it would be received by a quantum computer. Because these photons are entangled, measuring the New York photons polarization would instantly reveal the San Francisco photons polarization. This strange reality of entanglement is what the quantum internet exploits for neat features, such as unhackable security; any eavesdropper would mess up the delicate entanglement and be revealed. While his previous book contains more detailed explanations of quantum mechanics, Dowling still finds amusing new analogies, such as Fuzz Lightyear, a canine that runs along a superposition, or quantum combination, of two paths into neighbors yards. Fuzz helps explain physicist John Wheelers delayed-choice experiment, which illustrates the uncertainty, unreality and nonlocality of the quantum world. Fuzzs path is random, the dog doesnt exist on one path until we measure him, and measuring one path seems to instantly affect which yard Fuzz enters even if hes light-years away.

The complexities of the quantum web are saved for last, and even with Dowlings help, the details are not for the faint of heart. Readers will learn how to prepare Bell tests to check that a system of particles is entangled (SN: 8/28/15), navigate bureaucracy in the Department of Defense and send unhackable quantum communications with the dryly named BB84 and E91 protocols. Dowling also goes over some recent milestones in the development of a quantum internet, such as the 2017 quantum-secured videocall between scientists in China and Austria via satellite (SN: 9/29/17).

Just like the classical internet, we really wont figure out what the quantum internet is useful for until it is up and running, Dowling writes, so people can start playing around with it. Some of his prognostications seem improbable. Will people really have quantum computers on their phones and exchange entangled photons across the quantum internet?

Dowling died unexpectedly in June at age 65, before he could see this future come to fruition. Once when I interviewed him, he invoked Arthur C. Clarkes first law to justify why he thought another esteemed scientist was wrong. The first law is that if a distinguished, elderly scientist tells you something is possible, hes very likely right, he said. If he tells you something is impossible, hes very likely wrong.

Dowling died too soon to be considered elderly, but he was distinguished, and Schrdingers Web lays out a powerful case for the possibility of a quantum internet.

Buy Schrdingers Web from Amazon.com.Science Newsis a participant in the Amazon Services LLC Associates Program. Please see ourFAQfor more details.

See the rest here:

Schrdingers Web offers a sneak peek at the quantum internet - Science News

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

Global QC Market Projected to Grow to More Than $800 million by 2024 – HPCwire

Posted: at 5:59 am


without comments

The Quantum Economic Development Consortium (QED-C) and Hyperion Research are projecting that the global quantum computing (QC) market worth an estimated $320 million in 2020 will grow at an anticipated 27% CAGR between 2020 and 2024, reaching approximately $830 million by 2024.

This estimate is based on surveys of 135 US-based quantum computing researchers, developers and suppliers across the academic, commercial and government sectors. Supplemental data and insights came from a companion effort that surveyed 115 current and potential quantum computing users in North America, Europe and the Asia/Pacific region on their expectations, schedules and budgets for the use of quantum computing in their existing and planned computational workloads.

(Keeping track of the various quantum computing organization is becoming a challenge in itself. The Quantum Economic Development Consortium (QED-C) is a consortium of stakeholders that aims to enable and grow the U.S. quantum industry. QED-C was established with support from the National Institute of Standards and Technology (NIST) as part of the Federal strategy for advancing quantum information science and as called for by theNational Quantum Initiative Actenacted in 2018.)

Additional results from the study:

Based on our study and related forecast, there is a growing, vibrant, and diverse US-based QC research, development, and commercial ecosystem that shows the promise of maturing into a viable, if not profitable and self-sustaining industry. That said, it is too early to start picking winners and losers from either a technology or commercial perspective, said Bob Sorensen, quantum analyst for Hyperion Research.

A key driver for commercial success could be the ability of any vendor to ease the requirements needed to integrate QC technology into a larger HPC and enterprise IT user base while still supporting advanced QC-related research for a more targeted, albeit smaller, class of end-user scientists and engineers. This sector is not for faint of heart, but this forecast gives some sense of what is at stake hereat least for the next few years, noted Sorensen.

Source: QED-C

QED-C commissioned and collaborated with Hyperion Research to develop this market forecast to help inform decision making for QC technology developers and suppliers, national-level QC-related policy makers, potential QC users in both the advanced computing and enterprise IT marketplace investors and commercial QC funding organizations. This is a baseline estimate, and Hyperion Research and QED-C are looking to provide periodic updates of their QC market forecast as events, information, or decision- making requirements dictate. Contact: Celia Merzbacher, QED-C Deputy Director, [emailprotected]

See original here:

Global QC Market Projected to Grow to More Than $800 million by 2024 - HPCwire

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

Berkeley Lab Technologies Honored With 7 R&D 100 Awards – Lawrence Berkeley National Laboratory

Posted: at 5:59 am


without comments

Innovative technologies from Lawrence Berkeley National Laboratory (Berkeley Lab) to achieve higher energy efficiency in buildings, make lithium batteries safer and higher performing, and secure quantum communications were some of the inventions honored with R&D 100 Awards by R&D World magazine.

For more than 50 years, the annual R&D 100 Awards have recognized 100 technologies of the past year deemed most innovative and disruptive by an independent panel of judges. The full list of winners, announced by parent company WTWH Media LLC is available at the R&D World website.

Berkeley Labs award-winning technologies are described below.

A Tool to Accelerate Electrochemical and Solid-State Innovation

(from left) Adam Weber, New Danilovic, Douglas Kushner, and John Petrovick (Credit: Berkeley Lab)

Berkeley Lab scientists invented a microelectrode cell to analyze and test electrochemical systems with solid electrolytes. Thanks to significant cost and performance advantages, this tool can accelerate development of critical applications such as energy storage and conversion (fuel cells, batteries, electrolyzers), carbon capture, desalination, and industrial decarbonization.

Solid electrolytes have been displacing liquid electrolytes as the focus of electrochemical innovation because of their performance, safety, and cost advantages. However, the lack of effective methods and equipment for studying solid electrolytes has hindered advancement of the technologies that employ them. This microelectrode cell meets the testing needs, and is already being used by Berkeley Lab scientists.

The development team includes Berkeley Lab researchers Adam Weber, Nemanja Danilovic, Douglas Kushner, and John Petrovick.

Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com)

Information transmitted by MMQ-Com is impervious to security breaches. (Credit: Alexander Stibor/Berkeley Lab)

Quantum communication, cybersecurity, and quantum computing are growing global markets. But the safety of our data is in peril given the rise of quantum computers that can decode classical encryption schemes.

The Matter-Wave Modulating Secure Quantum Communicator (MMQ-Com) technology is a fundamentally new kind of secure quantum information transmitter. It transmits messages by modulating electron matter-waves without changing the pathways of the electrons. This secure communication method is inherently impervious to any interception attempt.

A novel quantum key distribution scheme also ensures that the signal is protected from spying by other quantum devices.

The development team includes Alexander Stibor of Berkeley Labs Molecular Foundry along with Robin Rpke and Nicole Kerker of the University of Tbingen in Germany.

Solid Lithium Battery Using Hard and Soft Solid Electrolytes

(from left) Marca Doeff, Guoying Chen, and Eongyu Yi (Credit: Berkeley Lab)

The lithium battery market is expected to grow from more than $37 billion in 2019 to more than $94 billion by 2025. However, the liquid electrolytes used in most commercial lithium-ion batteries are flammable and limit the ability to achieve higher energy densities. Safety issues continue to plague the electronics markets, as often-reported lithium battery fires and explosions result in casualties and financial losses.

In Berkeley Labs solid lithium battery, the organic electrolytic solution is replaced by two solid electrolytes, one soft and one hard, and lithium metal is used in place of the graphite anode. In addition to eliminating battery fires, incorporation of a lithium metal anode with a capacity 10 times higher than graphite (the conventional anode material in lithium-ion batteries) provides much higher energy densities.

The technology was developed by Berkeley Lab scientists Marca Doeff, Guoying Chen, and Eongyu Yi, along with collaborators at Montana State University.

Porous Graphitic Frameworks for Sustainable High-Performance Li-Ion Batteries

High-resolution transmission electron microscopy images of the Berkeley Lab PGF cathode reveal (at left) a highly ordered honeycomb structure within the 2D plane, and (at right) layered columnar arrays stacked perpendicular to the 2D plane. (Credit: Yi Liu/Berkeley Lab)

The Porous Graphitic Frameworks (PGF) technology is a lithium-ion battery cathode that could outperform todays cathodes in sustainability and performance.

In contrast to commercial cathodes, organic PGFs pose fewer risks to the environment because they are metal-free and composed of earth-abundant, lightweight organic elements such as carbon, hydrogen, and nitrogen. The PGF production process is also more energy-efficient and eco-friendly than other cathode technologies because they are prepared in water at mild temperatures, rather than in toxic solvents at high temperatures.

PGF cathodes also display stable charge-discharge cycles with ultrahigh capacity and record-high energy density, both of which are much higher than all commercial inorganic cathodes and organic cathodes known to exist.

The development team includes Yi Liu and Xinie Li of Berkeley Labs Molecular Foundry, as well as Hongxia Wang and Hao Chen of Stanford University.

Building Efficiency Targeting Tool for Energy Retrofits (BETTER)

The buildings sector is the largest source of primary energy consumption (40%) and ranks second after the industrial sector as a global source of direct and indirect carbon dioxide emissions from fuel combustion. According to the World Economic Forum, nearly one-half of all energy consumed by buildings could be avoided with new energy-efficient systems and equipment.

(from left) Carolyn Szum (Lead Researcher), Han Li, Chao Ding, Nan Zhou, Xu Liu (Credit: Berkeley Lab)

The Building Efficiency Targeting Tool for Energy Retrofits (BETTER) allows municipalities, building and portfolio owners and managers, and energy service providers to quickly and easily identify the most effective cost-saving and energy-efficiency measures in their buildings. With an open-source, data-driven analytical engine, BETTER uses readily available building and monthly energy data to quantify energy, cost, and greenhouse gas reduction potential, and to recommend efficiency interventions at the building and portfolio levels to capture that potential.

It is estimated that BETTER will help reduce about 165.8 megatons of carbon dioxide equivalent (MtCO2e) globally by 2030. This is equivalent to the CO2 sequestered by growing 2.7 billion tree seedlings for 10 years.

The development team includes Berkeley Lab scientists Nan Zhou, Carolyn Szum, Han Li, Chao Ding, Xu Liu, and William Huang, along with collaborators from Johnson Controls and ICF.

AmanziATS: Modeling Environmental Systems Across Scales

Simulated surface and subsurface water from Amanzi-ATS hydrological modeling of the Copper Creek sub-catchment in the East River, Colorado watershed. (Credit: Zexuan Xu/Berkeley Lab, David Moulton/Los Alamos National Laboratory)

Scientists use computer simulations to predict the impact of wildfires on water quality, or to monitor cleanup at nuclear waste remediation sites by portraying fluid flow across Earth compartments. The Amanzi-Advanced Terrestrial Simulator (ATS) enables them to replicate or couple multiple complex and integrated physical processes controlling these flowpaths, making it possible to capture the essential physics of the problem at hand.

Specific problems require taking an individual approach to simulations, said Sergi Molins, principal investigator at Berkeley Lab, which contributed expertise in geochemical modeling to the softwares development. Physical processes controlling how mountainous watersheds respond to disturbances such as climate- and land-use change, extreme weather, and wildfire are far different than the physical processes at play when an unexpected storm suddenly impacts groundwater contaminant levels in and around a nuclear remediation site. Amanzi-ATS allows scientists to make sense of these interactions in each individual scenario.

The code is open-source and capable of being run on systems ranging from a laptop to a supercomputer. Led by Los Alamos National Laboratory, Amanzi-ATS is jointly developed by researchers from Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Berkeley Lab researchers including Sergi Molins, Marcus Day, Carl Steefel, and Zexuan Xu.

Institute for the Design of Advanced Energy Systems (IDAES)

The U.S. Department of Energys (DOEs) Institute for the Design of Advanced Energy Systems (IDAES) project develops next-generation computational tools for process systems engineering (PSE) of advanced energy systems, enabling their rapid design and optimization.

IDAES Project Team (Credit: Berkeley Lab)

By providing rigorous modeling capabilities, the IDAES Modeling & Optimization Platform helps energy and process companies, technology developers, academic researchers, and DOE to design, develop, scale-up, and analyze new and potential PSE technologies and processes to accelerate advances and apply them to address the nations energy needs. The IDAES platform is also a key component in the National Alliance for Water Innovation, a $100 million, five-year DOE innovation hub led by Berkeley Lab, which will examine the critical technical barriers and research needed to radically lower the cost and energy of desalination.

Led by National Energy Technology Laboratory, IDAES is a collaboration with Sandia National Laboratories, Berkeley Lab, West Virginia University, Carnegie Mellon University, and the University of Notre Dame. The development team at Berkeley Lab includes Deb Agarwal, Oluwamayowa (Mayo) Amusat, Keith Beattie, Ludovico Bianchi, Josh Boverhof, Hamdy Elgammal, Dan Gunter, Julianne Mueller, Jangho Park, Makayla Shepherd, Karen Whitenack, and Perren Yang.

# # #

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 13 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

More here:

Berkeley Lab Technologies Honored With 7 R&D 100 Awards - Lawrence Berkeley National Laboratory

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer

oneAPI Academic Center of Excellence Established at the Heidelberg University Computing Center (URZ) – HPCwire

Posted: at 5:59 am


without comments

Sept. 29, 2020 A oneAPI Academic Center of Excellence (CoE) is now established at the Heidelberg University Computing Center (URZ). The new CoE will conduct research supporting the oneAPI industry initiative to create a uniform, open programming model for heterogeneous computer architectures.

A common language for heterogeneous computing

URZ will focus its research and programming efforts on a fundamental high-performance computing (HPC) challenge where modern computers utilize different types of hardware for different calculations. Accelerators, including graphics processing units (GPUs) and field programmable gate arrays (FPGAs), are used in combination with general compute processors (CPUs). Using different types of hardware make computers very powerful and provide versatility for a wide range of situations and workloads. However, hardware heterogeneity complicates software development for these computers, especially when specialized components from a variety of vendors are used in tandem.

One major reason for this complication is that many accelerated compute architectures require their own programming models. Therefore, software developers need to learn and use a different and sometimes proprietary language for each processing unit in a heterogeneous system, which increases complexity and limits flexibility.

oneAPIs cross-architecture language Data Parallel C++ (DPC++), based on Khronos Groups SYCL standard for heterogeneous programming in C++, overcomes these challenges with its single, unified open development model for performant and productive heterogeneous programming and cross-vendor support.

Developing for Heterogeneous Systems: advancing features and capabilities, maximizing interoperability

URZs work as a oneAPI CoE will add advanced DPC++ capabilities intohipSYCL, which supports systems based on AMD GPUs, NVIDIA GPUs, and CPUs. New DPC++ extensions are part of the SYCL 2020 provisional specification that brings features such as unified shared memory to hipSYCL and the platforms it supports furthering the promise of oneAPI application support across system architectures and vendors.

URZ HPC technical specialist Aksel Alpay, who created hipSYCL, leads its on-going development. The whole project is quite ambitious, says Alpay, venturing a look into the future. hipSYCL is an academic research project as well as a development project, where the final product will be used in production operations. It is incredibly exciting to bring DPC++ and SYCL 2020 capabilities to additional architectures, such as AMD GPUs.

To expedite the research, URZ researchers and developers will access an international network of experts at Intel and numerous academic and government institutions a great advantage to advance hipSYCL capabilities and further the goal of the oneAPI initiative. For a scientific computing center to have access to this level of expertise and work together on an open standard with partners from around the globe, is a wonderful prospect, states Heidelberg Universitys CIO and URZ director Prof. Dr. Vincent Heuveline, who is a major proponent of the CoE. In addition to being the universitys main liaison for the center, he will function as its scientific advisor.

One of our strategic goals is to make a measurable contribution to the transfer of new technologies from research to industrial application, and of course to continuously expand our expertise and research efforts in the field of supercomputing. The oneAPI CoE will allow us to do both, explains Heuveline.

oneAPI is a true cross-industry initiative that seeks to simplify development of diverse workloads by streamlining code re-use across a variety of architectures through an open and collaborative approach. URZs research helps to deliver on the cross-vendor promise of oneAPI by expanding advanced DPC++ application support to other architectures, says Dr. Jeff McVeigh, Intel vice president of Datacenter XPU Products and Solutions.

About oneAPI

oneAPI is an industry initiative to create a single, unified, cross-architecture programming model for CPUs + accelerator architectures. Based on industry standards and its open development approach, the initiative will help streamline software development for high performance computers, increase performance, and provide specifications for efficient and diverse architecture programming.

Learn more

Source: Heidelberg University

See more here:

oneAPI Academic Center of Excellence Established at the Heidelberg University Computing Center (URZ) - HPCwire

Written by admin

October 3rd, 2020 at 5:59 am

Posted in Quantum Computer


Page 5«..4567..10..»