Page 8«..78910..»

Archive for the ‘Quantum Computer’ Category

Toshiba Exits PC Business 35 Years of IBM Compatible PCs – Electropages

Posted: August 14, 2020 at 11:51 pm

without comments

Recently, Toshiba announced that it would sell the remained of its computer and laptop operations to Sharp after 35 years of working in the sector. Who is Toshiba, what products did Toshiba produce, and what will Toshiba look towards for its future endeavours?

Toshiba is a Japanese multinational conglomerate with its headquarters located in Minato, Tokyo. Toshiba provides a wide range of services and products in many industries, including semiconductors, discrete electronics, hard drives, printers, and quantum cryptography. Founded in 1890, the company has over 140,000 employees worldwide with yearly revenue of 3.693 trillion, and an operating income of 35.4 billion. Toshiba is arguably most known for its consumer-end products, including televisions, laptops, and flash memory.

One of the biggest challenges faced by early computer makers was creating a portable machine that would allow individuals to work while on the move. The reasons for the difficulty came from a multitude of problems, including heavy batteries, bulky floppy drives, and CRT screens that can easily weigh into the tens of kilograms. The first portable computer, called the Osborne, was developed in 1981, but its reliance on a mains plug made the computer more of a luggable as opposed to a portable platform (a battery pack was available, but only as an optional add-on). While the Osborne was the worlds first portable computer, the first IBM compatible PC laptop was produced by Toshiba in 1985, and offered MS-DOS 2.11, integrated an Intel 80C88 4.7MHz processor, 256KB RAM, internal 3.5 floppy drive, and a 640 x 200 display. Measuring only 4.1KG, the Toshiba T1100 is considered the first mass-produced laptop computer and provided a standard that other manufacturers would quickly follow.

While Toshiba has a long history producing PC compatible computers and laptops, the recent fall in sales has led to Toshiba selling the remainder of its stake in Dynabook to Sharp. To better understand just how much sales have fallen, Toshiba was selling over 17 million computers in 2011 and had dropped to just 1.9 million in 2017. This fall in sales resulted in Toshiba pulling out from the European market in 2016, but even this move did not help entirely. The exact reason for this reduction in sales cannot be attributed to any one cause, but the mass influx of mobile devices such as tabs and smartphones, as well as the introduction of cloud-based applications, means that tasks that would typically be done on a computer can now be done of much smaller, more convenient devices.

Consumer demand for laptops has soared in the last few months because of the Coronavirus pandemic and global lockdowns, but overall, the market for personal computers has been tough for quite a while. Only those who have managed to sustain scale and price (like Lenovo), or have a premium brand (like Apple) have succeeded in the unforgiving PC market, where volumes have been falling for years.

While the PC market is incredibly vast, it is only a small sector that Toshiba has specialised in. This year (2020), Toshiba announced its plans to launch quantum cryptography services, develop affordable solid-state LiDAR, and produce hydrogen fuel cells. Toshiba also continues to develop its other industrial sectors, including electronic storage (FLASH, HDDs, etc.), building systems (elevators), energy systems, infrastructure, and retail. Such a move by Toshiba makes sense when considering that quantum computers are starting to find real-world application, governments around the world are trying to move towards green technologies, and the rapid increase in internet usage is putting a strain on data centres.

Read More

More here:

Toshiba Exits PC Business 35 Years of IBM Compatible PCs - Electropages

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

IBM Z mainframes revived by Red Hat, AI and security – TechTarget

Posted: at 11:51 pm

without comments


Published: 13 Aug 2020

Mainframe systems could play a significant role in cybersecurity and artificial intelligence advancements in years to come and IBM is investing in those areas to ensure system Z mainframes have a stake in those growing tech markets.

IBM mainframe sales grew some 69% during the second quarter of this year, achieving the highest year-over-year percentage increase of any other business unit. Some industry observers attribute the unexpected performance to the fact the z15, introduced a year ago, is still in its anticipated upcycle. Typically, mainframe sales level off and dip after 12 to 18 months until the release of a new system. But that might not be the case this time around.

Ross Mauri, general manager of IBM's Z and LinuxOne mainframe business, discussed some of the factors that could contribute to sustained growth of the venerable system, including IBM's acquisition of Red Hat, the rise of open source software and timely technical enhancements.

Mainframe revenues in the second quarter were the fastest-growing of any IBM business unit, something analysts didn't expect to see again. Is this just the typical upcycle for the latest system or something else at work?

Ross Mauri: A lot of it has to do with the Red Hat acquisition and the move toward hybrid clouds. Consequently, mainframes are picking up new workloads, which is why you are seeing a lot more MIPS being generated. We set a record for MIPS in last year's fourth quarter.

How much of it has to do with the increase in Linux-based mainframes and the growing popularity of open source software?

Mauri: Yes, there is that plus all the more strategic applications [OpenShift, Ansible] going to the cloud. What also helped was our Capacity On Demand program going live in the second quarter, providing users with four times the [processor] capacity they had a year ago.

Some industries are in slumps, but online sales are up and that means credit card and banking systems are more active than normal. They liked the idea of being able to turn on 'dark' processors remotely.

Some analysts think mainframes are facing the same barrier Intel-based machines are with Moore's Law. Are you running out of real estate on mainframe chips to improve performance?

Mauri: What we have done is made improvements in the instruction set. So, with things like Watson machine learning, users can work to a pretty high level of AI, taking greater advantage of the hardware. We've not run out of real estate on the chips, or out of performance, and I don't think we will. If you think that, we will prove you wrong.

But with the last couple of mainframe releases performance improvements were in the single digits, compared to 30% to 40% performance improvements of Power systems.

Mauri: In terms of Z [series mainframes], they are running as fast as Power. We know where [mainframes] are going to be running in the future. As we move to deep learning inference engines in the future, you'll see more AI running on the system to help with fraud analytics and real-time transactions. We haven't played out our whole hand yet. The AI market is still nascent; we are very much at the beginning of it. For instance, we're not anywhere near what we can do with the security of the system.

As we move to deep learning inference engines in the future, you'll see more AI running on [mainframes] to help with fraud analytics and real-time transactions. We haven't played out our whole hand yet. Ross MauriGeneral manager, IBM's Z and LinuxOne mainframes

We have started to put quantum encryption algorithms in the system already, to make sure security was sound given what's going on in the world of cybersecurity. You'll see us continue to invest more in the future when it comes to AI. We'll build on that machine learning base we have already.

Is IBM Research investigating other technologies that would sit between existing mainframes and quantum computers in terms of improving performance?

Mauri: Our [mainframe] systems group is working closely with the quantum team as well as with IBM Research. We are still in the research phase; no one's using them for production.

What we're exploring with IBM Research and clients is trying to determine what algorithms run well on a quantum computer for solving business problems and business processes that now run on mainframes. For instance, we're looking at big financial institutions where we can make use of quantum computers as closely coupled accelerators for the mainframe. We think it can greatly reduce costs and improve business processing speed. It's actually not that complex to do. We're doing active experiments with clients now.

What are you looking at to increase performance?

Mauri: We are looking at a whole range of options right now. We have something we do with clients called Enterprise Design Thinking where they are involved throughout an entire process to make sure we're not putting some technology in that's not going to work for them. We have been doing that since the z14 [mainframe].

Read more:

IBM Z mainframes revived by Red Hat, AI and security - TechTarget

Written by admin

August 14th, 2020 at 11:51 pm

Posted in Quantum Computer

Quantum computing will (eventually) help us discover vaccines in days – VentureBeat

Posted: May 17, 2020 at 10:41 pm

without comments

The coronavirus is proving that we have to move faster in identifying and mitigating epidemics before they become pandemics because, in todays global world, viruses spread much faster, further, and more frequently than ever before.

If COVID-19 has taught us anything, its that while our ability to identify and treat pandemics has improved greatly since the outbreak of the Spanish Flu in 1918, there is still a lot of room for improvement. Over the past few decades, weve taken huge strides to improve quick detection capabilities. It took a mere 12 days to map the outer spike protein of the COVID-19 virus using new techniques. In the 1980s, a similar structural analysis for HIV took four years.

But developing a cure or vaccine still takes a long time and involves such high costs that big pharma doesnt always have incentive to try.

Drug discovery entrepreneur Prof. Noor Shaker posited that Whenever a disease is identified, a new journey into the chemical space starts seeking a medicine that could become useful in contending diseases. The journey takes approximately 15 years and costs $2.6 billion, and starts with a process to filter millions of molecules to identify the promising hundreds with high potential to become medicines. Around 99% of selected leads fail later in the process due to inaccurate prediction of behavior and the limited pool from which they were sampled.

Prof. Shaker highlights one of the main problems with our current drug discovery process: The development of pharmaceuticals is highly empirical. Molecules are made and then tested, without being able to accurately predict performance beforehand. The testing process itself is long, tedious, cumbersome, and may not predict future complications that will surface only when the molecule is deployed at scale, further eroding the cost/benefit ratio of the field. And while AI/ML tools are already being developed and implemented to optimize certain processes, theres a limit to their efficiency at key tasks in the process.

Ideally, a great way to cut down the time and cost would be to transfer the discovery and testing from the expensive and time-inefficient laboratory process (in-vitro) we utilize today, to computer simulations (in-silico). Databases of molecules are already available to us today. If we had infinite computing power we could simply scan these databases and calculate whether each molecule could serve as a cure or vaccine to the COVID-19 virus. We would simply input our factors into the simulation and screen the chemical space for a solution to our problem.

In principle, this is possible. After all, chemical structures can be measured, and the laws of physics governing chemistry are well known. However, as the great British physicist Paul Dirac observed: The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble.

In other words, we simply dont have the computing power to solve the equations, and if we stick to classical computers we never will.

This is a bit of a simplification, but the fundamental problem of chemistry is to figure out where electrons sit inside a molecule and calculate the total energy of such a configuration. With this data, one could calculate the properties of a molecule and predict its behavior. Accurate calculations of these properties will allow the screening of molecular databases for compounds that exhibit particular functions, such as a drug molecule that is able to attach to the coronavirus spike and attack it. Essentially, if we could use a computer to accurately calculate the properties of a molecule and predict its behavior in a given situation, it would speed up the process of identifying a cure and improve its efficiency.

Why are quantum computers much better than classical computers at simulating molecules?

Electrons spread out over the molecule in a strongly correlated fashion, and the characteristics of each electron depend greatly on those of its neighbors. These quantum correlations (or entanglement) are at the heart of the quantum theory and make simulating electrons with a classical computer very tricky.

The electrons of the COVID-19 virus, for example, must be treated in general as being part of a single entity having many degrees of freedom, and the description of this ensemble cannot be divided into the sum of its individual, distinguishable electrons. The electrons, due to their strong correlations, have lost their individuality and must be treated as a whole. So to solve the equations, you need to take into account all of the electrons simultaneously. Although classical computers can in principle simulate such molecules, every multi-electron configuration must be stored in memory separately.

Lets say you have a molecule with only 10 electrons (forget the rest of the atom for now), and each electron can be in two different positions within the molecule. Essentially, you have 2^10=1024 different configurations to keep track of rather just 10 electrons which would have been the case if the electrons were individual, distinguishable entities. Youd need 1024 classical bits to store the state of this molecule. Quantum computers, on the other hand, have quantum bits (qubits), which can be made to strongly correlate with one another in the same way electrons within molecules do. So in principle, you would need only about 10 such qubits to represent the strongly correlated electrons in this model system.

The exponentially large parameter space of electron configurations in molecules is exactly the space qubits naturally occupy. Thus, qubits are much more adapted to the simulation of quantum phenomena. This scaling difference between classical and quantum computation gets very big very quickly. For instance, simulating penicillin, a molecule with 41 atoms (and many more electrons) will require 10^86 classical bits, or more bits than the number of atoms in the universe. With a quantum computer, you would only need about 286 qubits. This is still far more qubits than we have today, but certainly a more reasonable and achievable number. The COVID-19 virus outer spike protein, for comparison, contains many thousands of atoms and is thus completely intractable for classical computation. The size of proteins makes them intractable to classical simulation with any degree of accuracy even on todays most powerful supercomputers. Chemists and pharma companies do simulate molecules with supercomputers (albeit not as large as the proteins), but they must resort to making very rough molecule models that dont capture the details a full simulation would, leading to large errors in estimation.

It might take several decades until a sufficiently large quantum computer capable of simulating molecules as large as proteins will emerge. But when such a computer is available, it will mean a complete revolution in the way the pharma and the chemical industries operate.

The holy grail end-to-end in-silico drug discovery involves evaluating and breaking down the entire chemical structures of the virus and the cure.

The continued development of quantum computers, if successful, will allow for end-to-end in-silico drug discovery and the discovery of procedures to fabricate the drug. Several decades from now, with the right technology in place, we could move the entire process into a computer simulation, allowing us to reach results with amazing speed. Computer simulations could eliminate 99.9% of false leads in a fraction of the time it now takes with in-vitro methods. With the appearance of a new epidemic, scientists could identify and develop a potential vaccine/drug in a matter of days.

The bottleneck for drug development would then move from drug discovery to the human testing phases including toxicity and other safety tests. Eventually, even these last stage tests could potentially be expedited with the help of a large scale quantum computer, but that would require an even greater level of quantum computing than described here. Tests at this level would require a quantum computer with enough power to contain a simulation of the human body (or part thereof) that will screen candidate compounds and simulate their impact on the human body.

Achieving all of these dreams will demand a continuous investment into the development of quantum computing as a technology. As Prof. Shohini Ghose said in her 2018 Ted Talk: You cannot build a light bulb by building better and better candles. A light bulb is a different technology based on a deeper scientific understanding. Todays computers are marvels of modern technology and will continue to improve as we move forward. However, we will not be able to solve this task with a more powerful classical computer. It requires new technology, more suited for the task.

(Special thanks Dr. Ilan Richter, MD MPH for assuring the accuracy of the medical details in this article.)

Ramon Szmuk is a Quantum Hardware Engineer at Quantum Machines.


Quantum computing will (eventually) help us discover vaccines in days - VentureBeat

Written by admin

May 17th, 2020 at 10:41 pm

Posted in Quantum Computer

Quantum computing analytics: Put this on your IT roadmap – TechRepublic

Posted: at 10:41 pm

without comments

Quantum is the next step toward the future of analytics and computing. Is your organization ready for it?

Quantum computing can solve challenges that modern computers can't--or it might take them a billion years to do so. It can crack any encryption and make your data completely safe. Google reports that it has seen a quantum computer that performed at least 100 million times faster than any classical computer in its lab.

Quantum blows away the processing of data and algorithms on conventional computers because of its ability to operate on electrical circuits that can be in more than one state at once. A quantum computer operates on Qubits (quantum bits) instead of on the standard bits that are used in conventional computing.

SEE: Managing AI and ML in the enterprise 2020: Tech leaders increase project development and implementation (TechRepublic Premium)

Quantum results can quickly make an impact on life science and pharmaceutical companies, for financial institutions evaluating portfolio risks, and for other organizations that want to expedite time-to-results for processing that on conventional computing platforms would take days to complete.

Few corporate CEOs are comfortable trying to explain to their boards what quantum computing is and why it is important to invest in it.

"There are three major areas where we see immediate corporate engagement with quantum computing," said Christopher Savoie, CEO and co-founder of Zapata Quantum Computing Software Company, a quantum computing solutions provider backed by Honeywell. "These areas are machine learning, optimization problems, and molecular simulation."

Savoie said quantum computing can bring better results in machine learning than conventional computing because of its speed. This rapid processing of data enables a machine learning application to consume large amounts of multi-dimensional data that can generate more sophisticated models of a particular problem or phenomenon under study.

SEE: Forget quantum supremacy: This quantum-computing milestone could be just as important (TechRepublic)

Quantum computing is also well suited for solving problems in optimization. "The mathematics of optimization in supply and distribution chains is highly complex," Savoie said. "You can optimize five nodes of a supply chain with conventional computing, but what about 15 nodes with over 85 million different routes? Add to this the optimization of work processes and people, and you have a very complex problem that can be overwhelming for a conventional computing approach."

A third application area is molecular simulation in chemistry and pharmaceuticals, which can be quite complex.

In each of these cases, models of circumstances, events, and problems can be rapidly developed and evaluated from a variety of dimensions that collate data from many diverse sources into a model.

SEE:Inside UPS: The logistics company's never-ending digital transformation (free PDF)(TechRepublic)

"The current COVID-19 crisis is a prime example," Savoie said. "Bill Gates knew in 2015 that handling such a pandemic would present enormous challengesbut until recently, we didn't have the models to understand the complexities of those challenges."

For those engaging in quantum computing and analytics today, the relative newness of the technology presents its own share of glitches. This makes it important to have quantum computing experts on board. For this reason, most early adopter companies elect to go to the cloud for their quantum computing, partnering with a vendor that has the specialized expertise needed to run and maintain quantum analytics.

SEE: Rural America is in the midst of a mental health crisis. Tech could help some patients see a way forward. (cover story PDF) (TechRepublic)

"These companies typically use a Kubernetes cluster and management stack on premises," Savoie said. "They code a quantum circuit that contains information on how operations are to be performed on quantum qubits. From there, the circuit and the prepared data are sent to the cloud, which performs the quantum operations on the data. The data is processed in the cloud and sent back to the on-prem stack, and the process repeats itself until processing is complete."

Savoie estimated that broad adoption of quantum computing for analytics will occur within a three- to five-year timeframe, with early innovators in sectors like oil and gas, and chemistry, that already understand the value of the technology and are adopting sooner.

"Whether or not you adopt quantum analytics now, you should minimally have it on your IT roadmap," Savoie said. "Quantum computing is a bit like the COVID-19 crisis. At first, there were only two deaths; then two weeks later, there were ten thousand. Quantum computing and analytics is a highly disruptive technology that can exponentially advance some companies over others."

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Image: sakkmesterke, Getty Images/iStockphoto

See the original post here:

Quantum computing analytics: Put this on your IT roadmap - TechRepublic

Written by admin

May 17th, 2020 at 10:41 pm

Posted in Quantum Computer

Video: The Future of Quantum Computing with IBM – insideHPC

Posted: at 10:41 pm

without comments

Dario Gil from IBM Research

In this video, Dario Gil from IBM shares results from the IBM Quantum Challenge and describes how you can access and program quantum computers on the IBM Cloud today.

From May 4-8, we invited people from around the world to participate in the IBM Quantum Challengeon the IBM Cloud. We devised the Challenge as a global event to celebrateour fourth anniversary of having a real quantum computer on the cloud. Over those four days 1,745people from45countries came together to solve four problems ranging from introductory topics in quantum computing, to understanding how to mitigate noise in a real system, to learning about historic work inquantum cryptography, to seeing how close they could come to the best optimization result for a quantum circuit.

Those working in the Challenge joined all those who regularly make use of the 18quantum computing systems that IBM has on the cloud, includingthe 10 open systemsand the advanced machines available within theIBM Q Network. During the 96 hours of the Challenge, the total use of the 18 IBM Quantum systems on the IBM Cloud exceeded 1 billion circuits a day. Together, we made history every day the cloud users of the IBM Quantum systems made and then extended what can absolutely be called a world record in computing.

Every day we extend the science of quantum computing and advance engineering to build more powerful devices and systems. Weve put new two new systems on the cloud in the last month, and so our fleet of quantum systems on the cloud is getting bigger and better. Well be extending this cloud infrastructure later this year by installing quantum systems inGermanyand inJapan. Weve also gone more and more digital with our users with videos, online education, social media, Slack community discussions, and, of course, the Challenge.

Dr. Dario Gil is the Director of IBM Research, one of the worlds largest and most influential corporate research labs. IBM Research is a global organization with over 3,000 researchers at 12 laboratories on six continents advancing the future of computing. Dr. Gil leads innovation efforts at IBM, directing research strategies in Quantum, AI, Hybrid Cloud, Security, Industry Solutions, and Semiconductors and Systems. Dr. Gil is the 12th Director in its 74-year history. Prior to his current appointment, Dr. Gil served as Chief Operating Officer of IBM Research and the Vice President of AI and Quantum Computing, areas in which he continues to have broad responsibilities across IBM. Under his leadership, IBM was the first company in the world to build programmable quantum computers and make them universally available through the cloud. An advocate of collaborative research models, he co-chairs the MIT-IBM Watson AI Lab, a pioneering industrial-academic laboratory with a portfolio of more than 50 projects focused on advancing fundamental AI research to the broad benefit of industry and society.

Sign up for our insideHPC Newsletter

Read the rest here:

Video: The Future of Quantum Computing with IBM - insideHPC

Written by admin

May 17th, 2020 at 10:41 pm

Posted in Quantum Computer

Registration Open for Inaugural IEEE International Conference on Quantum Computing and Engineering – HPCwire

Posted: at 10:41 pm

without comments

LOS ALAMITOS, Calif.,May 14, 2020 Registration is now open for the inauguralIEEE International Conference on Quantum Computing and Engineering (QCE20), a multidisciplinary event focusing on quantum technology, research, development, and training. QCE20, also known as IEEE Quantum Week, will deliver a series ofworld-class keynotes,workforce-building tutorials,community-building workshops, andtechnical paper presentations and postersonOctober 12-16inDenver, Colorado.

Were thrilled to open registration for the inaugural IEEE Quantum Week, founded by the IEEE Future Directions Initiative and supported by multiple IEEE Societies and organizational units, said Hausi Mller, QCE20 general chair and co-chair of the IEEE Quantum Initiative.Our initial goal is to address the current landscape of quantum technologies, identify challenges and opportunities, and engage the quantum community. With our current Quantum Week program, were well on track to deliver a first-rate quantum computing and engineering event.

QCE20skeynote speakersinclude the following quantum groundbreakers and leaders:

The week-longQCE20 tutorials programfeatures 15 tutorials by leading experts aimed squarely at workforce development and training considerations. The tutorials are ideally suited to develop quantum champions for industry, academia, and government and to build expertise for emerging quantum ecosystems.

Throughout the week, 19QCE20 workshopsprovide forums for group discussions on topics in quantum research, practice, education, and applications. The exciting workshops provide unique opportunities to share and discuss quantum computing and engineering ideas, research agendas, roadmaps, and applications.

The deadline for submittingtechnical papersto the eight technical paper tracks isMay 22. Papers accepted by QCE20 will be submitted to the IEEE Xplore Digital Library. The best papers will be invited to the journalsIEEE Transactions on Quantum Engineering(TQE)andACM Transactions on Quantum Computing(TQC).

QCE20 provides attendees a unique opportunity to discuss challenges and opportunities with quantum researchers, scientists, engineers, entrepreneurs, developers, students, practitioners, educators, programmers, and newcomers. QCE20 is co-sponsored by the IEEE Computer Society, IEEE Communications Society, IEEE Council on Superconductivity,IEEE Electronics Packaging Society (EPS), IEEE Future Directions Quantum Initiative, IEEE Photonics Society, and IEEETechnology and Engineering Management Society (TEMS).

Registerto be a part of the highly anticipated inaugural IEEE Quantum Week 2020. event news and all program details, including sponsorship and exhibitor opportunities.

About the IEEE Computer Society

The IEEE Computer Society is the worlds home for computer science, engineering, and technology. A global leader in providing access to computer science research, analysis, and information, the IEEE Computer Society offers a comprehensive array of unmatched products, services, and opportunities for individuals at all stages of their professional career. Known as the premier organization that empowers the people who drive technology, the IEEE Computer Society offers international conferences, peer-reviewed publications, a unique digital library, and training programs. more information.

About the IEEE Communications Society

TheIEEE Communications Societypromotes technological innovation and fosters creation and sharing of information among the global technical community. The Society provides services to members for their technical and professional advancement and forums for technical exchanges among professionals in academia, industry, and public institutions.

About the IEEE Council on Superconductivity

TheIEEE Council on Superconductivityand its activities and programs cover the science and technology of superconductors and their applications, including materials and their applications for electronics, magnetics, and power systems, where the superconductor properties are central to the application.

About the IEEE Electronics Packaging Society

TheIEEE Electronics Packaging Societyis the leading international forum for scientists and engineers engaged in the research, design, and development of revolutionary advances in microsystems packaging and manufacturing.

About the IEEE Future Directions Quantum Initiative

IEEE Quantumis an IEEE Future Directions initiative launched in 2019 that serves as IEEEs leading community for all projects and activities on quantum technologies. IEEE Quantum is supported by leadership and representation across IEEE Societies and OUs. The initiative addresses the current landscape of quantum technologies, identifies challenges and opportunities, leverages and collaborates with existing initiatives, and engages the quantum community at large.

About the IEEE Photonics Society

TheIEEE Photonics Societyforms the hub of a vibrant technical community of more than 100,000 professionals dedicated to transforming breakthroughs in quantum physics into the devices, systems, and products to revolutionize our daily lives. From ubiquitous and inexpensive global communications via fiber optics, to lasers for medical and other applications, to flat-screen displays, to photovoltaic devices for solar energy, to LEDs for energy-efficient illumination, there are myriad examples of the Societys impact on the world around us.

About the IEEE Technology and Engineering Management Society

IEEE TEMSencompasses the management sciences and practices required for defining, implementing, and managing engineering and technology.

Source: IEEE Computer Society

Original post:

Registration Open for Inaugural IEEE International Conference on Quantum Computing and Engineering - HPCwire

Written by admin

May 17th, 2020 at 10:41 pm

Posted in Quantum Computer

Light, fantastic: the path ahead for faster, smaller computer processors – News – The University of Sydney

Posted: at 10:41 pm

without comments

Research team: (from left) Associate Professor Stefano Palomba, Dr Alessandro Tuniz, Professor Martijn de Sterke. Photo: Louise Cooper

Light is emerging as the leading vehicle for information processing in computers and telecommunications as our need for energy efficiency and bandwidth increases.

Already the gold standard for intercontinental communication through fibre-optics, photons are replacing electrons as the main carriers of information throughout optical networks and into the very heart of computers themselves.

However, there remain substantial engineering barriers to complete this transformation. Industry-standard silicon circuits that support light are more than an order of magnitude larger than modern electronic transistors. One solution is to compress light using metallic waveguides however this would not only require a new manufacturing infrastructure, but also the way light interacts with metals on chips means that photonic information is easily lost.

Now scientists in Australia and Germany have developed a modular method to design nanoscale devices to help overcome these problems, combining the best of traditional chip design with photonic architecture in a hybrid structure. Their research is published today in Nature Communications.

We have built a bridge between industry-standard silicon photonic systems and the metal-based waveguides that can be made 100 times smaller while retaining efficiency, said lead author Dr Alessandro Tuniz from the University of Sydney Nano Institute and School of Physics.

This hybrid approach allows the manipulation of light at the nanoscale, measured in billionths of a metre. The scientists have shown that they can achieve data manipulation at 100 times smaller than the wavelength of light carrying the information.

This sort of efficiency and miniaturisation will be essential in transforming computer processing to be based on light. It will also be very useful in the development of quantum-optical information systems, a promising platform for future quantum computers, said Associate Professor Stefano Palomba, a co-author from the University of Sydney and Nanophotonics Leader at Sydney Nano.

Eventually we expect photonic information will migrate to the CPU, the heart of any modern computer. Such a vision has already been mapped out by IBM.

On-chip nanometre-scale devices that use metals (known as plasmonic devices) allow for functionality that no conventional photonic device allows. Most notably, they efficiently compress light down to a few billionths of a metre and thus achieve hugely enhanced, interference-free, light-to-matter interactions.

As well as revolutionising general processing, this is very useful for specialised scientific processes such as nano-spectroscopy, atomic-scale sensing and nanoscale detectors, said Dr Tuniz also from the Sydney Institute of Photonics and Optical Science.

However, their universal functionality was hampered by a reliance on ad hoc designs.

We have shown that two separate designs can be joined together to enhance a run-of-the-mill chip that previously did nothing special, Dr Tuniz said.

This modular approach allows for rapid rotation of light polarisation in the chip and,becauseof that rotation, quickly permits nano-focusing down to about 100 times less than the wavelength.

Professor Martijn de Sterke is Director of the Institute of Photonics and Optical Science at the University of Sydney. He said: The future of information processing is likely to involve photons using metals that allow us to compress light to the nanoscale and integrate these designs into conventional silicon photonics.

This research was supported by the University of Sydney Fellowship Scheme, the German Research Foundation (DFG) under Germanys Excellence Strategy EXC-2123/1. This work was performed in part at the NSW node of the Australian National Fabrication Facility (ANFF).

See more here:

Light, fantastic: the path ahead for faster, smaller computer processors - News - The University of Sydney

Written by admin

May 17th, 2020 at 10:41 pm

Posted in Quantum Computer

VTT to acquire Finland’s first quantum computer seeking to bolster Finland’s and Europe’s competitiveness – Quantaneo, the Quantum Computing Source

Posted: May 12, 2020 at 7:46 am

without comments

Quantum technology will revolutionise many industrial sectors, and will already begin spawning new, nationally significant business and research opportunities over the next few years. Advancements in quantum technology and, in particular, the technological leap afforded by quantum computers aka the quantum leap will enable unprecedented computing power and the ability to solve problems that are impossible for todays supercomputers.

Building this quantum computer will provide Finland with an exceptional level of capabilities in both research and technology, and will safeguard Finlands position at the forefront of new technology. The goal is to create a unique ecosystem for the development and application of quantum technology in Finland, in collaboration with companies and universities. VTT hopes to partner with progressive Finnish companies from a variety of sectors during the various phases of implementation and application.

The development and construction of Finlands quantum computer will be carried out as an innovation partnership that VTT will be opening up for international tender. The project will run for several years and its total cost is estimated at about EUR 2025 million.

The project will progress in stages. The first phase will last for about a year and aims to get a minimum five-qubit quantum computer in working order. However, the ultimate goal is a considerably more powerful machine with a larger number of qubits.

In the future, well encounter challenges that cannot be met using current methods. Quantum computing will play an important role in solving these kinds of problems. For example, the quantum computers of the future will be able to accurately model viruses and pharmaceuticals, or design new materials in a way that is impossible with traditional methods, says Antti Vasara, CEO of VTT.

Through this project, VTT is seeking to be a world leader in quantum technology and its application.

The pandemic has shocked not only Finlands economy but also the entire world economy, and it will take us some time to recover from the consequences. To safeguard economic recovery and future competitiveness, its now even more important than ever to make investments in innovation and future technologies that will create demand for Finnish companies products and services, says Vasara.

VTT has lengthy experience and top expertise in both quantum technology research and related fields of science and technology, such as superconductive circuits and cryogenics, microelectronics and photonics. In Otaniemi, VTT and Aalto University jointly run Micronova, a world-class research infrastructure that enables experimental research and development in quantum technologies. This infrastructure will be further developed to meet the requirements of quantum technologies. Micronovas cleanrooms are already equipped to manufacture components and products based on quantum technologies.

Go here to see the original:

VTT to acquire Finland's first quantum computer seeking to bolster Finland's and Europe's competitiveness - Quantaneo, the Quantum Computing Source

Written by admin

May 12th, 2020 at 7:46 am

Posted in Quantum Computer

IonQ CEO Peter Chapman on how quantum computing will change the future of AI – VentureBeat

Posted: at 7:46 am

without comments

Businesses eager to embrace cutting-edge technology are exploring quantum computing, which depends on qubits to perform computations that would be much more difficult, or simply not feasible, on classical computers. The ultimate goals are quantum advantage, the inflection point when quantum computers begin to solve useful problems. While that is a long way off (if it can even be achieved), the potential is massive. Applications include everything from cryptography and optimization to machine learning and materials science.

As quantum computing startup IonQ has described it, quantum computing is a marathon, not a sprint. We had the pleasure of interviewing IonQ CEO Peter Chapman last month to discuss a variety of topics. Among other questions, we asked Chapman about quantum computings future impact on AI and ML.

The conversation quickly turned to Strong AI, or Artificial General Intelligence (AGI), which does not yet exist. Strong AI is the idea that a machine could one day understand or learn any intellectual task that a human can.

AI in the Strong AI sense, that I have more of an opinion [about], just because I have more experience in that personally, Chapman told VentureBeat. And there was a really interesting paper that just recently came out talking about how to use a quantum computer to infer the meaning of words in NLP. And I do think that those kinds of things for Strong AI look quite promising. Its actually one of the reasons I joined IonQ. Its because I think that does have some sort of application.

In a follow-up email, Chapman expanded on his thoughts. For decades, it was believed that the brains computational capacity lay in the neuron as a minimal unit, he wrote. Early efforts by many tried to find a solution using artificial neurons linked together in artificial neural networks with very limited success. This approach was fueled by the thought that the brain is an electrical computer, similar to a classical computer.

However, since then, I believe we now know the brain is not an electrical computer, but an electrochemical one, he added. Sadly, todays computers do not have the processing power to be able to simulate the chemical interactions across discrete parts of the neuron, such as the dendrites, the axon, and the synapse. And even with Moores law, they wont next year or even after a million years.

Chapman then quoted Richard Feynman, who famously said Nature isnt classical, dammit, and if you want to make a simulation of nature, youd better make it quantum mechanical. And by golly, its a wonderful problem because it doesnt look so easy.

Similarly, its likely Strong AI isnt classical, its quantum mechanical as well, Chapman said.

One of IonQs competitors, D-Wave, argues that quantum computing and machine learning are extremely well matched. Chapman is still on the fence.

I havent spent enough time to really understand it, he admitted. There clearly [are] a lot of people who think that ML and quantum have an overlap. Certainly, if you think of 85% of all ML produces a decision tree, and the depth of that decision tree could easily be optimized with a quantum computer. Clearly, there [are] lots of people that think that generation of the decision tree could be optimized with a quantum computer. Honestly, I dont know if thats the case or not. I think its still a little early for machine learning, but there clearly [are] so many people that are working on it. Its hard to imagine it doesnt have [an] application.

Chapman continued in a later email: ML has intimate ties to optimization: Many learning problems are formulated as minimization of some loss function on a training set of examples. Generally, Universal Quantum Computers excel at these kinds of problems.

He listed three improvements in ML that quantum computing will likely allow:

Whether Strong AI or ML, IonQ isnt particularly interested in either. The company leaves that to its customers and future partners.

Theres so much to be to be done in a quantum, Chapman said. From education at one end all the way to the quantum computer itself. I think some of our competitors have taken on lots of the entire problem set. We at IonQ are just focused on producing the worlds best quantum computer for them. We think thats a large enough task for a little company like us to handle.

So, for the moment were kind of happy to let everyone else work on different problems, he added. We just dont have extra bandwidth or resources to put into working on machine learning algorithms. And luckily, there [are] lots of other companies that think that there [are] applications there. Well partner with them in the sense that well provide the hardware that their algorithms will run on. But were not in the ML business, per se.

See more here:

IonQ CEO Peter Chapman on how quantum computing will change the future of AI - VentureBeat

Written by admin

May 12th, 2020 at 7:46 am

Posted in Quantum Computer

David Graves to Head New Research at PPPL for Plasma Applications in Industry and Quantum Information Science – HPCwire

Posted: at 7:45 am

without comments

May 11, 2020 David Graves, an internationally-known chemical engineer, has been named to lead a new research enterprise that will explore plasma applications in nanotechnology for everything from semiconductor manufacturing to the next generation of super-fast quantum computers.

Graves, a professor at the University of California, Berkeley, since 1986, is an expert in plasma applications in semiconductor manufacturing. He will become the Princeton Plasma Physics Laboratorys (PPPL) first associate laboratory director for Low-Temperature Plasma Surface Interactions, effective June 1. He will likely begin his new position from his home in Lafayette, California, in the East Bay region of San Francisco.

He will lead a collaborative research effort to not only understand and measure how plasma is used in the manufacture of computer chips, but also to explore how plasma could be used to help fabricate powerful quantum computing devices over the next decade.

This is the apex of our thrust into becoming a multipurpose lab, said Steve Cowley, PPPL director, who recruited Graves. Working with Princeton University, and with industry and the U.S. Department of Energy (DOE), we are going to make a big push to do research that will help us understand how you can manufacture at the scale of a nanometer. A nanometer, one-billionth of a meter, is about ten thousand times less than the width of a human hair.

The new initiative will draw on PPPLs expertise in low temperature plasmas, diagnostics, and modeling. At the same time, it will work closely with plasma semiconductor equipment industries and will collaborate with Princeton University experts in various departments, including chemical and biological engineering, electrical engineering, materials science, and physics. In particular, collaborations with PRISM (the Princeton Institute for the Science and Technology of Materials) are planned, Cowley said. I want to see us more tightly bound to the University in some areas because that way we get cross-fertilization, he said.

Graves will also have an appointment as professor in the Princeton University Department of Chemical and Biological Engineering, starting July 1. He is retiring from his position at Berkeley at the end of this semester. He is currently writing a book (Plasma Biology) on plasma applications in biology and medicine. He said he changed his retirement plans to take the position at PPPL and Princeton University. This seemed like a great opportunity, Graves said. Theres a lot we can do at a national laboratory where theres bigger scale, world-class colleagues, powerful computers and other world-class facilities.

Exciting new direction for the Lab

Graves is already working with Jon Menard, PPPL deputy director for research, on the strategic plan for the new research initiative over the next five years. Its a really exciting new direction for the Lab that will build upon our unique expertise in diagnosing and simulating low-temperature plasmas, Menard said. It also brings us much closer to the university and industry, which is great for everyone.

The staff will grow over the next five years and PPPL is recruiting for an expert in nano-fabrication and quantum devices. The first planned research would use converted PPPL laboratory space fitted with equipment provided by industry. Subsequent work would use laboratory space at PRISM on Princeton Universitys campus. In the longer term, researchers in the growing group would have brand new laboratory and office space as a central part the Princeton Plasma Innovation Center (PPIC), a new building planned at PPPL.

Physicists Yevgeny Raitses, principal investigator for the Princeton Collaborative Low Temperature Plasma Research Facility (PCRF) and head of the Laboratory for Plasma Nanosynthesis, and Igor Kavanovich, co-principal investigator of PCRF, are both internationally-known experts in low temperature plasmas who have forged recent partnerships between PPPL and various industry partners. The new initiative builds on their work, Cowley said.

A priority research area

Research aimed at developing quantum information science (QIS) is a priority for the DOE. Quantum computers could be very powerful in solving complex scientific problems, including simulating quantum behavior in material or chemical systems. QIS could also have applications in quantum communication, especially in encryption, and quantum sensing. It could potentially have an impact in areas such as national security. A key question is whether plasma-based fabrication tools commonly used today will play a role in fabricating quantum devices in the future, Menard said. There are huge implications in that area, Menard said. We want to be part of that.

Graves is an expert on applying molecular dynamics simulations to low temperature plasma-surface interactions. These simulations are used to understand how plasma-generated ions, atoms and molecules interact with various surfaces. He has extensive research experience in academia and industry in plasma-related semiconductor manufacturing. That expertise will be useful for understanding how to make very fine structures and circuits at the nanometer, sub-nanometer and even atom-by-atom level, Menard said. Davids going to bring a lot of modeling and fundamental understanding to that process. That, paired with our expertise and measurement capabilities, should make us unique in the U.S. in terms of what we can do in this area.

Graves was born in Daytona Beach, Florida, and moved a lot as a child because his father was in the U.S. Air Force. He lived in Homestead, Florida; near Kansas City, Missouri; and in North Bay Ontario; and finished high school near Phoenix, Arizona.

Graves received bachelors and masters degrees in chemical engineering from the University of Arizona and went on to pursue a doctoral degree in the subject, graduating with a Ph.D. from the University of Minnesota in 1986. He is a fellow of the Institute of Physics and the American Vacuum Society. He is the author or co-author of more than 280 peer-reviewed publications. During his long career at Berkeley, he has supervised 30 Ph.D. students and 26 post-doctoral students, many of whom are now in leadership positions in industry and academia.

A leader since the 1990s

Graves has been a leader in the use of plasma in the semiconductor industry since the 1990s. In 1996, he co-chaired a National Research Council (NRC) workshop and co-edited the NRCs Database Needs for Modeling and Simulation of Plasma Processing. In 2008, he performed a similar role for a DOE workshop on low-temperature plasmas applications resulting in the report Low Temperature Plasma Science Challenges for the Next Decade.

Graves is an admitted Francophile who speaks (near) fluent French and has spent long stretches of time in France as a researcher. He was named Matre de Recherche (master of research) at the cole Polytechnic in Palaiseau, France, in 2006. He was an invited researcher at the University of Perpignan in 2010 and received a chaire dexcellence from the Nanoscience Foundation in Grenoble, France, to study plasma-graphene interactions.

He has received numerous honors during his career. He was appointed the first Lam Research Distinguished Chair in Semiconductor Processing at Berkeley for 2011-2016. More recently, he received the Will Allis Prize in Ionized Gas from the American Physical Society in 2014 and the 2017 Nishizawa Award, associated with the Dry Process Symposium in Japan. In 2019, he was appointed foreign expert at Huazhong University of Science and Technology in Wuhan, China. He served as the first senior editor of IEEE Transactions on Radiation and Plasma Medical Science.

Graves has been married for 35 years to Sue Graves, who recently retired from the City of Lafayette, where she worked in the school bus program. The couple has three adult children. Graves enjoys bicycling and yoga and the couple loves to travel. They also enjoy hiking, visiting museums, listening to jazz music, and going to the theater.

About PPPL

PPPL, on Princeton Universitys Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas ultra-hot, charged gases and to developing practical solutions for the creation of fusion energy. The Laboratory is managed by the University for the U.S. Department of Energys Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please is external).

Source: Jeanne Jackson DeVoe, PPPL


David Graves to Head New Research at PPPL for Plasma Applications in Industry and Quantum Information Science - HPCwire

Written by admin

May 12th, 2020 at 7:45 am

Posted in Quantum Computer

Page 8«..78910..»