Page 1123

Archive for the ‘Quantum Computer’ Category

D-Wave makes its quantum computers free to anyone working on the coronavirus crisis – VentureBeat

Posted: April 2, 2020 at 7:49 am

without comments

D-Wave today made its quantum computers available for free to researchers and developers working on responses to the coronavirus (COVID-19) crisis. D-Wave partners and customers Cineca, Denso, Forschungszentrum Jlich, Kyocera, MDR, Menten AI, NEC, OTI Lumionics, QAR Lab at LMU Munich, Sigma-i, Tohoku University, and Volkswagen are also offering to help. They will provide access to their engineering teams with expertise on how to use quantum computers, formulate problems, and develop solutions.

Quantum computing leverages qubits to perform computations that would be much more difficult, or simply not feasible, for a classical computer. Based in Burnaby, Canada, D-Wave was the first company to sell commercial quantum computers, which are built to use quantum annealing. D-Wave says the move to make access free is a response to a cross-industry request from the Canadian government for solutions to the COVID-19 pandemic. Free and unlimited commercial contract-level access to D-Waves quantum computers is available in 35 countries across North America, Europe, and Asia via Leap, the companys quantum cloud service. Just last month, D-Wave debuted Leap 2, which includes a hybrid solver service and solves problems of up to 10,000 variables.

D-Wave and its partners are hoping the free access to quantum processing resources and quantum expertise will help uncover solutions to the COVID-19 crisis. We asked the company if there were any specific use cases it is expecting to bear fruit. D-Wave listed analyzing new methods of diagnosis, modeling the spread of the virus, supply distribution, and pharmaceutical combinations. D-Wave CEO Alan Baratz added a few more to the list.

The D-Wave system, by design, is particularly well-suited to solve a broad range of optimization problems, some of which could be relevant in the context of the COVID-19 pandemic, Baratz told VentureBeat. Potential applications that could benefit from hybrid quantum/classical computing include drug discovery and interactions, epidemiological modeling, hospital logistics optimization, medical device and supply manufacturing optimization, and beyond.

Earlier this month, Murray Thom, D-Waves VP of software and cloud services, told us quantum computing and machine learning are extremely well matched. In todays press release, Prof. Dr. Kristel Michielsen from the Jlich Supercomputing Centre seemed to suggest a similar notion: To make efficient use of D-Waves optimization and AI capabilities, we are integrating the system into our modular HPC environment.

Read more:

D-Wave makes its quantum computers free to anyone working on the coronavirus crisis - VentureBeat

Written by admin

April 2nd, 2020 at 7:49 am

Posted in Quantum Computer

We’re Getting Closer to the Quantum Internet, But What Is It? – HowStuffWorks

Posted: at 7:49 am

without comments


Back in February 2020, scientists from the U.S. Department of Energy's Argonne National Laboratory and the University of Chicago revealed that they had achieved a quantum entanglement in which the behavior of a pair two tiny particles becomes linked, so that their states are identical over a 52-mile (83.7 kilometer) quantum-loop network in the Chicago suburbs.

You may be wondering what all the fuss is about, if you're not a scientist familiar with quantum mechanics that is, the behavior of matter and energy at the smallest scale of reality, which is peculiarly different from the world we can see around us.

But the researchers' feat could be an important step in the development of a new, vastly more powerful version of the internet in the next few decades. Instead of the bits that today's network uses, which can only express a value of either 0 or 1, the future quantum internet would utilize qubits of quantum information, which can take on an infinite number of values. (A quibit is the unit of information for a quantum computer; it's like a bit in an ordinary computer).

That would give the quantum internet way more bandwidth, which would make it possible to connect super-powerful quantum computers and other devices and run massive applications that simply aren't possible with the internet we have now.

"A quantum internet will be the platform of a quantum ecosystem, where computers, networks, and sensors exchange information in a fundamentally new manner where sensing, communication, and computing literally work together as one entity, " explains David Awschalom via email. He's a spintronics and quantum information professor in the Pritzker School of Molecular Engineering at the University of Chicago and a senior scientist at Argonne, who led the quantum-loop project.

So why do we need this and what does it do? For starters, the quantum internet is not a replacement of the regular internet we now have. Rather it would be a complement to it or a branch of it. It would be able to take care of some of the problems that plague the current internet. For instance, a quantum internet would offer much greater protection from hackers and cybercriminals. Right now, if Alice in New York sends a message to Bob in California over the internet, that message travels in more or less a straight line from one coast to the other. Along the way, the signals that transmit the message degrade; repeaters read the signals, amplify and correct the errors. But this process allows hackers to "break in" and intercept the message.

However, a quantum message wouldn't have that problem. Quantum networks use particles of light photons to send messages which are not vulnerable to cyberattacks. Instead of encrypting a message using mathematical complexity, says Ray Newell, a researcher at Los Alamos National Laboratory, we would rely upon the peculiar rules of quantum physics. With quantum information, "you can't copy it or cut it in half, and you can't even look at it without changing it." In fact, just trying to intercept a message destroys the message, as Wired magazine noted. That would enable encryption that would be vastly more secure than anything available today.

"The easiest way to understand the concept of the quantum internet is through the concept of quantum teleportation," Sumeet Khatri, a researcher at Louisiana State University in Baton Rouge, says in an email. He and colleagues have written a paper about the feasibility of a space-based quantum internet, in which satellites would continually broadcast entangled photons down to Earth's surface, as this Technology Review article describes.

"Quantum teleportation is unlike what a non-scientist's mind might conjure up in terms of what they see in sci-fi movies, " Khatri says. "In quantum teleportation, two people who want to communicate share a pair of quantum particles that are entangled. Then, through a sequence of operations, the sender can send any quantum information to the receiver (although it can't be done faster than light speed, a common misconception). This collection of shared entanglement between pairs of people all over the world essentially constitutes the quantum internet. The central research question is how best to distribute these entangled pairs to people distributed all over the world. "

Once it's possible to do that on a large scale, the quantum internet would be so astonishingly fast that far-flung clocks could be synchronized about a thousand times more precisely than the best atomic clocks available today, as Cosmos magazine details. That would make GPS navigation vastly more precise than it is today, and map Earth's gravitational field in such detail that scientists could spot the ripple of gravitational waves. It also could make it possible to teleport photons from distant visible-light telescopes all over Earth and link them into a giant virtual observatory.

"You could potentially see planets around other stars, " says Nicholas Peters, group leader of the Quantum Information Science Group at Oak Ridge National Laboratory.

It also would be possible for networks of super-powerful quantum computers across the globe to work together and create incredibly complex simulations. That might enable researchers to better understand the behavior of molecules and proteins, for example, and to develop and test new medications.

It also might help physicists to solve some of the longstanding mysteries of reality. "We don't have a complete picture of how the universe works," says Newell. "We have a very good understanding of how quantum mechanics works, but not a very clear picture of the implications. The picture is blurry where quantum mechanics intersects with our lived experience."

But before any of that can happen, researchers have to figure out how to build a quantum internet, and given the weirdness of quantum mechanics, that's not going to be easy. "In the classical world you can encode information and save it and it doesn't decay, " Peters says. "In the quantum world, you encode information and it starts to decay almost immediately. "

Another problem is that because the amount of energy that corresponds to quantum information is really low, it's difficult to keep it from interacting with the outside world. Today, "in many cases, quantum systems only work at very low temperatures," Newell says. "Another alternative is to work in a vacuum and pump all the air out. "

In order to make a quantum internet function, Newell says, we'll need all sorts of hardware that hasn't been developed yet. So it's hard to say at this point exactly when a quantum internet would be up and running, though one Chinese scientist has envisioned that it could happen as soon as 2030.

Here is the original post:

We're Getting Closer to the Quantum Internet, But What Is It? - HowStuffWorks

Written by admin

April 2nd, 2020 at 7:49 am

Posted in Quantum Computer

Q-CTRL to Host Live Demos of ‘Quantum Control’ Tools – Quantaneo, the Quantum Computing Source

Posted: at 7:49 am

without comments

Q-CTRL, a startup that applies the principles of control engineering to accelerate the development of the first useful quantum computers, will host a series of online demonstrations of new quantum control tools designed to enhance the efficiency and stability of quantum computing hardware.

Dr. Michael Hush, Head of Quantum Science and Engineering at Q-CTRL, will provide an overview of the companys cloud-based quantum control engineering software called BOULDER OPAL. This software uses custom machine learning algorithms to create error-robust logical operations in quantum computers. The team will demonstrate - using real quantum computing hardware in real time - how they reduce susceptibility to error by 100X and improve hardware stability in time by 10X, while reducing time-to-solution by 10X against existing software.

Scheduled to accommodate the global quantum computing research base, the demonstrations will take place:

April 16 from 4-4:30 p.m. U.S. Eastern Time (ET) April 21 from 10-10:30 a.m. Singapore Time (SGT) April 23 from 10-10:30 a.m. Central European Summer Time (CEST) To register, visit

Released in Beta by Q-CTRL in March, BOULDER OPAL is an advanced Python-based toolkit for developers and R&D teams using quantum control in their hardware or theoretical research. Technology agnostic and with major computational grunt delivered seamlessly via the cloud, BOULDER OPAL enables a range of essential tasks which improve the performance of quantum computing and quantum sensing hardware. This includes the efficient identification of sources of noise and error, calculating detailed error budgets in real lab environments, creating new error-robust logic operations for even the most complex quantum circuits, and integrating outputs directly into real hardware.

The result for users is greater performance from todays quantum computing hardware, without the need to become an expert in quantum control engineering.

Experimental validations and an overview of the software architecture, developed in collaboration with the University of Sydney, were recently released in an online technical manuscript titled Software Tools for Quantum Control: Improving Quantum Computer Performance through Noise and Error Suppression.

See the original post here:

Q-CTRL to Host Live Demos of 'Quantum Control' Tools - Quantaneo, the Quantum Computing Source

Written by admin

April 2nd, 2020 at 7:49 am

Posted in Quantum Computer

Disrupt The Datacenter With Orchestration – The Next Platform

Posted: at 7:49 am

without comments

Since 1965, the computer industry has relied on Moores Law to accelerate innovation, pushing more transistors into integrated circuits to improve computation performance. Making transistors smaller helped lift all boats for the entire industry and enable new applications. At some point, we will reach a physical limit that is, a limit stemming from physics itself. Even with this setback, improvements kept on pace thanks to increased parallelism of computation and consolidation of specialized functions into single chip packages, such as systems on chip).

In recent years, we are nearing another peak. This article proposes to improve computation performance not only by building better hardware, but by changing how we use existing hardware. More specifically, the focusing on how we use existing processor types. I call this approach Compute Orchestration: automatic optimization of machine code to best use the modern datacenter hardware (again, with special emphasis on different processor types).

So what is compute orchestration? It is the embracing of hardware diversity to support software.

There are many types of processors: Microprocessors in small devices, general purpose CPUs in computers and servers, GPUs for graphics and compute, and programmable hardware like FPGAs. In recent years, specialized processors like TPUs and neuromorphic processors for machine learning are rapidly entering the datacenter.

There is potential in this variety: Instead of statically utilizing each processor for pre-defined functions, we can use existing processors as a swarm, each processor working on the most suitable workloads. Doing that, we can potentially deliver more computation bandwidth with less power, lower latency and lower total cost of ownership).

Non-standard utilization of existing processors is already happening: GPUs, for example, were already adapted from processors dedicated to graphics into a core enterprise component. Today, GPUs are used for machine learning and cryptocurrency mining, for example.

I call the technology to utilize the processors as a swarm Compute Orchestration. Its tenets can be described in four simple bullets:

Compute orchestration is, in short, automatic adaptation of binary code and automatic allocation to the most suitable processor types available. I split the evolution of compute orchestration into four generations:

Compute Orchestration Gen 1: Static Allocation To Specialized Co-Processors

This type of compute orchestration is everywhere. Most devices today include co-processors to offload some specialized work from the CPU. Usually, the toolchain or runtime environment takes care of assigning workloads to the co-processor. This is seamless to the developer, but also limited in functionality.

Best known example is the use of cryptographic co-processors for relevant functions. Being liberal in our definitions of co-processor, Memory Management Units (MMUs) to manage virtual memory address translation can also be considered an example.

Compute Orchestration Gen 2: Static Allocation, Heterogeneous Hardware

This is where we are at now. In the second generation, the software relies on libraries, dedicated run time environments and VMs to best use the available hardware. Lets call the collection of components that help better use the hardware frameworks. Current frameworks implement specific code to better use specific processors. Most prevalent are frameworks that know how to utilize GPUs in the cloud. Usually, better allocation to bare metal hosts remains the responsibility of the developer. For example, the developer/DevOps engineer needs to make sure a machine with GPU is available for the relevant microservice. This phenomenon is what brought me to think of Compute Orchestration in the first place, as it proves there is more slack in our current hardware.

Common frameworks like OpenCL allow programming compute kernels to run on different processors. TensorFlow allows assigning nodes in a computation graph to different processors (devices).

This better use of hardware by using existing frameworks is great. However, I believe there is a bigger edge. Existing frameworks still require effort from the developer to be optimal they rely on the developer. Also, no legacy code from 2016 (for example) is ever going to utilize a modern datacenter GPU cluster. My view is that by developing automated and dynamic frameworks, that adapt to the hardware and workload, we can achieve another leap.

Compute Orchestration Gen 3: Dynamic Allocation To Heterogeneous Hardware

Computation can take an example from the storage industry: Products for better utilization and reliability of storage hardware have innovated for years. Storage startups develop abstraction layers and special filesystems that improve efficiency and reliability of existing storage hardware. Computation, on the other hand, remains a stupid allocation of hardware resources. Smart allocation of computation workloads to hardware could result in better performance and efficiency for big data centers (for example hyperscalers like cloud providers). The infrastructure for such allocation is here, with current data center designs pushing to more resource disaggregation, introduction of diverse accelerators, and increased work on automatic acceleration (for example: Workload-aware Automatic Parallelization for Multi-GPU DNN Training).

For high level resource management, we already have automatic allocation. For example, project Mesos (paper) focusing on fine-grained resource sharing, Slurm for cluster management, and several extensions using Kubernetes operators.

To further advance from here would require two steps: automatic mapping of available processors (which we call the compute environment) and workload adaptation. Imagine a situation where the developer doesnt have to optimize her code to the hardware. Rather, the runtime environment identifies the available processing hardware and automatically optimizes the code. Cloud environments are heterogeneous and changing, and the code should change accordingly (in fact its not the code, but the execution model in the run time environment of the machine code).

Compute Orchestration Gen 4: Automatic Allocation To Dynamic Hardware

A thought, even a possibility, can shatter and transform us. Friedrich Wilhelm Nietzsche

The quote above is to say that there we are far from practical implementation of the concept described here (as far as I know). We can, however, imagine a technology that dynamically re-designs a data center to serve needs of running applications. This change in the way whole data centers meet computation needs as already started. FGPAs are used more often and appear in new places (FPGAs in hosts, FPGA machines in AWS, SmartNICs), providing the framework for constant reconfiguration of hardware.

To illustrate the idea, I will use an example: Microsoft initiated project Catapult, augmenting CPUs with an interconnected and configurable compute layer composed of programmable silicon. The timeline in the projects website is fascinating. The project started off in 2010, aiming to improve search queries by using FPGAs. Quickly, it proposed the use of FPGAs as bumps in the wire, adding computation in new areas of the data path. Project Catapult also designed an architecture for using FPGAs as a distributed resource pool serving all the data center. Then, the project spun off Project BrainWave, utilizing FPGAs for accelerating AI/ML workloads.

This was just an example of innovation in how we compute. Quick online search will bring up several academic works on the topic. All we need to reach the 4th generation is some idea synthesis, combining a few concepts together:

Low effort HDL generation (for example Merlin compiler, BORPH)

In essence, what I am proposing is to optimize computation by adding an abstraction layer that:

Automatic allocation on agile hardware is the recipe for best utilizing existing resources: faster, greener, cheaper.

The trends and ideas mentioned in this article can lead to many places. It is very likely, that we are already working with existing hardware in the optimal way. It is my belief that we are in the midst of the improvement curve. In recent years, we had increased innovation in basic hardware building blocks, new processors for example, but we still have room to improve in overall allocation and utilization. The more we deploy new processors in the field, the more slack we have in our hardware stack. New concepts, like edge computing and resource disaggregation, bring new opportunities for optimizing legacy code by smarter execution. To achieve that, legacy code cant be expected to be refactored. Developers and DevOps engineers cant be expected to optimize for the cloud configuration. We just need to execute code in a smarter way and that is the essence of compute orchestration.

The conceptual framework described in this article should be further explored. We first need to find the killer app (what type of software we optimize to which type of hardware). From there, we can generalize. I was recently asked in a round table what is the next generation of computation? Quantum computing? Tensor Processor Units? I responded that all of the above, but what we really need is better usage of the existing generation.

Guy Harpak is the head of technology at Mercedes-Benz Research & Devcelopment in its Tel Aviv, Israel facility. Please feel free to contact him on any thoughts on the topics above at Harpak notes that this contributed article reflects his personal opinion and is in no way related to people or companies that he works with or for.

Related Reading: If you find this article interesting, I would recommend researching the following topics:

Some interesting articles on similar topics:

Return Of The Runtimes: Rethinking The Language Runtime System For The Cloud 3.0 Era

The Deep Learning Revolution And Its Implications For Computer Architecture And Chip Design (by Jeffrey Dean from Google Research)

Beyond SmartNICs: Towards A Fully Programmable Cloud

Hyperscale Cloud: Reimagining Datacenters From Hardware To Applications

Read more from the original source:

Disrupt The Datacenter With Orchestration - The Next Platform

Written by admin

April 2nd, 2020 at 7:49 am

Posted in Quantum Computer

Quantum Computing: Will It Actually Produce Jobs? – Dice Insights

Posted: March 19, 2020 at 1:52 pm

without comments

If youre interested in tech, youve likely heard about the race to develop quantum computers. These systems compute via qubits, which exist not only as ones and zeros (as you find in traditional processors) but also in an in-between state known as superposition.

For tasks such as cryptography, qubits and superposition would allow a quantum computer to analyze every potential solution simultaneously, making such systems much faster than conventional computers. Microsoft, Google, IBM, and other firms are all throwing tons of resources into quantum-computing research, hoping for a breakthrough that will make them a leader in this nascent industry.

Questions abound about quantum computing, including whether these systems will actually produce the answers that companies really need. For those in the tech industry, theres a related interest in whether quantum computing will actually produce jobs at scale.

The large tech companies and research laboratories who are leading the charge on R&D in the pure quantum computing hardware space are looking for people with advanced degrees in key STEM fields like physics, math and engineering, said John Prisco, President & CEO of Quantum Xchange, which markets a quantum-safe key distribution that supposedly will bridge the gap between traditional encryption solutions and quantum computing-driven security. This is in large part because there are few programs today that actually offer degrees or specializations in quantum technology.

When Prisco was in graduate school, he added, There were four of us in the electrical engineering program with the kind of physics training this field calls for. More recently, Ive recently seen universities like MIT and Columbia investing in offering this training to current students, but its going to take awhile to produce experts.

Theres every chance that increased demand for quantum-skilled technologists could drive even more universities to spin up the right kind of training and education programs. The National Institute of Standards and Technology (NIST) is evaluating post-quantum cryptography that would replace existing methods, including public-key RSA encryption methods. Time is of the essence when it comes to governments and companies coming up with these post-quantum algorithms; the next evolutions in cryptography will render the current generation pretty much obsolete.

Combine that quest with the current shortage of trained cybersecurity professionals, and you start to see where the talent and education crunch will hit over the next several years. While hackers weaponizing quantum computers themselves is still a far off proposal, the threat of harvesting attacks, where nefarious actors steal encrypted data now to decrypt later once quantum computers are available, is already here, Prisco said, pointing at Chinas 2015 hack of the U.S. Office of Personnel Management, which saw the theft of 21 million government employee records.

Though that stolen data was encrypted and there is no evidence it has been misused to date, the Chinese government is likely sitting on that trove, waiting for the day they have a quantum computer powerful enough to crack public key encryption, he said. Organizations that store sensitive data with a long shelf-life need to start preparing now. There is no time to waste.

But what will make a good quantum technologist?

Membership has its benefits. Sign up for a free Dice profile, add your resume, discover great career insights and set your tech career in motion. Register now

Herman Collins, CEO of StrategicQC, a recruiting agency for the quantum-computing ecosystem, believes that sourcing quantum-related talent at this stage comes down to credentials. Because advanced quantum expertise is rare, the biggest sign that a candidate is qualified is whether they have a degree in one of the fields of study that relates to quantum computing, he said. I would say that degrees, particularly advanced degrees, such as quantum physics obviously, physics theory, math or computer science are a good start. A focus on machine learning or artificial intelligence would be excellent as part of an augmented dynamic quantum skill set.

Although Google, IBM, and the U.S. government have infinite amounts of money to throw at talent, smaller companies are occasionally posting jobs for quantum-computing talent. Collins thinks that, despite the relative lack of resources, these small companies have at least a few advantages when it comes to attracting the right kind of very highly specialized talent.

Smaller firms and startups can often speak about the ability to do interesting work that will impact generations to come and perhaps some equity participation, he said. Likewise, some applicants may be interested in working with smaller firms to build quantum-related technology from the ground up. Others might prefer a more close-knit team environment that smaller firms may offer.

Some 20 percent of the quantum-related positions, Collins continued, are in marketing, sales, management, tech support, and operations. Even if you havent spent years studying quantum computing, in other words, you can still potentially land a job at a quantum-computing firm, doing all the things necessary to ensure that the overall tech stack keeps operating.

It is equally important for companies in industries where quantum can have impactful results in the nearer term begin to recruit and staff quantum expertise now, Collins said. Companies competing in financial services, aerospace, defense, healthcare, telecommunications, energy, transportation, agriculture and others should recognize the vital importance of looking very closely at quantum and adding some skilled in-house capability.

Given the amount of money and research-hours already invested in quantum computing, as well as some recent (and somewhat controversial) breakthroughs, theres every chance the tech industry could see an uptick in demand for jobs related to quantum computing. Even for those who dont plan on specializing in this esoteric field, there may be opportunities to contribute.

Here is the original post:

Quantum Computing: Will It Actually Produce Jobs? - Dice Insights

Written by admin

March 19th, 2020 at 1:52 pm

Posted in Quantum Computer

Quantum computing is right around the corner, but cooling is a problem. What are the options? – Diginomica

Posted: at 1:52 pm

without comments


Why would you be thinking about quantum computing? Yes, it may be two years or more before quantum computing will be widely available, but there are already quite a few organizations that are pressing ahead. I'll get into those use cases, but first - Lets start with the basics:

Classical computers require built-in fans and other ways to dissipate heat, and quantum computers are no different. Instead of working with bits of information that can be either 0 or 1, as in a classical machine, a quantum computer relies on "qubits," which can be in both states simultaneously called a superposition thanks to the quirks of quantum mechanics. Those qubits must be shielded from all external noise, since the slightest interference will destroy the superposition, resulting in calculation errors. Well-isolated qubits heat up quickly, so keeping them cool is a challenge.

The current operating temperature of quantum computers is 0.015 Kelvin or -273C or -460F. That is the only way to slow down the movement of atoms, so a "qubit" can hold a value.

There have been some creative solutions proposed for this problem, such as the nanofridge," which builds a circuit with an energy gap dividing two channels: a superconducting fast lane, where electrons can zip along with zero resistance, and a slow resistive (non-superconducting) lane. Only electrons with sufficient energy to jump across that gap can get to the superconductor highway; the rest are stuck in the slow lane. This has a cooling effect.

Just one problem though: The inventor, MikkoMttnen, is confident enough in the eventual success that he has applied for a patent for the device. However, "Maybe in 10 to 15 years, this might be commercially useful, he said. Its going to take some time, but Im pretty sure well get there."

Ten to fifteen years? It may be two years or more before quantum computing will be widely available, but there are already quite a few organizations that are pressing ahead in the following sectors:

An excellent, detailed report on the quantum computing ecosystem is: The Next Decade in Quantum Computingand How to Play.

But the cooling problem must get sorted. It may be diamonds that finally solve some of the commercial and operational/cost issues in quantum computing: synthetic, also known as lab-grown diamonds.

The first synthetic diamond was grown by GE in 1954. It was an ugly little brown thing. By the '70s, GE and others were growing up to 1-carat off-color diamonds for industrial use. By the '90s, a company called Gemesis (renamed Pure Grown Diamonds) successfully created one-carat flawless diamonds graded ILA, meaning perfect. Today designer diamonds come in all sizes and colors: adding Boron to make them pink or nitrogen to make them yellow.

Diamonds have unique properties. They have high thermal conductivity (meaning they don't melt like silicon). The thermal conductivity of a pure diamond is the highest of any known solid. They are also an excellent electrical insulator. In its center, it has an impurity called an N-V center, where a carbon atom is replaced by a nitrogen atom leaving a gap where an unpaired electron circles the nitrogen gap and can be excited or polarized by a laser. When excited, the electron gives off a single photon leaving it in a reduced energy state. Somehow, and I admit I dont completely understand this, the particle is placed into a quantum superposition. In quantum-speak, that means it can be two things, two values, two places at once, where it has both spin up and spin down. That is the essence of quantum computing, the creation of a "qubit," something that can be both 0 and 1 at the same time.

If that isnt weird enough, there is the issue of entanglement. A microwave pulse can be directed at a pair of qubits, placing them both in the same state. But you can "entangle" them so that they are always in the same state. In other words, if you change the state of one of them, the other also changes, even if great distances separate them, a phenomenon Einstein dubbed, spooky action at a distance. Entangled photons don't need bulky equipment to keep them in their quantum state, and they can transmit quantum information across long distances.

At least in the theory of the predictive nature of entanglement, adding qubits explodes a quantum computer's computing power. In telecommunications, for example, entangled photons that span the traditional telecommunications spectrum have enormous potential for multi-channel quantum communication.

News Flash: Physicists have just demonstrated a 3-particle entanglement. This increases the capacity of quantum computing geometrically.

The cooling of qubits is the stumbling block. Diamonds seem to offer a solution, one that could quantum computing into the mainstream. The impurities in synthetic diamonds can be manipulated, and the state of od qubit can held at room temperature, unlike other potential quantum computing systems, and NV-center qubits (described above) are long-lived. There are still many issues to unravel to make quantum computers feasible, but today, unless you have a refrigerator at home that can operate at near absolute-zero, hang on to that laptop.

But doesnt diamonds in computers sound expensive, flagrant, excessive? It begs the question, What is anything worth? Synthetic diamonds for jewelry are not as expensive as mined gems, but the price one pays at retail s burdened by the effect of monopoly, and so many intermediaries, distributors, jewelry companies, and retailers.

A recent book explored the value of fine things and explains the perceived value which only has a psychological basis.In the 1930s, De Beers, which had a monopoly on the world diamond market and too many for the weak demand, engaged the N. W. Ayers advertising agency realizing that diamonds were only sold to the very rich, while everyone else was buying cars and appliances. They created a market for diamond engagement rings and introduced the idea that a man should spend at least three months salary on a diamond for his betrothed.

And in classic selling of an idea, not a brand, they used their earworm taglines like diamonds are forever. These four iconic words have appeared in every single De Beers advertisement since 1948, and AdAge named it the #1 slogan of the century in 1999. Incidentally, diamonds arent forever. That diamond on your finger is slowly evaporating.

The worldwide outrage over the Blood Diamond scandal is increasing supply and demand for fine jewelry applications of synthetic diamonds. If quantum computers take off, and a diamond-based architecture becomes a standard, it will spawn a synthetic diamond production boom, increasing supply and drastically lowering the cost, making it feasible.

Many thanks to my daughter, Aja Raden, an author, jeweler, and behavioral economist for her insights about the diamond trade.

Here is the original post:

Quantum computing is right around the corner, but cooling is a problem. What are the options? - Diginomica

Written by admin

March 19th, 2020 at 1:52 pm

Posted in Quantum Computer

Quantum Computing for Everyone – The Startup – Medium

Posted: at 1:52 pm

without comments

Qubits are exponentially faster than bits in several computing problems, such as database searches and factoring (which, as we will discuss soon, may break your Internet encryption).

An important thing to realize is that qubits can hold much more information than a bit can. One bit holds the same amount of information as one qubit they can both only hold one value. However, four bits must be used to store the same amount of information as two qubits. A two-qubit system in equal superposition holds values for four states, which on a classical computer, would need at least four bits to hold. Eight bits are needed to store the same amount of information as three qubits, since a three-qubit system can store eight states 000, 001, 010, 011, 100, 101, 110, and 111. This pattern continues.

The below graph provides a visual for the computing power of qubits. The x-axis represents the number of qubits used to hold a certain amount of information. The blue lines y represents the number of bits needed to hold the same amount of information as the number of qubits (x-axis), or 2 to the power of x. The red lines y represents the number of qubits needed to hold the same amount of information as the number of qubits in the x-axis (y=x).

Imagine the exponential speedup quantum computing can provide! A gigabyte (8E+09 bits) worth of information can be represented with log(8E+09)/log(2) = 33 (rounded up from 32.9) qubits.

Quantum computers are also great at factoring numbers which leads us to RSA encryption. The security protocol that secures Medium and probably any other website youve been on is known as RSA encryption. It relies on the fact that with current computing resources, it would take a very, very long time to factor a 30+-digit number m that has only one solution namely, p times q, where both p and q are large prime numbers. However, dividing m by p or q is computationally much easier, and since m divided by q returns p and vice versa, it provides a quick key verification system.

A quantum algorithm called Shors algorithm has shown exponential speedup in factoring numbers, which could one day break RSA encryption. But dont buy into the hype yet as of this writing, the largest number factored by quantum computers is 21 (into 3 and 7). The hardware has not been developed yet for quantum computers to factor 30-digit numbers or even 10-digit numbers. Even if quantum computers one day do break RSA encryption, a new security protocol called BB84 that relies on quantum properties is verified safe from quantum computers.

So will quantum computers ever completely replace the classical PC? Not in the forseeable future.

Quantum computing, while developing very rapidly, is still in an infantile stage, with research only being conducted semi-competitively by large corporations like Google, Microsoft, and IBM. Much of the hardware to accelerate quantum computing is not currently available. There are several obstacles to a quantum future, of which a major one is addressing gate errors and maintaining integrity of a qubits state.

However, given the amount of innovation that has happened in the past few years, it seems inevitable during our lifetimes that quantum computing will make huge strides. In addition, complexity theory has shown that there are several cases where classical computers perform better than quantum computers. IBM quantum computer developers state that quantum computing will probably never completely eliminate classical computers. Instead, in the future we may see a hybrid chip that relies on quantum transistors for certain tasks and classical transistors for others, depending on which one is more appropriate.

Read more:

Quantum Computing for Everyone - The Startup - Medium

Written by admin

March 19th, 2020 at 1:52 pm

Posted in Quantum Computer

Work from home: Improve your security with MFA – We Live Security

Posted: at 1:52 pm

without comments

Remote work can be much safer with the right cyberhygiene practices in place multifactorauthentication is one of them

If you happen to be working from home due to the COVID-19 pandemic, you should beef up your logins with Multi-Factor Authentication (MFA), or sometimes called Two-Factor Authentication (2FA). That way, you dont have to entrust your security to a password alone. Easy to hack, steal, leak, rinse and repeat, passwords have become pass in the security world; its time to dial in your MFA.

That means you have something besides just a password. You may have seen MFA in action when you try to log into your bank and you receive an access code on your smartphone that you must also enter to verify its really you who is logging in. While its an extra step, it becomes exponentially more difficult for bad guys to get access to your account, even if they have a password that was compromised in a breach or otherwise.

The good news is that MFA is no longer super-tough to use. Here, we look at a few different popular ways to use it. If you need to work remotely now and log into a central office to collaborate with co-workers, this is a nice way to beef up the security of those connections.

This means you have something like a key fob, security USB key or the like, which can be used to generate a very secure passcode thats all-but-impossible to break (unless you have a quantum computer handy). Nowadays, things like YubiKey or Thetis are available for less than US$50 and are very widely supported if youre logging into your own corporate office technology, online office applications and a host of other cloud applications. It means your normal login will ask for a password, but also the code generated by your device, which is often physically small enough to get lost in a pants pocket, so some folks hang them on their keychain for safekeeping.

Nowadays you probably carry a mobile device around most of the time, which is a good argument for using it to boost your MFA security stance. For example, you can download an authentication app such as Authy, Google Authenticator, or ESET Secure Authentication. Whatever you choose, make sure it has a solid history, security-wise, since it needs to reside on your smartphone, which we now know can become compromised as well, thereby undermining your other security efforts.

RELATED READING: Work from home: How to set up a VPN

Its worth noting that spam SMS messages on your smartphone can trick some users into voluntarily compromising their own accounts, so stay on the lookout if you use this. Of course, reputable mobile security software can help if youre concerned with security problems on the platform itself.

Its very hard to fake a fingerprint or retinal scan and make sure it offers a solid factor in MFA. Nowadays, lots of devices have built-in biometric readers that can get an image of your face from your smartphone taking your picture, or scan your fingerprint, so its not hard to implement this on a device you probably already have. Some folks steer away due to privacy concerns, which promises to be an ongoing conversation. Also, while you can reset a password, if a provider gets hacked it is notoriously difficult to reset your face (old spy movie plots, anyone?).

The important thing with MFA is that you pick one that suits your goals and one that is easy for you to include in your routine. I have a very good lock on my front door, but its very hard to use, so often my wife catches me leaving it open, which isnt very secure, is it? Good security you dont use cant protect you.

In the event of a breach, MFA can offer side benefits as well. If you are notified that your password is compromised, theres a very good chance they dont also have one of your other factors, so successful hack attacks should drop precipitously if MFA is correctly implemented. Use an MFA solution and enjoy technology more safely.

Read the rest here:

Work from home: Improve your security with MFA - We Live Security

Written by admin

March 19th, 2020 at 1:52 pm

Posted in Quantum Computer

Career navigation Be at the core or be at the edge – The Financial Express BD

Posted: at 1:52 pm

without comments

Radi Shafiq | Published: March 19, 2020 11:02:35

In 2009, for aspiring engineering students, electrical engineering was the best subject to study. By the end of 2014, it seemed to be computer science, now it seems to be data science / statistics. There is no way of telling someone about what is to come in five years. Maybe it is quantum computing, or maybe a new era emphasising mental well-being, maybe biochemistry, or philosophy suddenly takes the centre stage at every endeavour.

Today, the market is shifting in an ever-increasing pace. It is easy to feel lost while navigating a career, looking for the best path to climb the ladder. Young professionals are essentially trying to be good enough to be relevant and even vital in 20-30 years. However, most of the buzz-worthy careers today were not even around 10 years ago, and so how can one be preparing for something 20 years down the line?

Here the author found a framework of thinking very helpful. It can be called "Be at the core or be at the edge" framework of thinking about jobs. Every company has some core functions that are time tested and relatively stable - maybe for some it is manufacturing, for some it is the sales, for others it is field management. These functions have well defined roles, hierarchy, and history to go alongside it. If someone is good at this core work, the job is more secured for him or her with little probability of unpredictable troubles. A clear hierarchy means the career will also have defined progression, although at a predictable pace, with only seniors' moving out or up and company growth ending upcreating new spaces.

On the other hand, there are the functions at the edge of the company. These are new things, maybe a new data section, maybe a digital marketing wing, or a small research team that is yet to make an impact on the work. At the edge there are people who are often keeping a low profile, but being flexible to take initiatives in creative and new directions. They are introducing new programmes, exploring sudden new flow of value or revenue. They can often be deemed unnecessary by more of the core people in the organisation.

However, since this is a time with the maximum pace of change in market landscape, the people at the edge have the best chance of adapting to a new reality and introduce the necessary function that take the company to the next level. This can suddenly make the edge people become the core people - or at least become a vital support function for the core to survive and thrive. Think of the way that Adobe stopped regular software sales in favour of subscription services, or how newspapers more and more emphasise on web version over print, how all the TV shows now work overtime on YouTube clips.

The people who are overstretched into their core function and their way of doing things, can become stiff and slow to look into the new avenues, as looking into anything outside - can understandably feel like a waste of time. Why would anyone need to stop doing what makes the most money and instead dabble into stuff that has no proven market? This thinking binds them away from dynamic learning possibilities. And then sudden changes are brought about by one company, and in the aftermath - the whole market begins to adapt, and quickly changes the old core people's position in the market hierarchy. Suddenly market demands one to learn new tricks to stay relevant in the secure place of years.

Very often though, there is no harm in digging deep into the core of the company. It can be a very safe bet, as most businesses may not change so dramatically.

But, to reduce the risk of suddenly being left irrelevant at the market, it is best that everyone needs to invest a portion of their time working on projects at the edge of their organisation, or at the edge of their skill set -- all throughout their career. This flexibility will keep them in touch with the changing tides, and make sure that they can ride the wave, or at least not be taken by surprise when the change finally comes.

This thinking works at any stage of life, when the author was a student, he did digital art for just fun, but ultimately it helped him land the first three part time jobs, having those skills was a bonus on top of the studies. He had friends whose outside interests into videography while studying computer science ended up shaping some of their whole career. In the author's office, he has seen a colleague's occasional contribution to a new initiative becoming 50 per cent of her duty in a year's time - leading to a promotion and recognition.

So, think again, at the office, are you at the core or at the edge? Why not both? Keep learning. Keep creating.

Radi Shafiq is a development professional and artist. He can be reached at

Continued here:

Career navigation Be at the core or be at the edge - The Financial Express BD

Written by admin

March 19th, 2020 at 1:52 pm

Posted in Quantum Computer

TensorFlow gets its quantum of solace, lid lifted on ‘all-seeing crime-detecting’ AI upstart, and more – The Register

Posted: March 17, 2020 at 5:44 am

without comments

Roundup Here's a handy little roundup of all the bits of AI news that you may have missed.

Uh oh, another surveillance company has secretly been purloining data from social media: Banjo, the AI startup that believes its software can detect and surface crimes and other activities in real-time from all kinds of data feeds, also scraped information from peoples public social media profiles.

However, it wasnt as brazen as Clearview, the controversial upstart known for downloading over three billion photos from Facebook, Instagram, YouTube, Twitter, and more to put together a massive dataset for facial recognition. Banjo apparently created a shadow company called Pink Unicorn Labs, according to Vice.

Pink Unicorn Labs went on to develop three apps directed at fans of things like the British boyband One Direction, EDM music, and Formula One racing. These apps asked users to connect and sign-in using their accounts on social media platforms like Facebook, Twitter, Instagram, Google Plus, FourSquare, as well as VK and Sina Weibo, commonly used in Russia and China. Linking the Pink Unicorn Labs Apps to peoples accounts makes it possible to scrape those netizens' data, such as images or location history.

Code across all the three apps contained links to Banjos website. Both companies were registered at the same address in Redwood City, California and headed by Banjos CEO Damien Patton.

Pink Unicorn Labs apps were removed from the Google Play Store in 2016. Even though data might be publicly posted on peoples accounts, scraping them to use for commercial purposes is against the terms of service of these platforms.

AI helps historians read messages carved on ancient bones: Researchers from Southwest University in China used a convolutional neural network to classify and read ancient scripts carved on bones dating back to more than 3,000 years between 1600 to 1046 BC.

The Chinese characters written in Yi script, the oldest examples show it was used in the Middle Kingdom from the 15th century. Studying these ancient texts is difficult; not only does it require extensive knowledge of the language and its history, but the messages imprinted on these bones are cracked and worn out over time.

Heres where the machine learning bit comes in. A convolutional neural network was trained on images of these texts where each character was labelled so it could recognize scripts carved on other types of bones, according to a paper published in IEEE Computer Graphics and Applications.

The researchers used a dataset consisting of 1,476 tortoise shell rubbings and 300 ox bone rubbings, from which they chose one-third as the test set and two-thirds as the training set. Experiment results show the proposed method reaches a level close to that of oracle experts, Synched explained this week.

As I said, classification is the first step,Shanxiong Chen, first author of the paper and an associate professor of computer and information science, told Synched.

This study specifically focused on telling between animal bones and tortoise shells, and were continuously working with Capital Normal Universitys Center for Oracle Bone Studies on further classifying different types of animal bones.

ICLR 2020 goes virtual: Tech conferences are dropping like flies amidst the current outbreak of the coronavirus. Now, the International Conference on Learning Representations (ICLR), a top academic machine learning conference, has decided to cancel its physical event due to take place in Addis Ababa, Ethiopia, next month.

Due to growing concerns about COVID-19, ICLR2020 will cancel its physical conference this year, instead shifting to a fully virtual conference, it announced this week. We were very excited to hold ICLR in Addis Ababa, and it is disappointing that we will not all be able to come together in person in April.

Organisers have called all academics with accepted papers to create a five minute video as presenting their work part of its virtual poster session. For those that were invited to give a talk, that video will be extended to 15 minutes and information should be conveyed in a series of slides. Workshops are a little trickier to put together; ICLR is currently contacting speakers to coordinate.

All registration fees and travel purchased for the conference will be reimbursed. Now, the price to attend the digital conference has dropped down to $50 for students and $100 for non-students.

New TensorFlow library! If youre bored at home and social distancing from all your friends, family, and colleagues then try this: TensorFlows latest library that allows you to build quantum AI models.

Your brain will probably turn to mush trying to understand and combine both quantum computing and machine learning. The library known as TensorFlow Quantum (TFQ) was built by folks over at Google, the University of Waterloo, X, and Volkswagen, to give developers tools to process data that could, theoretically, run on quantum computers.

We announce the release of TensorFlow Quantum (TFQ), an open-source library for the rapid prototyping of quantum ML models, the Chocolate Factory said this week. TFQ provides the tools necessary for bringing the quantum computing and machine learning research communities together to control and model natural or artificial quantum systems.

Sponsored: Webcast: Why you need managed detection and response

View post:

TensorFlow gets its quantum of solace, lid lifted on 'all-seeing crime-detecting' AI upstart, and more - The Register

Written by admin

March 17th, 2020 at 5:44 am

Posted in Quantum Computer

Page 1123