AXOGEN : MANAGEMENT’S DISCUSSION AND ANALYSIS OF FINANCIAL CONDITION AND RESULTS OF OPERATIONS (form 10-Q) – marketscreener.com
Posted: May 9, 2021 at 1:53 am
Unless the context otherwise requires, all references in this report to"Axogen," the "Company," "we," "us" and "our" refer to Axogen, Inc., and itswholly owned subsidiaries Axogen Corporation ("AC"), Axogen ProcessingCorporation, and Axogen Europe GmbH.OVERVIEWWe are the leading company focused specifically on the science, development andcommercialization of technologies for peripheral nerve regeneration and repair.We are passionate about helping to restore peripheral nerve function and qualityof life to patients with physical damage or transection to peripheral nervesproviding innovative, clinically proven and economically effective repairsolutions for surgeons and health care providers. Peripheral nerves provide thepathways for both motor and sensory signals throughout the body. Every daypeople suffer traumatic injuries or undergo surgical procedures that impact thefunction of their peripheral nerves. Physical damage to a peripheral nerve, orthe inability to properly reconnect peripheral nerves, can result in the loss ofmuscle or organ function, the loss of sensory feeling or the initiation of pain.Our platform for peripheral nerve repair features a comprehensive portfolio ofproducts, including Avance Nerve Graft, a biologically active off-the-shelfprocessed human nerve allograft for bridging severed peripheral nerves withoutthe comorbidities associated with a second surgical site, Axoguard NerveConnector, a porcine submucosa extracellular matrix ("ECM") coaptation aid fortensionless repair of severed peripheral nerves, Axoguard Nerve Protector, aporcine submucosa ECM product used to wrap and protect injured peripheral nervesand reinforce the nerve reconstruction while preventing soft tissue attachments,Axoguard Nerve Cap, a porcine submucosa ECM product used to protect aperipheral nerve end and separate the nerve from the surrounding environment toreduce the development of symptomatic or painful neuroma and Avive Soft TissueMembrane, a processed human umbilical cord intended for surgical use as aresorbable soft tissue barrier. Along with these core surgical products, we alsooffer the Axotouch Two-Point Discriminator, used to measure the innervationdensity of any surface area of the skin. Our portfolio of products is availablein the United States, Canada, the United Kingdom, several European countries,South Korea and other international countries.Revenue from the distribution of our nerve repair products, Avance Nerve Graft,Axoguard Nerve Connector, Axoguard Nerve Protector, Axoguard Nerve Cap and AviveSoft Tissue Membrane, in the United States is the main contributor to our totalreported sales and has been the key component of our growth to date.We have experienced that surgeons initially are cautious adopters for nerverepair products. Surgeons typically start with a few cases and then wait andreview the results of these initial cases. Active accounts are usually past thiswait period and have developed some level of product reorder. These activeaccounts have typically gone through the committee approval process, have atleast one surgeon who has converted a portion of his or her treatment algorithmsof peripheral nerve repair to the our portfolio and have ordered our products atleast six times in the last 12 months. In the first quarter, we had 919 activeaccounts, an increase of 26% from 731 one year ago. Active accounts areapproximately 85% of our revenue. The top 10% of these active accounts continueto represent approximately 35% of our revenue. As our business continues togrow, we will transition to reporting a new account metric that we believedemonstrates the strength of adoption and potential revenue growth in accountsthat have developed a more consistent use of Axogen products in their nerverepair algorithm. We refer to these as "Core Accounts" which we define as activeaccounts that have purchased at least $100,000 in the past 12 months. In thefirst quarter, we had 274 Core Accounts, an increase of 13% from 243 one yearago. These Core Accounts represented approximately 60% of our revenue in thequarter, which has remained consistent over the past two years.As such, revenue growth primarily occurs from increased purchasing from activeaccounts, followed by revenue growth from new accounts. During the COVID-19pandemic, we kept our sales team and broader commercial organization intact andtook the opportunity to provide extensive sales training. Our sales teamdeveloped skills and shared best practices around remote case support wherehospital access has been restricted. We believe this remote support has beenappreciated by customers and has expanded our sales team's ability to supportsurgeons and their patients during COVID-19 and beyond.There have been no significant changes to our critical accounting policies fromthose disclosed in our 2020 Annual Report on Form 10-K. 25-------------------------------------------------------------------------------- Table of ContentsResults of OperationsComparison of the Three Months Ended March 31, 2021 and 2020 Three Months Ended March 31, 2021 2020 % of % of Amount Revenue Amount Revenue (dollars in thousands)Revenues $ 31,037 100.0 % $ 24,261 100.0 %Cost of goods sold 5,172 16.7 4,816 19.9Gross Profit 25,865 83.3 19,445 80.1Cost and expensesSales and marketing 17,973 57.9 17,838 73.5Research and development 5,748 18.5 4,614 19.0General and administrative 8,364 26.9 5,502 22.7Total costs and expenses 32,085 103.4 27,954 115.2Loss from operations (6,220) (20.0) (8,509) (35.1)Other (expense) income:Investment income 34 0.1 311 1.3Interest expense (444) (1.4) (31) 0.0Change in fair value of derivatives - - 0.0Other expense (30) (0.1) 37 0.1Total other (expense) income, net (440) (1.4) 317 1.4Net Loss $ (6,660) (21.5) % $ (8,192) (33.8) %RevenuesRevenues for the three months ended March 31, 2021 increased 28% to $31,037 ascompared to $24,261 for the three months ended March 31, 2020. Revenue growthwas driven by an increase in unit volume of approximately 22%, as well as thenet impact of changes in prices and product mix of approximately 6%. The growthin unit volume increase was attributed to unit growth in our active accounts,and also reflects the initial negative impact of the COVID-19 pandemic, whichbegan to negatively impact procedure volumes and revenue in March of 2020.Gross ProfitGross profit for the three months ended March 31, 2021 increased 33% to $25,865as compared to $19,445 for the three months ended March 31, 2020. Gross marginincreased to 83% in the three months ended March 31, 2021 compared to 80% forthe three months ended March 31, 2020. Prior year gross margin was negativelyimpacted by excess inventory write-downs.Costs and ExpensesTotal costs and expenses increased 15% to $32,085 for the three months endedMarch 31, 2021, as compared to $27,954 for the three months ended March 31,2020. Prior year stock compensation included a credit of $1,697 on stockcompensation primarily reflecting lower estimates of performance stock awardsthat would be earned resulting from the impact of COVID-19 on performancemetrics for these awards. Additionally, the increase over prior year alsoincludes higher compensation, litigation expenses as well as increased expensesfor our new Tampa facility. These expenses were partially offset by decreases intravel, in-person conferences and surgeon education programs due to COVID-19related restrictions. As a percentage of total revenues, total costs andexpenses decreased to 103% for the three months ended March 31, 2021, ascompared to 115% for the three months ended March 31, 2020.Sales and marketing expenses increased less than 1% to $17,973 for the threemonths ended March 31, 2021, as compared to $17,838 for the three months endedMarch 31, 2020. This increase was primarily due to higher compensation related 26-------------------------------------------------------------------------------- Table of Contentsexpenses including sales commissions, offset by a decrease in travel andsymposium expense due to pandemic-related restrictions. As a percentage of totalrevenues, sales and marketing expenses decreased to 58% for the three monthsended March 31, 2021 as compared to 74% for the three months ended March 31,2020.Research and development expenses increased 25% to $5,748 for the three monthsended March 31, 2021, as compared to $4,614 for the three months ended March 31,2020. Research and development costs include our product development, reflectsspending in a number of specific programs including our efforts related to theBLA for Avance Nerve Graft and a next generation Avance product, and clinicaltrials. Product development expenses represented approximately 66% of totalresearch and development expense in the three months ended March 31, 2021 ascompared to 50% in the prior year period. Clinical trial expenses representedapproximately 34% of research and development expense in the three months endedMarch 31, 2021 as compared to 50% in the prior year period. The increase inproduct development expenses reflect increased spending in specific programs,including our efforts related to the BLA for Avance Nerve Graft and a nextgeneration Avance product. Additionally, pandemic related restrictions loweredspending on certain of our clinical study programs. In the first quarter of2021, we reinitiated activities in our Sensation-NOW and Rethink PainRegistries, and we expect that these and other clinical activities will continueto increase across the coming quarters. As a percentage of total revenues,research and development expenses remained flat at 19% for both three monthsended March 31, 2021 and 2021.General and administrative expenses increased 52% to $8,364 for the three monthsended March 31, 2021, as compared to $5,502 for the three months ended March 31,2020. The prior year quarter included $1,800 of lower non-cash stockcompensation primarily related to lower estimates of performance stock unitsthat would be earned resulting from the impact of COVID-19 on performancemetrics for these awards. Additionally, current year general and administrativeexpenses include litigation charges of $837. As a percentage of total revenues,general and administrative expenses increased to 27% for the three months endedMarch 31, 2021, as compared to 23% for the three months ended March 31, 2020.Other Income and ExpensesWe recognized total other expense of $440 for the three months ended March 31,2021, compared to other income of $317 for the three months ended March 31,2020. The change is primarily due to interest expense recognized in the currentperiod on our current financing agreement with Oberland Capital (the "OberlandFacility") that began June 30, 2020, and lower investment income from our assetmanagement program as we lowered our investment balances and increased cashreserves.Income TaxesWe had no income tax expense or benefit for each of the three months ended March31, 2021 and 2020, due to the incurrence of net operating losses in each ofthese periods, the benefits of which have been fully reserved. We do not believethat there are any additional tax expenses or benefits currently available.Liquidity and Capital ResourcesCash Flow InformationAs of March 31, 2021, we had cash, cash equivalents, and restricted cash of$46,176, a decrease of $9,433 from $55,609 at December 31, 2020, primarily as aresult of capital expenditures related to the biologics processing center inVandalia, Ohio, and other operating activities.We had working capital of $112,872 and a current ratio of 6.1x at March 31,2021, compared to working capital of $122,420 and a current ratio of 6.4x atDecember 31, 2020. The decrease in working capital and the current ratio atMarch 31, 2021, as compared to December 31, 2020, was primarily due to cashpayments in the quarter offset by higher receivables balance and inventorybalance at the end of the quarter due to increasing sales. We believe we havesufficient cash resources to meet our liquidity requirements for at least thenext 12 months based on our expected level of operations.Our future capital requirements depend on a number of factors including, withoutlimitation, revenue increases consistent with our business plan, cost ofproducts and acquisition and/or development of new products. We could faceincreasing capital needs. Such capital needs could be substantial depending onthe extent to which we are unable to increase revenue. 27-------------------------------------------------------------------------------- Table of ContentsIf we need additional capital in the future, we could draw additional debtproceeds of up to an additional $40,000 from the our current financing agreementwith Oberland Capital subject to certain restrictions as set forth in theagreement and described in Note 10 - Long Term Debt in the Notes to CondensedConsolidated Financial Statements. If necessary, we may raise additional fundsthrough public or private equity offerings, debt financings or from othersources. The sale of additional equity would result in dilution to ourshareholders. There is no assurance that we will be able to secure funding onterms acceptable to us, or at all. The increasing need for capital could alsomake it more difficult to obtain funding through either equity or debt. Shouldadditional capital not become available to us as needed, we may be required totake certain action, such as slowing sales and marketing expansion, delayingregulatory approvals or reducing headcount.
Three Months Ended March 31,
(9,433) $ 170
Table of Contents
Edgar Online, source Glimpses
See the original post:
AXOGEN : MANAGEMENT'S DISCUSSION AND ANALYSIS OF FINANCIAL CONDITION AND RESULTS OF OPERATIONS (form 10-Q) - marketscreener.com
Texas is about to allow residents to carry handguns without a license or training – Yahoo News
Posted: at 1:53 am
The Texas Senate on Wednesday voted to allow most Texans to carry handguns without any sort of permit or training, sending the legislation to a conference committee with the House, which already passed a similar measure. Gov. Greg Abbott (R) said last week he will sign the bill. The Senate passed permitless carry on a party-line 18-13 vote, "less than a week after it sailed out of a committee created to specifically to tackle the legislation," The Texas Tribune reports. Every Republican voted for it, but several voiced concerns about the legislation during debate.
The legislation, considered too fringe during previous legislative sessions, faced opposition from law enforcement groups, firearms instructors, and Democrats. Currently, Texans must undergo four to six hours of training, pass a written exam and shooting proficiency test, and get fingerprinted to carry a handgun.
State Sen. Charles Schwertner (R), who sponsored the bill in the Senate, argued that gun safety is a personal responsibility. "The obligation on the part of the citizen who owns a potentially dangerous weapon to understand gun laws, to become proficient in their handling of their gun, is not absolved," he said. One Republican who showed up to vote for unlicensed carry despite injuries from a car accident collapsed on the Senate floor during debate.
Texans oppose unlicensed carry, 59 percent to 34 percent, according to a University of Texas/Texas Tribune poll from April. When asked, 46 percent of Texans would make gun laws stricter while 30 percent would leave them untouched and 20 percent would loosen them further, the poll found. Three-quarters favor requiring criminal and mental background checks before all gun sales.
"A lot of the [legislative] agenda right now seems at odds with public opinion," said James Henson, co-director of UT/Texas Tribune poll. "Guns is the best example" of Republican lawmakers chasing policies that "come from the most conservative wing of the majority party," he added. "But this is also notable on the abortion questions."
More stories from theweek.com House GOP campaign wing reportedly withheld bad Trump polling from lawmakers at retreat 5 brutally funny cartoons about the GOP's shunning of Liz Cheney The secret truth of the student debt crisis
Read the original post:
Texas is about to allow residents to carry handguns without a license or training - Yahoo News
IBM and MIT kickstarted the age of quantum computing in 1981 – Fast Company
Posted: at 1:52 am
In May 1981, at a conference center housed in a chateau-style mansion outside Boston, a few dozen physicists and computer scientists gathered for a three-day meeting. The assembled brainpower was formidable: One attendee, Caltechs Richard Feynman, was already a Nobel laureate and would earn a widespread reputation for genius when his 1985 memoir Surely Youre Joking, Mr. Feynman!: Adventures of a Curious Character became a bestseller. Numerous others, such as Paul Benioff, Arthur Burks, Freeman Dyson, Edward Fredkin, Rolf Landauer, John Wheeler, and Konrad Zuse, were among the most accomplished figures in their respective research areas.
The conference they were attending, The Physics of Computation, was held from May 6 to 8 and cohosted by IBM and MITs Laboratory for Computer Science. It would come to be regarded as a seminal moment in the history of quantum computingnot that anyone present grasped that as it was happening.
Its hard to put yourself back in time, says Charlie Bennett, a distinguished physicist and information theorist who was part of the IBM Research contingent at the event. If youd said quantum computing, nobody would have understood what you were talking about.
Why was the conference so significant? According to numerous latter-day accounts, Feynman electrified the gathering by calling for the creation of a quantum computer. But I dont think he quite put it that way, contends Bennett, who took Feynmans comments less as a call to action than a provocative observation. He just said the world is quantum, Bennett remembers. So if you really wanted to build a computer to simulate physics, that should probably be a quantum computer.
For a guide to whos who in this 1981 Physics of Computation photo, click here. [Photo: courtesy of Charlie Bennett, who isnt in itbecause he took it]Even if Feynman wasnt trying to kick off a moonshot-style effort to build a quantum computer, his talkand The Physics of Computation conference in generalproved influential in focusing research resources. Quantum computing was nobodys day job before this conference, says Bennett. And then some people began considering it important enough to work on.
It turned out to be such a rewarding area for study that Bennett is still working on it in 2021and hes still at IBM Research, where hes been, aside from the occasional academic sabbatical, since 1972. His contributions have been so significant that hes not only won numerous awards but also had one named after him. (On Thursday, he was among the participants in an online conference on quantum computings past, present, and future that IBM held to mark the 40th anniversary of the original meeting.)
Charlie Bennett [Photo: courtesy of IBM]These days, Bennett has plenty of company. In recent years, quantum computing has become one of IBMs biggest bets, as it strives to get the technology to the point where its capable of performing useful work at scale, particularly for the large organizations that have long been IBMs core customer base. Quantum computing is also a major area of research focus at other tech giants such as Google, Microsoft, Intel, and Honeywell, as well as a bevy of startups.
According to IBM senior VP and director of research Dario Gil, the 1981 Physics of Computation conference played an epoch-shifting role in getting the computing community excited about quantum physicss possible benefits. Before then, in the context of computing, it was seen as a source of noiselike a bothersome problem that when dealing with tiny devices, they became less reliable than larger devices, he says. People understood that this was driven by quantum effects, but it was a bug, not a feature.
Making progress in quantum computing has continued to require setting aside much of what we know about computers in their classical form. From early room-sized mainframe monsters to the smartphone in your pocket, computing has always boiled down to performing math with bits set either to one or zero. But instead of depending on bits, quantum computers leverage quantum mechanics through a basic building block called a quantum bit, or qubit. It can represent a one, a zero, orin a radical departure from classical computingboth at once.
Dario Gil [Photo: courtesy of IBM]Qubits give quantum computers the potential to rapidly perform calculations that might be impossibly slow on even the fastest classical computers. That could have transformative benefits for applications ranging from drug discovery to cryptography to financial modeling. But it requires mastering an array of new challenges, including cooling superconducting qubits to a temperature only slightly above abolute zero, or -459.67 Farenheit.
Four decades after the 1981 conference, quantum computing remains a research project in progress, albeit one thats lately come tantalizingly close to fruition. Bennett says that timetable isnt surprising or disappointing. For a truly transformative idea, 40 years just isnt that much time: Charles Babbage began working on his Analytical Engine in the 1830s, more than a century before technological progress reached the point where early computers such as IBMs own Automated Sequence Controlled Calculator could implement his concepts in a workable fashion. And even those machines came nowhere near fulfilling the vision scientists had already developed for computing, including some things that [computers] failed at miserably for decades, like language translation, says Bennett.
I think was the first time ever somebody said the phrase quantum information theory.
In 1970, as a Harvard PhD candidate, Bennett was brainstorming with fellow physics researcher Stephen Wiesner, a friend from his undergraduate days at Brandeis. Wiesner speculated that quantum physics would make it possible to send, through a channel with a nominal capacity of one bit, two bits of information; subject however to the constraint that whichever bit the receiver choose to read, the other bit is destroyed, as Bennett jotted in notes whichfortunately for computing historyhe preserved.
Charlie Bennetts 1970 notes on Stephen Wiesners musings about quantum physics and computing (click to expand). [Photo: courtesy of Charlie Bennett]I think was the first time ever somebody said the phrase quantum information theory,' says Bennett. The idea that you could do things of not just a physics nature, but an information processing nature with quantum effects that you couldnt do with ordinary data processing.
Like many technological advances of historic proportionsAI is another examplequantum computing didnt progress from idea to reality in an altogether predictable and efficient way. It took 11 years from Wiesners observation until enough people took the topic seriously enough to inspire the Physics of Computation conference. Bennett and the University of Montreals Gilles Brassard published important research on quantum cryptography in 1984; in the 1990s, scientists realized that quantum computers had the potential to be exponentially faster than their classical forebears.
All along, IBM had small teams investigating the technology. According to Gil, however, it wasnt until around 2010 that the company had made enough progress that it began to see quantum computing not just as an intriguing research area but as a powerful business opportunity. What weve seen since then is this dramatic progress over the last decade, in terms of scale, effort, and investment, he says.
IBMs superconducting qubits need to be kept chilled in a super fridge. [Photo: courtesy of IBM]As IBM made that progress, it shared it publicly so that interested parties could begin to get their heads around quantum computing at the earliest opportunity. Starting in May 2016, for instance, the company made quantum computing available as a cloud service, allowing outsiders to tinker with the technology in a very early form.
It is really important that when you put something out, you have a path to deliver.
One of the things that road maps provide is clarity, he says, allowing that road maps without execution are hallucinations, so it is really important that when you put something out, you have a path to deliver.
Scaling up quantum computing into a form that can trounce classical computers at ambitious jobs requires increasing the number of reliable qubits that a quantum computer has to work with. When IBM published its quantum hardware road map last September, it had recently deployed the 65-qubit IBM Quantum Hummingbird processor, a considerable advance on its previous 5- and 27-qubit predecessors. This year, the company plans to complete the 127-qubit IBM Quantum Eagle. And by 2023, it expects to have a 1,000-qubit machine, the IBM Quantum Condor. Its this machine, IBM believes, that may have the muscle to achieve quantum advantage by solving certain real-world problems faster the worlds best supercomputers.
Essential though it is to crank up the supply of qubits, the software side of quantum computings future is also under construction, and IBM published a separate road map devoted to the topic in February. Gil says that the company is striving to create a frictionless environment in which coders dont have to understand how quantum computing works any more than they currently think about a classical computers transistors. An IBM software layer will handle the intricacies (and meld quantum resources with classical ones, which will remain indispensable for many tasks).
You dont need to know quantum mechanics, you dont need to know a special programming language, and youre not going to need to know how to do these gate operations and all that stuff, he explains. Youre just going to program with your favorite language, say, Python. And behind the scenes, there will be the equivalent of libraries that call on these quantum circuits, and then they get delivered to you on demand.
IBM is still working on making quantum computing ready for everyday reality, but its already worked with designers to make it look good. [Photo: courtesy of IBM]In this vision, we think that at the end of this decade, there may be as many as a trillion quantum circuits that are running behind the scene, making software run better, Gil says.
Even if IBM clearly understands the road ahead, theres plenty left to do. Charlie Bennett says that quantum researchers will overcome remaining challenges in much the same way that he and others confronted past ones. Its hard to look very far ahead, but the right approach is to maintain a high level of expertise and keep chipping away at the little problems that are causing a thing not to work as well as it could, he says. And then when you solve that one, there will be another one, which you wont be able to understand until you solve the first one.
As for Bennetts own current work, he says hes particularly interested in the intersection betweeninformation theory and cosmologynot so much because I think I can learn enough about it to make an original research contribution, but just because its so much fun to do. Hes also been making explainer videos about quantum computing, a topic whose reputation for being weird and mysterious he blames on inadequate explanation by others.
Unfortunately, the majority of science journalists dont understand it, he laments. And they say confusing things about itpainfully, for me, confusing things.
For IBM Research, Bennett is both a living link to its past and an inspiration for its future. Hes had such a massive impact on the people we have here, so many of our top talent, says Gil. In my view, weve accrued the most talented group of people in the world, in terms of doing quantum computing. So many of them trace it back to the influence of Charlie. Impressive though Bennetts 49-year tenure at the company is, the fact that hes seen and made so much quantum computing historyincluding attending the 1981 conferenceand is here to talk about it is a reminder of how young the field still is.
Harry McCracken is the technology editor for Fast Company, based in San Francisco. In past lives, he was editor at large for Time magazine, founder and editor of Technologizer, and editor of PC World.
More
Visit link:
IBM and MIT kickstarted the age of quantum computing in 1981 - Fast Company
Here’s the lowdown on how quantum computing affects the Middle East – SCOOP EMPIRE
Posted: at 1:52 am
By Sherif Awad
In 1980, American physicist Paul Benioff, set the first milestone for developing a new kind of computer (quantum computer), that far more powerful than our normal computers. He demonstrated the theoretical possibility of quantum computers.
Quantum computers are based on quantum mechanics, and they can perform computations much faster than normal computers. They can solve complex problems that the fastest super computer cannot solve. A quantum computer can solve a problem that takes one week on a normal computer in one second, or in some other scenarios, a real quantum computer solved a problem that would take the worlds fastest computer 10,000 years in 200 seconds. Quantum computers can be used to solve the most complex problems from finance, security, to cancer research. Scientists expect to have real use of quantum computers by end of next year, full applications by 2026, and commercial use by 2030. To reach commercial scale quantum computers, it is going to require revolutionary discoveries in physics, material science, computer science, and mathematics.
The communication on the internet is protected by cryptography. Cryptography protects our information, as it travels over and is stored on the internet. Quantum computers are so powerful, they can break into the worlds most complex cryptography in seconds, and that poses a threat to the world. It can break into governments, enterprises, or global organization systems. IT organizations around the world are working on creating new cryptography methods that cannot be broken by quantum computers. The US National Institute of Standards and Technology (NIST), is working on standardizing cryptography algorithms that cannot be broken by quantum computers.
Quantum computers exist today, but they are not as powerful as we need them to be in order to solve the most complex problems that we have today. Quantum computers speed can be measured in qubits, which is the basic unit of quantum information like bits in normal computers. IT giants are battling for quantum supremacy such as IBM, Google, Microsoft etc. Google claimed quantum supremacy in 2019 by building a quantum computer with 53 qubits. A team of Chinese scientists in 2020 have developed the most powerful super computer that is able to perform a single task 100 trillion times faster than the worlds fastest super computer. China has invested $10 billion on the countrys National Laboratory for Quantum Information Sciences. That does not mean they reached quantum supremacy, as their computer is specialized in doing a single task really fast, unlike the Google general quantum computer.
Any security that we have today will be useless by 2030. That means that the IT systems that we rely on today such as electricity, networks, hospitals and supply chains, could be down in seconds. Governments, enterprises and global organizations need to change their security systems to be post-quantum proof, which means they will invest heavily in new security agile systems that can adapt to new security protocols as they arise.
Congress passed the National Quantum Initiative Act which requires presidents to be advised about the developments in the field, and the World Economic Forum has advised that we need to build quantum literacy program in governments.
Imagine that some encrypted secure data got stolen from you today. In a few years, quantum computers will be able to decrypt the data that was stolen. If your data becomes irrelevant in five years, then you shouldnt care about it being decrypted in five years. But, if it is government data, then defiantly you need to start thinking about quantum security today.
Guest writers who have worthy insights on pressing matters, that they feel the need to share with the world
Go here to see the original:
Here's the lowdown on how quantum computing affects the Middle East - SCOOP EMPIRE
IBM Extends HBCU Initiatives Through New Industry Collaborations – PRNewswire
Posted: at 1:52 am
ARMONK, N.Y., May 7, 2021 /PRNewswire/ --IBM (NYSE: IBM) announced today it has extended its IBM Global University Program with historically black colleges and universities (HBCUs) to 40 schools.
IBM is now working with the American Association of Blacks in Higher Education (AABHE), 100 Black Men of America, Inc., Advancing Minorities' Interest in Engineering (AMIE) and the United Negro College Fund (UNCF) to better prepare HBCU students for in-demand jobs in the digital economy.
In parallel, the IBM Institute for Business Value released a new reportwith broad-ranging recommendations on how businesses can cultivate more diverse, inclusive workforces by establishing similar programs and deepening engagement with HBCUs.
IBM's HBCU program momentum has been strong in an environment where only 43% of leaders across industry and academia believe higher education prepares students with necessary workforce skills.* In September 2020, IBM announced the investment of $100 million in assets, technology and resources to HBCUs across the United States. Through IBM Global University Programs, which include the continuously enhanced IBM Academic Initiative and IBM Skills Academy, IBM has now:
Building on this work, IBM and key HBCU ecosystem partners are now collaborating to expedite faculty and student access and use of IBM's industry resources.
In its new report, "Investing in Black Technical Talent: The Power of Partnering with HBCUs," IBM describes how HBCUs succeed in realizing their mission and innovate to produce an exceptional talent pipeline, despite serious funding challenges. IBM explains its approach to broad-based HBCU collaboration with a series of best-practices for industry organizations.
IBM's series of best practices include:
To download the full report, please visit: LINK.
HBCU students continue to engage with IBM on a wide range of opportunities. These include students taking artificial intelligence, cybersecurity or cloud e-learning courses and receiving a foundational industry badge certificate in four hours. Many also attend IBM's virtual student Wednesday seminars with leading experts, such as IBM neuroscientists who discuss the implications of ethics in neurotechnology.
Statements from Collaborators "HBCUs typically deliver a high return on investment. They have less money in their endowments, faculty is responsible for teaching a larger volume of classes per term and they receive less revenue per student than non-HBCUs. Yet, HBCUs produce almost a third of all African-American STEM graduates,"** said Valinda Kennedy, HBCU Program Manager, IBM Global University Programs and co-author of "Investing in Black Technical Talent: The Power of Partnering with HBCUs.""It is both a racial equity and an economic imperative for U.S. industry competitiveness to develop the most in-demand skills and jobs for all students and seek out HBCU students who are typically underrepresented in many of the most high-demand areas."
"100 Black Men of America, Inc. is proud to collaboratewith IBM to deliver these exceptional and needed resources to the HBCU community and students attending these institutions. The 100 has long supported and sought to identify mechanisms that aid in the sustainability of historically black colleges and universities. This collaboration and the access and opportunities provided by IBM will make great strides in advancing that goal," stated 100 Black Men of America Chairman Thomas W. Dortch, Jr.
"The American Association of Blacks in Higher Education is proud to collaborate with IBM," said Dereck Rovaris, President, AABHE. "Our mission to be the premier organization to drive leadership development, access and vital issues concerning Blacks in higher education works perfectly with IBM's mission to lead in the creation, development and manufacture of the industry's most advanced information technologies.Togetherthis collaboration will enhance both organizations and the many people we serve."
"IBM is a strong AMIE partnerwhose role is strategic and support is significant in developing a diverse engineering workforce through AMIE and our HBCU community.IBM's presence on AMIE's Board of Directors provides leadership for AMIE's strategies,key initiatives and programsto achieve our goal of a diverse engineering workforce," said Veronica Nelson, Executive Director, AMIE."IBM programslike the IBM Academic Initiative and the IBM Skills Academyprovideaccess, assets and opportunities for our HBCU faculty and students to gain high-demand skills in areas like AI, cybersecurity, blockchain, quantum computing and cloud computing. IBM is a key sponsor of the annual AMIE Design Challenge introducing students to new and emerging technologies through industry collaborations and providing experiential activities like IBM Enterprise Design Thinking, which is the foundational platform for the Design Challenge. The IBM Masters and PhD Fellowship Awards program supports our HBCU students with mentoring, collaboration opportunities on disruptive technologies as well as a financial award. The IBM Blue Movement HBCU Coding Boot Camp enables and recognizes programming competencies. IBM also sponsors scholarships for the students at the 15 HBCU Schools of Engineering to support their educational pursuits. IBM continues to evolve its engagement with AMIE and the HBCU Schools of Engineering."
"The IBM Skills Academy is timely in providing resources that support the creativity of my students in the Dual Degree Engineering Program at Clark Atlanta University," said Dr. Olugbemiga A. Olatidoye, Professor, Dual Degree Engineering and Director, Visualization, Stimulation and Design Laboratory, Clark Atlanta University. "It also allows my students to be skillful in their design thinking process, which resulted in an IBM digital badge certificate and a stackable credential for their future endeavors."
"We truly value the IBM skills programs and have benefitted from the Academic Initiative, Skills Academy and Global University Awards across all five campuses," saidDr. Derrick Warren, Interim Associate Dean and MBA Director, Southern University. "Over 24 faculty and staff have received instructor training and more than 300 students now have micro-certifications in AI, cloud, cybersecurity, data science, design thinking, Internet of Things, quantum computing and other offerings."
"At UNCF, we have a history of supporting HBCUs as they amplify their outsized impact on the Black community, and our work would not be possible without transformational partnerships with organizations like IBM and their IBM Global University Programs," said Ed Smith-Lewis, Executive Director of UNCF's Institute for Capacity Building. "We are excited to bring the resources of IBM to HBCUs, their faculty, and their students."
"IBM Skills Academy is an ideal platform for faculty to teach their students the latest in computing and internet technologies," said Dr. Sridhar Malkaram, West Virginia State University. "It helped the students in my Applied Data Mining course experience the state of the art in data science methods and analysis tools. The course completion badge/certificate has been an additional and useful incentive for students, which promoted their interest. The Skills Academy courses can be advantageously adapted by faculty, either as stand-alone courses or as part of existing courses."
About IBM:IBM is a leading global hybrid cloud, AI and business services provider. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. For more information visit: https://newsroom.ibm.com/home.
*King, Michael, Anthony Marshall, Dave Zaharchuk. "Pursuit of relevance: How higher education remains viable in today's dynamic world." IBM Institute for Business Value. Accessed March 23, 2021. https://www.ibm.com/thought-leadership/institute-business-value/report/education-relevance
**Source: National Center for Education Statistics, Integrated Postsecondary Education Data System
IBM Media RelationsContact: Carrie Bendzsa [emailprotected] +1613-796-3880
SOURCE IBM
Read more from the original source:
IBM Extends HBCU Initiatives Through New Industry Collaborations - PRNewswire
Here comes the worlds first ever multi-node quantum network – TelecomTV
Posted: at 1:52 am
Dutch scientists working at the quantum research institute QuTech in the city of Delft, southeast of The Hague in the Netherlands, have built the first ever multi-node quantum network by managing to connect three quantum processors. The nodes can both store and process qubits (quantum bits) and the researchers have provided a proof of concept that quantum networks are not only achievable but capable of being scaled-up in size eventually to provide humanity with a quantum Internet.
When that happens the world will become a very different place. With massive new and computing capabilities being made available via the power of sub-atomic particles, intractable problems that would currently take many years to solve (it they could be solved at all) using conventional silicon-based super-computers will be determined within seconds.
The ultimate goal is to enable the construction of a world-wide quantum Internet wherein quantum mechanics will permit quantum devices to communicate and conjoin to create large quantum clusters of exponentially great power easily capable of solving currently unsolvable problems at enormous speed.
Qubits, the basic building blocks of quantum computers exist in a quantum state where, unlike traditional binary computing where a bit represents the value of either zero or one, qubits can exist both as zeros and ones simultaneously. Thus quantum computers can perform an incredible number of calculations at once but, due to the inherent instability of the quantum state they can collapse and disappear the instant they are exposed to an outside environment and must "decide" to take the value of a zero or one. This makes for the strong possibility that qubit calculations may, or may not, be reliable and verifiable and so a great deal of research is underway on error correction systems that would guarantee the results arrived at in a quantum calculations are true.
Say hello to Bob, Alice and Charlie, just don't look at them
A quantum Internet will come into being and continue to exist because of quantum entanglement, a remarkable physical property whereby a group of particles interact or share spatial proximity such that the quantum state of each particle cannot be determined independently of the state of the others, even when the particles are physically separated by great distances.
In other words, quantum particles can be coupled into a single fundamental connection regardless of how far apart they might be. The entanglement means that a change applied to one of the particles will instantly be echoed in the other. In quantum Internet communications, entangled particles can instantly transmit information from a qubit to its entangled other even though that other is in a quantum device on the other side of the world, or the other side of the universe come to that.
For this desired state of affairs to maintain itself, entanglement must be achieved and and maintained for as long as is required. There have already been many laboratory demonstrations, commonly using fibre optics, of a physical link between two quantum devices, but two nodes do not a network make. Thats's why QuTech's achievement is so important. In a system configuration reminiscent of the role routers play in a traditional network environment, the Dutch scientists placed a third node, which has a physical connection between the two others enabling entanglement between it and them. Thus a network was born. The researchers christened the three nodes as Bob, Alice and Charlie
So, Bob has two qubits: a memory qubit to permit the storage of an established quantum link, (in this case with Alice) and a communications qubit (to permit a link with node Charlie). Once the links with Alice and Charlie are established, Bob locally connects its own to qubits with the result that an entangled three node network exists and Alice and Charlie are linked at the quantum level despite there being no physical link between them. QuTech has also invented the world's first quantum network protocol which flags up a message to the research scientists when entanglement is successfully completed.
The next step will be to add more qubits to Bob, Alice and Charlie and develop hardware, software and a full set of protocols that will form the foundation blocks of a quantum Internet. That will be laboratory work but later on the network will be tested over real-world, operational telco fibre. Research will also be conducted into creating compatibility with data structures already in use today.
Another problem to be solved is how to enable the creation of a large-scale quantum network by increasing the distance that entanglement can be maintained. Until very recently that limit was 100 kilometres but researchers in Chinese universities have just ramped it up to 1,200 kilometres.
The greater the distance of travel, the more quantum devices and intermediary nodes can be deployed and the more powerful and resilient a quantum network and Internet will become. That will enable new applications such as quantum cryptography, completely secure, utterly private and unhackable comms and cloud computing, the discovery of new drugs and other applications in fields such as finance, education, astrophysics, aeronautics, telecoms, medicine, chemistry and many others that haven't even been thought of yet.
It might even provide answers to the riddle of the universal oneness of which we are all a miniscule part. Maybe the answer to the question of life, the universe and everything will be 43, as calculated by the supercomputer Deep Thought rather than the 42 postulated by Douglas Adams in "The Hitchhikers Guide to the Galaxy". Even if that is the case, given localised quantum relativity effects and Heisenbergs Uncertainty Principle it could easily be another number, until you look at it, when it turns into a living/dead cat.
Read the original post:
Here comes the worlds first ever multi-node quantum network - TelecomTV
Crystal Ball Gazing at Nvidia: R&D Chief Bill Dally Talks Targets and Approach – HPCwire
Posted: at 1:52 am
Theres no quibbling with Nvidias success. Entrenched atop the GPU market, Nvidia has ridden its own inventiveness and growing demand for accelerated computing to meet the needs of HPC and AI. Recently it embarked on an ambitious expansion by acquiring Mellanox (interconnect) and is now working to complete the purchase of Arm (processor IP). Along the way, it jumped into the systems business with its DGX line. What was mostly a GPU company is suddenly quite a bit more.
Bill Dally, chief scientist and senior vice president, research, argues that R&D has been and remains a key player in Nvidias current and long-term success. At GTC21 this spring Dally provided a glimpse into Nvidias R&D organization and a couple of high priority projects. Like Nvidia writ large, Dallys research group is expanding. It recently added a GPU storage systems effort and just started an autonomous vehicle research group, said Dally.
Presented here is a snapshot of the Nvidia R&D organization and a little about its current efforts as told by Dally plus a few of his Q&A responses at the end of the article.
[We] are loosely organized into a supply side and demand side. The supply side of the research lab tries to develop technology that goes directly to supply our product needs to make better GPUs [these are] VLSI design methodologies to architect the GPUs, better GPU architectures, better networking technology to connect CPUs together and into the larger datacenter programming systems, and we recently started a new GPU storage systems group, said Dally.
The demand side of Nvidia Research aims to drive demand for GPUs. We actually have three different graphics research groups, because one thing we have to continually do is raise the bar for what is good real-time graphics. If it ever becomes good enough, eventually, the integrated graphics that you get for free with certain CPUs will become good enough. And then therell be no demand for our discrete GPUs anymore. But by introducing ray tracing, by introducing better illumination both direct and indirect, were able to constantly raise the bar on what people demand for good real time graphics.
Not surprisingly, AI has quickly become a priority. We have actually five different AI labs because AI has become such a huge driver for demand for GPUs, he said. A couple years ago the company opened a robotics lab. We believe that Nvidia GPUs will be the brains of all future robots, and we want to lead that revolution as robots go from being very active positioning machines to being things that interact with their environments and interact with humans. Weve also just started an autonomous vehicle research group to look at technology that will lead the way for our DRIVE products.
Occasionally, said Dally, Nvidia will pull people together from the different research for what are called moonshots or high-impact projects. We did one of those that developed the TTU [tree traversal unit], what is now called the RT core, to introduce ray tracing to real-time graphics. We did one for a research GPU that later turned into Volta. [Moonshots] are typically larger projects that try to push technology further ahead, integrating concepts from many of the different disciplines, said Dally.
A clear focus on productizing R&D has consistently paid off for Nvidia contends Dally, Over the years, weve had a huge influence on Nvidia technology. Almost all of ray tracing at Nvidia started within a Nvidia Research. Starting with the development of optics and the software ray tracer that forms the core of our professional graphics offering. More recently developing the RT cores that have brought ray tracing to real time and consumer graphics. We got Nvidia into networking when we developed NVSwitch originally as a research project back in about 2012. And we got Nvidia into deep learning and AI on a collaborative project with Stanford that led to the development of cuDNN, he said.
So much for history. Today, like many others, Nvidia is investigating in optical communications technology to overcome speedbumps imposed by existing wire-based technology. Dally discussed some of Nvidias current efforts.
When we started working on NVLink and NVSwitch, it was because we had this vision that were not just building one GPU, but were building a system that incorporates many GPUs, switches and connections to the larger datacenter. To do this, we need technology that allows our GPUs to communicate with each other and other elements of the system, and this is becoming harder to do for two reasons, he said.
Slowing switching times and wiring constraints are the main culprits. For example, said Dally, using 26-gauge cable you can go at different bit rates 25, 50, 100, 200 Gbps but at 200 Gbps, youre down to one meter (reach) which is barely enough to reach a top of rack switch from a GPU; if you speed up to 400 Gbps, its going to be a half a meter.
What we want is to get as many bits per second off a millimeter chip edge as we can because if you look forward, were going to be building 100 terabit switches, and we need to get 100 terabits per second off of that switch. So wed like to be at more than a terabit per second per millimeter of chip edge and wed like to be able to reach at least 10 meters. It turns out if youre building something like a DGX SuperPod, you actually need very few cables longer than that. And wed like to have the energy per bit be down in the one picojoule per bit range. The technology that seems most promising to do this is dense wavelength division multiplexing with integrated silicon photonics.
Conceptually the idea is pretty straightforward.
This chart (below) shows the general architecture. We start with a laser comb source. This is a laser that produces a number of different colors of light. I say different colors [but they] are imperceptibly different by like 100 gigahertz in frequency, but it produces these different colors of light and sends them over a supply fiber to our transmitter. In the transmitter, we have a number of ring resonators that are able to individually modulate (on-and-off) the different colors of light. So we can take one color of light and modulate it at some bit rate on and off. We do this simultaneously in parallel on all of the other colors and get a bit rate which is a product of the number of colors we have and the bit rate were switching per color. We send that over a fiber with a reach of 10-to-100 meters to our receiving integrated circuit. [There] we pick off with ring resonators the different colors that are now either on or off with a bitstream and send that photodetectors and transimpedance amplifiers and on up to the receiver, described Dally
Dally envisions a future optical DGX where a GPU will communicate via an organic package to an electrical integrated circuit that basically takes that GPU link and modulates the individual ring resonators that you saw in the previous figure on the photonic integrated circuit. The photonic integrated circuit accepts the supply fiber from the laser, has the ring resonator modulators, and drives that fiber to the receiver. The receiver will have an NVSwitch and has the same photonic integrated circuit. But now were on the receive side where the ring resonators pick the wavelengths off to the electrical integrated circuit, and it drives the switch.
The key to this is that optical engine, he said, which has a couple of components on it. It has the host electrical interface that receives a short reach electrical interface from the GPU. It has modulator drivers to modulate the ring resonators as well as control circuitry, for example, to maintain the temperature of the ring resonators [which must be at] a very accurate temperature to keep the frequency stable. It then has waveguides to grating couplers that couple that energy into the fiber that goes to the switch.
Many electronic system and device makers are grappling with the interconnect bandwidth issue. Likely at a future GTC, one of Dallys colleagues from product management will be showcasing new optical interconnect systems while the Nvidia R&D team is grappling with some new set of projects.
I hope that the projects I described for you today [will achieve] future success, but we never know. Some of our projects become the next RT core. Some of our projects [dont work as planned, and] we quietly declare success and move on to the next one. But we are trying to do everything that we think could have impact on Nvidias future.
POST SCRIPTS Dally Quick Hits During Q&A
Nvidia R&D Reach Go Where the Talent Is
We are already geographically very, very diverse. I have a map. Of course, its not in the slide deck (shrugs), were all over North America and Europe. And a couple years ago, actually, even before the Mellanox acquisition, we opened an office in Tel Aviv. Whats driven this geographic expansion has been talent, we find smart people. And there are a lot of smart people who dont want to move to Santa Clara, California. So we basically create an office where they are. I think there are certainly some gaps. One gap I see as a big gap is an office in Asia; there are an awful lot of smart people in Asia, a lot of interesting work coming out of there. And I think Africa and South America clearly have talent pools we want to be tapping as well.
On Fab Technologys Future
So what will be the future of computing when the fab processing technology becomes near sub nanometer scaling with respect to quantum computing? Thats a good question, but I dont know that Ive given that much thought. I think weve got a couple generations to go. Amperes in seven nanometers and we see our way clearly to five nanometers and three nanometers, and the devices there operate very classically. Quantum computing, I think if we move there, its not going to be, you know, with conventional fabs. Its going to be with these Josephson junction based technologies that a lot of people are experimenting, or with photonics, or with trapped ions. We have done a study group to look at quantum computing and have seen it as a technology is pretty far out. But our strategy is to enable [quantum] by things like the recently announced cuQuantum (SDK) so that we can both help people simulate quantum algorithms until quantum computers are available, and ultimately run the classical part of those quantum computers on our GPUs.
Not Betting on Neuromorphic Tech
The next one is do you see Nvidia developing neuromorphic hardware to support spiking neural networks? The short answer is no, Ive actually spent a lot of time looking at neuromorphic computing. I spent a lot of time looking at a lot of emerging technologies and try to ask the question, Could these technologies make a difference for Nvidia? For neuromorphic computing the answer is no, and sort of consists of three things. One of them is the the spiking representation, which is actually a pretty inefficient representation of data because youre toggling a line up and down multiple times to signal a number. To have that say 256 dynamic range, on average, youd have to toggle 128 times and that [requires] probably 64 times more energy than an integer representation. Then theres the analog computation and weve looked at analog computation, finding it to be less energy efficient when you consider the need to convert to store the digital computation. And then theres different models they typically come up with. If those models were better than models, like BERT for language, or Resnet, for imaging, people would be using them, but they dont win the competitions. So were not looking at spiking things right now.
Can DL Leverage Sparsity Yes.
The next question here is can deep learning techniques leverage sparsity, for example, sparse atom optimizer, sparse attention, take advantage of the sparse matrix multiplication mechanisms in the Ampere tensor cores? Thats a bit off topic, but the short answer is yes. I mean, neural networks are fundamentally sparse. [A colleague and] I had a paper at NeurIPS in 2015, where we showed that you can basically prune most convolution layers down to 30 percent density and most fully-connected layers down to 10 percent or less density with no loss of accuracy. So I think that getting to the 50 percent you need to exploit the sparse matrix multiply units in Ampere is actually very easy. And I think were going to see, actually weve already seen that applied kind of across the board on the matrix multiply gives you a 2x improvement. But over the whole application, which includes all these things that arent matrix multiply, like the normalization step, and the nonlinear operator and the pooling, we actually even considering all of that and Amdahls law we still get a 1.5x speed up on BERT applying the sparse tensor cores.
Continue reading here:
Crystal Ball Gazing at Nvidia: R&D Chief Bill Dally Talks Targets and Approach - HPCwire
Harnessing the power of machine learning with MLOps – VentureBeat
Posted: at 1:51 am
Join Transform 2021 this July 12-16. Register for the AI event of the year.
MLOps, a compound of machine learning and information technology operations, is a newer discipline involving collaboration between data scientists and IT professionals with the aim of productizing machine learning algorithms. The market for such solutions could grow from a nascent $350 million to $4 billion by 2025, according to Cognilytica. But certain nuances can make implementing MLOps a challenge. A survey by NewVantage Partners found that only 15% of leading enterprises have deployed AI capabilities into production at any scale.
Still, the business value of MLOps cant be ignored. A robust data strategy enables enterprises to respond to changing circumstances, in part by frequently building and testing machine learning technologies and releasing them into production. MLOps essentially aims to capture and expand on previous operational practices while extending these practices to manage the unique challenges of machine learning.
MLOps, which was born at the intersection of DevOps, data engineering, and machine learning, is similar to DevOps but differs in execution. MLOps combines different skill sets: those of data scientists specializing in algorithms, mathematics, simulations, and developer tools and those of operations administrators who focus on tasks like upgrades, production deployments, resource and data management, and security.
One goal of MLOps is to roll out new models and algorithms seamlessly, without incurring downtime. Because production data can change due to unexpected events and machine learning models respond well to previously seen scenarios, frequent retraining or even continuous online training can make the difference between an optimal and suboptimal prediction.
A typical MLOps software stack might span data sources and the datasets created from them, as well as a repository of AI models tagged with their histories and attributes. Organizations with MLOps operations might also have automated pipelines that manage datasets, models, experiments, and software containers typically based on Kubernetes to make running these jobs simpler.
At Nvidia, developers running jobs on internal infrastructure must perform checks to guarantee theyre adhering to MLOps best practices. First, everything must run in a container to consolidate the libraries and runtimes necessary for AI apps. Jobs must also launch containers with an approved mechanism and run across multiple servers, as well as showing performance data to expose potential bottlenecks.
Another company embracing MLOps, software startup GreenStream, incorporates code dependency management and machine learning model testing into its development workflows. GreenStream automates model training and evaluation and leverages a consistent method of deploying and serving each model while keeping humans in the loop.
Given all the elements involved with MLOps, it isnt surprising that companies adopting it often run into roadblocks. Data scientists have to tweak various features like hyperparameters, parameters, and models while managing the codebase for reproducible results. They also need to engage in model validation, in addition to conventional code tests, including unit testing and integration testing. And they have to use a multistep pipeline to retrain and deploy a model particularly if theres a risk of reduced performance.
When formulating an MLOps strategy, it helps to begin by framing machine learning objectives from business growth objectives. These objectives, which typically come in the form of KPIs, can have certain performance measures, budgets, technical requirements, and so on. From there, organizations can work toward identifying input data and the kinds of models to use for that data. This is followed by data preparation and processing, which includes tasks like cleaning data and selecting relevant features (i.e., the variables used by the model to make predictions).
The importance of data selection and prep cant be overstated. In a recent Atlation survey, a clear majority of employees pegged data quality issues as the reason their organizations failed to successfully implement AI and machine learning. Eighty-seven percent of professionals said inherent biases in the data being used in their AI systems produce discriminatory results that create compliance risks for their organizations.
At this stage, MLOps extends to model training and experimentation. Capabilities like version control can help keep track of data and model qualities as they change throughout testing, as well as helping scale models across distributed architectures. Once machine learning pipelines are built and automated, deployment into production can proceed, followed by the monitoring, optimization, and maintenance of models.
A critical part of monitoring models is governance, which here means adding control measures to ensure the models deliver on their responsibilities.Astudy by Capgemini found that customers and employees will reward organizations that practice ethical AI with greater loyalty, more business, and even a willingness to advocate for them and will punish those that dont. The study suggests companies that dont approach the issue thoughtfully can incur both reputational risk and a direct hit to their bottom line.
In sum, MLOps applies to the entire machine learning lifecycle, including data gathering, model creation, orchestration, deployment, health, diagnostics, governance, and business metrics. If successfully executed, MLOps can bring business interest to the fore of AI projects while allowing data scientists to work with clear direction and measurable benchmarks.
Enterprises that ignore MLOps do so at their own peril. Theres a shortage of data scientists skilled at developing apps, and its hard to keep up with evolving business objectives a challenge exacerbated by communication gaps. According to a 2019 IDC survey, skills shortages and unrealistic expectations from the C-suite are the top reasons for failure in machine learning projects. In 2018, Element AI estimated that of the 22,000 Ph.D.-educated researchers working globally on AI development and research, only 25% are well-versed enough in the technology to work with teams to take it from research to application.
Theres also the fact that models frequently drift away from what they were intended to accomplish. Assessing the risk of these failures as a part of MLOps is a key step not only for regulatory purposes, but to protect against business impacts. For example, the cost of an inaccurate video recommendation on YouTube would be much lower compared with flagging an innocent person for fraud and blocking their account or declining their loan applications.
The advantage of MLOps is that it puts operations teams at the forefront of best practices within an organization. The bottleneck that results from machine learning algorithms eases with a smarter division of expertise and collaboration from operations and data teams, and MLOps tightens that loop.
Here is the original post:
Harnessing the power of machine learning with MLOps - VentureBeat
How Machine Learning is Beneficial to the Police Departments? – CIOReview
Posted: at 1:51 am
It is important to understand the basic nature of machines like computers in order to understand what machine learning is. Computers are devices that follow instructions, and machine learning brings in an interesting outlook, where a computer can learn from the experience without the need for programming. Machine learning transports computers to another level where they can learn intuitively in a similar manner as humans. It has several applications, including virtual assistants, predictive traffic systems, surveillance systems, face recognition, spam, malware filtering, fraud detection, and so on.
The police can utilize machine learning effectively to resolve the challenges that they face. Machine learning helps in predictive policing, where they can prevent crimes and improve public safety. Here a few ways how the police can leverage machine learning to achieve better results.
Pattern recognition
One of the most robust applications of machine learning in policing is in the field of pattern recognition. Crimes can be related and might either be done by the same person or use the same modus operandi. The police can gain an advantage if they can spot the patterns in crimes. The data that the police gather from crimes is essentially unstructured. This data must be organized and sifted through to find the patterns.
Machine learning can help do achieve this easily. Machine learning tools can compare numerous crimes easily and generate a likewise score. The software can then utilize these scores to try and determine if there are common patterns. The New York Police Department is implementing this. The tool has been utilized to crack cases effectively
Cybersecurity
Cybersecurity is a vital area in todays world. With the extensive usage of the internet everywhere, cybercriminals are targeting computer systems around the globe. Cybersecurity is critical not for solving cases but to prevent them from very proactively. Cybersecurity can be enhanced with the use of machine learning. Tools that use machine learning can better cybersecurity and proactively prevent crimes.
Predictive analytics
Another area related to machine learning, which can help police is predictive analytics. This is a powerful application of machine learning that the police can leverage to achieve substantial results. A tool that has predictive analytics features utilizes machine learning to help the police in improving public safety. These tools focus on crime trends and are thus beneficial. When these trends are spotted, the law can proactively take action
Continued here:
How Machine Learning is Beneficial to the Police Departments? - CIOReview
4 Stocks to Watch Amid Rising Adoption of Machine Learning – Zacks.com
Posted: at 1:51 am
Machine learning (ML) has been gaining precedence over the past few years as organizations are rapidly implementing ML solutions to increase efficiency by delivering more accurate results as well as providing a better customer experience. Notably, when it comes to automation, ML has become a driving force as it involves training the Artificial Intelligence (AI) to learn a task and carry it out efficiently, minimizing the need for human intervention.
In any case, ML was already witnessing rapid adoption and the outbreak of the COVID-19 pandemic last year helped in accelerating that demand, as organizations began to rely heavily on automation to carry out their operations.
Markedly, ML is gradually becoming an integral part across various sectors as the trend of digitization is picking up. Notably, ML is finding application in the finance sector as among other usages, it helps in better fraud detection and enabling automated trading for investors. Meanwhile, ML is also making its way into healthcare as with the help of algorithms, big volumes of data like healthcare records can be studied to identify patterns related to diseases, thereby allowing practitioners to deliver more efficient and precise treatments.
Moreover, the retail segment has been using ML to optimize the experience of their customers by providing streamlined recommendations. Interestingly, ML also helps retailers in gauging the current market situation and determine the prices of their products accordingly, thereby increasing their competitiveness. Meanwhile, virtual voice assistants are also utilizing ML to learn from previous interactions and in turn, provide a much-improved user experience over time.
In its Top 10 Strategic Technology Trends for 2020 report, Gartner mentioned hyperautomation as one of the top-most technological trends. Notably, it involves the use of advanced technologies like AI and ML to automate processes and augment humans. This means that in tasks where hyperautomation will be implemented, the need for human involvement will gradually reduce as decision-making will increasingly become AI-driven.
Reflective of the positive developments that ML is bringing to various organizations spread across multiple sectors, the ML market looks set to grow. A report by Verified Market Research stated that the ML market is estimated to witness a CAGR of 44.9% from 2020 to 2027. Moreover, businesses are also using Machine Learning as a Service (MLaaS) models to customize their applications with the help of available ML tools. Notably, a report by Orion Market Reports stated that the MLaaS is estimated to grow at an annual average of 43% from 2021 to 2027, as mentioned in a WhaTech article.
Machine learning has been taking the world of technology by storm, allowing computers to learn by studying huge volumes of data and deliver improved results while reducing the need for human intervention. This makes it a good time then to look at companies that can make the most of this ongoing trend. Notably, we have selected four such stocks that carry a Zacks Rank #1 (Strong Buy), 2 (Buy) or 3 (Hold). You can see the complete list of todays Zacks #1 Rank stocks here.
Alphabet Inc.s (GOOGL Quick QuoteGOOGL - Free Report) Google has been using ML across various applications like YouTube, Gmail, Google Photos, Google Voice Assistant and so on, to optimize the user experience. Moreover, Googles Cloud AutoML allows developers to train high-quality models suited to their business needs. The company currently has a Zacks Rank #1. The Zacks Consensus Estimate for its current-year earnings increased 27.3% over the past 60 days. The companys expected earnings growth rate for the current year is nearly 50%.
NVIDIA Corporation (NVDA Quick QuoteNVDA - Free Report) offers ML and analytics software libraries to accelerate the ML operations of businesses. The company currently has a Zacks Rank #2. The Zacks Consensus Estimate for its current-year earnings increased 2.2% over the past 60 days. The companys expected earnings growth rate for the current year is 35.6%.
Microsoft Corporation (MSFT Quick QuoteMSFT - Free Report) provides it Azure platform for ML, allowing developers to build, train and deploy ML models. The company currently has a Zacks Rank #2. The Zacks Consensus Estimate for its current-year earnings increased 5.8% over the past 60 days. The companys expected earnings growth rate for the current year is 35.4%.
Amazon.com, Inc. (AMZN Quick QuoteAMZN - Free Report) is making use of ML models to train its virtual voice assistant Alexa. Moreover, Amazons AWS platform offers ML services to suit specific business needs. The company currently has a Zacks Rank #3. The Zacks Consensus Estimate for its current-year earnings increased 11.3% over the past 60 days. The companys expected earnings growth rate for the current year is 31.7%.
In addition to the stocks you read about above, would you like to see Zacks top picks to capitalize on the Internet of Things (IoT)? It is one of the fastest-growing technologies in history, with an estimated 77 billion devices to be connected by 2025. That works out to 127 new devices per second.
Zacks has released a special report to help you capitalize on the Internet of Thingss exponential growth. It reveals 4 under-the-radar stocks that could be some of the most profitable holdings in your portfolio in 2021 and beyond.
Click here to download this report FREE >>
Read the rest here:
4 Stocks to Watch Amid Rising Adoption of Machine Learning - Zacks.com