A16zs hyped-up orange balls revealed to be an L2 rollup client – Cointelegraph
Posted: April 25, 2023 at 12:08 am
A series of cryptic tweets depicting orange balls were revealed to be building up hype for a new rollup client for Optimism (OP) called Magi from the crypto arm of venture capital firm Andreessen Horowitz (A16z).
An April 19 Tweet from a16z engineer Noah Citron explained Magi is written in the programming language Rust and will help improve the client diversity and resilience of the entire OP Stack ecosystem.
The OP Stack refers to the set of software that powers the Ethereum layer-2 solution Optimism. Among the other benefits it provides, it helps simplify the process of creating layer-2 blockchains.
Citron explained Magi takes the place of a consensus client (often called rollup client) in the OP Stack, and works alongside an execution client such as op-geth to sync, meaning that it allows the Ethereum chain to advance by feeding new blocks to the execution client.
The lead engineer for Coinbases layer-2 solution Base, Jesse Pollak, also chimed in on the announcement, tweeting that magi means more decentralization, security, and scale for the OP Stack.
In an April 19 blog post, Citron opined that decentralization increases network security, which is critically important for rollups just as it is for the base layer of Ethereum.
A16zs cryptic hype orange circle tweets echoed the way Coinbase hyped and introduced its own layering network called Base, which instead featured tweets of a blue circle.
Related: US share of global crypto developers fell 26% in 5 years a16z
Citron kicked off the hype train with a tweet of an orange circle on April 18 bearing the phrase coming soon.
Its similarity to the hype before the announcement of Base prompted the crypto community to theorize another Ethereum layer-2 solution was imminent before a16zs chief technology officer, Eddy Lazzarin, quashed the rumors.
Citron also noted that Magi is still currently in development, and while it can currently sync to the Optimism testnet it will be some months before it is production-ready.
Asia Express: Bitcoin glory on Chinese TikTok, 30M mainland users, Justin Sun saga
Read more:
A16zs hyped-up orange balls revealed to be an L2 rollup client - Cointelegraph
What Is Bitcoin Mining Centralization and Why Is It a Concern? – MUO – MakeUseOf
Posted: at 12:08 am
When building Bitcoin, Satoshi Nakamoto envisioned a decentralized digital currency that could operate without the need for centralized institutions such as banks and governments.
Satoshi did not picture a situation where a few entities controlled a significant portion of the entire network, essentially centralizing power and influence.
Bitcoin mining centralization, a result of market competition over the years, goes against the fundamental principle of cryptocurrency.
Bitcoin mining centralization is the concentration of mining power among a few dominant players. Originally, anyone with a computer and internet connection could mine Bitcoin. However, the network grew with time, and as a result, mining became more competitive.
This led to the development of specialized chips known as ASICs (Application Specific Integrated Circuits), which outperformed GPUs and CPUs by being more efficient. Unfortunately, ASICs are expensive and out of reach for most people, and the fact that newer, better, but more costly versions are released exacerbates the situation.
Miners began to form pools to combine their computing power and share the rewards earned. The largest pools also acquire the latest technologies to stay ahead of the competition, which caused others who couldn't keep up to drop off.
Over time, a few large mining pools, including Foundry USA, Antpool, and F2Pool, have come to dominate the Bitcoin mining industry, controlling a significant percentage of the total hash rate at any given time. This beats the logic of cryptocurrency, which is supposed to distribute power among many players.
Several factors contribute to the centralization of Bitcoin mining. Most of these factors also apply in a typical competitive market. They include
While Bitcoin mining centralization is a natural process inspired by competition, it presents a few challenges to the network and ecosystem.
All these challenges require careful consideration and action if the integrity and security of the Bitcoin ecosystem are to be preserved. But how?
Over time, various parties have suggested ways to solve the centralization issue.
Bitcoin Core developer Matt Corrallo proposed the BetterHash Protocol, which involves decentralizing the selection of transactions going into a block to individual hardware operators. However, it didn't provide a mechanism that would ensure miners will choose transactions that create a balanced difficulty for the Bitcoin network hence opening another loophole for centralization. It also introduced inefficiencies due to the need to constantly monitor the network, which was hard to adopt.
Meanwhile, the crypto mining pool P2Pool suggested decentralizing payouts to address the issue. However, by decentralizing payouts, small miners who rely on consistent payouts to cover costs would be disadvantaged. Also, it required low-latency connections between miners and the P2Pool server, which meant whenever a miner experienced high latency, their mining performance would be negatively impacted. For these reasons, it didn't incentivize its adoption.
The most direct way to solve Bitcoin mining centralization is to decentralize the mining pools. This can be achieved through incentives that encourage the use of smaller and more decentralized mining pools. A practical incentive would be to fund innovation and experimentation by small miners, leading to better and more competitive mining strategies.
Notably, former Twitter CEO Jack Dorsey's payment company, Block, started working on an open Bitcoin mining system to make the network more decentralized and permissionless. Block aimed to build its own high-performance open-source ASIC and a Bitcoin wallet to make Bitcoin custody more mainstream.
Nevertheless, incentives alone may not be enough to encourage decentralization. Regulatory policies, network upgrades, and community initiatives may also be necessary to encourage the growth of smaller and more decentralized mining pools.
It's difficult to predict that Bitcoin mining will become more decentralized. Mining power will remain centralized among dominant players as mining becomes more expensive.
Due to economies of scale and other bottlenecks, smaller miners continue to struggle against the big dogs. As a result, it would take tremendous efforts by the rest of the Bitcoin network to implement strategies and solutions to solve Bitcoin mining centralization.
See the rest here:
What Is Bitcoin Mining Centralization and Why Is It a Concern? - MUO - MakeUseOf
GO ART! presents $210K grants for artists, concerts and cultural … – Orleans Hub
Posted: at 12:08 am
By Tom Rivers, Editor Posted 24 April 2023 at 10:35 am
Photos by Tom Rivers: Gregory Hallock (right), executive director of the Genesee-Orleans Regional Arts Council greets about 75 people during an announcement on Saturday for $210,000 in grants to local arts programs. He is joined at Hoag Library in Albion by Mary Jo Whitman (left), the education and Statewide Community Regrant Program coordinator; and Jodi Fisher (center), the GO ART! administrative assistant.
ALBION Local artists and organizations will see a big increase in funding for cultural programming this year. The Genesee-Orleans Regional Arts Council on Saturday presented grants totaling $210,000 to about 50 different organizations, municipalities and artists for events and projects.
That is about double the $107,800 from a year ago and triple the $70,000 in decentralization grants from the state in 2019.
There was more money available from the state through the Statewide Community Regrant Program. GO ART! has shown it can administer the funding, and the two local counties have shown there is demand for the programs, said Gregory Hallock, GO ART! executive director.
He addressed about 75 people on Saturday at Hoag Library, when the checks were distributed for the programs.
Its absolutely phenomenal to get this kind of money to give out, he said.
GO ART! officials on Saturday presented checks for $210,000 to about 50 different artists, community organizations and municipalities to support cultural programs in 2023. The funding was presented to about 75 people at the Hoag Library in Albion. Last year there was $107,800 available.
The result is bigger grants up to $5,000 for many of the projects, and some first-time recipients.
The Statewide Community Regrant Program is funded through the New York State Council on the Arts. Money is available in all 62 counties with funding regranted by local arts agencies through a peer panel funding process.The Statewide Community Regrant Programs consists of 3 different grants Reach, Ripple and Spark.
REACH: The GO ART! Community Arts Grants (Reach Grants) provide seed grants to individual artists, collectives and arts organizations for projects and activities that enable Genesee and Orleans communities to experience and engage with the performing, literary, media, and visual arts. Each year the program supports arts projects, including concerts, performances, public art, exhibitions, screenings, festivals, workshops, readings and more.
Veronica Morgan accepts a grant for $5,000 to go towards the I was a Hoggee on the Erie Canal program planned for Oct. 6-7 which will include a canal boat in Orleans County, artisits and other entertainers.
ORLEANS COUNTY REACH GRANTEES
Tony Barry, an artist from Holley, receives a grant to paint a mural on the back of the Community Free Library building. The mural will be in an Erie Canal theme, and will include a portrait of Myron Holley, the villages namesake and an early champion of the canal.
GENESEE COUNTY REACH GRANTEES:
Sara Vacin, executive director of GLOW Out, said a grant will help fund the GLOW Pride Fest on June 9 in Batavia.
RIPPLE: The GO ART! Individual Artist Commission (Ripple Grant) supports local, artist-initiated activity, and highlights the role of artists as important members of the community. The Commission is for artistic projects with outstanding artistic merit that work within a community setting.
ORLEANS COUNTY RIPPLE GRANTEES:
GENESEE COUNTY RIPPLE GRANTEES:
Alex Fitzak, a member of the band Vette, accepted a grant on behalf of an Albion concert series organized by the Albion Merchants Association.
SPARK: The Arts Education Program (Spark Grant) supports arts education projects for youth and/or senior learners. Emphasis is placed on the depth and quality of the creative process through which participants learn through or about the arts. Projects must focus on the exploration of art and the artistic process.
ORLEANS COUNTY SPARK GRANTEES:
GENESEE COUNTY SPARK GRANTEES:
Todd Bensley, one of the leaders of Friends of Boxwood, accepts a grant to support the Boxwood at Night event this summer at Boxwood Cemetery in medina. Boxwood is a first-time recipient of a GO ART! grant. The Boxwood at Night includes a cemetery tour, ghost walk and music.
See original here:
GO ART! presents $210K grants for artists, concerts and cultural ... - Orleans Hub
Unlocking agricultures full potential with blockchain and innovative tech – Cointelegraph
Posted: at 12:08 am
Blockchain has been recognized for its potential to transform finance and other industries that rely on data, but what happens when innovation meets the worlds oldest industry agriculture? It turns out that blockchain has a lot to offer to the food and agriculture sectors, especially when merged with other innovative technologies such as artificial intelligence (AI), satellites and the Internet of Things (IoT).
The agricultural sector can join the tech revolution to upgrade every aspect that has to do with transactions and data. For example, blockchain could streamline processes related to the supply chain by increasing traceability and bringing automation to the table.
A report from InsightAce Analyticsfound that blockchain in the agriculture and food supply chain is a market valued at over $280 million as of 2022, and is expected to grow to over $7billion by 2031, demonstrating a compound annual growth rate (CAGR) of 43.76% during that period.
Thanks to its unique architecture that involves decentralization, blockchain ensures the highest possible degree of transparency and traceability, which are key elements in the agricultural sector. Decentralized networks enable participants, including farmers, producers, retailers and exporters, to monitor and address major challenges showing up in the supply chain. Eventually, blockchain records can be used for analysis purposes to improve various aspects of the supply chain.
The adoption of blockchain in agriculture can also help regulatory compliance and reporting. By ensuring the provision of accurate, up-to-date, tamper-proof data, stakeholders can make better-informed decisions and implement proper corporate governance. Decentralized networks also simplify the distribution of certification data among relevant parties.
Besides transparency, blockchain can facilitate other advancements in the agricultural sector. For instance, it can enable better management of land rights, more efficient food safety tracking, and enhanced traceability of inputs like seeds and fertilizers.
Tech giants have realized the potential of decentralized ledger technology for agriculture. For example, IBM provides businesses with a permissioned blockchain platform called IBM Food Trust, which offers multiple features, including proof of origin, traceability, fraud monitoring and documentation, among others.
Agricultural companies can also leverage blockchain solutions that rely on public networks, which ensure a higher degree of decentralization and security. One example is Dimitra, an AgTech company that aims to help farmers reduce the amount of labor required to complete manual tasks by integrating its technology stack, which combines blockchain, AI, IoT, drones and satellites.
Source: Dimitra
Dimitra offers digital solutions to help farmers gather data to make smarter and faster decisions to improve their crop yields and increase sustainability.
For Dimitra CEO Jon Trask, the integration of blockchain and other innovative technologies into agricultural processes is natural and imperative. He said: Every smallholder farmer, regardless of economic status, should be able to benefit from simple, beautiful and useful technology, because when farmers thrive, economies thrive.
Dimitra offers four main AgTech applications:
Source: Dimitra
The Dimitra ecosystem is fueled by its proprietary Ethereum-based token, DMTR. It acts as a utility token for the Connected Farmer app that helps farmers worldwide increase sustainability and make informed decisions.
To spread its mission and technology, Dimitra is also working with governments, agencies, NGOs and for-profit organizations. The company was awarded a contract from the OBC Indian Chamber of Commerce, Industries and Agriculture for deploying its Connected Farmer app to 1.3 million farms for soil assessment and remediation. Elsewhere, Dimitra partnered with an organization in the worlds third-largest fruit producing country, the Brazilian Association of Fruit Producers and Exporters. Its members represent more than 85% of the total fruit exported by Brazil.
Dimitra has demonstrated that integrating blockchain with other innovations like AI, satellites and IoT can revolutionize the agricultural sector. By increasing transparency, traceability and efficiency, these advancements offer major opportunities for improving supply chain management, regulatory compliance and land rights management.
Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you with all important information that we could obtain in this sponsored article, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.
Continued here:
Unlocking agricultures full potential with blockchain and innovative tech - Cointelegraph
Three elected to WDCs without a vote – Avas.mv
Posted: at 12:07 am
Three people have been elected to the Women's Development Committee (WDC)s of three islands without a vote.
The three candidates were the only candidates to apply to run in the annual by-election for the open vacancy on their respective islands' WDCs. Therefore, they were elected to the committees by default.
The members elected without a vote are Aishath Haneefa to Sh. Maaungoodhoo WDC, Waseema Ibrahim to R. Meedhoo WDC, and Liusha Ibrahim to GDh. Vaadhoo WDC. All three members are from the opposing Progressive Party of Maldives (PPM).
When the Elections Commission (EC) opened application to contest the annual WDC by-election for ten islands, no candidates applied from F. Nilandhoo. Therefore, the by-election for the remaining six islands will take place on April 17. The islands are HDh. Finey, K. Dhiffushi, M. Veyvah, Th. Gaadhiffushi, GDh. Hoadeddhoo, and Ga. Maamendhoo.
Under the Decentralization Act, elections were previously held within 60 days of the resignation of a WDC member. However, with the amendment to the Decentralization Act and the WDC election rules last year, the WDC elections will now have to be held once a year. However, if a WDC does not have a quorum, elections will be held within 60 days of the member's resignation.
The decision to hold the by-elections of the WDCs once a year was taken to find a solution to the high expenses incurred for recurrent elections.
See the article here:
Web3 interoperability highlighted in RadixLayerZero partnership – Cointelegraph
Posted: at 12:07 am
Web3 interoperability is important to building a decentralized ecosystem that is scalable, secure and provides a seamless user experience.
Radix, a layer-1 protocol, has partnered with LayerZero, an interoperability protocol, to integrate LayerZero with the Radix Babylon public network. The integration encourages cross-chain communication and asset transfers to the Radix ecosystem, benefitting both platforms and their users.
Web3 interoperability refers to the ability of different decentralized applications (DApps) and blockchain networks to communicate with one another. Put simply, interoperability is a state where blockchains can listen to each other, allow users to transfer digital assets and data, and enable better collaboration.
LayerZeros technology enables decentralized applications to send messages between different blockchains. By integrating LayerZero into the Radix ecosystem, Radix users and developers will have a connected experience, making it possible for DApps and assets to unlock omnichain functionality.
Interoperability provides various advantages, one of which is the enhancement of functionality. With Web3 interoperability, various DApps can collaborate and integrate, broadening the scope of features they offer. This means that the integration of a decentralized financeprotocol with a nonfungible tokenmarketplace, for instance, enables users to utilize their NFTs as loan collateral.
Additionally, through interoperability, DApps can also leverage benefits such as increased liquidity sharing, which can result in a more comprehensive liquidity pool for the decentralized ecosystem. This, in turn, can minimize the fragmentation of liquidity across various blockchain networks. Moreover, DApps can exploit the security features of multiple blockchain networks, enhancing their overall security posture through interoperability.
Related: Web3 security: How to identify the risks and use protection tools?
Piers Ridyard, the Radix Foundations director, expressed enthusiasm about the integration, stating that it would showcase the potential of cross-chain interoperability. One of the fundamental principles of Web3 is decentralization, and greater decentralization is achievable with interoperability, as it can reduce the dominance of a single blockchain network.
The integration is set to launch in the second half of 2023, and it is expected to benefit the users of both platforms.
Magazine: Web3 Gamer: Shrapnel wows at GDC, Undead Blocks hot take, Second Trip
Read more from the original source:
Web3 interoperability highlighted in RadixLayerZero partnership - Cointelegraph
2 Top Web3 Stocks to Buy in April – The Motley Fool
Posted: at 12:07 am
Web3 is like the internet's cool younger sibling, all about decentralization, security, and empowering its users. Constructed on the solid foundation of ultra-secure blockchain ledgers, it offers a more transparent and secure method for storing data, without any single puppet master pulling the strings.
These Web3 companies are the trailblazers, harnessing blockchain technology to create innovative products and services, all while staying true to the principles of decentralization and user autonomy.
Powered by blockchain technology, the decentralized spirit of Web3 has the potential to rewrite the rules of internet engagement. Picture this: users owning their data, being true masters of their own privacy, and having a say in online platform governance. Web3 could be the key that unlocks a fairer and more open digital playground for everyone -- and it's a game-changing idea that spells "business opportunity" in 30-foot-tall letters of fire.
So let me show you two companies that are ahead of the Web3 game in 2023. Investing in these stocks today should set you up to make serious money as the Web3 revolution plays out and the companies below reap the benefits of their ambitious planning.
Let's start right at the epicenter of the action. You can't really talk about the Web3 market without including cryptocurrency trading expert Coinbase Global (COIN -7.27%).
Here's how Coinbase CEO Brian Armstrong discussed the Web3 market in last November's third-quarter earnings call:
"We're leaning into Web3 usage, building a lot of this functionality natively into our app," Armstrong said. "We're trying to make crypto easier to use, and that's how we're going to get one billion people and eventually half the world onto using crypto and benefiting from it. So I think Coinbase has been a leader in terms of ease of use and design."
Coinbase's popular and sophisticated platform for cryptocurrency ownership also forms a fantastic basis for Web3 solutions. Once you're logged in to your Coinbase account, you have dozens of crypto coins and tokens at your command and other decentralized applications can build their own Web3 tools around that treasure trove. From logins and financial transactions to asset-tracking and digital wallet management, Coinbase can help with everything.
The target market is literally billions of people; the business opportunity is massive.
If you know where your towel is, Coinbase should be one of your first investments in the Web3 space.
Did you know that FedEx (FDX 0.75%) has had a blockchain strategist on its payroll since 2018? Well, it's true. The global logistics and shipping veteran started exploring this space years ago. Dale Chrystie still runs that office five years later.
And that makes perfect sense, of course. Blockchain ledgers are perfect tools for keeping track of goods across the transport and delivery process. These ledgers are decentralized, meaning you can access and update them on the fly from anywhere. They are also incorruptible thanks to several layers of secure encryption and sophisticated networks that confirm every new transaction or event that is recorded on the blockchain.
So of course FedEx had to dive into this technology early on, leading the charge toward digital asset-tracking on the blockchain. Company founder and chairman Fred Smith said as much in the 2018 edition of the Decentralization Deciphered (D2) conference:
"We're quite confident that blockchain has big implications in supply chain, transportation, and logistics," Smith said. "It's the next frontier that's going to completely change worldwide supply chains."
And when it comes right down to it, the blockchain ledger is at the heart of the Web3 philosophy. As our old friend Dale Chrystie said at the same D2 conference last summer:
"We believe that peer-to-peer technology, smart contract, and other attributes from a blockchain or Web3 scenario are going to be transformative" in the logistics industry.
This sector stands on the threshold of a huge sea change, where Web3 ideas will disrupt old industry standards and age-old business tenets. FedEx has no intention of missing the boat, with all-in support from experts like Dale Chrystie and absolute top-level leaders including Fred Smith. This makes FedEx a mover and a shaker in the upcoming Web3 era.
With FedEx shares trading at just 12.5 times forward earnings right now, the stock is a low-priced bet on the Web3 future. You shouldn't let this opportunity hang in the air in much the same way that bricks don't.
As Web3 rewrites the rules of internet engagement, allowing users to own their data, control their privacy, and participate in online platform governance, you wouldn't want to miss out on this game-changing opportunity. The day may come when Web3 dominates the digital landscape, and the centralized internet we're so used to nowadays becomes a distant memory.
If that day arrives and you're left empty-handed on the sidelines, there's not much left to say, except we apologize for the inconvenience. I'm trying to warn you.
So, take action and invest in Web3 stocks today. Embrace the decentralized future and ensure your portfolio isn't left behind, missing out on the countless opportunities that Web3 promises to deliver.
See the article here:
Astronomers used AI to generate picture of black hole spotted in 2019 – Business Insider
Posted: April 17, 2023 at 12:13 am
A comparison of the original image (left), captured in 2019, and a new version supplemented by artificial intelligence that scientists believe is closer to what the black hole may actually look like. Lia Medeiros via The Associated Press
A group of astronomers released what they believe is a more accurate depiction of the M87 black hole, images they created using artificial intelligence to fill in the gaps from photos first released by researchers in 2019.
The new images, published Thursday in The Astrophysical Journal Letters, could provide important information for scientists studying the M87 black hole and others in the future, researchers said.
The original image first captured by the Event Horizon Telescope in 2017 was taken using a collection of high-powered telescopes around the globe focused on the black hole at the center of the Messier 87 galaxy. The hole is about 54 million light years away from Earth and located within the constellation Virgo.
However, as the world cannot be covered in telescopes to capture a clearer image, researchers developed a machine learning algorithm that could interpret the data from thousands of simulated images of what black holes should look like based on decades of calculations to fill in the gaps from the 2019 images, researchers said.
"With our new machine learning technique, PRIMO, we were able to achieve the maximum resolution of the current array," lead author Dr. Lia Medeiros said in a statement. "Since we cannot study black holes up-close, the detail of an image plays a critical role in our ability to understand its behavior."
Researchers said the thinner orange line around the black hole is produced by the emissions of hot gas falling into the black hole, and noted the new images still align with data captured by the Event Horizon Telescope and theoretical expectations.
They said the accuracy of the technology in analyzing the M87 black hole could allow researchers to use it to study other astronomical objects that have been captured by the Event Horizon Telescope, including Sagittarius A*, the central black hole in our own Milky Way galaxy.
Loading...
Go here to see the original:
Astronomers used AI to generate picture of black hole spotted in 2019 - Business Insider
Machine learning used to sharpen the first image of a black hole – Digital Trends
Posted: at 12:13 am
The world watched in delight when scientists revealed the first-ever image of a black hole in 2019, showing the huge black hole at the center of galaxy Messier 87. Now, that image has been refined and sharpened using machine learning techniques. The approach, called PRIMO or principal-component interferometric modeling, was developed by some of the same researchers that worked on the original Event Horizon Telescope project that took the photo of the black hole.
That image combined data from seven radio telescopes around the globe which worked together to form a virtual Earth-sized array. While that approach was amazingly effective at seeing such a distant object located 55 million light-years away, it did mean that there were some gaps in the original data. The new machine learning approach has been used to fill in those gaps, which allows for a more sharp and more precise final image.
With our new machine-learning technique, PRIMO, we were able to achieve the maximum resolution of the current array, said lead author of the research, Lia Medeiros of the Institute for Advanced Study, in a statement. Since we cannot study black holes up close, the detail in an image plays a critical role in our ability to understand its behavior. The width of the ring in the image is now smaller by about a factor of two, which will be a powerful constraint for our theoretical models and tests of gravity.
PRIMO was trained using tens of thousands of example images which were created from simulations of gas accreting onto a black hole. By analyzing the pictures that resulted from these simulations for patterns, PRIMO was able to refine the data for the EHT image. The plan is that the same technique can be used for future observations from the EHT collaboration as well.
PRIMO is a new approach to the difficult task of constructing images from EHT observations, said another of the researchers, Tod Lauer of NSFs NOIRLab. It provides a way to compensate for the missing information about the object being observed, which is required to generate the image that would have been seen using a single gigantic radio telescope the size of the Earth.
In 2022, the EHT collaboration followed up its image of the black hole in M87 with a stunning image of the black hole at the heart of the Milky Way, so that image could be the next target for sharpening using this technique.
The 2019 image was just the beginning, said Medeiros. If a picture is worth a thousand words, the data underlying that image have many more stories to tell. PRIMO will continue to be a critical tool in extracting such insights.
The research is published in The Astrophysical Journal Letters.
See original here:
Machine learning used to sharpen the first image of a black hole - Digital Trends
Revolutionize Your Factor Investing with Machine Learning | by … – DataDrivenInvestor
Posted: at 12:13 am
Zero to one in Financial ML Developer with SKlearn
Factor investing has gained significant popularity in the field of modern portfolio management. It refers to a systematic investment approach that focuses on specific risk factors or investment characteristics, such as value, size, momentum, and quality, to construct portfolios. The blossoming of machine learning in factor investing has its source at the confluence of three favorable developments: data availability, computational capacity, and economic groundings. So this article will discuss Factor Investing with Machine Learning.
Factor investing has gained significant popularity in the field of modern portfolio management. It refers to a systematic investment approach that focuses on specific risk factors or investment characteristics, such as value, size, momentum, and quality, to construct portfolios. These factors are believed to have historically exhibited persistent risk premia that can be exploited to achieve better risk-adjusted returns.
The historical roots of factor investing can be traced back to several decades ago. In the 1960s, pioneering academics like Eugene F. Fama and Kenneth R. French conducted groundbreaking research that laid the foundation for modern factor investing. They identified specific risk factors that could explain the variation in stock returns, such as the value factor (stocks with low price-to-book ratios tend to outperform), the size factor (small-cap stocks tend to outperform large-cap stocks), and the momentum factor (stocks with recent positive price trends tend to continue outperforming).
By incorporating factor-based strategies into their portfolios, investors can aim to achieve enhanced diversification, improved risk-adjusted returns, and better risk management. Factor investing provides an alternative approach to traditional market-cap-weighted strategies, allowing investors to potentially capture excess returns by focusing on specific risk factors that have demonstrated historical performance.
The main reason why Factor Investing is so popular may be because Factor Investing can be easily explained in investment management.
In my last article, I talk about Financial Machine Learning with Scikit-Learn and use 200+ Financial Indicators of US stocks (20142018) dataset from Kaggle. In this article, I will use the same dataset. So, You can view details from my last article.
The common procedure and the one used in Fama and French (1992). The idea is simple. On one date,
We will start with a simple factor, Size Factor. We will cluster stocks into two groups Big/Small, Well see if size can generate alpha.
# Filter rows based on the variableBig_Cap = df_2014[df_2014['Market Cap'] > filter_value]Small_Cap = df_2014[df_2014['Market Cap'] < filter_value]
print("Big cap alpha", Big_Cap["Alpha"].mean())print("Small cap alpha", Small_Cap["Alpha"].mean())
We can see that in 2014 Big cap have alpha 3.91, Small cap stocks -3.28.
Lets look at some other years.
for Year in Year_lst :
data = df_all[df_all['Year'] == Year]filter_value = data[["Market Cap"]].median()[0]Big_Cap = data[data['Market Cap'] > filter_value]
print("Year : ", Year )print("Big cap alpha", Big_Cap["Alpha"].mean())
dic_alpha_bigcap [Year] = Big_Cap["Alpha"].mean()
for Year in Year_lst :
data = df_all[df_all['Year'] == Year]filter_value = data[["Market Cap"]].median()[0]Small_Cap = data[data['Market Cap'] < filter_value]
print("Year : ", Year )print("Small cap alpha", Small_Cap["Alpha"].mean())
dic_alpha_Small_Cap [Year] = Small_Cap["Alpha"].mean()
At this point, we may conclude that the year 2014 -2019 Big cap. perform more Small cap.We will do this for all variables.
Year_lst = ["2014", "2015", "2016", "2017", "2018"] dic_alpha = {}
for Year in Year_lst :data = df_all[df_all['Year'] == Year]filter_value = data[[column]].median()[0]
filterdata = data[data[column] > filter_value]
print("Year : ", Year )print(column, filterdata["Alpha"].mean())
dic_alpha[Year] = filterdata["Alpha"].mean()
return dic_alpha
def filter_data_below (column,df_all ) :
Year_lst = ["2014", "2015", "2016", "2017", "2018"] dic_alpha = {}
for Year in Year_lst :data = df_all[df_all['Year'] == Year]filter_value = data[[column]].median()[0]
filterdata = data[data[column] < filter_value]
print("Year : ", Year )print(column, filterdata["Alpha"].mean())
dic_alpha[Year] = filterdata["Alpha"].mean()
return dic_alpha
for factor in df_all.columns[2:] :try: print(factor)dic_alpha = filter_data_top (column= factor,df_all = df_all)
df__alpha = pd.DataFrame.from_dict(dic_alpha, orient='index',columns=[factor])lst_alpha_all_top.append(df__alpha)
except:print("error")
Finally, we get a alpha table.
You can plot.
When analyzing stock returns, if we assume that all returns of all stocks are stacked in a vector r, and x is a lagged variable that exhibits predictive power in a regression analysis, it may be tempting to conclude that x is a good predictor of returns if the estimated coefficient b-hat is statistically significant based on a specified threshold. To test the importance of x as a factor in predicting returns, we can use Factor Importance Tests, where x is treated as the factor and y is the alpha. In the Fama and French equation, y is typically represented as Return, but it can also be interpreted as Return minus Ri and Ri-Rf, which is essentially the alpha. While we wont delve into the details of this section here, you can refer to the article at this if you are interested in learning more about this topic.
Note : We need to change the variable to percentile
for Year in Year_lst :
data = df_all[df_all['Year'] == Year]df_rank = df_all.rank(pct=True)
dic_rank[Year] = df_rankdf_rank = pd.concat(dic_rank)
Ill introduce another technic call Maximal Information Coefficient Maximal Information Coefficient (MIC) is a statistical measure that quantifies the strength and non-linearity of association between two variables. It is used to assess the relationship between two variables and determine if there is any significant mutual information or dependence between them. MIC is particularly useful in cases where the relationship between two variables is not linear, as it can capture non-linear associations that may be missed by linear methods such as correlation.
MIC is considered important because it offers several advantages:
We use minepy for test MIC.
mine = MINE( est="mic_approx")mine.compute_score(X_, y)mine.mic()
We have previously explained how factor investing works. Machine learning (ML) techniques can be used to improve factor investing in various ways. ML algorithms can help identify relevant factors with predictive power, optimize the combination of factors, determine the optimal timing of factor exposures, enhance risk management, optimize portfolio construction, and incorporate alternative data sources. By analyzing historical data and applying ML algorithms, investors can identify factors, optimize their combination, and dynamically adjust exposures based on market conditions. ML techniques can also enhance risk management measures and incorporate alternative data for better insights.
We use linear regression using the LinearRegression class from the sklearn.linear_model module in Python. The goal is to fit a linear regression model to the data in df_rank DataFrame to predict the values of the dependent variable, denoted as y, using the values of the independent variable, denoted as X, which is derived from the "PE ratio" column of df_rank.
Import LinearRegression
This line imports the LinearRegression class from the sklearn.linear_model module, which provides implementation of linear regression in scikit-learn, a popular machine learning library in Python.
Define the dependent variable:
This line assigns the Alpha column of df_all DataFrame to the variable y, which represents the dependent variable in the linear regression model.
This line assigns a DataFrame containing the PE ratio column of df_rank DataFrame to the variable X, which represents the independent variable in the linear regression model. The fillna() method is used to fill any missing values in the "PE ratio" column with the mean value of the column.
This line creates an instance of the LinearRegression class and assigns it to the variable PElin_reg.
Fit the linear regression model:
This line fits the linear regression model to the data, using the values of X as the independent variable and y as the dependent variable.
Extract the model coefficients:
These lines extract the intercept and coefficient(s) of the linear regression model, which represent the estimated parameters of the model. The intercept is accessed using the intercept_ attribute, and the coefficient(s) are accessed using the coef_ attribute. These values can provide insights into the relationship between the independent variable(s) and the dependent variable in the linear regression model.
Repeat with ROE
ROElin_reg = LinearRegression()ROElin_reg.fit(X, y)ROElin_reg.intercept_, ROElin_reg.coef_
And plot
X1 = df_rank[["PE ratio"]].fillna(value=df_rank["PE ratio"].mean() )X2 = df_rank[["ROE"]].fillna(value=df_rank["ROE"].mean() )
plt.figure(figsize=(6, 4)) # extra code not needed, just formattingplt.plot( X1, PElin_reg.predict(X1), "r-", label="PE")plt.plot(X2 , ROElin_reg.predict(X2) , "b-", label="ROE")
# extra code beautifies and saves Figure 42plt.xlabel("$x_1$")plt.ylabel("$y$", rotation=0)# plt.axis([1, 1, ])plt.grid()plt.legend(loc="upper left")
plt.show()
We can see that the PE line has a higher slope( coef_). Therefore, we can say that PE can predict an Alpha value.If we have 3 stocks and we want to know which one will perform.
Here we will try to predict Alphas rank from 10 variables.
repeat
top_fest = featureScores.nlargest(10,'Score')["Specs"].tolist()
y = df_rank["Alpha"]X = df_rank[top_fest].fillna(value=df_rank[top_fest].mean() )
lin_reg = LinearRegression()lin_reg.fit(X, y)
lin_reg.predict(X)
Handle data
print MSE
mean_squared_error(Y1, Y2)
MSE = 0.07 , it indicates that, on average, the squared differences between the predicted values and the actual values (i.e., the residuals) are relatively small. A lower MSE generally indicates better model performance, as it signifies that the models predictions are closer to the actual values.
But,
X = df_["Unnamed: 0"].head(30)Y1 = df_["Alpha_rank"].head(30)Y2 = df_["Predict_Alpha"].head(30)
plt.figure(figsize=(6, 6)) # extra code not needed, just formattingplt.plot( X, Y1, "r.", label="Alpha")plt.plot( X , Y2 , "b.", label="Predict_Alpha")
# extra code beautifies and saves Figure 42plt.xlabel("$x_1$")plt.ylabel("$y$", rotation=0)# plt.axis([1, 1, ])plt.grid()plt.legend(loc="upper left")
plt.show()
We can see that, Although MSE its less but, We can see that the predicted value is in the middle. That is because of the limitations of the Liner regression model.
Now we will look at a very different way to train a linear regression model, which is better suited for cases where there are a large number of features or too many training instances to fit in memory.
Gradient descent is an optimization algorithm commonly used in machine learning to minimize a loss or cost function during the training process of a model. It is an iterative optimization algorithm that adjusts the model parameters in the direction of the negative gradient of the cost function in order to find the optimal values for the parameters that minimize the cost function.
sgd_reg = SGDRegressor(max_iter=100000, tol=1e-5, penalty=None, eta0=0.01,n_iter_no_change=100, random_state=42)sgd_reg.fit(X, y.ravel()) # y.ravel() because fit() expects 1D targets
Sometimes we have too many variables. We want to fit training data much better than plain linear regression.This case we use 100 variables. We use learning_curve function to help us.
train_sizes, train_scores, valid_scores = learning_curve(LinearRegression(), X, y, train_sizes=np.linspace(0.01, 1.0, 40), cv=5,scoring="neg_root_mean_squared_error")train_errors = -train_scores.mean(axis=1)valid_errors = -valid_scores.mean(axis=1)
plt.figure(figsize=(6, 4)) # extra code not needed, just formattingplt.plot(train_sizes, train_errors, "r-+", linewidth=2, label="train")plt.plot(train_sizes, valid_errors, "b-", linewidth=3, label="valid")
# extra code beautifies and saves Figure 415plt.xlabel("Training set size")plt.ylabel("RMSE")plt.grid()plt.legend(loc="upper right")#plt.axis([0, 80, 0, 2.5])
plt.show()
The optimal Training set size data is the red line close to the blue line. If the red line is much lower than the blue line, it is called Underfit. On the other hand, If the red line is much upper than the blue line Overfit.
At this point, we will get the coefficient. The coefficient can tell what factors determine returns.We still have a lot of details that we skipped(Normalization, Factor engineering, etc.)Other algorithms(Regularized Linear Models,Lasso Regression,Elastic Net) and Including the use of ML in various steps.
Machine Learning for Factor Investing by Guillaume Coqueret
https://colab.research.google.com/drive/1bXpSC-rln-yhmqd7Et06Y-ZkhmhC8joP?usp=sharing
See more here:
Revolutionize Your Factor Investing with Machine Learning | by ... - DataDrivenInvestor