Asset Tokenization to Become a $2 Trillion Industry, Says McKinsey
The analysts from the consultation, firm McKinsey & Company, claim that the asset tokenization sector has the potential to morph into a $2 trillion industry regardless of the cold start. The analysts also projected that wider adoption of the sector may take a long time but some of the projects in the sector are bound to take off faster than others.
Digital Assets Value to Reach $4 Trillion
The analysts also noticed that the tokenization sector had a cold start but it has the potential to become a $2 trillion sector by 2030. The analysis has retained that in the event of bullish momentum, the value of the underlying digital assets could also reach $4 trillion.
The post was shared on 20th June that added an exception to this scenario noting that if the market participants are less optimistic than before. McKinsey further stated that at the time there is a lot of activity taking place in tokenization.
However, broader adoption is set to take place in the distant future. The analysts attributed the delay to the challenges with upgrading the current financial infrastructure and heavily regulated nature of the sector.
The analysts also talked about cash, bonds, exchange-traded notes (ETNs), exchange-traded funds (ETFs), loans, deposits, securitization, and mutual funds to form the $100 billion portion of tokenized market cap by 2030.
McKinsey Analysts Link Lack of Tokenization with Lack of User Value
The analysts have excluded various forms of cryptocurrencies such as stablecoins, CBDCs, and tokenized deposits from the $100 billion market cap.
At the same time, the analysts pointed out that the cold start for the tokenization sector is due to users to contribute more value to the sector for a boom. The biggest problem facing tokenization is the lack of liquidity that limits new token issuance.
Fear of market share constriction among investors is also a deterrent that has prompted token issuers to use legacy networks for parallel issuance. As per the analysts, tokenization adoption is conditioned with a real utility where it has a visible advantage over traditional financial infrastructure in order to produce demand.
One such example is bond tokenization. The analysts have noticed that every few weeks a new bond tokenized project is announced.
McKinsey further stated that right now billions of dollars that are associated with outstanding tokenized bonds. However, the advantage over traditional issuers is limited and secondary trading activity is rare.
Analysts further stated that the slow start is offset by offering higher mobility, faster settlements, and additional liquidity. Analysts have projected that early investors to invest in tokenization may benefit from expanding market share.
Institutional Investors Still Participating in Tokenization as Observers
Mckinsey analysts further noticed that the presence of institutional investors in the tokenization sector is still nominal. As per the analysis, institutional investors have assumed the position of observers in the tokenization sector.
The analyst further relayed that indicators suggest that when the tipping point for tokenization arrives, it will include blockchains that are able to handle trillions of dollars in trading volume, regulatory clarity, policy upgrades, and robust network connectivity.
A recent report indicated that the Oracle Protocol (ORA) has recently collected $20 million after a funding round. The investors hail from Polygon, Hashkey Capital, HF0, and more.
The funding will go towards building a tokenization project focusing on AI on-chain models. The ORA intends to use the funding for research and development of the oracle programs and blockchain design.
This tokenized AI model generated from this project has verifiability and traceability layers for data entry and verified output processing. The ORA has also included a new system called Initial Model Offering (IMO). This feature is used for AI model tokenization based on the ERC-20 token.
Based on this feature the token owners have the ability to share profits that are generated via an AI model. An ORA spokesperson revealed that the developers are working on new use cases for the tech such as on-chain insurance claims, logical outputs, dispute settlements, and anomaly detection.