Programmable Liquidity

Tokenization and the Quest for Liquid Markets

 

Financial markets have always grappled with the challenge of illiquidity.

People have long sought access to assets that are valuable, yet simultaneously difficult to issue, structure, and trade in an efficient manner. Decades ago, traditional financial markets experienced a step-function improvement with the securitization of many asset classes. This created more accessible, regulated, and efficient markets for interested parties. This securitization also produced consequential secondary effects, including decreasing costs to capital and the ability to engineer new financial products.

Powered by the unique premises of blockchain infrastructure, tokenization represents a similar transformation in the way liquid markets might be created for traditionally illiquid assets. This piece aims to analyze the framework of tokenization through the lens of traditional financial markets, drawing parallels from the evolution of previously illiquid assets.

A Primer on Securitization

Broadly speaking, all financial assets can generally be categorized into two clear groups: those that are securitized and those that are not. This seemingly simple distinction is fundamental to understanding our existing financial systems. Assets that have undergone securitization possess two key characteristics that non-securitized assets lack: a standardized identification code and a centralized settlement mechanism.

First and foremost, securitized assets are assigned a unique identification code. In North America, this is the CUSIP (Committee on Uniform Securities Identification Procedures), a 9-character alphanumeric code used for financial instruments like stocks, bonds, and other securities. Globally, the ISIN (International Securities Identification Number) is used, which incorporates the CUSIP for North American securities within its broader 12-character code. These standardized codes play a crucial role in fostering trust and efficiency within the financial system by ensuring that all participants can identify and operate on the same basis for a given security.

Secondly, securitized assets settle almost universally through a canonical clearing house. In the United States, and globally to a significant extent, this central entity is the Depository Trust & Clearing Corporation (DTCC), along with its subsidiaries. The primary function of the DTCC is to ensure the smooth clearing and settlement of securities trades. For example, when an investor purchases shares of a publicly traded company, the DTCC's National Securities Clearing Corporation (NSCC) acts as an intermediary to guarantee that both the buyer and seller fulfill their obligations. Subsequently, the Depository Trust Company (DTC), another arm of the DTCC, facilitates the actual transfer of funds and securities, typically on a T+1 (trade date plus one day) basis. This process involves moving the buyer's funds to the seller and transferring ownership of the securities to the buyer's brokerage account held at the DTC.

 

Retail Trade Settlement Flow (source: Federal Register)

 

While the use of a centralized clearing and settlement system like the DTCC provides universally stability and risk management for securities, it also introduces certain costs. These can include:

  • Operational Costs: The DTCC and its participants incur significant operational expenses related to operating and maintaining their clearing and settlement infrastructure network. These costs are ultimately passed on to the end-user in various forms.

  • Time Delays: The standard T+1 settlement cycle—while providing a buffer for liquidity management and netting of trades—inherently introduces a delay between the trade execution and the finalization of the transaction. This impacts capital efficiency and introduces counterparty risk over the settlement period.

  • Intermediary Fees: Participants like brokers and the clearing house itself often charge fees for their services, adding to the overall cost of trading or investing in these securitized assets.

  • Regulatory Compliance Costs: The highly regulated nature of a securities market and the DTCC necessitates significant investment in compliance measures by all participants, further contributing to the overall cost structure.

While this clearing and settlement system is firmly entrenched in our markets, it’s clear that it brings costs at the expense of various investors.

The Mortgage-Backed Security (MBS)

The challenges of illiquidity and the desire for more efficient markets are not new. The securitization of various illiquid assets served as a key solution in the past, and the emergence of mortgage-backed securities (MBSs) provides a compelling historical parallel to the tokenization potential we see today.

Liquifying Mortgages

The concept of the mortgage-backed security emerged in the early 1970s during a rather turbulent mortgage market. During this time, banks and other financial institutions tended to operate regionally — taking in deposits from local residents and making loans to local homeowners or businesses. This model meant that, in some parts of the country, demand for mortgages and other funding exceeded the supply of deposits, leaving banks without the resources to make otherwise attractive loans. Conversely, mortgage supply exceeded demand in other regions, meaning that mortgage rates as an aggregate tended to reflect local market conditions—with 100+ bps variance depending on the region. One of the challenges for financial institutions was that they were frustratingly susceptible to regional downturns because they were unable to hold a geographically diversified loan portfolio, instead stuck holding local liabilities on their balance sheets.

Securitization provided a unique solution at the time. In 1970, the Government National Mortgage Association (GNMA, or “Ginnie Mae”) pooled together a bundle of single-family mortgages as collateral and sold securities on it, termed “pass-throughs” (or mortgage-backed securities). For these products, the principal and interest on the underlying mortgages would be collected by the loan-originating entity and then passed through to investors, minus a servicing fee. Default risk was eliminated through the direct guarantee of timely payment by GNMA itself. Importantly, this allowed loan originators to offload liabilities from their balance sheet and receive active capital in return, kicking off a new originate-to-distribute model. In just 10 years, the total amount of pass-throughs outstanding had risen to $100 billion, and by 1990 had experienced a further 10x to $1 trillion.

 

MBS Security Structure (source: Research Gate)

 

The securitization of mortgages into pass-throughs/MBSs had a few consequential effects. For one, investors in these pass-throughs could make use of the diversification inherent in a large number of pooled loans and thus reduced their need to gather information and monitor the payment history of the underlying cash flows. Loans that would have otherwise been too expensive or cumbersome to sell on their own could now be bundled and sold much more easily and at a fraction of the cost. And by creating tradable instruments out of these illiquid assets, a lively secondary market commenced that soon became a major enterprise on Wall Street.

One of the most significant unlocks from the creation of the MBS was the reduction in the liquidity premium inherent in illiquid assets, leading to lower overall capital costs. With a more competitive pool of investors, the secondary market allowed for better price discovery of loan pricing. As this market grew, financial institutions also began to specialize in various functions of the loan process. Some focused on originating loans, others serviced and monitored these loans, and others provided funds in the process. Importantly, each assumed tasks based on their core capabilities and the relative efficiency for their business model. As a result, the secondary market lowered origination fees and other costs in obtaining mortgages. Early research showed securitized mortgage loans were associated with interest rates 25-50bps lower than their non-securitized counterparts. In other words, making these assets liquid lowered the liquidity premium associated with consumers taking out a mortgage loan, and thus is credited with making homeownership more affordable for many consumers.

The Financial Engineering Unlock

Like any novel advancement, the MBS opened the door for further innovation in design and product offering. Financial institutions witnessed the success of these instruments and began to devise solutions around the existing headaches they encountered in the market.

Pass-throughs had a few notable features that limited their attractiveness to buyers. For one, their cash flow was unpredictable, and they tended to have relatively long maturities since the pass through of income was not retired until the last homeowner paid off their mortgage. For investors, the Achilles’ heel was the existence of prepayment risk. Most mortgages allow homeowners to pay off any part of their principal ahead of schedule without penalty, and there are a variety of reasons why borrowers may choose to do so. The most notable is a drop in interest rates since many homeowners may look to refinance, forcing investors to take their cash and reinvest at lower rates. As such, there was a potential for profit opportunities and incentives for financial institutions to devise new securitization designs with greater predictability, flexibility in maturity, and other innovative methods to manage risk.

In 1983, the Federal Home Loan Mortgage Corporation (FHLMC, or “Freddie Mac”) issued the first collateralized mortgage obligation (CMO). Like a traditional pass-through, CMOs are backed by a pool of guaranteed mortgages, but their payment structure provides additional flexibility to the investor. In its simplest form, a CMO has several payment tiers known as “tranches,” with bonds in each tranche receiving interest payments but principal payments instead following a waterfall-like payment structure. Principal payments flow to bonds in the top tier until they are entirely repaid, and this continues downstream until all tiers are retired sequentially. As a result, the upper tiers have shorter and more certain maturities, while bottom tiers have longer maturities and assume more prepayment risk in exchange for a higher return. In essence, a CMO constitutes several bonds with varying maturities and payment schedules, giving investors the ability to customize their preference for yield and risk. Importantly, CMOs can’t eliminate prepayment risk entirely, but rather shift it from one tranche to another.

 

CMO Security Structure (source: Research Gate)

 

After their introduction, the CMO market grew steadily, from $20 billion issued in 1985 to nearly $320 billion in 1993. The data attests to their widespread appeal, particularly among large-scale investment institutions like pension funds and insurance companies. What’s more, the idea of splitting cash flows and tranching them set off a wave of innovation in the market in which participants built off this idea to structure a seemingly endless number of variations (e.g. PAC bonds, IO bonds, floaters, superfloaters).

Another innovation frontier was the securitization of new asset classes. The financial community quickly recognized the similarities between mortgages and other assets, prompting them to spin out new securitized products on a variety of collateral. Any cash stream that appeared reasonably predictable and had a history that could be analyzed was a possible candidate to be securitized, including car loans, municipal property tax liens, and even computer leases.

Much like the impact on the mortgage market, securitization of these alternative assets usually meant industry access to capital at rates that wouldn’t have been possible with traditional financial relationships. Securitization transformed opaque, relationship-based lending into a more transparent, market-driven process. By standardizing and making these assets tradable, it connected borrowers with a much larger and more diverse pool of capital, decreasing liquidity premiums in the process.

The Tokenization Moment

Just as traditional securitization unlocked liquidity in various illiquid markets, tokenization promises to do something similar for an even broader range of assets. This makes it reasonable to analyze the tokenization thesis through the historical lens of traditional securitization. And while tokenization improves liquidity in a similar way, it provides a variety of step-function improvements over securitization simply from the premise of using blockchain infrastructure.

 

Token Access from Tokenization Platform (source: Amazon AWS)

 

The wedge for tokenization to surpass prior methods emerges precisely where traditional securitization fails. There are a few important reasons why securitization is challenging (or nearly impossible) for a variety of assets. Logically, securitization requires a certain homogeneity in the assets involved. In order to aggregate and standardize them into tradable securities, assets must possess a shared functional form and standardized characteristics. Thinking about the vast array of illiquid assets today (e.g. collectables, pieces of intellectual property, royalty contracts), its clear that each has its own unique features, making it nearly impossible to aggregate and securitize them under a universal framework.

Another issue arises with barriers to entry. The process to securitize an asset can take upwards of six months and cost issuers $2 million or more in fees. These expenses may originate from high structuring and legal fees, credit rating agency fees, underwriting and distribution costs, and servicing costs. While some of these fees and steps are necessary to ensure proper regulatory compliance and investor protection, the process as a whole is excessively long and costly.

Asset Ledgers and Programmability

By its very nature, a blockchain is an asset ledger that is inherently programmable. This makes it a uniquely powerful piece of infrastructure and the ideal substrate from which to liquify assets of varying forms.

While the securitization process has high friction and barriers to entry (complex legal structures, specialized intermediaries, lengthy timelines, substantial upfront/ongoing costs, etc.), tokenizing assets on-chain faces less costs in both time and capital. Digitizing assets on a network is streamlined in comparison through the use of smart contracts and direct on-chain ownership. What’s more, tokenization itself doesn’t need to come at the expense of regulatory compliance since this compliance logic can be programmed directly into the assets themselves. The composability of smart contracts allows virtually any requirement to be embedded directly into an asset on-chain This could include things like user access, transfer restrictions, holding periods, dividend distributions, tax compliance, trading limitations, and many more.

Beyond regulatory compliance, the programmability of blockchains allows for a wealth of other nuances to be embedded directly into assets via tokenization. This opens up possibilities far beyond the traditional limitations of legacy securities. Some possibilities include:

  • Automated Economic Mechanisms: Smart contracts can automate economic tasks like revenue sharing to equity holders, fee collection and distribution to stakeholders, and staking that automatically rewards holders with native value accrual.

  • Dynamic Rights and Privileges: Smart contracts can grant tiered access to privileges or asset rewards based on requirements, conditional feature-unlocks based on predefined logic, and dynamic asset functionality that evolves over time or as triggered by specific events.

  • Integration With the Physical World (via Oracles): For tokenized physical assets, smart contracts could be linked to real-world events through oracles for condition-based release of value (think car titles transferring upon confirmation of payment and successful inspection).

While these examples merely scratch the surface, they suggest that tokenization has the ability to transform assets from static representations of value into programmable entities capable of interacting with both the digital and physical world in unprecedented ways. Naturally, this programmability allows for the creation of entirely new asset classes and business models, promising even more than the securitization unlock.

Unifying the Infrastructure Layer

Blockchain ledgers don’t just enhance the issuance and capabilities of assets but also provide a unique back-end infrastructure layer upon which diverse front-end applications and marketplaces can be built. Tokenizing assets on shared, transparent, and immutable asset ledgers eliminates the data silos and reconciliation challenges inherent in traditional financial systems. These legacy systems are composed of different institutions, each generally with its own databases or ledgers to record and update asset ownership. This naturally creates inefficiencies as data is siloed, fragmented, and sometimes inconsistent across systems at any given time, making it more difficult to reconcile.

In contrast, a blockchain’s architecture allows it to be somewhat of a universally agreed-upon record book for assets. Because such a record book is reliable and consistent, it provides a trustworthy base layer for parties to build liquid marketplaces or valuable applications on top of. Imagine a scenario where ownership of various asset classes—from real estate fractions and digitized commodities to tokenized debt instruments and digital art—is recorded on the same underlying ledger. This shared infrastructure enables the development of user-friendly front-end platforms that can aggregate these diverse assets into a single, accessible marketplace. Investors could then browse, discover, and trade a wide range of tokenized assets through a unified interface, regardless of their underlying nature or origin.

There are many benefits to this sharing of back-end infrastructure, including:

  • Verifiable Asset Ownership: A ledger definitively records who owns what tokenized asset and the history of its ownership. This eliminates ambiguity and the need for each marketplace or application to maintain its own separate record of ownership. It also reduces the costs associated with reconciling these records concurrently in real-time.

  • Standardized Asset Representation: Once an asset is tokenized on a particular ledger, it adheres to certain technical standards, making it easier for different applications and marketplaces to understand and interact with the asset under one scope of rules.

  • Programmable Functionality: The ability of smart contracts to embed logic directly into assets can be leveraged by front-end applications to offer a wide range of functionalities. For example, a lending platform can interact with an asset's smart contract to automatically manage collateral, interest accrual, and loan repayment based on pre-defined rules.

  • Interoperability and Network Effects: As more assets are tokenized on a given ledger and more marketplaces are built around them, the underlying infrastructure becomes increasingly valuable and parties are incentivized to build new interconnected services.

The Canvas for Innovation

The financial engineering unlock from early securitization naturally raises questions about what new innovations might be possible through tokenization. With the proper framework in place, the canvas for innovation is vast, and we’ve likely only scratched the surface of what’s possible in engineering new blockchain-based financial primitives. The innovation that occurred at the MBS level spun out new financial products with clear value-add (e.g. the tranching of yield and risk via CMOs). Similarly, the development possibilities with programmable ledgers are extensive and extend beyond merely liquifying currently illiquid assets. We can speculate on potential engineering unlocks, and a few ideas might include:

  • Dynamic Risk and Return Profiling: Similar to how securitization created tranches with varying risk-return profiles, tokenization can enable the creation of programmable tranches within a single asset pool. Smart contracts can define the priority of cash flow allocation and loss absorption for different classes of tokens representing the same underlying asset.

  • Automated and Conditional Cash Flow Distribution: Smart contracts can automate the distribution of cash flows based on pre-defined rules and conditions embedded directly in the token itself. This might include tiered distributions based on holding size, early participation bonuses, or even penalties for certain actions. Things like dividend payouts might be linked to the performance of the underlying asset, adjusting automatically based on real-time data.

  • Adding Parametric Triggers to Insurance-Based Assets: Tokenized insurance claims might have payouts that are triggered automatically by specific, measurable parameters (e.g.rainfall levels, flight delays, temperature thresholds) verified by oracles. Contracts manage the pooling of premiums and the automated distribution of claims. This would theoretically eliminate the need for lengthy manual claims investigations and enable the creation of affordable insurance products for niche risks based on verifiable data.

Again, these examples are surface-level ideas about the potential for innovation via programmable token contracts. Ultimately, tokenization not only expands the scope of assets that can be liquified but also suggests there will be programmable innovations on top of these assets to bring new value to the marketplace. We’ve seen a lot of deep innovation in this area so far, and I suspect it will only accelerate as infrastructure continues to mature and compliance frameworks crystallize.

Effects on Capital Markets

The creation of programmable liquid markets for niche asset classes has important second-order effects. These naturally follow from the benefits of traditional securitization.

Improved Price Discovery

For one, the transformation of previously illiquid assets into fractionalized, digitally-tradable tokens fosters the development of more robust price discovery mechanisms. Tokenized markets benefit from continuous trading activity and a more readily available order book, leading to more accurate real time pricing as well as:

  • Enhanced Transparency: The on-chain nature of tokenized asset trading provides a transparent record of transactions, helping to reduce information asymmetry and contribute to more efficient price formation.   

  • Discovery of Latent Value: By creating liquid markets for previously hard-to-value assets, tokenization can unlock latent value that was previously inaccessible due to the difficulty of price discovery and exchange.

Competitive Markets and Barriers to Entry

Tokenization has the potential to significantly increase competition within capital markets by lowering barriers to entry for both issuers and investors. The streamlined processes and reduced intermediary costs associated with tokenization can make it more feasible for smaller businesses and individuals to raise capital by tokenizing their assets or future revenue streams. This increased supply of investment opportunities fosters greater competition for investor capital.

On the flip side, fractional ownership and lower minimum investment thresholds make previously inaccessible asset classes available to a wider range of investors, including retail participants. This increased demand can lead to more competitive pricing and better terms for issuers.

The Cost of Capital

Just as the securitization of the mortgage has lowered the borrowing costs for homeowners, new liquid marketplaces for certain assets suggest cheaper access to capital for parties borrowing against liquid collateral. The combined effects of increased liquidity (reducing liquidity premiums), broader investor access, and reduced intermediary costs can lead to an overall lower cost of capital for issuers.

Tokenizing the Long Tail

The tendency for emerging technologies to initially find their traction in the long-tail or within niche areas represents a common pattern in adoption. Logically—and concurrent with much of what we’ve seen already—tokenization will probably experience the same.

Long-tail assets can generally be thought of as those that individually have low liquidity, infrequent trading volume, and often higher transaction costs in traditional markets. They represent a vast collection of niche or less standardized items such as unique collectibles, illiquid private debt, or revenue streams from small-scale projects.

The focus on long-tail assets as a starting point for tokenization aligns with common patterns observed in the adoption of most emerging technologies. First, emerging technologies often gain initial traction by solving problems that existing solutions don't adequately address. Long-tail assets represent a significant area of unmet need in traditional finance due to their illiquidity and high transaction costs. Tokenization offers a compelling solution to these pain points.

Moreover, emerging technologies often find their first enthusiastic adopters within niche communities or among those that are specifically struggling with the limitations of existing solutions. The owners and potential investors in long-tail assets are often highly motivated to find new ways to unlock value—which could certainly be achieved through newfound liquidity and programmability.

Focusing on these long-tail assets also allows for iterative development and refinement of tokenization methods, smart contract standards, and market infrastructure in a potentially less high-stakes environment. Lessons learned from these early applications could then be applied to the tokenization of more complex and regulated assets.

Navigating around entrenched systems is also important here. From a competition standpoint, utilizing a blockchain ledger to create liquid marketplaces objectively stands at odds with incumbent securitization. Simply considering the players involved in the securitization process, it’s relatively clear that tokenization suggests only marginal improvements (for now) for the security assets that already enjoy liquid markets. Much of this is due to the canonical clearing system that already exists for these securitized assets, as well as the entrenched establishment of key players in this system like the DTCC and the entire consortium of other banks and broker-dealers who collectively agree on this standardized system. So by initially focusing on markets that are less efficiently served by traditional financial methods, tokenization can better avoid direct and immediate competition with entrenched systems and their established players. This allows the thesis to mature and build momentum before potentially disrupting more mainstream markets down the road.

Moreover, tokenization’s focus on addressing more niche areas of the market is akin to the traditional technological pattern of initially seeking success by providing solutions to unique challenges. Throughout history, we’ve seen a similar pattern with other technologies, where a more concentrated focus allowed them to scale their thesis from an experimental improvement to a globally-accepted idea. One example is in IoT, as early applications of connected devices were often focused on specific industrial and commercial needs (as well as a small segment of tech-savvy consumers). But as the technology matured over time, costs decreased, and standardization improved, allowing IoT to expand significantly beyond these niches.

Other examples can be found in 3D printing—in which rapid prototyping for industrial design and engineering expanded to mass utilization in manufacturing—and even in PCs, which initially appealed only to hobbyists, engineers, and very small businesses for specific tasks but achieved widespread adoption through user-friendly operating systems, affordable hardware, and a wider range of software applications. It seems only logical that an initially-narrow focus for tokenization on niche asset classes and uses cases could help it achieve a similar scale down the road.

Looking Ahead

Going forward, tokenizing more illiquid assets seems inevitable, particularly with the structural advantages that blockchains provide as programmable asset ledgers. The liquidity unlock that comes from this movement will carry important second-order effects since liquid secondary markets promote robust price discovery, enhanced transparency and accessibility, and valuable decreases in the cost of capital within various asset classes.

The parallels from an earlier time of securitization still offer valuable insights. The underlying principle of unlocking value through standardization and market creation remains relevant, but tokenization promises even more possibility in comparison to this preceding movement. By learning from the successes and challenges of traditional securitization, and by leveraging the unique programmability of a blockchain, tokenization has the potential to democratize access to an even wider range of assets, fostering more efficient and competitive markets in the process.

The push for this thesis is only accelerating, and while the future is still unclear, we’re nonetheless witnessing a monumental shift in our financial markets.

Previous
Previous

Cities Are The Best Tech Product Ever Built

Next
Next

The Chicken-and-Egg of Payments