Blockchains and the Future of AI

Danny Sursock

Platform Shift, Meet Platform Shift…

The world is shaped by periods in which extraordinary upheavals in technology or infrastructure coincide, unleashing a generational step function in innovation. Think telegraphs and railroads, fiber-optic cables and the internet, or mobile phones and 3G. 

Our belief is that the intersection of two groundbreaking frontiers – Artificial Intelligence (AI) and blockchains – represents a similarly transformative moment. 

Three important pillars underpin this thesis:

Blockchains Can Offer a Superior Design Space

AI’s high-impact areas are numerous but can broadly be summarized into three main categories:

In particular, Generative AI introduces unique challenges and opportunities that we believe play to the strengths of blockchain technology.

To understand why, it’s important to consider the core inputs that drive the evolution of intelligent systems. Machine Learning (ML) is fundamentally powered by data (quantity but increasingly quality), feedback mechanisms, and compute power.

Dominant players in AI/ML like OpenAI (backed by Microsoft) and Anthropic (with Google and Amazon) are already consolidating resources and building walls around their models and data. But despite early advantages in compute, data, and distribution, this approach risks stifling momentum by fragmenting the collaborative development cycles that birthed the industry in the first place.

Offering a viable counter to this are blockchains like Ethereum, which have emerged as credibly neutral systems of data and compute fueling open-source innovation. Blockchains already underpin a range of digitally native primitives that are well positioned to serve critical roles in a world increasingly shaped by generative AI.

Our belief is that there is a major opportunity for blockchains to become the primary domain upon which open-source research & development in AI compounds.

The State of Today’s Market

A tremendous amount has already been invested in this year’s generative AI frenzy across core infrastructure, the model layer, and even user-facing applications like chatbots, customer support, and coding assistants. Despite that, where (and to whom) value accrues across the traditional stack in the long run isn’t obvious.

In the current paradigm, AI risks being a centralizing force that extends the dominance of web2 market leaders. At the infrastructure and model layers in particular, the name of the game is scale – in hardware and capital resources, access to data, distribution channels, and unique partnerships.

Many of these players – from cloud service providers like AWS to hardware manufacturers like Nvidia to longstanding heavyweights like Microsoft – are going full-stack, whether vertically via M&A or through proprietary partnerships.

The titans at the top are competing for scale and accuracy at the margin, but the market for ultra-expensive, high-accuracy enterprise API models may well be constrained by economics, emerging performance parity of open-source, or even a trend towards lower-latency workload needs.

Meanwhile, a large portion of the middle market is already seeing a commoditization in offerings resembling a collection of ‘OpenAI API wrappers’ with indistinguishable albeit sufficient functionality.

Building on Open-Source Momentum

Open-source datasets for pretraining, training and finetuning, as well as freely accessible foundational models and tools, are already encouraging enterprises of all sizes to get creative with open systems & tooling directly.

A leaked paper from Google outlined just how quickly the gap is closing between the closed and open- source worlds. Notably, 96% of today’s code bases already use open-source software, with the trend particularly evident across Big Data, AI, and machine learning.

Meanwhile, the cloud services oligopoly may be ripe for disruption anyway.

Historically, the big three of AWS, Google Cloud, and Azure have come to own the market by layering on tools and services to entrench themselves deep within the enterprise stack. This dominance has led to a number of challenges for companies, ranging from restrictive operational dependence to excessive costs associated with cloud infrastructure, especially given the premium charged by the major providers.

The pressure on incumbent companies to restructure operating expenses, coupled with a desire to experiment with and integrate the growing range of open-source AI, will create a window to reimagine the stack with decentralized alternatives.

The emerging intersection of open-source AI and blockchain technology therefore presents an extraordinary domain for experimentation and investment.


Crypto x AI: A Mutually Valuable Relationship

We’re profoundly excited by the potential symbiosis between AI and blockchains.

Crypto middleware can drastically improve inputs across the supply side of AI by establishing efficient markets for compute and data (provision, labeling, or finetuning), as well as tools for attestation or privacy.

In turn, decentralized applications and protocols will reach new heights by ingesting the fruits of that labor. 

Undeniably, crypto has come a long way, but protocols and applications still suffer from tooling and user interfaces that remain unintuitive for mainstream users. Likewise, smart contracts themselves can be constricting, both in terms of manual workload demands for developers, but also in overall functional fluidity.

Web3 developers are a remarkably productive bunch. A peak of just ~7.5K full-time developers have built a multi-trillion-dollar industry. Coding assistants and DevOps augmented by ML promise to supercharge existing efforts, while no-code tooling is rapidly empowering a new class of builders.

As ML capabilities get integrated into smart contracts and brought onchain, developers will be able to design more seamless and expressive user experiences and, eventually, net-new killer apps. That step function improvement in the onchain experience will attract a new – and likely much larger – audience, catalyzing an important adoption-feedback flywheel.

Generative AI may prove to be crypto’s missing link, transforming UI/UX and catalyzing a major wave of renewed technical development. In turn, blockchain technology will harness, contextualize, and accelerate AI’s potential.

Using Blockchains to Build a Better Market for Data

Data is ML’s Foundational Input

Yes, huge improvements in compute infrastructure have been instrumental, but enormous repositories of data like Common Crawl and The Pile are what made the foundation models captivating the world today possible.

Moreover, it’ll be data with which companies refine the models underpinning their product offerings or build competitive moats going forward. And ultimately, data will be the bridge between users and personal models that run locally and continuously adapt to individual needs.

The competition for data is therefore an essential frontier, and one where blockchains can carve an edge – especially as quality becomes the prized attribute shaping the market for data.

Quality over Quantity

Early research suggests that up to 90% of online content may be synthetically generated in the coming years. While synthetic training data offers advantages, it also introduces material risks around deteriorating model quality as well as the reinforcement of biases.

There’s a real risk that Machine Learning models may deplete non-synthetic data sources in the next few years. Crypto’s coordination mechanisms and attestation primitives are inherently optimized to support decentralized marketplaces where users can share, own, or monetize their data for training or fine-tuning domain-specific models.

As a result, web3 may prove to be a better and more efficient source of human-generated training and fine-tuning data overall.

Compounding Progress

Decentralized training, finetuning, and inference processes enabled by blockchains can also better preserve and compound open-source intelligence.

Smaller open-source models refined using efficient fine-tuning processes are already rivaling their larger peers in output accuracy. The tide is therefore starting to shift from quantity to quality in terms of source & fine-tuning data.

The ability to track and verify the lifecycle of both original and derivative data enables reproducibility and transparency that will fuel higher quality models & inputs.

Source: Will Henshall / Epoch (TIME)

Blockchains can build a durable moat as the primary domain with diverse, verifiable, and tailored datasets. This can be particularly valuable as traditional solutions over-index on algorithmic progress to counter data shortfalls.

The Content Tsunami

The coming tidal wave of AI-generated content is another place where crypto’s early-mover advantage will excel.

This new technological paradigm will empower digital content creators at unprecedented scale, and Web3 offers plug and play foundations to make sense of it all. Crypto has homecourt advantage thanks to years of development around primitives that establish ownership and immutable provenance of digital assets AND content in the form of NFTs.

NFTs can capture the entire content creation lifecycle, but can also represent digitally-native identity, virtual assets, or even streams of cashflows.

As a result, NFTs make possible new user experiences like digital asset marketplaces (OpenSea, Blur), while also rethinking business models around written content (Mirror), social media (Farcaster, Lens), gaming (Dapper Labs, Immutable), and even financial infrastructure (Upshot, NFTFi).

The technology may even combat deep fakes and computational manipulation more reliably than the alternative - using algorithms to do the work. In one glaring example, OpenAI’s detection tool was shut down because of accuracy failures.

A final point: advancements in succinct and verifiable compute will also upgrade the dynamism of NFTs as they incorporate ML outputs to drive more intelligent, evolving metadata. Our belief is that AI-powered tooling and interfaces atop blockchain technology will unleash full-stack value and reshape the digital content landscape.

Harnessing ML’s Infinite Knowledge with Zero Knowledge

The blockchain industry’s search for technical solutions enabling resource-efficient compute while preserving trustless dynamics has led to substantial progress in zero-knowledge (ZK) cryptography.

Though initially designed to tackle resource bottlenecks inherent to systems like the Ethereum Virtual Machine (EVM), ZK proofs offer a range of valuable use cases related to AI.

An obvious one is simply an extension of an existing unlock: efficiently and succinctly verifying compute-intensive processes, like running an ML model offchain, so that the end product, like a model’s inference, can be ingested onchain by smart contracts in the form of a ZK proof.

Storage proofs paired with coprocessing can take this a step further, materially enhancing the capabilities of onchain applications by making them more reflective without introducing new trust assumptions.

The implications allow for net-new functions as well.

ZK cryptography can be used to verify that a specific model or pool of data was in fact used in generating inferences when called via an API. It can also conceal the specific weights or data consumed by a model in client-sensitive industries like healthcare or insurance.

Companies can even collaborate more effectively by exchanging data or IP, benefiting from shared learnings while still keeping their resources proprietary.

And finally, ZKPs have real applicability in the increasingly relevant (and challenging) realm of differentiating between human and synthetically generated data discussed earlier.

Some of these use cases are contingent on the need for further development around technical implementation and the search for sustainable economics at scale, but zkML has the potential to be uniquely impactful on the trajectory of AI.

Long Tail Assets & Latent Value

Crypto has already demonstrated its role as a superior architect of value flow across legacy markets like music and art. Over the last couple of years, onchain liquid markets representing offchain, tangible assets like wine and sneakers have also emerged.

The natural successor will involve advanced ML capabilities as AI is brought onchain and made accessible to smart contracts. 

ML models, in combination with blockchain rails, will rework the underwriting process behind illiquid assets previously inaccessible due to a lack of data or buyer depth.

One method will see ML algorithms query a massive range of variables to assess hidden relationships and minimize the attack surface of manipulative actors. Web3 is already experimenting with creating markets around novel concepts like social media connections and wallet usernames.

Similar to the impact AMMs had on unlocking liquidity for long-tail tokens, ML will revolutionize price discovery by ingesting massive amounts of quantitative and qualitative data to derive nonobvious patterns. These new insights can then form the basis for smart-contract based markets.

AI’s analytical capabilities will plug into decentralized financial infrastructure to uncover dormant value in long tail assets.

Decentralizing the Infrastructure Layer

Crypto’s advantages around attracting and monetizing higher quality data address one side of the equation. The other side – the supporting infrastructure behind AI – holds similar promise.

Decentralized Physical Infrastructure Networks (DePINs) like Filecoin or Arweave have already built systems for storage that natively incorporate blockchain technology.

Others like Gensyn and Together are tackling the challenge of model training across a distributed network, while Akash has launched an impressive P2P marketplace connecting supply and demand around excess computing resources.

Beyond that, Ritual is building the foundation for open AI infrastructure in the form of an incentivized network and suite of models, connecting distributed computing devices for users to run inference and fine-tuning against.

Crucially, DePINs like Ritual, Filecoin or Akash can also create a much larger and more efficient market. They do this by opening up the supply side to a much broader domain that includes passive providers able to unlock latent economic value, or by consolidating less-performant hardware into pools that rival their sophisticated peers.

Each part of the stack involves different constraints and value preferences, and significant work remains to be done in battle-testing these layers at scale (in particular, the emerging fields of decentralized model training and compute).

However, the foundations exist for blockchain-based solutions for compute, storage, and even model training that can eventually compete with conventional markets.

What It All Means

Crypto x AI is quickly becoming one of the most inspiring design spaces. The respective fields are already impacting everything from content creation and cultural expression to enterprise workflows and financial infrastructure.

Together, we believe these technologies will reshape the world in the coming decades. The best teams are natively incorporating permissionless infrastructure and cryptoeconomics alongside AI to upgrade performance, enable net-new behaviors, or achieve competitive cost structures.

Crypto introduces unprecedented scale, depth, and granularity of standardized data into coordination networks, often without an obvious means for deriving utility from that data.

Meanwhile, AI converts pools of information into vectors of relevant context or relationships.

When paired together, these two frontiers can form a uniquely reciprocal relationship that sets the stage for builders of the decentralized future.

*A huge thank you to Niraj Pant, Akilesh Potti, Jason Morton, Dante Camuto, David Wong, Ismael Hishon- Rezaizadeh, Illia Polosukhin, and others for their work at the forefront of this space, invaluable insights, and inspiration – all of which make possible not only this article but crypto’s bright future.

—————

Disclaimer:

This post is for general information purposes only. It does not constitute investment advice or a recommendation or solicitation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. It should not be relied upon for accounting, legal or tax advice or investment recommendations. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment or legal matters. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by Archetype. This post reflects the current opinions of the authors and is not made on behalf of Archetype or its affiliates and does not necessarily reflect the opinions of Archetype, its affiliates or individuals associated with Archetype. The opinions reflected herein are subject to change without being updated.

accelerating the decentralized future

Blockchains and the Future of AI

November 9, 2023
 | 
 | 

Platform Shift, Meet Platform Shift…

The world is shaped by periods in which extraordinary upheavals in technology or infrastructure coincide, unleashing a generational step function in innovation. Think telegraphs and railroads, fiber-optic cables and the internet, or mobile phones and 3G. 

Our belief is that the intersection of two groundbreaking frontiers – Artificial Intelligence (AI) and blockchains – represents a similarly transformative moment. 

Three important pillars underpin this thesis:

Blockchains Can Offer a Superior Design Space

AI’s high-impact areas are numerous but can broadly be summarized into three main categories:

In particular, Generative AI introduces unique challenges and opportunities that we believe play to the strengths of blockchain technology.

To understand why, it’s important to consider the core inputs that drive the evolution of intelligent systems. Machine Learning (ML) is fundamentally powered by data (quantity but increasingly quality), feedback mechanisms, and compute power.

Dominant players in AI/ML like OpenAI (backed by Microsoft) and Anthropic (with Google and Amazon) are already consolidating resources and building walls around their models and data. But despite early advantages in compute, data, and distribution, this approach risks stifling momentum by fragmenting the collaborative development cycles that birthed the industry in the first place.

Offering a viable counter to this are blockchains like Ethereum, which have emerged as credibly neutral systems of data and compute fueling open-source innovation. Blockchains already underpin a range of digitally native primitives that are well positioned to serve critical roles in a world increasingly shaped by generative AI.

Our belief is that there is a major opportunity for blockchains to become the primary domain upon which open-source research & development in AI compounds.

The State of Today’s Market

A tremendous amount has already been invested in this year’s generative AI frenzy across core infrastructure, the model layer, and even user-facing applications like chatbots, customer support, and coding assistants. Despite that, where (and to whom) value accrues across the traditional stack in the long run isn’t obvious.

In the current paradigm, AI risks being a centralizing force that extends the dominance of web2 market leaders. At the infrastructure and model layers in particular, the name of the game is scale – in hardware and capital resources, access to data, distribution channels, and unique partnerships.

Many of these players – from cloud service providers like AWS to hardware manufacturers like Nvidia to longstanding heavyweights like Microsoft – are going full-stack, whether vertically via M&A or through proprietary partnerships.

The titans at the top are competing for scale and accuracy at the margin, but the market for ultra-expensive, high-accuracy enterprise API models may well be constrained by economics, emerging performance parity of open-source, or even a trend towards lower-latency workload needs.

Meanwhile, a large portion of the middle market is already seeing a commoditization in offerings resembling a collection of ‘OpenAI API wrappers’ with indistinguishable albeit sufficient functionality.

Building on Open-Source Momentum

Open-source datasets for pretraining, training and finetuning, as well as freely accessible foundational models and tools, are already encouraging enterprises of all sizes to get creative with open systems & tooling directly.

A leaked paper from Google outlined just how quickly the gap is closing between the closed and open- source worlds. Notably, 96% of today’s code bases already use open-source software, with the trend particularly evident across Big Data, AI, and machine learning.

Meanwhile, the cloud services oligopoly may be ripe for disruption anyway.

Historically, the big three of AWS, Google Cloud, and Azure have come to own the market by layering on tools and services to entrench themselves deep within the enterprise stack. This dominance has led to a number of challenges for companies, ranging from restrictive operational dependence to excessive costs associated with cloud infrastructure, especially given the premium charged by the major providers.

The pressure on incumbent companies to restructure operating expenses, coupled with a desire to experiment with and integrate the growing range of open-source AI, will create a window to reimagine the stack with decentralized alternatives.

The emerging intersection of open-source AI and blockchain technology therefore presents an extraordinary domain for experimentation and investment.


Crypto x AI: A Mutually Valuable Relationship

We’re profoundly excited by the potential symbiosis between AI and blockchains.

Crypto middleware can drastically improve inputs across the supply side of AI by establishing efficient markets for compute and data (provision, labeling, or finetuning), as well as tools for attestation or privacy.

In turn, decentralized applications and protocols will reach new heights by ingesting the fruits of that labor. 

Undeniably, crypto has come a long way, but protocols and applications still suffer from tooling and user interfaces that remain unintuitive for mainstream users. Likewise, smart contracts themselves can be constricting, both in terms of manual workload demands for developers, but also in overall functional fluidity.

Web3 developers are a remarkably productive bunch. A peak of just ~7.5K full-time developers have built a multi-trillion-dollar industry. Coding assistants and DevOps augmented by ML promise to supercharge existing efforts, while no-code tooling is rapidly empowering a new class of builders.

As ML capabilities get integrated into smart contracts and brought onchain, developers will be able to design more seamless and expressive user experiences and, eventually, net-new killer apps. That step function improvement in the onchain experience will attract a new – and likely much larger – audience, catalyzing an important adoption-feedback flywheel.

Generative AI may prove to be crypto’s missing link, transforming UI/UX and catalyzing a major wave of renewed technical development. In turn, blockchain technology will harness, contextualize, and accelerate AI’s potential.

Using Blockchains to Build a Better Market for Data

Data is ML’s Foundational Input

Yes, huge improvements in compute infrastructure have been instrumental, but enormous repositories of data like Common Crawl and The Pile are what made the foundation models captivating the world today possible.

Moreover, it’ll be data with which companies refine the models underpinning their product offerings or build competitive moats going forward. And ultimately, data will be the bridge between users and personal models that run locally and continuously adapt to individual needs.

The competition for data is therefore an essential frontier, and one where blockchains can carve an edge – especially as quality becomes the prized attribute shaping the market for data.

Quality over Quantity

Early research suggests that up to 90% of online content may be synthetically generated in the coming years. While synthetic training data offers advantages, it also introduces material risks around deteriorating model quality as well as the reinforcement of biases.

There’s a real risk that Machine Learning models may deplete non-synthetic data sources in the next few years. Crypto’s coordination mechanisms and attestation primitives are inherently optimized to support decentralized marketplaces where users can share, own, or monetize their data for training or fine-tuning domain-specific models.

As a result, web3 may prove to be a better and more efficient source of human-generated training and fine-tuning data overall.

Compounding Progress

Decentralized training, finetuning, and inference processes enabled by blockchains can also better preserve and compound open-source intelligence.

Smaller open-source models refined using efficient fine-tuning processes are already rivaling their larger peers in output accuracy. The tide is therefore starting to shift from quantity to quality in terms of source & fine-tuning data.

The ability to track and verify the lifecycle of both original and derivative data enables reproducibility and transparency that will fuel higher quality models & inputs.

Source: Will Henshall / Epoch (TIME)

Blockchains can build a durable moat as the primary domain with diverse, verifiable, and tailored datasets. This can be particularly valuable as traditional solutions over-index on algorithmic progress to counter data shortfalls.

The Content Tsunami

The coming tidal wave of AI-generated content is another place where crypto’s early-mover advantage will excel.

This new technological paradigm will empower digital content creators at unprecedented scale, and Web3 offers plug and play foundations to make sense of it all. Crypto has homecourt advantage thanks to years of development around primitives that establish ownership and immutable provenance of digital assets AND content in the form of NFTs.

NFTs can capture the entire content creation lifecycle, but can also represent digitally-native identity, virtual assets, or even streams of cashflows.

As a result, NFTs make possible new user experiences like digital asset marketplaces (OpenSea, Blur), while also rethinking business models around written content (Mirror), social media (Farcaster, Lens), gaming (Dapper Labs, Immutable), and even financial infrastructure (Upshot, NFTFi).

The technology may even combat deep fakes and computational manipulation more reliably than the alternative - using algorithms to do the work. In one glaring example, OpenAI’s detection tool was shut down because of accuracy failures.

A final point: advancements in succinct and verifiable compute will also upgrade the dynamism of NFTs as they incorporate ML outputs to drive more intelligent, evolving metadata. Our belief is that AI-powered tooling and interfaces atop blockchain technology will unleash full-stack value and reshape the digital content landscape.

Harnessing ML’s Infinite Knowledge with Zero Knowledge

The blockchain industry’s search for technical solutions enabling resource-efficient compute while preserving trustless dynamics has led to substantial progress in zero-knowledge (ZK) cryptography.

Though initially designed to tackle resource bottlenecks inherent to systems like the Ethereum Virtual Machine (EVM), ZK proofs offer a range of valuable use cases related to AI.

An obvious one is simply an extension of an existing unlock: efficiently and succinctly verifying compute-intensive processes, like running an ML model offchain, so that the end product, like a model’s inference, can be ingested onchain by smart contracts in the form of a ZK proof.

Storage proofs paired with coprocessing can take this a step further, materially enhancing the capabilities of onchain applications by making them more reflective without introducing new trust assumptions.

The implications allow for net-new functions as well.

ZK cryptography can be used to verify that a specific model or pool of data was in fact used in generating inferences when called via an API. It can also conceal the specific weights or data consumed by a model in client-sensitive industries like healthcare or insurance.

Companies can even collaborate more effectively by exchanging data or IP, benefiting from shared learnings while still keeping their resources proprietary.

And finally, ZKPs have real applicability in the increasingly relevant (and challenging) realm of differentiating between human and synthetically generated data discussed earlier.

Some of these use cases are contingent on the need for further development around technical implementation and the search for sustainable economics at scale, but zkML has the potential to be uniquely impactful on the trajectory of AI.

Long Tail Assets & Latent Value

Crypto has already demonstrated its role as a superior architect of value flow across legacy markets like music and art. Over the last couple of years, onchain liquid markets representing offchain, tangible assets like wine and sneakers have also emerged.

The natural successor will involve advanced ML capabilities as AI is brought onchain and made accessible to smart contracts. 

ML models, in combination with blockchain rails, will rework the underwriting process behind illiquid assets previously inaccessible due to a lack of data or buyer depth.

One method will see ML algorithms query a massive range of variables to assess hidden relationships and minimize the attack surface of manipulative actors. Web3 is already experimenting with creating markets around novel concepts like social media connections and wallet usernames.

Similar to the impact AMMs had on unlocking liquidity for long-tail tokens, ML will revolutionize price discovery by ingesting massive amounts of quantitative and qualitative data to derive nonobvious patterns. These new insights can then form the basis for smart-contract based markets.

AI’s analytical capabilities will plug into decentralized financial infrastructure to uncover dormant value in long tail assets.

Decentralizing the Infrastructure Layer

Crypto’s advantages around attracting and monetizing higher quality data address one side of the equation. The other side – the supporting infrastructure behind AI – holds similar promise.

Decentralized Physical Infrastructure Networks (DePINs) like Filecoin or Arweave have already built systems for storage that natively incorporate blockchain technology.

Others like Gensyn and Together are tackling the challenge of model training across a distributed network, while Akash has launched an impressive P2P marketplace connecting supply and demand around excess computing resources.

Beyond that, Ritual is building the foundation for open AI infrastructure in the form of an incentivized network and suite of models, connecting distributed computing devices for users to run inference and fine-tuning against.

Crucially, DePINs like Ritual, Filecoin or Akash can also create a much larger and more efficient market. They do this by opening up the supply side to a much broader domain that includes passive providers able to unlock latent economic value, or by consolidating less-performant hardware into pools that rival their sophisticated peers.

Each part of the stack involves different constraints and value preferences, and significant work remains to be done in battle-testing these layers at scale (in particular, the emerging fields of decentralized model training and compute).

However, the foundations exist for blockchain-based solutions for compute, storage, and even model training that can eventually compete with conventional markets.

What It All Means

Crypto x AI is quickly becoming one of the most inspiring design spaces. The respective fields are already impacting everything from content creation and cultural expression to enterprise workflows and financial infrastructure.

Together, we believe these technologies will reshape the world in the coming decades. The best teams are natively incorporating permissionless infrastructure and cryptoeconomics alongside AI to upgrade performance, enable net-new behaviors, or achieve competitive cost structures.

Crypto introduces unprecedented scale, depth, and granularity of standardized data into coordination networks, often without an obvious means for deriving utility from that data.

Meanwhile, AI converts pools of information into vectors of relevant context or relationships.

When paired together, these two frontiers can form a uniquely reciprocal relationship that sets the stage for builders of the decentralized future.

*A huge thank you to Niraj Pant, Akilesh Potti, Jason Morton, Dante Camuto, David Wong, Ismael Hishon- Rezaizadeh, Illia Polosukhin, and others for their work at the forefront of this space, invaluable insights, and inspiration – all of which make possible not only this article but crypto’s bright future.

—————

Disclaimer:

This post is for general information purposes only. It does not constitute investment advice or a recommendation or solicitation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. It should not be relied upon for accounting, legal or tax advice or investment recommendations. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment or legal matters. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by Archetype. This post reflects the current opinions of the authors and is not made on behalf of Archetype or its affiliates and does not necessarily reflect the opinions of Archetype, its affiliates or individuals associated with Archetype. The opinions reflected herein are subject to change without being updated.

Expand to view full transcript
Collapse to smaller transcript view
accelerating the decentralized future
we strive towards the ideal. are you with us?