Data Marketplaces: Value Capture in Web 3.0

Marketplaces are critical elements of the entire Convergence ecosystem; the element that incentivises data to be collected, shared and utilised. We now have the ability to open up the machine economy and need to think of ‘trading’ beyond the scope of human interaction. The sorts of marketplaces that are being developed in the crypto community are decentralised, automated and tokenized. These marketplaces are made possible because of the distributed ledgers, consensus mechanisms and interoperability protocols at the lower levels.

We will see the emergence of a whole host of new types of marketplaces beyond just today’s cryptocurrency exchanges like Binance or Coinbase. We are seeing the emergence of data exchanges that work with specific types of data; machine data from IoT networks, artificial intelligence data, personal data, and complex digital assets like crypto-collectables (pioneered by ERC 721 non-fungible tokens like CryptoKitties) and bots. It’s likely that over time marketplaces will expand into enabling all types of data and if that does occur we could end up with a dominant data and digital asset marketplace for Web 3.0 like Amazon for the Web 2.0. It’s interesting to think where the points of leverage will be in the Web 3.0 especially if value and data are interoperable across blockchains. Anyway at least for now, we see four types of decentralised data marketplace.

IoT Data Marketplaces

IoT data is already being collected in vast quantities, but the sprawl of devices has created a fragmented ecosystem. On the consumer side, operating system providers like Apple, Google and Amazon are attempting to leverage their dominant positions in smartphones and retail to sell more devices to collect more data. The Apple Watch and CarPlay, Google Home and Next, Amazon Echo and Dot; these are all attempts to grow their walled gardens of data. Smaller consumer IoT device makers like Fitbit, Wink, or GreenIQ struggle to collect enough data to make do meaningful machine learning to improve their products as quickly as the tech giants.

On the enterprise side, the same dynamics are at work. The internet of things (IoT) and industrial internet in the United States, Industrie 4.0 in Germany, and 物联网 (wù lián wăng) in China all promise to use low-cost sensors and big data analytics to dramatically improve productivity and usher in a new age of data-driven manufacturing. But the promise has not been realized for a number of reasons. Core to the failure has been the lack of data sharing. This lack of data sharing has been the case across all industries that are trying to utilise IoT technologies including aviation, agriculture, and utilities. The problem, as we have already highlighted, is that there is no incentive to share data because it is seen as the competitive advantage to be protected. Current data infrastructure is coarse: data is either hoarded and valuable, or shared with limited commercial viability. IoT marketplaces begin to offer new business models for the monetisation of machine data. The IOTA data marketplace, Streamr, Datum and Databroker DAO are all examples of these marketplace emerging to enable the sharing of sensor and machine data.

AI Data Marketplaces

Just like IoT data, or any data for that matter, data for AI algorithms tend to be accumulated by the largest companies. Society is becoming reliant on data, and as it applied to AI algorithms, we are facing a situation in which a select group of organisations are amassing vast datasets and building unassailable AI capabilities. With the emergence of deep learning as the most useful machine learning technique for a range of AI applications like computer vision and natural language processing, data has become like digital oil. Digital monopolies like Facebook, Google and Amazon, today get data from users for free. Every like, search and purchase feeds the learning system to further improve the algorithms; in turn bringing more customers and engagement. In value chain terms, data is supply, and AI algorithms are demand. Digital monopolies are searching everywhere for more and more data to feed their algorithms: Facebook buying WhatsApp and Instagram, Google with self-driving cars and Google Home, and Amazon with Alexa Echos and Dots.

“Traditionally proprietary data and technology have been significant defensibility mechanisms for companies. In the blockchain industry this is all open source, leading to an incredibly rapid innovation cycle, but also shifting defensibility more towards the sheer size of the community and thus the distribution power. This is an industry where increasingly users decide what technologies they want to use.” — Teemu Paivinen, Founder, Equilibrium Labs & Author of Thin Protocols

Decentralised AI data marketplaces will reduce, and eventually remove, the competitive advantage of hoarding private data by enabling anybody to monetise data. Again in value chain terms, these marketplaces increase supply. An AI data marketplace will make it easy for people and increasingly agents and bots to recommend, price and therefore find value in different types of data. A market for data will lead to more efficient allocation of data, rather than giving it away for free or not using it at all. As more and more machines, individuals and organisations upload data to sell on a data marketplace, it becomes more attractive to data buyers. As this data commons grows with more datasets, it will attract more data buyers, creating powerful network effects. More than anything, decentralised AI data marketplaces are a bulwark to the rapacious AI data monopolies that have the potential to become the most powerful organisations ever built (if they aren’t already), controlling ever-increasing numbers of industries and markets with their superior AI capabilities. It is, for this reason, we invested in the Ocean Protocol, whose mission is “to unlock data, for more equitable outcomes for users of data, using a thoughtful application of both technology and governance.”

“The aim of Ocean Protocol is to to equalize the opportunity to access data, so that a much broader range of AI practitioners can create value from it, and in turn spread the power of data. To respect privacy needs, we must include privacy-preserving compute. Our practical goal is deploy a tokenised ecosystem that incentivizes for making AI data & services available. This network can be used as a foundational substrate to power a new ecosystem of data marketplaces, and more broadly, data sharing for the public good.” — 

Trent McConaghy, Co-Founder, BigChainDB & Ocean Protocol

We also expect to see these marketplaces become ever more automated and efficient. Another of our portfolio companies, Fetch, is building a solution that uses decentralised machine learning to enable marketplaces to self-evolve around popular or valuable datasets, improving discoverability. In some senses they are embedding marketplaces directly into the ledger to truly enable the machine economy.

Personal Data Marketplaces

After peer-to-peer payments, control of personal data has been one of the most talked about applications for blockchains. This is related to but separate from self-sovereign identity and SII networks like Sovrin, in the sense that once an individual controls their own identity, they can choose who can have access to it. The same principle can be applied to other personal data. This choice puts the individual in the position of the seller and the party who wants access to the data as the buyer. Personal data is an economic asset that we currently give up in return for services. Some data is handed over consciously, like entering an email address or a telephone number; other data is captured without us knowing about it: likes, tweets, our online behaviour and other forms of digital data exhaust. The value comes (albeit it is much less understood by individuals) when different datasets are aggregated, and an individual psycho-demographic profile is created and sold to all sorts of organisations like insurers, market researchers, and political organisations. A multi-billion dollar data industry exists just to trade personal data.

Individual pieces of personal data are not particularly valuable on their own. According to the Financial Times, general information such as age, gender or location is worth just 0.0005 dollars per person. Buyers will have to fork out 26 cents per person for lists of people with specific health conditions. Genomic data would likely fetch much more. The challenge is that at an individual level, there is very little economic value. Value comes in aggregate. This is where blockchains, self-sovereign identity, and personal data wallets combine.

In today’s Web 2.0 paradigm, Google, Facebook and other data monopolists capture the profit. In the future, blockchain infrastructure, self-sovereign identity and personal data marketplaces will empower individuals. They can choose to allow Google and Facebook to use their data, or they can auction it off to get the best price. They might decide to only sell general information, but not their genomic data. Others will rent access to genomic data to cancer research charities but not insurers. New business models will emerge as buyers give sellers discounts based on aggregating family data for instance and new startups will emerge differentiating on consumer trust. Metâme is a UK-based startup working on creating a universal unit of trade enabling bundles of personal data to be packaged and exchanged. A data marketplace is not necessarily about making the most money. It is about giving individuals choice and control of how they want to invest their most valuable economic asset.

“Self-sovereign personal data marketplaces need to address two key hurdles before they can take off: 1) the need for a universal unit of trade that transforms personal data into assets which people can tangibly trade and own, 2) ensuring anonymity and then incentivising consented identifiability as new legislation like GDPR effectively calls for anonymity by default. Without solutions to these problems personal data marketplaces cannot scale sustainably.”

Dele Atanda, CEO, Metâme Labs

Digital Assets Marketplaces

The final category of marketplace we expect to evolve are digital asset marketplaces. Unlike traditional physical assets or money, distributed ledger-based crypto-tokens can be programmable. This gives them more flexibility and variety than their physical counterparts. Cryptocurrencies, or tokens designed to be a medium of exchange, are already reasonably well-defined and projects are innovating around how to create the optimal token for this use in mind with rules around supply, distribution, privacy, and other attributes being tweaked. Cryptocurrencies confer the fact that the crypto-token is a medium of exchange. Most tokens are incorrectly referred to as cryptocurrencies. This is because Bitcoin began life as a cryptocurrency and has over the last ten years become more of a crypto-asset, predominantly because of the programmed deflationary economics (Layer 2 solutions like Lightning may change this classification however). However, currencies and assets require different economic designs. Currencies need to have a high velocity; assets need to retain and ideally increase value resulting in low velocity.

Broader than cryptocurrencies, digital assets will come to include all digitals assets that use distributed ledgers to create scarcity. Today there isn’t a clear distinction between cryptocurrencies and crypto-assets, but as the market matures, it will become more evident which tokens are designed to be a medium of exchange and which are designed to be a store of value. It is challenging to be both. Ether, for example, is intended to be used as a medium of exchange to redeem decentralised services from applications. But as its price rises, it becomes more of a store of value and less of a medium of exchange as holders refrain from redeeming Ether in anticipation of value appreciation. This non-fungible subclass of crypto-assets will be designed to be collectables and derive value through exclusivity and proof-of-ownership. Tooling for this is already emerging with the ERC 721 NFTs.

We expect to see a whole new ecosystem of digital assets like in-game weapons or costumes for gaming, AI bots and virtual avatar templates, such as those provided by SEED. Virtual reality land such as Decentraland; objects with real-world counterparts like digital twins from Spherity; and even digital to physical assets like 3D printed items, many of which will be collaboratively made and collectively owned. With digital scarcity comes the ability to artificially limit supply which has up until now been almost impossible with existing digital and Internet technologies.

The possibilities are endless and we are at the very beginning of a whole new age of digital assets created, bought, licensed, rented and sold in decentralised peer-to-peer marketplaces.


This excerpt is the latest in a series from Convergence Ecosystem vision paper. Go and read the full thing here which can be read here. Or take a look at the previous excerpts which have covered core themes from the paper:

Building the Internet of Blockchains

I have said it before so I will say it again, the new data value ecosystem see data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog.

Clickbait headline aside, if we aren’t careful we are going to end up replicating data silos we hoped blockchains and decentralised technologies would remove. This is why the transport layer of our investment thesis: the Convergence Ecosystem, is so important. For us the transport layer includes but not limited to four components: data interoperability like say haja networks; value interoperability like Polkadot, AION, Cosmos and atomic swaps; transport and messaging protocols like telehash and whisper, and state communication protocols like Lightning Network for Bitcoin, Raiden Network for Ethereum and IOTA’s Flash Channels.

The technologies of this layer are less mature than the layers below, but will become ever more critical as blockchains and DLTs proliferate if we are to avoid the same data silos that exist today in the Web 2.0 era. It is at this layer where interoperability protocols are developing for messaging, value, data and state — and we are beginning to see the contours of a so-called ‘Internet of blockchains’. In the full paper we explore each of the interoperability protocols, this blog is an extract of value and data interoperability.

Value Interoperability

Value interoperability across multiple blockchains refers to the ability of digital assets in one blockchain to interact with assets in another. The most straightforward example for an interoperable transaction would be one in which an individual transfers a cryptocurrency on one blockchain in exchange for cryptocurrency on another, for example, Bitcoin exchanged for Litecoin or XRP. Interoperability matters as it enables multiple ledgers to compound the benefits offered by each. Through limiting the flow of value in a blockchain to a single ledger, one risks creating new “decentralised” DLT-based siloes that cannot interact with each other at scale. By enabling ledgers to interact with one another with a communication protocol layer, improvements in security, speed, cost of transactions can be attained.

There are multiple approaches to obtaining interoperability, each with a focus on a specific function. One of the simplest forms is through a relayer. These utilities check for transactions in one chain and “relay” that information to another. BTC Relay, for instance, allows Ethereum smart contracts to verify a Bitcoin transaction without any intermediary. This enables Ethereum DApps and smart contracts to accept Bitcoin payments. A new generation of cross-chain transaction enablers allows exchanges to occur without a centralised party. Atomic cross chain swaps use hash time locked contracts to enable two parties to interact with tokens from different ledgers with each other without the need for an intermediary.

Atomic cross chain swaps will be crucial in creating a new generation of decentralised exchanges. Cosmos, Polkadot and Komodo are a handful of projects with an explicit focus on the space. Interoperability protocols also often enhance privacy through zero-knowledge proofs. They enable verifying the accuracy of a computation without knowing the variables involved. Through sending a transaction across multiple ledgers, tracking the source and recipient of a transaction can be made drastically more difficult. One could also consider decentralised exchanges such as EtherDelta as an interoperability enabler. Although restricted to ERC20 tokens, they allow individuals to trade their tokens for another one without relying on a central authority. One could trade their Storj tokens received as payments for leasing their computer’s storage space out and buy INS tokens to receive discounts at a retail outlet without having to move coins from their wallet with the help of the likes of 0x and Kyber. While decentralised exchanges come with new challenges — especially liquidity — they offer the promise of delivering significant security improvements over centralised exchanges.

Value interoperability will allow value that is stored in siloed blockchains to break free. This applies equally to value stored in both public and private blockchains. NEO is already enabling cross-chain asset agreements with NeoX. Users do not need to set up wallets for every blockchain they want to use and rely on third parties every time they have to interact on a different chain. Interoperability protocols further add value to the Convergence Ecosystem by allowing multiple industry-oriented tokens to communicate with each other. For instance, one could make payments in MIOTAs for leasing IoT based sensors that pass on data using the Ocean Protocol OCN token. Similarly protocols would be used in connecting and incentivising functions in mobility and robotics. A machine can pay for access to a resource in the native token of one ledger and receive the resource itself through another ledger. As projects and protocols start delivering real-world utility at scale, the need for exchange infrastructure will increase. One could compare these protocols to hubs that route value without an intermediary.

In a world of seamless value interoperability one can expect a complex interplay between users holding tokens for particular service utility and others for store-of-value; the wallet or ‘portfolio’ balance likely optimised by a personal AI. This AI will be personalised by risk appetite, values and services; the weighting of which will lead to a new field of TPO (token portfolio optimisation) an extension of search engine optimisation (SEO) and social media optimisation (SMO). If purchasing and holding tokens is a reflection of one’s values, it’s interesting to think that token portfolios could become a new sort of social or political badge.

Data Interoperability (Off-chain)

Today, incredible amounts of data are stored on the private servers of a relatively small amount of organisations. The internet’s client-server architecture makes data-sharing inconvenient, while privacy and data protection laws limit the cases where it can be done legally. Even if this were not to be the case, there is no rational economic incentive for individuals to do anything other than give away their data. While strides are being made towards increased data accessibility, such as open Application Programming Interfaces (APIs) and open-data regulations like PSD2, the benefits are one-sided. Indeed, users can now benefit from open data, but there is still no market, and data contributors remain largely unpaid. So, are blockchains the solution?

Blockchains are not databases; they are ledgers. It sounds almost flippant to say that, but the distinction is essential in understanding why data interoperability is just as important as value interoperability. Value interoperability means tokens can be moved across chains; data interoperability allows data to move across databases. Blockchains must be lightweight with limited on-chain storage so that “anyone” can download a full history of the blockchain. If blockchains become too large, fewer people will be able to participate in the network, thus reducing the decentralisation of the network and overall security. When it is said: “blockchains will enable large datasets to be shared or stored” actually it is not blockchains where the data itself will be stored. We are talking about decentralised and distributed data storages like IPFS and Swarm. Each blockchain implementation uses different data storage for “off-chain” data, and the balance between “on-chain” and “off-chain” data depends on the use case requirements. Just like the design of the Internet and the internet protocol suites, we expect blockchains to remain as light as possible to ensure speed and reliability; it will be the “off-chain” storage that will hold the majority of the data.

But what we must avoid is a world in which value is interoperable, but the underlying data is not; leading to the same monopolistic market dynamic as we have today. Projects like Haja Networks are vital in enabling data sharing throughout the ecosystem. We need protocols that permit data to be shared seamlessly across both centralised and decentralised databases. Innovations in cryptography such as zero-knowledge proofs, differential privacy, Fully Homomorphic Encryption (FHE), and secure Multi-party Computation (MPC) will enable data to remain private and secure but still move through public networks. Without data interoperability, the Convergence Ecosystem does not work.

Only when both value and data can be shared securely, can marketplaces be built that will drive the Convergence Ecosystem. Tune in next week for more on the importance of data marketplaces to the future Web 3.0 vision.


A Real Use Case for Blockchains: A Global Data Commons

This week I am continuing our series of excerpts from our Convergence Ecosystem vision paper, which can be read here. The previous instalments can be found here:

Today, I want to talk about a use case that only blockchain technology can deliver. They are few and far between and usually revolve around applications that demand censorship-resistance. Indeed, a public data commons demands just that.

The Convergence Ecosystem will lead to a global data commons. It’s not inevitable, but blockchains and distributed ledgers are disruptive technologies that change the structure of the data value chain. Yes, I know that word is overused, but in this case it is true. The point of value capture in the value chain will change. Instead of web companies capturing value and profit by controlling data, data could be stored on decentralised data file systems and blockchains making it accessible to all, not just a select few platforms that collected it.

In the Convergence Ecosystem, a few different technologies are all interconnected forming an authentication, validation and security layer: blockchains and distributed ledgers, decentralised consensus protocols, self-sovereign identity & reputation, and decentralised storage and data integrity. We believe the development in these four areas are contributing to the development of a data commons. These decentralised technologies are critical to the creation of a true global public utility. A public utility must be citizen owned and not be able to controlled by a single entity either Government or corporation. The transparency, usage provenance, tamper-evidence and censorship resistance features of blockchain technology are perfect for a global public utility.

Public blockchains are the ideal foundation for a data commons

Public blockchains are in many ways worse than existing databases. They are slower, have less storage, use more energy, and are less private. Sure, sharding, proof-of-stake or other proof-of-x, and privacy-protecting tools like multi-party computation and zk-snarks are attempting to address some of these issues. But the key thing to remember is that the original Bitcoin blockchain was designed specifically as a peer-to-peer digital cash systems; it is perfectly designed for that use case. The design choices were made to improve one feature: censorship resistance. Public blockchains aren’t owned or managed by one Government or company that can choose who views or uses it. This is what crypto people mean when they say blockchains cut out the middleman (although the so-called middleman will almost certainly integrate at another point in the value chain so it’s more appropriate to say blockchains will change where the middlemen make their money). Governments have traditionally had the ability to censor information and communication; but today Silicon Valley tech monopolies do on a global scale. Twitter, Facebook and Google have all come under fire recently because of their decisions to limit freedom of speech. If you control a network you pick and choose who uses it. This is too much power for a single entity.

We now have the tools to ensure no single entity controls data. With all communications, money, and health becoming digital; data infrastructure will be too valuable to be controlled by one nation or company. In fact, for individuals and society more broadly, global data infrastructure, just like the Internet, should be a public good. Never has so much data been available for collection and analysis. And everyone wants it. As sensors are embedded in everyday objects, and we move to a world of ubiquitous computing, everybody is fighting for who ‘owns’ the data. This is yesterday’s war. Public blockchains offer an open-source, decentralised, shared database that anyone can view and interact with based on programmable rules.

We are seeing the emergence of this new data infrastructure. We aren’t there yet, we still need to process more transactions at faster speeds and use less energy in doing so. Data needs to be private, stored in an accessible way, and shared across different blockchain flavours. We also need a way for individuals, organisations and machines to buy and sell data in a marketplace. The storage and access to data is important, but it will be the data marketplaces that finally provide a business model for data creators. There will finally be a way for people and machines to make the most of the data they collect. A marketplace provides an economic incentive for the more efficient allocation of data. Individuals can sell it instead of giving it away for free; organisations can monetize it instead of letting it sit idle on databases; and machines can buy and sell data automatically to increase their utility. In my view, a peer-to-peer marketplace for data is the second most important idea to come from the blockchain industry after peer-to-peer electronic cash.

A data commons give control back to users and limits monopoly control of the most valuable resource in the digital economy

2018 will see the beginnings of this global data sharing and monetisation network. Data creators will begin to earn money from uploads, likes and retweets. This is a far more profound change than it may seem. Disruption has typically come from startups offering seemingly inferior products that serve a niche which is underserved by the incumbent. Blockchain-based networks won’t just disrupt particular companies; they go much further, they disrupt a digital norm: the existing assumption that we should be giving away personal data for free. Digital monopolies including Facebook, Google and Amazon, get data from users for free. Every like, search and purchase feeds the learning system to further improve the algorithms; in turn bringing more customers and engagement. In value chain terms, data is supply, and AI algorithms are demand. Digital monopolies are searching everywhere for more and more data to feed their algorithms. Facebook buying WhatsApp and Instagram. Google with self-driving cars and Google Home. And Amazon with Alexa Echos and Dots.

Blockchains and decentralised public infrastructure change the game. Blockchains reduce the value of hoards of private data. It makes proprietary datasets much less valuable because as more and more machines, individuals and organisations use a public data infrastructure, a global data commons becomes more attractive to data sellers. As this data commons grows with more datasets, it will attract more data buyers, creating powerful network effects. In other words, data becomes more of a commodity; and it is no longer the source of value in and of itself. Firms that control supply — data — no longer dominate markets. The point of value capture in the value chain will change from data to brand and trust.

As data becomes less valuable, the customer relationship becomes ever more important. Startups and incumbents alike will compete for customers’ data based on trust. The global data commons will mean individuals will choose where their data is sold or rented. This global data commons will at first attract individuals that care about privacy and self-sovereign data. Machines will soon follow as machine operators and owners look for new revenue streams. Some organisations, especially the public sector, will be attracted by the non-corporate controlled nature of the decentralised infrastructure as well as the cost and liability reductions in not storing consumer data. Smaller organisations and startups will sign-up to access standardised data that would otherwise take too long or cost too much to acquire. Today, data is siloed with no business model for creators to monetise it. Blockchain technology and other decentralised infrastructure are emerging as a new data infrastructure to support machines, individuals and organisations to get paid for the data they generate. Blockchain-based data infrastructure, including data exchanges, will commoditise data and help realise the vision of a data commons and the first real global public utility.


Building a New Data Infrastructure with Blockchains and the Internet of Things

The Convergence Ecosystem is open-source, distributed, decentralised, automated and tokenised and we believe it is nothing less than an economic paradigm shift.

We are excited to introduce the Outlier Ventures vision of the future and our investment thesis: The Convergence Ecosystem. The new data value ecosystem see data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog, and for a deeper look into Blockchains, Community, & Crypto Governance have a read of my last post here. Today though I want to talk talk specifically about the convergence of blockchains and the Internet of Things.

Complexity versus simplicity

As the graphic above shows, data enters the ecosystem at the ‘data collection’ layer through either hardware (Internet of Things) or Software (Web, VR, AR, etc). In fact, early feedback on the paper has suggested what we are really talking about here is a new data value chain, and I agree with that to some extent. But of course, this is just a snapshot, a simplification of the emerging data value chain.

If your first thought upon reading the paper or looking at the graphic was “buzzword salad” or “this is too abstract, what are the actual products and protocols that need to be built?” well you are not alone. Indeed, thinking through the Convergence Ecosystem was a constant tension between complexity and simplification.

I felt that actually it was more important that non-technical people understood that all these seemingly disparate technologies were connected rather than I went into detail about the technical differences between Cosmos and Polkodot in addressing blockchain interoperability. This simplification can be seen at the data collection layer. I note the Internet of Things and software as the two entry points for data. This is purposefully broad, I had another attempt which separated hardware into types of devices — mobile, wearables, IoT devices, learning robots — but ultimately the ecosystem become to complex and overwhelming to understand for the layperson. With that in mind, I decided that any sensor measuring the external environment should be often bundled together under the umbrella term the ‘Internet of Things’; and this includes all sensors in smartphones and wearables such as gyroscopes, accelerometers, and proximity sensors as well as hundreds of others sensors measuring our world. As for software, well this is so broad as to include any data created from the digital environment regardless of application — augmented reality and virtual reality worlds, our digital exhaust from online activity, and gaming are just a few examples.

The key exercise isn’t to define exactly where data will come from. The key message is that the amount of data created annually will reach 180 zettabytes (one zettabyte is equal to one trillion gigabytes) by 2025 up from 4.4 zettabytes in 2013 and an average person anywhere will interact with connected devices every 18 seconds (nearly 4,800 times a day).

The so called Internet of ‘Things’

If you thought that the blockchain industry lacked a clear definition, well the internet of so called ‘things’ is even worse. The industry lacks a standard definition of the IoT, and in its broadest sense, it will come to include every physical object that has a sensor, microcontroller and Internet connection. Today that mainly means connected home devices like Amazon Echos, wearables like the Apple Watch, industrial and agricultural connected sensors, and smart meters measuring home energy usage. But the range of applications is growing, and it has been estimated that by 2024, the automotive industry will account for the almost a third of all IoT connections, followed by consumer electronics and FMCG (fast moving consumer goods) and the utility sector. Other sectors including Smart Cities, supply chain, manufacturing, healthcare and others will make up a relatively small proportion of the connections. The IoT market intersects with the robotics market in the sense that a robot has the same features as an IoT device, but with the addition of actuators and the means to move and respond to the environment. We would consider connected vehicles, service robots and other types of robotics as data collecting machines.

The IoT market is often measured in the number of connections — roughly 30 billion by the end of the decade — or the economic impact — 11 trillion dollars over the next decade says McKinsey. A less-asked question is: what happens to all the data? The same McKinsey study found we may be using as little as 1% of data being generated. As well as under-utilising data, how data is being used is unclear. In a survey by Ponemon Institute, 82% say IoT manufacturers had not provided any details about how their personal information is handled.

The emergence of distributed systems like IPFS, Filecoin, and other blockchains offers a potential new model for data storage and utilisation. It has been expected that data would be fought over by devices makers, software providers, cloud providers and data analytics companies. In fact, the reluctance of car makers to put Android Auto or Apple CarPlay into their cars is an awareness that they would lose control of valuable data.

So the key value proposition for distributed and decentralised systems in many cases isn’t actually ‘censorship resistance’ or ‘unstoppable payments’, it is actually a shared (but private) dataset of industry data, both transactional and otherwise. As we know we are still early in the development of the blockchain industry, we still need to prove and scale privacy technologies like zero-knowledge proofs, secure multi-party computation, and differential privacy. As well as increasing throughput of blockchains and linking blockchains robustly with off-chain databases for the volumes of data we expect to be generated from the IoT.

Very broadly speaking, decentralised technologies can provide shared data infrastructure whereby data use isn’t a zero-sum game. It is not longer a case of generating data and a use-it-or-lose-it model. The stack of technologies including blockchain-based marketplaces enable IoT data creators — machine-owned or human-owned — to buy and sell data.

Software is eating the world; and throwing off valuable data

Adding to the 50 billion IoT connections, we also need to add digital media and human-generated digital data. We are on our way to quantifying and digitising our external world, and we are even further along in gathering data on our digital lives. We use the term ‘software’ as a producer of data broadly to capture all personal and business data produced through the interaction with databases, operating systems, applications and APIs. These interactions build up digital dossiers including cookie and web browsing data as well as active traces like social media and messaging.

On the business side, as we continue to digitise and bring online our offline interactions and documents like electronic health records and learning records, key sectors will have an overwhelming amount of data to handle, which they do not have the capabilities to utilise. On the consumer side, digitally-created and digitally-augmented environments with augmented reality (AR) or virtual reality (VR) will lead the growth in available personal information.

Half the world’s population — 3.82 billion — will have an Internet connection by the end of 2018 and by 2020 it will be 4.17 billion. Mobile data traffic will grow to 49 exabytes per month by 2021, a sevenfold increase over 2016 according to Cisco. We are creating unfathomable amounts of data, and the growth shows no sign of abating. Adoption of AR and VR will further drive the amount and granular detail of data that we can collect, enabling deeper insights into individual and collective behaviours. Whether it’s from the IoT or software, we have a massive data problem.

IoT needs blockchains

We are creating and collecting more data than ever, but we are storing it in insecure private databases with no incentives to share the data. Data breaches and hacks are commonplace, and the data can be censored or tampered with. Software-generated data is lost, hoarded or latent. There is no reason for consumers to do anything other than to give data away for free and for corporations to hoard it.

Open-source, distributed, decentralised, automated and tokenised infrastructure offers a solution.


For more in how communities and tokens will integrate with the Internet of Things and Artificial Intelligence, read the full paper here.

The End of Scale: Blockchains, Community & Crypto Governance

We are excited to introduce the Outlier Ventures vision of the future and our investment thesis: The Convergence Ecosystem. The new data value ecosystem see data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog, today I want to take a deeper look at how important communities, governance and politics will play in this new era.

The industry is rapidly experimenting with new (and old) consensus mechanisms and decision-making techniques to coordinate and govern the emerging token economy. This experimentation began with Bitcoin and spawned thousands of tokens each with different rules to encourage or discourage behaviours within the network and allocate resources. Tokens and automated decision-making tools allow for the mass decentralisation of entire industries through a distributed coordination network. These networks are birthing new types of resource allocation structures such as decentralised and autonomous organisations, pushing forward our conception of what an organisation should be.

Crypto Tokens

Cryptographically secure and digitally scarce tokens are the key innovation that makes a group of technologies into a living, breathing ecosystem.

Tokens are a native digital coordination mechanism for the Convergence Ecosystem. Until now we have been retrofitting a financial infrastructure designed for cash and cheques to the digital, software-defined era. Ever since the emergence of Bitcoin, it has been clear that distributed ledgers with automated consensus held the potential for new forms of asset and value exchange. It was not until the ERC20 smart contract on Ethereum that experimentation around digital and programmable money began at significant scale. There is now a mechanism to fund open-source protocols that would have previously struggled to raise financing because open-source lacked a business model. As Albert Wenger has noted: “Now, however, we have a new way of providing incentives for the creation of protocols and for governing their evolution.” In early 2018, we are still at the very beginning of this evolution.

Over the next year or so, we expect to see a much clearer delineation between two types of tokens: crypto-assets and cryptocurrencies.

Cryptocurrencies will be designed to be a medium of exchange and crypto-assets will be designed to be a store of value and offer utility in a digital economy. Despite the fact dominant token ecosystems have a element of both; design challenges abound when attempting to incentivise usage with digital scarcity. It is unclear if single-token systems like Bitcoin and Ethereum can provide a sustainable balance; instead it is likely we will see multi-token systems as a more effective mechanism.

Experimentation is happening at a rapid pace on both the supply and demand side.

We have tokens with a deflationary economy, scheduled inflation and others that let the community vote on how and when new tokens are minted and/or burned. That is just programmable money supply; we are also experimenting with demand-side economics: variable transaction fees, demurrage charges, interoperability, and different consensus rules. Non-fungible tokens such as cryptokitties and the new Ethereum ERC 721 NFTs will also impact demand by incorporating historical ownership creating a subclass of crypto-assets called crypto-collectables. In addition, a currently underutilized token model is the crypto-consumable, a token that is programmed to reduce in value over time using a decay or burn function. This could be a continuous decline in value like a used car or a step decline like a ticket to a live event. This sort of token design would not be a store-of-value and would be be a powerful way to increase network token velocity. Our head of cryptoeconomics Eden Dhaliwal is working with Imperial College London and our portfolio companies to experiment and design sustainable token ecoomies.

Today the industry is focused on the initial distribution of these tokens in generation events. But the initial distribution is just one stage of building a sustainable ecosystem.

Token distribution schedules will become more sophisticated over time to include staged releases like traditional equity fundraises and mechanisms such as airdrops or token faucets. Continued network engagement will separate successful networks from unsuccessful ones. 2018 and beyond will show that the much of the ICO class of 2017 was prepared for initial distributions but underprepared for sustainable growth and utility. It must be remembered that prior to 2017, tokens were distributed to the network in exchange for utility; Bitcoin distributes Bitcoin as a reward for the secure clearing and settling of Bitcoin transactions. By giving away the majority of tokens upfront, many 2017 ICO projects are left with few tokens to reinvigorate demand later down the line. Most projects will fail, but the open-source nature of the ecosystem means learnings and code will be available to all. We can learn and build faster than ever. Unlike economic modelling or theory, the industry is testing economic theories in real-time with real money. It is the greatest experiment in socio-economics we have ever seen.

Tokens are the first native coordination mechanism for the digital and now machine economy.

We expect tokens to be issued at each layer of the stack to incentivise behaviours within each particular network and to connect with the broader ecosystem through a series of exchanges and interoperability protocols. The model would be similar to today’s global economy in which each nation issues and uses their own currency within their own borders and trades foreign currency with other countries for products and services that it needs. If Bitcoin is indeed the digital store-of-value in the same way gold is the physical store-of-value, it is likely we will see a digital hierarchy of money emerge with Bitcoin as an apex token, protocol tokens like ethereum, NEO and CARDANO Labo below Bitcoin, and utility or application tokens below the protocol tokens. As the Convergence economy develops and core infrastructure is developed, tokens will become increasingly liquid and frictionless leading to extraordinarily complex economic dynamics.


Communities & Governance

Tokens themselves are simply a type of value instrument, the rules under which these instruments are generated, distributed and managed are decided by community members through agreed governance rules.

These governance rules are set and decided by the community members using different forms of decision-making. For protocol tokens like Bitcoin, Ethereum et al., governance includes decision-making on changes to the network. The explosion of tokens and blockchain-based networks has led to a renaissance in thinking about governance, especially decentralised governance.

We have the Bitcoin network with a strong libertarian value-system valuing decentralisation above all else. And therefore there is a separation of ‘powers’ between developers, miners and users; no one stakeholder group can ‘force’ a decision on the others. This results in a very slow-moving but stable network. Ethereum, while still aiming to be a decentralised network, does not have quite as strong libertarian streak but does have more leadership with Vitalik Buterin who is often able to push through changes because the community follows his lead. See the 2016 summer fork to return funds lost through the DAO bug.

New projects are experimenting with automated governance in an attempt to avoid messy human decision-making.

Tezos is hoping to enable governance to be ‘upgraded’ through community voting. dfinity-network is doing something similar but allowing retroactive changes to the ledger. These types of ‘on-chain’ governance as they are known are still technically immature and open up a whole new attack vector. Other projects like Augur and Gnosis are testing futarchy, a type of voting model in which the community defines a set of values and then prediction markets are used to decide which decisions will maximise those values.

We are also seeing exciting experiments with curation markets and reputation staking from projects like Colony and @Ocean_protocol. This type of decentralised and automated model is extended further with decentralised autonomous organisations (DAOs). In these sorts of organisations, all decision-making is offloaded to smart contracts and decisions would be automated based on the rules encoded in the smart contracts. One of the first examples was of course TheDAO, a DAO for venture funding, that was never able to allocate capital after a bug was exploited. Other live examples include Dash, a privacy-focused cryptocurrency; Digix, a gold payment system project; and Aragon, a platform hoping to provide the entire governance service for other token projects.

The end-point of blockchain-based automation will come through AI DAOs as articulated by Trent McConaghy These theoretical organisations will be managed and owned by AI algorithms enabling AI to interact in the economy by earning and spending tokens. An AI could own a fleet of self-driving vehicles, charging fares which it then uses to pay for maintenance, tolls, insurance, and taxes.

Blockchains and tokens will be issued, distributed, governed and owned in increasingly diverse ways. Governing models will evolve and we are likely to see an industry with multi-types of governance each co-evolving around the belief-systems of the community they serve.

Bitcoin will remain staunchly libertarian; Ethereum has more of a central leadership which appeals to pragmatic developers; and self-sovereign identity underpins the value-system of the Sovrin Foundation blockchain. We will soon see more projects with social democratic values that prioritise wealth redistribution through ‘network’ (read: State) intervention or pre-agreed taxation rules. Others will prioritise ethical and environmental values with green-friendly policies that use non-consumption based consensus mechanisms (eg Chia) and focus on common-ownership and resource sharing.

“The Convergence Ecosystem should support a diverse range of different governing models that support different communities. There is no optimal model of governance; only a perpetual tension to maintain alignment amongst stakeholders.”

We have millenia of literature exploring politics and governance, everything from Plato’s five regimes to John Locke’s libertarianism to Jeremy Bentham’s utilitarianism. Philosophers and political scientists will never settle on an ‘optimal’ governance model because ‘optimal’ can only exist for individuals in limited contexts never for society at large.

As with almost all information and communications technologies that have come before, blockchain technology was born decentralised.

Bitcoin with the first blockchain implementation was a libertarian movement created as a direct reaction to a centralised financial system. Early adopters shared this value-system. As more and more blockchains and tokens are created, the industry attracts an audience with different belief-systems. As it continues to mature, different communities will have unique objectives and priorities that will require specific design trade-offs. The financial community requires more and faster transactions and will sacrifice decentralised consensus to achieve that, as can be seen with Ripple and it’s XRP token. The healthcare community must adhere to privacy regulations and so will require more privacy than public blockchains currently afford. The ecosystem will support a variety of communities using different governance models with differing levels of decentralisation and automation depending on the values of the community and the needs of the market.

We are in the very early stages of understanding how to design token economies and the governance models that support them.

As an industry, we must be more supportive of new ideas and implementations. It is not a zero-sum game in a growing market. Some tokens, communities and governance experiments will fail. Let’s learn quickly from their failures and compound learnings.

The biggest advantage the decentralisation community has is momentum and the brightest minds from around the world are working together to solve tough problems. Communities will co-exist and thrive. Let’s be inclusive and supportive.


For more on how crypto-communities and crypto-tokens will integrate with the Internet of Things and Artificial Intelligence, read the full paper here.

VC for The Decentralised Future: Introducing the Convergence Ecosystem

Today we are introducing the Outlier Ventures vision of the future and refined thesis: The Convergence Ecosystem.

The Ecosystem sees data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. The Convergence Ecosystem is open-source, distributed, decentralised, automated and tokenised and we believe it is nothing less than an economic paradigm shift.

How We Got Here: The Outlier Journey

From Blockchain-enabled Convergence

In late 2016, we published a paper titled: ‘Blockchain-enabled Convergence’ outlining our investment strategy. The paper was the result of over three years’ experience researching, investing and building blockchain-based businesses. Our insight was that blockchains are not just a secure ledger for cryptocurrencies and other digital assets, but that they represented something more transformative: a decentralised data infrastructure. Infrastructure that could solve technical and market problems across a variety of emerging technologies like artificial intelligence, autonomous robotics, the Internet of Things, 3D printing and augmented and virtual reality.

In 2017, crypto-tokens proved they are the first digitally-native mass coordination mechanism

2017 saw a vast change in the cryptocurrency and blockchain markets to arguably the peak of inflated expectations as per the Gartner Hype Cycle. The ERC20 smart contract industrialised the token sale crowdfunding model, raising over 4 billion dollars in funding. Despite misplaced energy and too much focus on token prices, it is now clear, in a way that wasn’t in late 2016, that crypto-tokens are a critical missing component in decentralised networks — the first digitally-native mass coordination mechanism for humans, bots and machines. Recognising the underlying importance of crypto-tokens to create an ecosystem of converging technologies, we started investing.

From IOTA, Botanic & SEED, Evernym & Sovrin, to Fetch and Ocean

Over the last year we have partnered with and invested in IOTA, a foundation building Internet of Things infrastructure with a new type of decentralised data structure. Botanic and the SEED Vault foundation it founded, creating a platform for developers to publish trusted software bots. Evernym, a company using the Sovrin Network and Protocol to establish self-sovereign identity. Fetch, a startup building an emergent intelligence protocol combining distributed ledgers with machine learning. And most recently, Ocean Protocol, who are developing a decentralised data exchange protocol to unlock data for AI. Each of these investments have been strategically chosen because they are a complimentary piece of decentralised infrastructure required to create the Convergence Ecosystem.

Why We Need The Convergence Ecosystem

Centralised Web 2.0 has failed…

Centralised Web 2.0 digital infrastructure has failed. Too many hacks and data leaks. No individual privacy. Monopoly control over global information and communities networks. The Internet of Things is creating an unmanageable data environment, and artificial intelligence is giving those who control the most data more power than any company in history. As Tim-Berners Lee, the creator of the Web, recently wrote;

“What’s more, the fact that power is concentrated among so few companies has made it possible to weaponise the web at scale. In recent years, we’ve seen conspiracy theories trend on social media platforms, fake Twitter and Facebook accounts stoke social tensions, external actors interfere in elections, and criminals steal troves of personal data.”

Something must change.

We are 10 years into the decentralisation revolution

It has been ten years since the publication of Satoshi’s seminal paper and the introduction of the first viable decentralised solution to the problem of double-spend in digital networks. Bitcoin sparked interest and innovation in other cryptographic and decentralised technologies including blockchains and crypto-tokens. We are in a rapid period of experimentation around decentralised technologies including consensus mechanisms, identity, data structures, crypto-economic designs and smart contracts. Taken together, we see the foundations of a new data infrastructure.

Our Vision: The Convergence Ecosystem

Introducing the Convergence Ecosystem

We believe that future decentralised data infrastructure will come from the convergence of the Internet of Things (data production), blockchains (data distribution), and artificial intelligence (data consumption). The integration of these technologies will see markets become increasingly open-source, distributed, decentralised, automated, and tokenised.

The Convergence Ecosystem consists of four parts: governance, production, distribution and consumption. Each of these are explored in the paper, and we will be publishing further analysis into part throughout the year.

Governance — How are protocols and communities governed and incentivised?

  • Data flow through the ecosystem is coordinated and incentivised using crypto-assets, crypto-currencies and crypto-consumables designed to incentivise behaviours for people, machines, devices and agents to the benefit of the overall ecosystem. These new types of assets will continue rapidly experimenting with supply and demand policy including fungibility mechanisms like the Ethereum ERC 721 NFTs.
  • Emergent governance models will have differing levels of decentralisation and automation depending on the values of the community. Some will value censorship-resistance and others self-sovereign identity. New decentralised projects will be guided by social democratic values that prioritise wealth redistribution through ‘network’ (read: State) intervention or pre-agreed taxation rules. Others will prioritise ethical and environmental values with green-friendly policies that use non-consumption based consensus mechanisms (eg Chia) and focus on common-ownership and resource sharing. Communities will continue to experiment with traditional governance models like corporations and newer structures like decentralised organisations or decentralised autonomous organisations (DAOs).

Production — How is data produced?

  • Data is brought into the ecosystem by either hardware connected to the Internet of Things or software such as digital, virtual or augmented spaces.
  • We are creating and collecting more data than ever, but we are storing it in insecure private databases with no incentives to share the data. Data breaches and hacks are commonplace, and the data can be censored or tampered with. Software-generated data is lost, hoarded or latent. There is no reason for consumers to do anything other than to give data away for free and for corporations to hoard it. Decentralised infrastructure offers a solution.

Distribution — How is data authenticated, validated, secured and stored? How is it transported across databases and blockchains, and how is it exchanged?

  • Once data is in the ecosystem it needs to be authenticated, validated and secured. This is where blockchains or more specifically distributed ledgers, consensus mechanisms, self-sovereign identity and reputation, and decentralised storage and data integrity solutions are valuable tools.
  • Using new data distribution protocols such as; transport & messaging, state communication; value and data interoperability, data can be efficiently moved from storage across networks and protocols to marketplaces.
  • Marketplaces are already developing going beyond just cryptocurrencies to support the buying and selling of all sorts of other data types including internet of things data, artificial intelligence data, personal data, and a range of newly emerging digital assets including but not limited to cryptokitties.

Consumption — How is data turned into insight?

  • Finally, data is processed, analysed and automated using a range of technologies including distributed computation, decentralised machine learning and smart contracts.
  • This is where data is transformed into actions and insight using traditional and distributed computing techniques, as well as newer types of computing such as quantum computing. It is at this layer where blockchains and artificial intelligence blur and it becomes clear they are intertwined and interconnected. Both smart contracts and machine learning offer differing levels of automation and decentralisation depending on the type of input data and level of trust the use case demands.

Winners will differentiate on values and trust

The open-source nature of the technology; ease of forking; almost zero costs of digital distribution; and interoperability protocols will mean projects will struggle to differentiate using technology in the long-term. Successful projects will differentiate through political values such as libertarianism, self-sovereignty and egalitarianism as well as through trust. This makes the Convergence Ecosystem structurally different from other markets in which value capture happens at friction points. With very few friction points and lock-in, we are unlikely to see the same market consolidation dynamic that has dominated previous digital markets. When technology and data are open and free, lock-in will come from brand and values. There will be as many protocols as there are value-systems and personal priorities.

There will not be one chain to rule them all. In a world of scarcity, competition is the optimal strategy. In a world of abundance, we must change our mental models. The Convergence Ecosystem drives collaboration rather than competition.

Outlier Ventures: VC for The Decentralised Future

The Convergence Ecosystem is our vision of the future. We expect the Ecosystem to support hundreds of communities that will over time outcompete their Web 2.0 competitors for developers and users using tokenised business models. This shift will not occur overnight. People will continue to focus on the price of crypto-assets and worry about the regulatory implications of public token sales.

But behind the scenes, a decentralised infrastructure is being built.

Network by network.

Protocol by protocol.

We want to invest and partner with tokenised communities to build decentralised economies. Join us to help build the decentralised future!

Download the full paper here

Also a big thanks to all the Outlier Ventures team including Joel John, Harry McLaverty and Shaquile Noir for their work on putting this together. Also to Jamie Burke, Aron van Ammers, Eden Dhaliwal, Anesu Machoko, and Geoff Le Fevre for their contributions and feedback.

Also a massive thank you to all of the people outside of Outlier that contributed to the paper:

Dele Atanda — CEO, metâme x (Dele Atanda)

Chris Burniske — Partner, Placeholder & Author of Cryptoassets (Chris Burniske)

Dr Rose Chan — Founder, Ladyboss.world & Former Head of Blockchain Working Group, World Bank (@I_am_rose)

Professor David Lee Kuo Chuen Professor, FinTech & Blockchain, Singapore University of Social Sciences (@DavidKChuenLEE)

Matt Chwierut — Director of Research, Smith + Crown (@Skryptical)

Dr Anne Hsu — Assistant Professor, Department of Computer Science, Queen Mary University

Dr Stylianos Kampakis — Research Fellow, UCL Centre for Blockchain Technology (@s_kampakis)

Samuel Klein — Fellow, Berkman Centre for Internet & Society at Harvard University (Samuel Jay Klein ❦)

Professor William Knottenbelt — Director, Imperial College Centre for Cryptocurrency Research and Engineering (@will_wjk)

Dr Robert M. Learney — Lead Technologist Blockchain & DLT, Digital Catapult (Robert Learney)

Trent McConaghy — Co-founder, BigchainDB & @Oceanprotocol (Trent McConaghy)

Mark Stephen Meadows — CEO & Founder, Botanic.io & SEED Token (Mark Stephen Meadows)

Teemu Paivinen — Founder, Equilibrium Labs & Author of Thin Protocols (Teemu Paivinen)

Samuli Poyhtari — Founder, OrbitDB (@haadcode)

Drummond Reed — Chief Trust Officer, Evernym (Drummond Reed)

Toby Simpson — Co-founder, Fetch.AI

Dr Phillip J. Windley — Chairman, Sovrin Foundation (Phil Windley)

Disrupting Tech Monopolies & AI Tycoons — Part 2

Blockchains combined with artificial intelligence is more than just a technical innovation: it’s an economic paradigm shift 💰💰💰

This is Part 2 of Disrupting Tech Monopolies & AI Tycoons — Part 1 outlines how we will get here.


>>Blockchains combined with AI will create the conditions for disruption of platform monopolies>>

>> As data stops being a competitive advantage, powerful token-driven network effects will lead to AI agents using blockchains to accumulate tokens>>

>>This will lead to profound questions about the how we govern non-human entities in the economy and society. 🤔 🤔


End of AI Platform Monopolies

So, now we have a global data sharing and monetization network

Right, so where were we? Oh yeah, we now have a global network of interconnected blockchains and DLTs that share value seamlessly with easy-to-use data exchanges (See Part 1). Hopefully, as the industry begins to focus on usability and user design, we will be in a world in which anybody can publish data with a press of a button or voice command. Payments in Bitcoin or other tokens are seamless and automated based on rules coded into smart contracts. For the average user, all they have done is agreed to conditionally share data, as they do today with Facebook and other systems, and next thing they know they have tokens to spend however they want. They can convert to a national currency or merrily purchase their preferred goods and services.

All that matters today for AI platforms is data…

How does this lead to the end of AI platform monopolies? Well in 2017, the only thing that matters is data. Platforms like Google, Facebook, Baidu collect data to feed their AI algorithms (specifically their deep learning algorithms) improving their products. More data improves the products which in turn brings more customers and engagement which in turn generates more data.

When AI is the driver of product improvements proprietary data are the most valuable asset for platforms. In fact, access to proprietary data sets is one of the most important assets for an AI startup. The way to think about it is data is the supply and AI algorithms are the demand. And deep learning models are hungry.

But hold on a moment, blockchains aggregate and commoditize data. That means…

Here is the knockout: blockchains aggregate the supply side for free (almost) for all. Of course, there will be some transaction fees and other friction points, but compared to existing data infrastructure an open, shared data layer essentially commoditizes data. Or at the very least makes proprietary datasets much less valuable.

But that means control of data is no longer the leverage point in value chains…

Firms that control supply — data — no longer dominate markets. Data stops becoming a moat and a competitive advantage. Now the demand-side becomes the most valuable place in the ecosystem. This is where the customer relationship is won with trust. Well, trust, and a simple, easy to use interface, maybe a conversational or voice UX. The only thing that matters in 2020: The customer relationship. (Side note EU’s General Data Protection Regulation, or GDPR, will reinforce this)


Beginning of Blockchain-enabled AI

So we have this global shared data layer right…

A second, a longer-term implication of a global shared data layer is blockchain-enabled AI. Agents (not even particularly intelligent agents) can use blockchains as a ‘substrate’ as Fred Ehrsam has put it in the past.

Deploying and using agents on blockchains rather than using proprietary tools and platforms like Facebook Messenger will be more attractive to developers, users and regulators.

Well developers are going to love it…

For developers, first, they have access to a vast amount of free (on public chains anyway) and structured data, data that they would never be able to buy or generate themselves at first. Second, they have structured and high-quality data (right now, just transaction data, but increasingly all sorts of value store and exchange). Third, native automation tools in smart contracts and hopefully very soon production sidechains make it easier to build reasonable agents that can perform reasonably complex actions. Finally, developers and companies that deploy agents have a native payment channel to be paid almost immediately based on all sorts of conditions like usage or user utility. The business models with tokens and smart contracts are not limited to up-front payment or a paywall. All sorts of new business models will be available for experimental developers.

Users will love it, too. They can get paid just to use agents…

Users benefit because unlike any other environments, they will have direct access to token capital, investment and real interest in the system. When users use a Facebook messenger bot, they get some utility. When they use an agent on a blockchain they can be rewarded or paid with tokens. Depending on the token economics a user can ‘own’ a stake in the agent or company behind the agent. The more the user uses or evangelises the product, the stronger the product and underlying blockchain gets. Network effects with a direct monetary reward thrown in. In a sense, a user is no longer a passive consumer of a service; they are a stakeholder. This model begins to look more and more like a digital cooperative. (Something we are actively exploring at Outlier Ventures with the tokenization of Botanic Technologies’ bot platform, a project named SEED that allows the fair exchange of information between AI and people)

Regulators will love it the most, they might even force data and AI agents to use blockchains…

The last stakeholder, and potentially the deciding factor will be regulators and Governments that demand some element of control or access to AI algorithms. The public and political tide are turning against technology companies. Certainly many Governments around the World are waking up to the power amassed by large US-based tech firms through their exploitation of data. Without overselling it, it seems to me that an open-source, auditable data structure would be an ideal technical solution for regulators that want a window into AI decision making and data used to train models. This would at the very least allow scrutiny of training data to check for bias as well as potentially providing an audit trail for exploration if an agent makes a bad decision. It’s not a leap to imagine regulators actually mandating the use of either a public blockchain or demanding a node in private networks for audibility of AI.

So now we have the perfect environment for autonomous agents…

If this scenario plays out you have more developers, more users and happy regulators. There are many different descriptions, I like Autonomous Economic Agents (AEAs), these new types of decentralised AI are the logical next step when autonomous agents start using blockchains (something FETCH.ai, an Outlier portfolio company are working on). The level of human involvement with the agents will vary; some AIs can be managed by traditional organisations, others will be managed by decentralised autonomous organisations (DAOs). Regardless of the human involvement, the fact is AIs will be accumulating tokens (seen another way, wealth). For example, an autonomous vehicle can be paid in tokens for rides and can pay for re-charging and servicing with tokens. Or an AI DAO could manage a neighbourhood distributed energy grid in which energy is exchanged using smart contracts based on real-time supply and demand.

Cool yeah, this sounds like a pretty big deal…


I don’t think many people have truly thought through the implications of this. A non-human and non-human controlled entity will have the ability to acquire resources and wealth. When people talk about exponential growth, this is exactly what they are talking about. Society and politics are simply not ready to even begin a discussion about these sorts of issues. Can an autonomous agent generate wealth? What is the optimal level of taxation that doesn’t act as a disincentive to activity? We already have enough trouble collecting taxes as it is, how and who will collect taxes from an AI DAO?

Blockchain-enabled AI might seem pie in the sky. But unlike say artificial general intelligence (AGI) we know exactly the problems that need to be solved to bring this vision to reality. There are already rudimentary versions of these agents available today. For more on AI DAOs you must read Trent McConaghy’s AI DAOs, and Three Paths to Get There.

Yea it’s a big deal alright, possibility an economic paradigm shift…

Blockchains combined with artificial intelligence is more than just a technical innovation: it’s an economic paradigm shift. The political philosophy written in next 10 years will be as important as the socialist and labour movement of the late 20th century.

Thanks for reading 👍. If you enjoyed it, please hit the 👏 button and share on Twitter and Linkedin. Honestly though, Ev Williams, surely tokenizing claps is the perfect business model for Medium?

— –

This is a working thesis and an high level description of the work we are doing at Outlier Ventures. I am looking for feedback so please tweet me Lawrence Lundy. The thesis can certainly be improved upon. I am particularly keen to explore the potential impact of improved unsupervised learning algorithms and reinforcement learning on the need for large data sets. If large data sets are no longer required the outcome would also be that data becomes less valuable which serves to reduce the value of data but wouldn’t enable blockchain-enabled AI.

Thanks to Aron Van Ammers, Joel John & Jason Manolopoulos of Outlier Ventures; Jeremy Barnett a Barrister at St Paul’s Chambers; Omar Rahim of Energi Mine; Toby Simpson of FETCH.ai (an Outlier portfolio company); Mark Stephen Meadows of Botanic Technologies (an Outlier portfolio company) for reviewing and providing valuable feedback. Also to Trent McConaghy for driving and pushing forward the thinking on AI DAOs.