Data Marketplaces: Value Capture in Web 3.0

Marketplaces are critical elements of the entire Convergence ecosystem; the element that incentivises data to be collected, shared and utilised. We now have the ability to open up the machine economy and need to think of ‘trading’ beyond the scope of human interaction. The sorts of marketplaces that are being developed in the crypto community are decentralised, automated and tokenized. These marketplaces are made possible because of the distributed ledgers, consensus mechanisms and interoperability protocols at the lower levels.

We will see the emergence of a whole host of new types of marketplaces beyond just today’s cryptocurrency exchanges like Binance or Coinbase. We are seeing the emergence of data exchanges that work with specific types of data; machine data from IoT networks, artificial intelligence data, personal data, and complex digital assets like crypto-collectables (pioneered by ERC 721 non-fungible tokens like CryptoKitties) and bots. It’s likely that over time marketplaces will expand into enabling all types of data and if that does occur we could end up with a dominant data and digital asset marketplace for Web 3.0 like Amazon for the Web 2.0. It’s interesting to think where the points of leverage will be in the Web 3.0 especially if value and data are interoperable across blockchains. Anyway at least for now, we see four types of decentralised data marketplace.

IoT Data Marketplaces

IoT data is already being collected in vast quantities, but the sprawl of devices has created a fragmented ecosystem. On the consumer side, operating system providers like Apple, Google and Amazon are attempting to leverage their dominant positions in smartphones and retail to sell more devices to collect more data. The Apple Watch and CarPlay, Google Home and Next, Amazon Echo and Dot; these are all attempts to grow their walled gardens of data. Smaller consumer IoT device makers like Fitbit, Wink, or GreenIQ struggle to collect enough data to make do meaningful machine learning to improve their products as quickly as the tech giants.

On the enterprise side, the same dynamics are at work. The internet of things (IoT) and industrial internet in the United States, Industrie 4.0 in Germany, and 物联网 (wù lián wăng) in China all promise to use low-cost sensors and big data analytics to dramatically improve productivity and usher in a new age of data-driven manufacturing. But the promise has not been realized for a number of reasons. Core to the failure has been the lack of data sharing. This lack of data sharing has been the case across all industries that are trying to utilise IoT technologies including aviation, agriculture, and utilities. The problem, as we have already highlighted, is that there is no incentive to share data because it is seen as the competitive advantage to be protected. Current data infrastructure is coarse: data is either hoarded and valuable, or shared with limited commercial viability. IoT marketplaces begin to offer new business models for the monetisation of machine data. The IOTA data marketplace, Streamr, Datum and Databroker DAO are all examples of these marketplace emerging to enable the sharing of sensor and machine data.

AI Data Marketplaces

Just like IoT data, or any data for that matter, data for AI algorithms tend to be accumulated by the largest companies. Society is becoming reliant on data, and as it applied to AI algorithms, we are facing a situation in which a select group of organisations are amassing vast datasets and building unassailable AI capabilities. With the emergence of deep learning as the most useful machine learning technique for a range of AI applications like computer vision and natural language processing, data has become like digital oil. Digital monopolies like Facebook, Google and Amazon, today get data from users for free. Every like, search and purchase feeds the learning system to further improve the algorithms; in turn bringing more customers and engagement. In value chain terms, data is supply, and AI algorithms are demand. Digital monopolies are searching everywhere for more and more data to feed their algorithms: Facebook buying WhatsApp and Instagram, Google with self-driving cars and Google Home, and Amazon with Alexa Echos and Dots.

“Traditionally proprietary data and technology have been significant defensibility mechanisms for companies. In the blockchain industry this is all open source, leading to an incredibly rapid innovation cycle, but also shifting defensibility more towards the sheer size of the community and thus the distribution power. This is an industry where increasingly users decide what technologies they want to use.” — Teemu Paivinen, Founder, Equilibrium Labs & Author of Thin Protocols

Decentralised AI data marketplaces will reduce, and eventually remove, the competitive advantage of hoarding private data by enabling anybody to monetise data. Again in value chain terms, these marketplaces increase supply. An AI data marketplace will make it easy for people and increasingly agents and bots to recommend, price and therefore find value in different types of data. A market for data will lead to more efficient allocation of data, rather than giving it away for free or not using it at all. As more and more machines, individuals and organisations upload data to sell on a data marketplace, it becomes more attractive to data buyers. As this data commons grows with more datasets, it will attract more data buyers, creating powerful network effects. More than anything, decentralised AI data marketplaces are a bulwark to the rapacious AI data monopolies that have the potential to become the most powerful organisations ever built (if they aren’t already), controlling ever-increasing numbers of industries and markets with their superior AI capabilities. It is, for this reason, we invested in the Ocean Protocol, whose mission is “to unlock data, for more equitable outcomes for users of data, using a thoughtful application of both technology and governance.”

“The aim of Ocean Protocol is to to equalize the opportunity to access data, so that a much broader range of AI practitioners can create value from it, and in turn spread the power of data. To respect privacy needs, we must include privacy-preserving compute. Our practical goal is deploy a tokenised ecosystem that incentivizes for making AI data & services available. This network can be used as a foundational substrate to power a new ecosystem of data marketplaces, and more broadly, data sharing for the public good.” — 

Trent McConaghy, Co-Founder, BigChainDB & Ocean Protocol

We also expect to see these marketplaces become ever more automated and efficient. Another of our portfolio companies, Fetch, is building a solution that uses decentralised machine learning to enable marketplaces to self-evolve around popular or valuable datasets, improving discoverability. In some senses they are embedding marketplaces directly into the ledger to truly enable the machine economy.

Personal Data Marketplaces

After peer-to-peer payments, control of personal data has been one of the most talked about applications for blockchains. This is related to but separate from self-sovereign identity and SII networks like Sovrin, in the sense that once an individual controls their own identity, they can choose who can have access to it. The same principle can be applied to other personal data. This choice puts the individual in the position of the seller and the party who wants access to the data as the buyer. Personal data is an economic asset that we currently give up in return for services. Some data is handed over consciously, like entering an email address or a telephone number; other data is captured without us knowing about it: likes, tweets, our online behaviour and other forms of digital data exhaust. The value comes (albeit it is much less understood by individuals) when different datasets are aggregated, and an individual psycho-demographic profile is created and sold to all sorts of organisations like insurers, market researchers, and political organisations. A multi-billion dollar data industry exists just to trade personal data.

Individual pieces of personal data are not particularly valuable on their own. According to the Financial Times, general information such as age, gender or location is worth just 0.0005 dollars per person. Buyers will have to fork out 26 cents per person for lists of people with specific health conditions. Genomic data would likely fetch much more. The challenge is that at an individual level, there is very little economic value. Value comes in aggregate. This is where blockchains, self-sovereign identity, and personal data wallets combine.

In today’s Web 2.0 paradigm, Google, Facebook and other data monopolists capture the profit. In the future, blockchain infrastructure, self-sovereign identity and personal data marketplaces will empower individuals. They can choose to allow Google and Facebook to use their data, or they can auction it off to get the best price. They might decide to only sell general information, but not their genomic data. Others will rent access to genomic data to cancer research charities but not insurers. New business models will emerge as buyers give sellers discounts based on aggregating family data for instance and new startups will emerge differentiating on consumer trust. Metâme is a UK-based startup working on creating a universal unit of trade enabling bundles of personal data to be packaged and exchanged. A data marketplace is not necessarily about making the most money. It is about giving individuals choice and control of how they want to invest their most valuable economic asset.

“Self-sovereign personal data marketplaces need to address two key hurdles before they can take off: 1) the need for a universal unit of trade that transforms personal data into assets which people can tangibly trade and own, 2) ensuring anonymity and then incentivising consented identifiability as new legislation like GDPR effectively calls for anonymity by default. Without solutions to these problems personal data marketplaces cannot scale sustainably.”

Dele Atanda, CEO, Metâme Labs

Digital Assets Marketplaces

The final category of marketplace we expect to evolve are digital asset marketplaces. Unlike traditional physical assets or money, distributed ledger-based crypto-tokens can be programmable. This gives them more flexibility and variety than their physical counterparts. Cryptocurrencies, or tokens designed to be a medium of exchange, are already reasonably well-defined and projects are innovating around how to create the optimal token for this use in mind with rules around supply, distribution, privacy, and other attributes being tweaked. Cryptocurrencies confer the fact that the crypto-token is a medium of exchange. Most tokens are incorrectly referred to as cryptocurrencies. This is because Bitcoin began life as a cryptocurrency and has over the last ten years become more of a crypto-asset, predominantly because of the programmed deflationary economics (Layer 2 solutions like Lightning may change this classification however). However, currencies and assets require different economic designs. Currencies need to have a high velocity; assets need to retain and ideally increase value resulting in low velocity.

Broader than cryptocurrencies, digital assets will come to include all digitals assets that use distributed ledgers to create scarcity. Today there isn’t a clear distinction between cryptocurrencies and crypto-assets, but as the market matures, it will become more evident which tokens are designed to be a medium of exchange and which are designed to be a store of value. It is challenging to be both. Ether, for example, is intended to be used as a medium of exchange to redeem decentralised services from applications. But as its price rises, it becomes more of a store of value and less of a medium of exchange as holders refrain from redeeming Ether in anticipation of value appreciation. This non-fungible subclass of crypto-assets will be designed to be collectables and derive value through exclusivity and proof-of-ownership. Tooling for this is already emerging with the ERC 721 NFTs.

We expect to see a whole new ecosystem of digital assets like in-game weapons or costumes for gaming, AI bots and virtual avatar templates, such as those provided by SEED. Virtual reality land such as Decentraland; objects with real-world counterparts like digital twins from Spherity; and even digital to physical assets like 3D printed items, many of which will be collaboratively made and collectively owned. With digital scarcity comes the ability to artificially limit supply which has up until now been almost impossible with existing digital and Internet technologies.

The possibilities are endless and we are at the very beginning of a whole new age of digital assets created, bought, licensed, rented and sold in decentralised peer-to-peer marketplaces.


This excerpt is the latest in a series from Convergence Ecosystem vision paper. Go and read the full thing here which can be read here. Or take a look at the previous excerpts which have covered core themes from the paper:

Building the Internet of Blockchains

I have said it before so I will say it again, the new data value ecosystem see data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog.

Clickbait headline aside, if we aren’t careful we are going to end up replicating data silos we hoped blockchains and decentralised technologies would remove. This is why the transport layer of our investment thesis: the Convergence Ecosystem, is so important. For us the transport layer includes but not limited to four components: data interoperability like say haja networks; value interoperability like Polkadot, AION, Cosmos and atomic swaps; transport and messaging protocols like telehash and whisper, and state communication protocols like Lightning Network for Bitcoin, Raiden Network for Ethereum and IOTA’s Flash Channels.

The technologies of this layer are less mature than the layers below, but will become ever more critical as blockchains and DLTs proliferate if we are to avoid the same data silos that exist today in the Web 2.0 era. It is at this layer where interoperability protocols are developing for messaging, value, data and state — and we are beginning to see the contours of a so-called ‘Internet of blockchains’. In the full paper we explore each of the interoperability protocols, this blog is an extract of value and data interoperability.

Value Interoperability

Value interoperability across multiple blockchains refers to the ability of digital assets in one blockchain to interact with assets in another. The most straightforward example for an interoperable transaction would be one in which an individual transfers a cryptocurrency on one blockchain in exchange for cryptocurrency on another, for example, Bitcoin exchanged for Litecoin or XRP. Interoperability matters as it enables multiple ledgers to compound the benefits offered by each. Through limiting the flow of value in a blockchain to a single ledger, one risks creating new “decentralised” DLT-based siloes that cannot interact with each other at scale. By enabling ledgers to interact with one another with a communication protocol layer, improvements in security, speed, cost of transactions can be attained.

There are multiple approaches to obtaining interoperability, each with a focus on a specific function. One of the simplest forms is through a relayer. These utilities check for transactions in one chain and “relay” that information to another. BTC Relay, for instance, allows Ethereum smart contracts to verify a Bitcoin transaction without any intermediary. This enables Ethereum DApps and smart contracts to accept Bitcoin payments. A new generation of cross-chain transaction enablers allows exchanges to occur without a centralised party. Atomic cross chain swaps use hash time locked contracts to enable two parties to interact with tokens from different ledgers with each other without the need for an intermediary.

Atomic cross chain swaps will be crucial in creating a new generation of decentralised exchanges. Cosmos, Polkadot and Komodo are a handful of projects with an explicit focus on the space. Interoperability protocols also often enhance privacy through zero-knowledge proofs. They enable verifying the accuracy of a computation without knowing the variables involved. Through sending a transaction across multiple ledgers, tracking the source and recipient of a transaction can be made drastically more difficult. One could also consider decentralised exchanges such as EtherDelta as an interoperability enabler. Although restricted to ERC20 tokens, they allow individuals to trade their tokens for another one without relying on a central authority. One could trade their Storj tokens received as payments for leasing their computer’s storage space out and buy INS tokens to receive discounts at a retail outlet without having to move coins from their wallet with the help of the likes of 0x and Kyber. While decentralised exchanges come with new challenges — especially liquidity — they offer the promise of delivering significant security improvements over centralised exchanges.

Value interoperability will allow value that is stored in siloed blockchains to break free. This applies equally to value stored in both public and private blockchains. NEO is already enabling cross-chain asset agreements with NeoX. Users do not need to set up wallets for every blockchain they want to use and rely on third parties every time they have to interact on a different chain. Interoperability protocols further add value to the Convergence Ecosystem by allowing multiple industry-oriented tokens to communicate with each other. For instance, one could make payments in MIOTAs for leasing IoT based sensors that pass on data using the Ocean Protocol OCN token. Similarly protocols would be used in connecting and incentivising functions in mobility and robotics. A machine can pay for access to a resource in the native token of one ledger and receive the resource itself through another ledger. As projects and protocols start delivering real-world utility at scale, the need for exchange infrastructure will increase. One could compare these protocols to hubs that route value without an intermediary.

In a world of seamless value interoperability one can expect a complex interplay between users holding tokens for particular service utility and others for store-of-value; the wallet or ‘portfolio’ balance likely optimised by a personal AI. This AI will be personalised by risk appetite, values and services; the weighting of which will lead to a new field of TPO (token portfolio optimisation) an extension of search engine optimisation (SEO) and social media optimisation (SMO). If purchasing and holding tokens is a reflection of one’s values, it’s interesting to think that token portfolios could become a new sort of social or political badge.

Data Interoperability (Off-chain)

Today, incredible amounts of data are stored on the private servers of a relatively small amount of organisations. The internet’s client-server architecture makes data-sharing inconvenient, while privacy and data protection laws limit the cases where it can be done legally. Even if this were not to be the case, there is no rational economic incentive for individuals to do anything other than give away their data. While strides are being made towards increased data accessibility, such as open Application Programming Interfaces (APIs) and open-data regulations like PSD2, the benefits are one-sided. Indeed, users can now benefit from open data, but there is still no market, and data contributors remain largely unpaid. So, are blockchains the solution?

Blockchains are not databases; they are ledgers. It sounds almost flippant to say that, but the distinction is essential in understanding why data interoperability is just as important as value interoperability. Value interoperability means tokens can be moved across chains; data interoperability allows data to move across databases. Blockchains must be lightweight with limited on-chain storage so that “anyone” can download a full history of the blockchain. If blockchains become too large, fewer people will be able to participate in the network, thus reducing the decentralisation of the network and overall security. When it is said: “blockchains will enable large datasets to be shared or stored” actually it is not blockchains where the data itself will be stored. We are talking about decentralised and distributed data storages like IPFS and Swarm. Each blockchain implementation uses different data storage for “off-chain” data, and the balance between “on-chain” and “off-chain” data depends on the use case requirements. Just like the design of the Internet and the internet protocol suites, we expect blockchains to remain as light as possible to ensure speed and reliability; it will be the “off-chain” storage that will hold the majority of the data.

But what we must avoid is a world in which value is interoperable, but the underlying data is not; leading to the same monopolistic market dynamic as we have today. Projects like Haja Networks are vital in enabling data sharing throughout the ecosystem. We need protocols that permit data to be shared seamlessly across both centralised and decentralised databases. Innovations in cryptography such as zero-knowledge proofs, differential privacy, Fully Homomorphic Encryption (FHE), and secure Multi-party Computation (MPC) will enable data to remain private and secure but still move through public networks. Without data interoperability, the Convergence Ecosystem does not work.

Only when both value and data can be shared securely, can marketplaces be built that will drive the Convergence Ecosystem. Tune in next week for more on the importance of data marketplaces to the future Web 3.0 vision.


A Real Use Case for Blockchains: A Global Data Commons

This week I am continuing our series of excerpts from our Convergence Ecosystem vision paper, which can be read here. The previous instalments can be found here:

Today, I want to talk about a use case that only blockchain technology can deliver. They are few and far between and usually revolve around applications that demand censorship-resistance. Indeed, a public data commons demands just that.

The Convergence Ecosystem will lead to a global data commons. It’s not inevitable, but blockchains and distributed ledgers are disruptive technologies that change the structure of the data value chain. Yes, I know that word is overused, but in this case it is true. The point of value capture in the value chain will change. Instead of web companies capturing value and profit by controlling data, data could be stored on decentralised data file systems and blockchains making it accessible to all, not just a select few platforms that collected it.

In the Convergence Ecosystem, a few different technologies are all interconnected forming an authentication, validation and security layer: blockchains and distributed ledgers, decentralised consensus protocols, self-sovereign identity & reputation, and decentralised storage and data integrity. We believe the development in these four areas are contributing to the development of a data commons. These decentralised technologies are critical to the creation of a true global public utility. A public utility must be citizen owned and not be able to controlled by a single entity either Government or corporation. The transparency, usage provenance, tamper-evidence and censorship resistance features of blockchain technology are perfect for a global public utility.

Public blockchains are the ideal foundation for a data commons

Public blockchains are in many ways worse than existing databases. They are slower, have less storage, use more energy, and are less private. Sure, sharding, proof-of-stake or other proof-of-x, and privacy-protecting tools like multi-party computation and zk-snarks are attempting to address some of these issues. But the key thing to remember is that the original Bitcoin blockchain was designed specifically as a peer-to-peer digital cash systems; it is perfectly designed for that use case. The design choices were made to improve one feature: censorship resistance. Public blockchains aren’t owned or managed by one Government or company that can choose who views or uses it. This is what crypto people mean when they say blockchains cut out the middleman (although the so-called middleman will almost certainly integrate at another point in the value chain so it’s more appropriate to say blockchains will change where the middlemen make their money). Governments have traditionally had the ability to censor information and communication; but today Silicon Valley tech monopolies do on a global scale. Twitter, Facebook and Google have all come under fire recently because of their decisions to limit freedom of speech. If you control a network you pick and choose who uses it. This is too much power for a single entity.

We now have the tools to ensure no single entity controls data. With all communications, money, and health becoming digital; data infrastructure will be too valuable to be controlled by one nation or company. In fact, for individuals and society more broadly, global data infrastructure, just like the Internet, should be a public good. Never has so much data been available for collection and analysis. And everyone wants it. As sensors are embedded in everyday objects, and we move to a world of ubiquitous computing, everybody is fighting for who ‘owns’ the data. This is yesterday’s war. Public blockchains offer an open-source, decentralised, shared database that anyone can view and interact with based on programmable rules.

We are seeing the emergence of this new data infrastructure. We aren’t there yet, we still need to process more transactions at faster speeds and use less energy in doing so. Data needs to be private, stored in an accessible way, and shared across different blockchain flavours. We also need a way for individuals, organisations and machines to buy and sell data in a marketplace. The storage and access to data is important, but it will be the data marketplaces that finally provide a business model for data creators. There will finally be a way for people and machines to make the most of the data they collect. A marketplace provides an economic incentive for the more efficient allocation of data. Individuals can sell it instead of giving it away for free; organisations can monetize it instead of letting it sit idle on databases; and machines can buy and sell data automatically to increase their utility. In my view, a peer-to-peer marketplace for data is the second most important idea to come from the blockchain industry after peer-to-peer electronic cash.

A data commons give control back to users and limits monopoly control of the most valuable resource in the digital economy

2018 will see the beginnings of this global data sharing and monetisation network. Data creators will begin to earn money from uploads, likes and retweets. This is a far more profound change than it may seem. Disruption has typically come from startups offering seemingly inferior products that serve a niche which is underserved by the incumbent. Blockchain-based networks won’t just disrupt particular companies; they go much further, they disrupt a digital norm: the existing assumption that we should be giving away personal data for free. Digital monopolies including Facebook, Google and Amazon, get data from users for free. Every like, search and purchase feeds the learning system to further improve the algorithms; in turn bringing more customers and engagement. In value chain terms, data is supply, and AI algorithms are demand. Digital monopolies are searching everywhere for more and more data to feed their algorithms. Facebook buying WhatsApp and Instagram. Google with self-driving cars and Google Home. And Amazon with Alexa Echos and Dots.

Blockchains and decentralised public infrastructure change the game. Blockchains reduce the value of hoards of private data. It makes proprietary datasets much less valuable because as more and more machines, individuals and organisations use a public data infrastructure, a global data commons becomes more attractive to data sellers. As this data commons grows with more datasets, it will attract more data buyers, creating powerful network effects. In other words, data becomes more of a commodity; and it is no longer the source of value in and of itself. Firms that control supply — data — no longer dominate markets. The point of value capture in the value chain will change from data to brand and trust.

As data becomes less valuable, the customer relationship becomes ever more important. Startups and incumbents alike will compete for customers’ data based on trust. The global data commons will mean individuals will choose where their data is sold or rented. This global data commons will at first attract individuals that care about privacy and self-sovereign data. Machines will soon follow as machine operators and owners look for new revenue streams. Some organisations, especially the public sector, will be attracted by the non-corporate controlled nature of the decentralised infrastructure as well as the cost and liability reductions in not storing consumer data. Smaller organisations and startups will sign-up to access standardised data that would otherwise take too long or cost too much to acquire. Today, data is siloed with no business model for creators to monetise it. Blockchain technology and other decentralised infrastructure are emerging as a new data infrastructure to support machines, individuals and organisations to get paid for the data they generate. Blockchain-based data infrastructure, including data exchanges, will commoditise data and help realise the vision of a data commons and the first real global public utility.


Building a New Data Infrastructure with Blockchains and the Internet of Things

The Convergence Ecosystem is open-source, distributed, decentralised, automated and tokenised and we believe it is nothing less than an economic paradigm shift.

We are excited to introduce the Outlier Ventures vision of the future and our investment thesis: The Convergence Ecosystem. The new data value ecosystem see data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog, and for a deeper look into Blockchains, Community, & Crypto Governance have a read of my last post here. Today though I want to talk talk specifically about the convergence of blockchains and the Internet of Things.

Complexity versus simplicity

As the graphic above shows, data enters the ecosystem at the ‘data collection’ layer through either hardware (Internet of Things) or Software (Web, VR, AR, etc). In fact, early feedback on the paper has suggested what we are really talking about here is a new data value chain, and I agree with that to some extent. But of course, this is just a snapshot, a simplification of the emerging data value chain.

If your first thought upon reading the paper or looking at the graphic was “buzzword salad” or “this is too abstract, what are the actual products and protocols that need to be built?” well you are not alone. Indeed, thinking through the Convergence Ecosystem was a constant tension between complexity and simplification.

I felt that actually it was more important that non-technical people understood that all these seemingly disparate technologies were connected rather than I went into detail about the technical differences between Cosmos and Polkodot in addressing blockchain interoperability. This simplification can be seen at the data collection layer. I note the Internet of Things and software as the two entry points for data. This is purposefully broad, I had another attempt which separated hardware into types of devices — mobile, wearables, IoT devices, learning robots — but ultimately the ecosystem become to complex and overwhelming to understand for the layperson. With that in mind, I decided that any sensor measuring the external environment should be often bundled together under the umbrella term the ‘Internet of Things’; and this includes all sensors in smartphones and wearables such as gyroscopes, accelerometers, and proximity sensors as well as hundreds of others sensors measuring our world. As for software, well this is so broad as to include any data created from the digital environment regardless of application — augmented reality and virtual reality worlds, our digital exhaust from online activity, and gaming are just a few examples.

The key exercise isn’t to define exactly where data will come from. The key message is that the amount of data created annually will reach 180 zettabytes (one zettabyte is equal to one trillion gigabytes) by 2025 up from 4.4 zettabytes in 2013 and an average person anywhere will interact with connected devices every 18 seconds (nearly 4,800 times a day).

The so called Internet of ‘Things’

If you thought that the blockchain industry lacked a clear definition, well the internet of so called ‘things’ is even worse. The industry lacks a standard definition of the IoT, and in its broadest sense, it will come to include every physical object that has a sensor, microcontroller and Internet connection. Today that mainly means connected home devices like Amazon Echos, wearables like the Apple Watch, industrial and agricultural connected sensors, and smart meters measuring home energy usage. But the range of applications is growing, and it has been estimated that by 2024, the automotive industry will account for the almost a third of all IoT connections, followed by consumer electronics and FMCG (fast moving consumer goods) and the utility sector. Other sectors including Smart Cities, supply chain, manufacturing, healthcare and others will make up a relatively small proportion of the connections. The IoT market intersects with the robotics market in the sense that a robot has the same features as an IoT device, but with the addition of actuators and the means to move and respond to the environment. We would consider connected vehicles, service robots and other types of robotics as data collecting machines.

The IoT market is often measured in the number of connections — roughly 30 billion by the end of the decade — or the economic impact — 11 trillion dollars over the next decade says McKinsey. A less-asked question is: what happens to all the data? The same McKinsey study found we may be using as little as 1% of data being generated. As well as under-utilising data, how data is being used is unclear. In a survey by Ponemon Institute, 82% say IoT manufacturers had not provided any details about how their personal information is handled.

The emergence of distributed systems like IPFS, Filecoin, and other blockchains offers a potential new model for data storage and utilisation. It has been expected that data would be fought over by devices makers, software providers, cloud providers and data analytics companies. In fact, the reluctance of car makers to put Android Auto or Apple CarPlay into their cars is an awareness that they would lose control of valuable data.

So the key value proposition for distributed and decentralised systems in many cases isn’t actually ‘censorship resistance’ or ‘unstoppable payments’, it is actually a shared (but private) dataset of industry data, both transactional and otherwise. As we know we are still early in the development of the blockchain industry, we still need to prove and scale privacy technologies like zero-knowledge proofs, secure multi-party computation, and differential privacy. As well as increasing throughput of blockchains and linking blockchains robustly with off-chain databases for the volumes of data we expect to be generated from the IoT.

Very broadly speaking, decentralised technologies can provide shared data infrastructure whereby data use isn’t a zero-sum game. It is not longer a case of generating data and a use-it-or-lose-it model. The stack of technologies including blockchain-based marketplaces enable IoT data creators — machine-owned or human-owned — to buy and sell data.

Software is eating the world; and throwing off valuable data

Adding to the 50 billion IoT connections, we also need to add digital media and human-generated digital data. We are on our way to quantifying and digitising our external world, and we are even further along in gathering data on our digital lives. We use the term ‘software’ as a producer of data broadly to capture all personal and business data produced through the interaction with databases, operating systems, applications and APIs. These interactions build up digital dossiers including cookie and web browsing data as well as active traces like social media and messaging.

On the business side, as we continue to digitise and bring online our offline interactions and documents like electronic health records and learning records, key sectors will have an overwhelming amount of data to handle, which they do not have the capabilities to utilise. On the consumer side, digitally-created and digitally-augmented environments with augmented reality (AR) or virtual reality (VR) will lead the growth in available personal information.

Half the world’s population — 3.82 billion — will have an Internet connection by the end of 2018 and by 2020 it will be 4.17 billion. Mobile data traffic will grow to 49 exabytes per month by 2021, a sevenfold increase over 2016 according to Cisco. We are creating unfathomable amounts of data, and the growth shows no sign of abating. Adoption of AR and VR will further drive the amount and granular detail of data that we can collect, enabling deeper insights into individual and collective behaviours. Whether it’s from the IoT or software, we have a massive data problem.

IoT needs blockchains

We are creating and collecting more data than ever, but we are storing it in insecure private databases with no incentives to share the data. Data breaches and hacks are commonplace, and the data can be censored or tampered with. Software-generated data is lost, hoarded or latent. There is no reason for consumers to do anything other than to give data away for free and for corporations to hoard it.

Open-source, distributed, decentralised, automated and tokenised infrastructure offers a solution.


For more in how communities and tokens will integrate with the Internet of Things and Artificial Intelligence, read the full paper here.

The End of Scale: Blockchains, Community & Crypto Governance

We are excited to introduce the Outlier Ventures vision of the future and our investment thesis: The Convergence Ecosystem. The new data value ecosystem see data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. For a summary of the thesis take a look at the introductory blog, today I want to take a deeper look at how important communities, governance and politics will play in this new era.

The industry is rapidly experimenting with new (and old) consensus mechanisms and decision-making techniques to coordinate and govern the emerging token economy. This experimentation began with Bitcoin and spawned thousands of tokens each with different rules to encourage or discourage behaviours within the network and allocate resources. Tokens and automated decision-making tools allow for the mass decentralisation of entire industries through a distributed coordination network. These networks are birthing new types of resource allocation structures such as decentralised and autonomous organisations, pushing forward our conception of what an organisation should be.

Crypto Tokens

Cryptographically secure and digitally scarce tokens are the key innovation that makes a group of technologies into a living, breathing ecosystem.

Tokens are a native digital coordination mechanism for the Convergence Ecosystem. Until now we have been retrofitting a financial infrastructure designed for cash and cheques to the digital, software-defined era. Ever since the emergence of Bitcoin, it has been clear that distributed ledgers with automated consensus held the potential for new forms of asset and value exchange. It was not until the ERC20 smart contract on Ethereum that experimentation around digital and programmable money began at significant scale. There is now a mechanism to fund open-source protocols that would have previously struggled to raise financing because open-source lacked a business model. As Albert Wenger has noted: “Now, however, we have a new way of providing incentives for the creation of protocols and for governing their evolution.” In early 2018, we are still at the very beginning of this evolution.

Over the next year or so, we expect to see a much clearer delineation between two types of tokens: crypto-assets and cryptocurrencies.

Cryptocurrencies will be designed to be a medium of exchange and crypto-assets will be designed to be a store of value and offer utility in a digital economy. Despite the fact dominant token ecosystems have a element of both; design challenges abound when attempting to incentivise usage with digital scarcity. It is unclear if single-token systems like Bitcoin and Ethereum can provide a sustainable balance; instead it is likely we will see multi-token systems as a more effective mechanism.

Experimentation is happening at a rapid pace on both the supply and demand side.

We have tokens with a deflationary economy, scheduled inflation and others that let the community vote on how and when new tokens are minted and/or burned. That is just programmable money supply; we are also experimenting with demand-side economics: variable transaction fees, demurrage charges, interoperability, and different consensus rules. Non-fungible tokens such as cryptokitties and the new Ethereum ERC 721 NFTs will also impact demand by incorporating historical ownership creating a subclass of crypto-assets called crypto-collectables. In addition, a currently underutilized token model is the crypto-consumable, a token that is programmed to reduce in value over time using a decay or burn function. This could be a continuous decline in value like a used car or a step decline like a ticket to a live event. This sort of token design would not be a store-of-value and would be be a powerful way to increase network token velocity. Our head of cryptoeconomics Eden Dhaliwal is working with Imperial College London and our portfolio companies to experiment and design sustainable token ecoomies.

Today the industry is focused on the initial distribution of these tokens in generation events. But the initial distribution is just one stage of building a sustainable ecosystem.

Token distribution schedules will become more sophisticated over time to include staged releases like traditional equity fundraises and mechanisms such as airdrops or token faucets. Continued network engagement will separate successful networks from unsuccessful ones. 2018 and beyond will show that the much of the ICO class of 2017 was prepared for initial distributions but underprepared for sustainable growth and utility. It must be remembered that prior to 2017, tokens were distributed to the network in exchange for utility; Bitcoin distributes Bitcoin as a reward for the secure clearing and settling of Bitcoin transactions. By giving away the majority of tokens upfront, many 2017 ICO projects are left with few tokens to reinvigorate demand later down the line. Most projects will fail, but the open-source nature of the ecosystem means learnings and code will be available to all. We can learn and build faster than ever. Unlike economic modelling or theory, the industry is testing economic theories in real-time with real money. It is the greatest experiment in socio-economics we have ever seen.

Tokens are the first native coordination mechanism for the digital and now machine economy.

We expect tokens to be issued at each layer of the stack to incentivise behaviours within each particular network and to connect with the broader ecosystem through a series of exchanges and interoperability protocols. The model would be similar to today’s global economy in which each nation issues and uses their own currency within their own borders and trades foreign currency with other countries for products and services that it needs. If Bitcoin is indeed the digital store-of-value in the same way gold is the physical store-of-value, it is likely we will see a digital hierarchy of money emerge with Bitcoin as an apex token, protocol tokens like ethereum, NEO and CARDANO Labo below Bitcoin, and utility or application tokens below the protocol tokens. As the Convergence economy develops and core infrastructure is developed, tokens will become increasingly liquid and frictionless leading to extraordinarily complex economic dynamics.


Communities & Governance

Tokens themselves are simply a type of value instrument, the rules under which these instruments are generated, distributed and managed are decided by community members through agreed governance rules.

These governance rules are set and decided by the community members using different forms of decision-making. For protocol tokens like Bitcoin, Ethereum et al., governance includes decision-making on changes to the network. The explosion of tokens and blockchain-based networks has led to a renaissance in thinking about governance, especially decentralised governance.

We have the Bitcoin network with a strong libertarian value-system valuing decentralisation above all else. And therefore there is a separation of ‘powers’ between developers, miners and users; no one stakeholder group can ‘force’ a decision on the others. This results in a very slow-moving but stable network. Ethereum, while still aiming to be a decentralised network, does not have quite as strong libertarian streak but does have more leadership with Vitalik Buterin who is often able to push through changes because the community follows his lead. See the 2016 summer fork to return funds lost through the DAO bug.

New projects are experimenting with automated governance in an attempt to avoid messy human decision-making.

Tezos is hoping to enable governance to be ‘upgraded’ through community voting. dfinity-network is doing something similar but allowing retroactive changes to the ledger. These types of ‘on-chain’ governance as they are known are still technically immature and open up a whole new attack vector. Other projects like Augur and Gnosis are testing futarchy, a type of voting model in which the community defines a set of values and then prediction markets are used to decide which decisions will maximise those values.

We are also seeing exciting experiments with curation markets and reputation staking from projects like Colony and @Ocean_protocol. This type of decentralised and automated model is extended further with decentralised autonomous organisations (DAOs). In these sorts of organisations, all decision-making is offloaded to smart contracts and decisions would be automated based on the rules encoded in the smart contracts. One of the first examples was of course TheDAO, a DAO for venture funding, that was never able to allocate capital after a bug was exploited. Other live examples include Dash, a privacy-focused cryptocurrency; Digix, a gold payment system project; and Aragon, a platform hoping to provide the entire governance service for other token projects.

The end-point of blockchain-based automation will come through AI DAOs as articulated by Trent McConaghy These theoretical organisations will be managed and owned by AI algorithms enabling AI to interact in the economy by earning and spending tokens. An AI could own a fleet of self-driving vehicles, charging fares which it then uses to pay for maintenance, tolls, insurance, and taxes.

Blockchains and tokens will be issued, distributed, governed and owned in increasingly diverse ways. Governing models will evolve and we are likely to see an industry with multi-types of governance each co-evolving around the belief-systems of the community they serve.

Bitcoin will remain staunchly libertarian; Ethereum has more of a central leadership which appeals to pragmatic developers; and self-sovereign identity underpins the value-system of the Sovrin Foundation blockchain. We will soon see more projects with social democratic values that prioritise wealth redistribution through ‘network’ (read: State) intervention or pre-agreed taxation rules. Others will prioritise ethical and environmental values with green-friendly policies that use non-consumption based consensus mechanisms (eg Chia) and focus on common-ownership and resource sharing.

“The Convergence Ecosystem should support a diverse range of different governing models that support different communities. There is no optimal model of governance; only a perpetual tension to maintain alignment amongst stakeholders.”

We have millenia of literature exploring politics and governance, everything from Plato’s five regimes to John Locke’s libertarianism to Jeremy Bentham’s utilitarianism. Philosophers and political scientists will never settle on an ‘optimal’ governance model because ‘optimal’ can only exist for individuals in limited contexts never for society at large.

As with almost all information and communications technologies that have come before, blockchain technology was born decentralised.

Bitcoin with the first blockchain implementation was a libertarian movement created as a direct reaction to a centralised financial system. Early adopters shared this value-system. As more and more blockchains and tokens are created, the industry attracts an audience with different belief-systems. As it continues to mature, different communities will have unique objectives and priorities that will require specific design trade-offs. The financial community requires more and faster transactions and will sacrifice decentralised consensus to achieve that, as can be seen with Ripple and it’s XRP token. The healthcare community must adhere to privacy regulations and so will require more privacy than public blockchains currently afford. The ecosystem will support a variety of communities using different governance models with differing levels of decentralisation and automation depending on the values of the community and the needs of the market.

We are in the very early stages of understanding how to design token economies and the governance models that support them.

As an industry, we must be more supportive of new ideas and implementations. It is not a zero-sum game in a growing market. Some tokens, communities and governance experiments will fail. Let’s learn quickly from their failures and compound learnings.

The biggest advantage the decentralisation community has is momentum and the brightest minds from around the world are working together to solve tough problems. Communities will co-exist and thrive. Let’s be inclusive and supportive.


For more on how crypto-communities and crypto-tokens will integrate with the Internet of Things and Artificial Intelligence, read the full paper here.

Disrupting Tech Monopolies & AI Tycoons — Part 1

Blockchains & Artificial Intelligence is not just a technical innovation: it’s an economic paradigm shift 💰💰💰


>>Blockchains combined with AI will create the conditions for disruption of platform monopolies>>

>> As data stops being a competitive advantage, powerful token-driven network effects will lead to AI agents using blockchains to accumulate tokens>>

>>This will lead to profound questions about the how we govern non-human entities in the economy and society. 🤔 🤔


Data is siloed…

It is almost banal to say it, but as a society we have a data problem. Most of the World’s data is held on private servers. The client-server architecture of the Internet and the design of corporate IT networks has resulted in data being hosted and stored on private and centralised databases. Lucrative consulting businesses sprung up to help organisations connect systems and try to make it easier share their data internally. Open Application Programming Interfaces (APIs) have gone some way into opening up data externally, especially in the public sector, but nevertheless these fixes are typically forced upon organisations. PSD2 is supposed to force banks to open up their data but they are doing their best to wriggle out of it. See Chris Skinner’s blog here.

Regulation and data privacy further limit sharing…

Even when organisations might be set up to share data and have a culture of sharing and collaboration, privacy laws and data protection legislation has had a chilling effect on data sharing. The Health Insurance Portability and Accountability Act (HIPAA) in the US or the Data Protection Act in the UK explicitly state what data can and cannot be shared. But these are complicated policies. The technical difficulty of implementing data policies, combined with the degradation of user experience those implementations produce, make IT systems costly, inefficient, and unpleasant to use. On top of which the security theatre that surrounds them only adds to the insecurity of participating with such systems. It’s just not worth it for people to go through the risk and hassle of sharing data.

It’s also really hard to make any money from sharing data…

Even when sharing is encouraged internally, technically possible with Open APIs, and legislation is favourable, current data infrastructure makes it difficult to monetise data. The best that happens is that external developers or citizens get free access to data. Lovely stuff for the user of the data, not so much for the publisher. There is no good way to get paid appropriately for published data. Some content can be published using an open-source license, sure, and attribution is great, but it doesn’t pay the bills. Dual-licensing for software is possible. See Oracle’s MySQL database which is dual-licensed under a commercial proprietary license and under the GPLv2 license. But in most cases licensing is costly, difficult to enforce and inefficient. Licensing of personal data is officially non-existent. Despite that in developing countries companies make money from this value. Around the world doctors use Facebook’s WhatsApp to send medical reports, nurses use Gmail to provide remedial advice, and that data, while not licensed, is published to advertisers as users uncaringly share the data.

So basically there is no business model for data sharing…

The fact is today there is no rational economic incentive for individuals to do anything other than give data away for free and for corporations to hoard it. If only there was a solution…


Blockchains are terrible databases, stop going on about it…

First and foremost, despite my clickbait-y headline I want to get this out of the way: public blockchains are in most ways worse than existing databases. They are slower, have less storage, are extremely energy-inefficient and in most cases less private (although zero-knowledge proofs 👏, self-sovereign identity 👏 and Masked Authenticated Messaging (MAM) 👏 will help this). But these are design choices are made to improve one feature: decentralisation. By decentralisation I mean the elimination of a central administrator which leads to extreme fault tolerance and increased data integrity and tamper-evidence. The removal of centralised control and vulnerable centralised data repositories has trade-offs which make most blockchains unsuitable for a ton of use cases that have been spoken about. But in cases where security and tamper-evidence is more important than throughput, speed, capacity, and stable governance, well then public blockchains are well worth exploring. For more on this Adam Ludwin of Chain explains it better than anyone in this post.

Okay, so it’s more secure, but nobody cares about security…

So the question becomes: how important is security to individuals, corporates and Governments, right? The bull case for blockchains is that it matters a lot and it will matter more in the future. If security continues to be an afterthought, well existing databases are cheaper, faster and more convenient, so why bother with blockchains at all?

Well, I tend to believe that security will become more important. Things like the Yahoo! or Equifax hacks certainly shine a light on the vulnerability of centralised data providers but tbh I really don’t think individuals are going to demand change. People are going mad for Amazon Echos and Dots and sticking them in every room, very few people are asking: what data is actually being collected? Where is it being stored? Is it encrypted? How can it be combined with other datasets? Security and data protection matters far more to business and to Government and the so-called Internet of Things will force the change.

Blockchains will actually help manage the sharing of data instead of fighting over it…

Never has so much data been available for collection and analysis. Connected cars are throwing off vast amounts of data, the challenge is that every single stakeholder wants access for their own purposes: car makers want it to improve the driving experience; tire makers want it to see how their tires perform; City administrations want it for traffic prediction; and software makers want it to improve their self-driving software. As sensors are embedded in all sorts of everyday objects, everybody is fighting for who ‘owns’ the data. This is fighting yesterday’s war. Blockchains can provide an open, shared data layer in which all stakeholders have access to data.

So blockchains will actually be quite useful in securing and sharing data…

Sure, not every bit of data will need a fully decentralised blockchain with proof-of-work. In most cases a simple distributed ledger with a Merkle tree will suffice (see DeepMind Health Verifiable Data Audit). Much of the data could even be stored off-chain with just links to the on-chain hash. Regardless of the blockchain flavour, cryptographically-secured distributed ledgers offer a better alternative than centralised databases. Of course, an assumption here is that blockchains don’t suffer the familiar fate of incompatibility by competing blockchains. The community does seem to be fully behind blockchain-connecting projects like Polkadot, Cosmos, Atomic Swap and AION. These services combined with zero-knowledge proofs mean data can be shared privately on public ledgers. At this point, we are close to the ideal of a globally shared database with easy and, ideally, public permissions.

Add in data exchanges…

Now, the final piece. Data exchanges like the Ocean Protocol bring together data buyers and sellers (also including bots and devices). As explained, today data is either given away for free or sits underutilised because people and organisations have no way to monetise it. A blockchain-based data exchange can enforce data quality standards, ownership and usage rules, and pay sellers to rent or sell data. A data exchange provides the missing component to a shared ledger: a business model. People and organisations can easily earn money from their data.

Now we have the infrastructure to share and monetise data…

Sure, people won’t just stop using Google or Facebook tomorrow. The value they provide is far too great. But these new networks will change the conversation. The public will begin reading news stories about how they can be paid when people download their pictures or paid when they upload their smart watch data.

The seed of disruption has been planted. Why allow yourself to be sold for nothing when you can get paid?


If this piqued your interest and you want to know more about the implications Part 2 is available to read here.

Thanks for reading 👍. If you enjoyed it, please hit the 👏 button and share on Twitter and Linkedin.


Thanks to Aron Van Ammers, Joel John & Jason Manolopoulos of Outlier Ventures; Jeremy Barnett a Barrister at St Paul’s Chambers; Omar Rahim of Energi Mine; Toby Simpson of FETCH.ai (an Outlier portfolio company); Mark Stephen Meadows of Botanic Technologies (an Outlier portfolio company) for reviewing and providing valuable feedback. Also to Trent McConaghy for driving and pushing forward the thinking on AI DAOs.

TheDAO — The End of the Corporation?


It’s bold, risky and it could change the world. Or it could fizzle out completely. It’s called TheDAO (for now), and it is a groundbreaking attempt at using the blockchain to change what a company is in the digital age.

Let’s start at the beginning

First, what is a DAO?

DAO stands for decentralised autonomous organisation. A DAO is a self-organizing entity in which software acts according to a set of rules encoded on a blockchain. The rules are immutable and transparent for everybody to see. DAOs are the zenith of digital business. A real digital business does not have a ‘chief digital officer’, or undergoing a ‘digital transformation’. It is an attempt to rethink the very fundamentals of what a business can be with today’s technology, unencumbered by legacy systems, legacy legal structures and most profoundly, legacy thinking.

So, what is TheDAO?


TheDAO is an open source computer programme that works like an investment club. It exists on Ethereum, a decentralised computation platform with a built-in cryptocurrency called Ether. Ether has a real-world value in currencies like US Dollars or Pound Sterling because it can be traded on exchanges. Being an Ethereum programme, TheDAO can be the sovereign controller of an amount of Ether, and send this to others according to its hardcoded rules. TheDAO has semi-anonymous members, no board, no physical presence, no website, not even a legal entity.

TheDAO is not the first attempt at launching a DAO, nor the first attempt to launch a decentralised funding platform. It is however by far the most successful attempt up to now. At time of writing, it has raised $127 million making it the highest funded crowdfunding project of all time.

One theory to explain how one initiative could suddenly raise such a large sum of funds is that holders of Ether saw their crypto-assets sharply appreciate in value since the Ethereum crowdsale, but didn’t have many options to further invest their wealth. Founder and COO of Slock.it and prominent contributor to TheDAO, Stephan Tual, states that giving focus to “sleeping ether” is indeed one possible use of TheDAO:

How does it work?

Investment by TheDAO is structured through Proposals. Real-world businesses or persons called Contractors present an investment Proposal to TheDAO in the form of an Ethereum programme and a description in plain English. A Proposal can describe anything from a lemonade store to a space tourism operator. Anything goes as long as there’s a projected return on investment for TheDAO. Backers of TheDAO vote on Proposals through the Ethereum network, and votes are fully traceable and transparent.

Now, if a Proposal, say for a lemonade store, is accepted by vote, the requested amount of Ether is released. Ideally, the Contractor builds the lemonade store, sells a lot of tasty lemonade, makes a healthy profit and returns the agreed percentage of profit to TheDAO. In a less successful outcome, the project might run a loss and no profits returned to TheDAO.

What can be funded?


Considering the background of the project and the size of the fund, Proposals will likely not be lemonade stores but decentralised applications. From its creation, TheDAO already has enough credibility to become a serious alternative for other forms of early stage startup funding. There are no limitations for any backer of TheDAO to submit a Proposal, have it voted upon and possibly funded. Confusingly, TheDAO does have Curators, whose job is to validate the identity of the Contractors, but they do not provide curation per se, and certainly do not make a judgment on the Proposal. Gavin Wood, one of the original Curators resigned from his role as Curator emphasizing that he felt the communication had been misleading. Curation and due diligence are not modeled as a core part of TheDAO itself. Instead, TheDAO is supposed to hire experts to provide these services with its funds. All of this means there are little safeguards for investors.

The lack of limitations raises questions of illegality. What happens if a crime is committed using funds from a DAO? TheDAO is not a legal entity. Its funders are legal entities, but they are semi-anonymous, not directly connected to the decision, and possibly in many different jurisdictions. Should they all be prosecuted? The legal industry needs to start thinking about these questions.

Why does it matter?


It matters because TheDAO, and DAOs more broadly, is the first serious attempt at rethinking what a company should look like in a digital and globalised world. Existing governance and business processes were designed to work with employees in factories and offices locally. Board meetings, contracts, financing and investment, certificates of equity, and legal contracts are all processes designed with the assumption that governance and management occur face to face with a handshake and a signature. Sure, we now have video conferencing and digital document signing, but these are tools designed with the previous assumptions in mind.

TheDAO makes new assumptions; locality doesn’t matter. The Curators, Contractors and Token Owners can be anywhere and engage in economic activity. This is a paradigm shift. We will need to rethink all of our old assumptions. Why do we need offices? What is the role of middle management in an era of smart contracts? How do nation states legislate, audit and protect the consumer if TheDAO does not operate in that state?

DAOs could outcompete traditional corporate structures. Like the networked working trend is challenging the traditional model of labour with individuals working in a single job for a long period, DAOs could disrupt the corporation itself by offering a more fluid funding model with networked shareholders. The whole decision-making process is potentially more efficient, scalable and diverse. DAOs provide the framework to realise the potential of the blockchain — removing intermediaries in marketplaces. Who would have thought this would include the company itself?

What are the risks?

It’s easy to say that locality doesn’t matter, but the fact is, in the world today: where a business is based, does matter. There are three important risk areas that DAOs have to face in today’s world; economic, technical and legal.

Economic

The holdings of TheDAO are, and always will be, held in Ether, a highly volatile crypto-asset. For example, the total size of the fund (around 11 million Ether at the time of writing) has been worth anywhere from $90M to $154M in just the past two months. Likely to a majority of backers of TheDAO this is a non-issue as they were already invested in Ether. To a prospective investor who holds capital in fiat currency, however, this poses an additional risk over the risk of financial success or failure of the individual projects that TheDAO invests in.

On the side being funded, Contractors receive their funding in Ether and pay their profits to TheDAO in Ether. As Contractors are real-world businesses, they likely have fixed costs that are specified in fiat currency, and could receive their revenue in fiat currency as well. Trading in multiple currencies has always come with a risk, even to current businesses, but the currency risk of transacting in Ether is of another level.

Technical

In any investment in technology, there is a technical risk. Administration systems might contain errors, bank transfers might fail, developed products might be faulty. In this case, the impact of a technical error is greater than in traditional investment vehicles. TheDAO exists only as technology. Hence, if a technical error in the underlying Ethereum network or TheDAO itself causes for example loss of funds, those funds are irrevocably lost. There is no bank to call. The developers of TheDAO aim to avoid this through careful development, testing and thorough auditing, but there are no guarantees.

A first small bug in the code of TheDAO has already been found in practice. The impact of the bug was low: the price increase of tokens in the crowdfunding sale was started one day later that planned. It does make clear that even well-tested code can lead to unexpected results. In the case of TheDAO those results are irrevocable.

Legal

The wide suite of international laws relating to commerce and equity raising are simply not ready for TheDAO:

First, investors and companies are increasingly regulated as to how investment opportunities are marketed and executed. Those regulations are quite different from country to country around the world but none of those regulations fully contemplate a structure such as TheDAO. That will mean increased risk for founders raising money in this way as it will be unclear as to whether the fundraising is legally permitted, especially as the owners of DAO tokens are not guaranteed to have qualifications to make sophisticated investments.

Second, due diligence and a suite of investor protections that typically sit in (non-smart) shareholder agreements will need re-visiting. These are areas that are not necessarily legal requirements but investing is all about “buyer beware” — investors need to do the thinking up front. Not just about the business opportunity at hand, but also about the management team’s engagement and ensuring that IP/goodwill of any particular venture is being created in an entity that can be sold or floated to crystallise value for the investors. Traditional due diligence and contractual remedies when things go wrong all need re-visiting in a DAO scenario.

Finally, that point around value creation is fundamental. How will the investors make their return? The best tech companies are floated or sold to a strategic buyer. Can a DAO be floated? Not yet. Can a DAO be sold? Technically yes, but is that likely? Perhaps not right now but it’s not impossible. It will be quite mind-expanding for almost all strategic buyers currently, though.

DAOs as a disruptive innovation

The Dutch East India Company, the 1st multinational company in the world

We are at the beginning of a bold experiment into what a company should and can be in the digital age. Almost all businesses today are aware of disruptive innovation and worry about new competitors undermining their business models with new technology. But they may have all been looking in the wrong place. It might not be the business model that’s undermined; it could be the whole corporate structure itself.

The crowd sale of tokens runs until the 29th of May 2016.

This post was originally posted on the Outlier Ventures blog. If you enjoyed the post, please recommend and share! Thanks

Disclosure: The authors have no affiliation with TheDAO and are not invested in its tokens.