What Does Artificial Intelligence Have To Do With Blockchains?

What does artificial intelligence have to do with blockchains? Well it’s actually helpful to remove the buzzwords and talk about a new decentralised data value ecosystem in which data is produced, distributed and consumed. Using that framing, the Internet of Things produces data, blockchains and other authentication technologies distribute it, and then the data needs to be processed, analysed and automated. This is where so-called smart contracts, decentralised compute and decentralised machine learning can be used on data in decentralised databases, document stores and blockchains.

It is here where innovations in the blockchain and artificial intelligence communities blur and it becomes clear they are intertwined and interconnected. Both smart contracts and machine learning offer differing levels of automation and decentralisation depending on the type of input data and level of trust the use case demands.

Distributed Compute

Distributed compute refers to computing whereby a complex problem is broken down into more simple tasks. These simple problems are distributed out to a network of trusted computers to be solved in parallel, and then the solutions to these simple problems are combined in such a way to solve the main problem at hand. This is quite similar to how processors (CPUs and GPUs) developed from single-core to multi-core on the same circuit, and multiple cores were used to solve a problem more quickly than one core by itself. Although a simple premise, the other computers need to be trusted for the system to work. Conversely, blockchains and ledgers may be used to create networks of computers through a ‘trust framework’ and to incentivise these nodes to work together, rewarding those who solve these simple problems with tokens that have a financial value no matter how small.Blockchain projects including Golem and iExec are actively solving this problem. Other projects like Truebit are working towards off-chain computation in a trustless way using a prover-verifier game. Verifiable and non-verifiable distributed processing will both be needed depending on the level of trust between participants in the network. Interestingly, we could finally could see the realisation of the National Science Foundation Network (NSFNET) project from the 1980s, a supercomputer on-demand for any computing task. Other distributed computing projects like Nyriad are looking to achieve hyper-scale storage processing but without tokens using a concept called ‘liquid data’.

Quantum computing is different to distributed computing in that it looks to solve problems that cannot be solved by existing computers (read: Turing Machines). By using quantum particles, the nascent technology has the potential to test all potential solutions to problems in one go in a single machine, rather than a network of machines. These machines pose a potential threat to blockchain technology because they are reliant on public key cryptography (also commonly used in banking for credit card security) which is made secure based on the difficulty of finding prime factors for huge numbers. These problems would typically take many hundreds or even several thousands of years to solve, but with quantum computers, this timeframe could be reduced to hours or minutes. Companies like IBM, Rigetti and D-Wave, are driving progress in the field.

Parallelisation is the thread that ties together distributed computing and quantum computing. On the one hand, distributed computing involves networks of computers that look to solve a problem by solving smaller problems in parallel, while in quantum computing one computer is solving many complex problems simultaneously. In both cases, we can start to rely on networks of incentivised machines to solve computational challenges, rather than servers owned by centralised entities. From an incentivisation perspective, blockchains enable these networks to work efficiently and ‘trustlessly’ with a token powering a marketplace of nodes with computing power. Quantum computers could also form part of these networks, solving the specific problems that the classical computers could not.

Smart Contracts

There are currently a handful of smart contracts blockchain platforms that have successfully captured the market. According to Etherscan there are 93039 ERC20 token contracts. Waves, NEO and Stellar, are all developing their own standards in an attempt to challenge Ethereum’s dominance. In a nutshell, smart contracts are programmable “if this, then that” conditions attached to transactions on the blockchain. If situation ‘A’ occurs, the contract is coded to have an automated response ‘B’. This idea isn’t new, and we can find examples all around us, such as in vending machines: if button ‘A’ is pressed, then ‘X’ amount is required; if ‘X’ amount is paid, then snack ‘B’ is dispensed. By adding this simple concept to blockchains, contracts cannot be forged, changed, or destroyed without an audit trail. This is because the ledger distributes identical copies of that contract across a network of nodes, for verification by anyone at any time. When transparency can be guaranteed, these contracts now become possible in industries which would have previously deemed them too risky.

With embedded legal frameworks, smart contracts have the potential to replace and automate many existing paper contracts. Mattereum is working on such legally-enforceable smart contracts. The process of buying a house could become more efficient with no banks, lawyers, or estate agents. Countless hours, expenses and middle-men can be condensed into a few dozen lines of code and an automated product. This automation principle in blockchain-based smart contracts applies to any industry which requires trusted third parties to oversee agreements. Contracts are only as good as their enforcement, so decentralised dispute resolution services are necessary to make smart contracts useful. Early efforts in this direction are utilising prediction markets and reputation staking tools as with Kleros.

With the rapid development and convergence of AI and decentralised networks, we will begin to see more complex smart contracts develop, such as contracts which are connected to expansive neural networks. The development of these systems could see inconsistencies being found in legal frameworks, resulting in a more robust legal system. Smart contracts would be built upon those legal models, within which AI must comply. It is still early in the development cycle of smart contracts and progress with require collaboration from the legal industry as well as lawmakers in Governments; smart contracts should be seen as the legal rails for the digital world. If tokens are the beginnings of digitally-native money and financial assets; smart contracts are the beginning of a digitally-native legal system. Smart contracts as with distributed computation and decentralised machine learning will automate data in the Convergence Ecosystem creating unprecedented levels of automation within auditable parameters.

Decentralised Machine Learning

Machine learning is a field within artificial intelligence that focuses on enabling computers to learn rather than be explicitly programmed. More traditional AI approaches based on rules and symbols are not capable of capturing the complex statistical patterns present in natural environments such as visual and auditory scenes, and our everyday modes of interaction such as movement and language. A relatively recent breakthrough in machine learning called deep learning is currently driving progress in the field (however for how much longer is up for debate). Deep learning techniques are ‘deep’ because they use multiple layers of information processing stages to identify patterns in data. The different layers train the system to understand structures within data. In fact, deep learning as a technique is not new but combined with big data, more computing power, and parallel computing it has become increasingly accurate in previously challenging tasks such as computer vision and natural language processing. The most recent breakthroughs in transfer learning and strategic play comes from the combination of deep learning and reinforcement learning as with DeepMind’s AlphaGo.

Machine and deep learning techniques can transform raw data into actionable knowledge; converting voice input into text output in voice-to-text programs or turning LIDAR input into a driving decision. In diverse fields including image and speech recognition, medical diagnosis, and fraud detection, machine learning is equipping us with the ability to learn from large amounts of data. The current machine learning paradigm is where solutions are delivered as cloud-based APIs by a few leading companies. But it is becoming increasingly apparent that this paradigm is not sustainable.

“Data and services are costly to use and can’t sell themselves. It’s staggering to consider all that gets lost without its value ever being realised — especially when it comes to intelligence constructed about markets and data. We simply can’t let all that value be captured by a select few. Fetch has a mission to build an open, decentralised, tokenised network that self-organises and learns how to connect those with value to those who need it, or indeed may need it; creating a more equitable future for all.“ Toby Simpson, Co-founder, Fetch

As per the theme of the Convergence paper in general, centralised systems suffer from a few fundamental problems: inability to coordinate globally, limits on collaboration and interoperability, and the tendency toward market monopoly and censorship behaviours. With machine learning becoming integral to our lives, centralised machine learning is a threat to both economic competition and freedom of speech.

The Convergence Ecosystem if realised provides global data sharing and marketplace infrastructure enabling AIs to collaborate and coordinate processing in a decentralised way. Removing centralised bottlenecks for heavy computational workloads and helps address latency issues reducing the time needed to train models. On-device training like Google’s Federated Learning model is a technical improvement but lacks the ability for mass coordination using marketplaces and tokens.

Decentralised machine learning not only provides a coordination mechanism for the more efficient allocation of resources, it increases access to machine learning capabilities but allowing anyone to submit models and algorithms and get paid based on quality and utility. SingularityNET, doc.ai and Fetch (a portfolio company) are examples of companies already building the type of decentralised artificial intelligence described. Decentralised machine learning will be the result but would not be possible without the development of distributed ledgers, consensus, identity, reputation, interoperability protocols and data marketplaces.

We must avoid the “disconnected and dysfunctional “villages” of specialization” as Alexander von Humboldt put it and instead aim for a holistic view to see the connectedness of seemingly disparate technological innovations.

Read the full Convergence paper here, or go back and read the rest of the abridged Convergence articles: :

VC for The Decentralised Future: Introducing the Convergence Ecosystem

Today we are introducing the Outlier Ventures vision of the future and refined thesis: The Convergence Ecosystem.

The Ecosystem sees data captured by the Internet of Things, managed by blockchains, automated by artificial intelligence, and all incentivised using crypto-tokens. The Convergence Ecosystem is open-source, distributed, decentralised, automated and tokenised and we believe it is nothing less than an economic paradigm shift.

How We Got Here: The Outlier Journey

From Blockchain-enabled Convergence

In late 2016, we published a paper titled: ‘Blockchain-enabled Convergence’ outlining our investment strategy. The paper was the result of over three years’ experience researching, investing and building blockchain-based businesses. Our insight was that blockchains are not just a secure ledger for cryptocurrencies and other digital assets, but that they represented something more transformative: a decentralised data infrastructure. Infrastructure that could solve technical and market problems across a variety of emerging technologies like artificial intelligence, autonomous robotics, the Internet of Things, 3D printing and augmented and virtual reality.

In 2017, crypto-tokens proved they are the first digitally-native mass coordination mechanism

2017 saw a vast change in the cryptocurrency and blockchain markets to arguably the peak of inflated expectations as per the Gartner Hype Cycle. The ERC20 smart contract industrialised the token sale crowdfunding model, raising over 4 billion dollars in funding. Despite misplaced energy and too much focus on token prices, it is now clear, in a way that wasn’t in late 2016, that crypto-tokens are a critical missing component in decentralised networks — the first digitally-native mass coordination mechanism for humans, bots and machines. Recognising the underlying importance of crypto-tokens to create an ecosystem of converging technologies, we started investing.

From IOTA, Botanic & SEED, Evernym & Sovrin, to Fetch and Ocean

Over the last year we have partnered with and invested in IOTA, a foundation building Internet of Things infrastructure with a new type of decentralised data structure. Botanic and the SEED Vault foundation it founded, creating a platform for developers to publish trusted software bots. Evernym, a company using the Sovrin Network and Protocol to establish self-sovereign identity. Fetch, a startup building an emergent intelligence protocol combining distributed ledgers with machine learning. And most recently, Ocean Protocol, who are developing a decentralised data exchange protocol to unlock data for AI. Each of these investments have been strategically chosen because they are a complimentary piece of decentralised infrastructure required to create the Convergence Ecosystem.

Why We Need The Convergence Ecosystem

Centralised Web 2.0 has failed…

Centralised Web 2.0 digital infrastructure has failed. Too many hacks and data leaks. No individual privacy. Monopoly control over global information and communities networks. The Internet of Things is creating an unmanageable data environment, and artificial intelligence is giving those who control the most data more power than any company in history. As Tim-Berners Lee, the creator of the Web, recently wrote;

“What’s more, the fact that power is concentrated among so few companies has made it possible to weaponise the web at scale. In recent years, we’ve seen conspiracy theories trend on social media platforms, fake Twitter and Facebook accounts stoke social tensions, external actors interfere in elections, and criminals steal troves of personal data.”

Something must change.

We are 10 years into the decentralisation revolution

It has been ten years since the publication of Satoshi’s seminal paper and the introduction of the first viable decentralised solution to the problem of double-spend in digital networks. Bitcoin sparked interest and innovation in other cryptographic and decentralised technologies including blockchains and crypto-tokens. We are in a rapid period of experimentation around decentralised technologies including consensus mechanisms, identity, data structures, crypto-economic designs and smart contracts. Taken together, we see the foundations of a new data infrastructure.

Our Vision: The Convergence Ecosystem

Introducing the Convergence Ecosystem

We believe that future decentralised data infrastructure will come from the convergence of the Internet of Things (data production), blockchains (data distribution), and artificial intelligence (data consumption). The integration of these technologies will see markets become increasingly open-source, distributed, decentralised, automated, and tokenised.

The Convergence Ecosystem consists of four parts: governance, production, distribution and consumption. Each of these are explored in the paper, and we will be publishing further analysis into part throughout the year.

Governance — How are protocols and communities governed and incentivised?

  • Data flow through the ecosystem is coordinated and incentivised using crypto-assets, crypto-currencies and crypto-consumables designed to incentivise behaviours for people, machines, devices and agents to the benefit of the overall ecosystem. These new types of assets will continue rapidly experimenting with supply and demand policy including fungibility mechanisms like the Ethereum ERC 721 NFTs.
  • Emergent governance models will have differing levels of decentralisation and automation depending on the values of the community. Some will value censorship-resistance and others self-sovereign identity. New decentralised projects will be guided by social democratic values that prioritise wealth redistribution through ‘network’ (read: State) intervention or pre-agreed taxation rules. Others will prioritise ethical and environmental values with green-friendly policies that use non-consumption based consensus mechanisms (eg Chia) and focus on common-ownership and resource sharing. Communities will continue to experiment with traditional governance models like corporations and newer structures like decentralised organisations or decentralised autonomous organisations (DAOs).

Production — How is data produced?

  • Data is brought into the ecosystem by either hardware connected to the Internet of Things or software such as digital, virtual or augmented spaces.
  • We are creating and collecting more data than ever, but we are storing it in insecure private databases with no incentives to share the data. Data breaches and hacks are commonplace, and the data can be censored or tampered with. Software-generated data is lost, hoarded or latent. There is no reason for consumers to do anything other than to give data away for free and for corporations to hoard it. Decentralised infrastructure offers a solution.

Distribution — How is data authenticated, validated, secured and stored? How is it transported across databases and blockchains, and how is it exchanged?

  • Once data is in the ecosystem it needs to be authenticated, validated and secured. This is where blockchains or more specifically distributed ledgers, consensus mechanisms, self-sovereign identity and reputation, and decentralised storage and data integrity solutions are valuable tools.
  • Using new data distribution protocols such as; transport & messaging, state communication; value and data interoperability, data can be efficiently moved from storage across networks and protocols to marketplaces.
  • Marketplaces are already developing going beyond just cryptocurrencies to support the buying and selling of all sorts of other data types including internet of things data, artificial intelligence data, personal data, and a range of newly emerging digital assets including but not limited to cryptokitties.

Consumption — How is data turned into insight?

  • Finally, data is processed, analysed and automated using a range of technologies including distributed computation, decentralised machine learning and smart contracts.
  • This is where data is transformed into actions and insight using traditional and distributed computing techniques, as well as newer types of computing such as quantum computing. It is at this layer where blockchains and artificial intelligence blur and it becomes clear they are intertwined and interconnected. Both smart contracts and machine learning offer differing levels of automation and decentralisation depending on the type of input data and level of trust the use case demands.

Winners will differentiate on values and trust

The open-source nature of the technology; ease of forking; almost zero costs of digital distribution; and interoperability protocols will mean projects will struggle to differentiate using technology in the long-term. Successful projects will differentiate through political values such as libertarianism, self-sovereignty and egalitarianism as well as through trust. This makes the Convergence Ecosystem structurally different from other markets in which value capture happens at friction points. With very few friction points and lock-in, we are unlikely to see the same market consolidation dynamic that has dominated previous digital markets. When technology and data are open and free, lock-in will come from brand and values. There will be as many protocols as there are value-systems and personal priorities.

There will not be one chain to rule them all. In a world of scarcity, competition is the optimal strategy. In a world of abundance, we must change our mental models. The Convergence Ecosystem drives collaboration rather than competition.

Outlier Ventures: VC for The Decentralised Future

The Convergence Ecosystem is our vision of the future. We expect the Ecosystem to support hundreds of communities that will over time outcompete their Web 2.0 competitors for developers and users using tokenised business models. This shift will not occur overnight. People will continue to focus on the price of crypto-assets and worry about the regulatory implications of public token sales.

But behind the scenes, a decentralised infrastructure is being built.

Network by network.

Protocol by protocol.

We want to invest and partner with tokenised communities to build decentralised economies. Join us to help build the decentralised future!

Download the full paper here

Also a big thanks to all the Outlier Ventures team including Joel John, Harry McLaverty and Shaquile Noir for their work on putting this together. Also to Jamie Burke, Aron van Ammers, Eden Dhaliwal, Anesu Machoko, and Geoff Le Fevre for their contributions and feedback.

Also a massive thank you to all of the people outside of Outlier that contributed to the paper:

Dele Atanda — CEO, metâme x (Dele Atanda)

Chris Burniske — Partner, Placeholder & Author of Cryptoassets (Chris Burniske)

Dr Rose Chan — Founder, Ladyboss.world & Former Head of Blockchain Working Group, World Bank (@I_am_rose)

Professor David Lee Kuo Chuen Professor, FinTech & Blockchain, Singapore University of Social Sciences (@DavidKChuenLEE)

Matt Chwierut — Director of Research, Smith + Crown (@Skryptical)

Dr Anne Hsu — Assistant Professor, Department of Computer Science, Queen Mary University

Dr Stylianos Kampakis — Research Fellow, UCL Centre for Blockchain Technology (@s_kampakis)

Samuel Klein — Fellow, Berkman Centre for Internet & Society at Harvard University (Samuel Jay Klein ❦)

Professor William Knottenbelt — Director, Imperial College Centre for Cryptocurrency Research and Engineering (@will_wjk)

Dr Robert M. Learney — Lead Technologist Blockchain & DLT, Digital Catapult (Robert Learney)

Trent McConaghy — Co-founder, BigchainDB & @Oceanprotocol (Trent McConaghy)

Mark Stephen Meadows — CEO & Founder, Botanic.io & SEED Token (Mark Stephen Meadows)

Teemu Paivinen — Founder, Equilibrium Labs & Author of Thin Protocols (Teemu Paivinen)

Samuli Poyhtari — Founder, OrbitDB (@haadcode)

Drummond Reed — Chief Trust Officer, Evernym (Drummond Reed)

Toby Simpson — Co-founder, Fetch.AI

Dr Phillip J. Windley — Chairman, Sovrin Foundation (Phil Windley)

Disrupting Tech Monopolies & AI Tycoons — Part 2

Blockchains combined with artificial intelligence is more than just a technical innovation: it’s an economic paradigm shift 💰💰💰

This is Part 2 of Disrupting Tech Monopolies & AI Tycoons — Part 1 outlines how we will get here.


>>Blockchains combined with AI will create the conditions for disruption of platform monopolies>>

>> As data stops being a competitive advantage, powerful token-driven network effects will lead to AI agents using blockchains to accumulate tokens>>

>>This will lead to profound questions about the how we govern non-human entities in the economy and society. 🤔 🤔


End of AI Platform Monopolies

So, now we have a global data sharing and monetization network

Right, so where were we? Oh yeah, we now have a global network of interconnected blockchains and DLTs that share value seamlessly with easy-to-use data exchanges (See Part 1). Hopefully, as the industry begins to focus on usability and user design, we will be in a world in which anybody can publish data with a press of a button or voice command. Payments in Bitcoin or other tokens are seamless and automated based on rules coded into smart contracts. For the average user, all they have done is agreed to conditionally share data, as they do today with Facebook and other systems, and next thing they know they have tokens to spend however they want. They can convert to a national currency or merrily purchase their preferred goods and services.

All that matters today for AI platforms is data…

How does this lead to the end of AI platform monopolies? Well in 2017, the only thing that matters is data. Platforms like Google, Facebook, Baidu collect data to feed their AI algorithms (specifically their deep learning algorithms) improving their products. More data improves the products which in turn brings more customers and engagement which in turn generates more data.

When AI is the driver of product improvements proprietary data are the most valuable asset for platforms. In fact, access to proprietary data sets is one of the most important assets for an AI startup. The way to think about it is data is the supply and AI algorithms are the demand. And deep learning models are hungry.

But hold on a moment, blockchains aggregate and commoditize data. That means…

Here is the knockout: blockchains aggregate the supply side for free (almost) for all. Of course, there will be some transaction fees and other friction points, but compared to existing data infrastructure an open, shared data layer essentially commoditizes data. Or at the very least makes proprietary datasets much less valuable.

But that means control of data is no longer the leverage point in value chains…

Firms that control supply — data — no longer dominate markets. Data stops becoming a moat and a competitive advantage. Now the demand-side becomes the most valuable place in the ecosystem. This is where the customer relationship is won with trust. Well, trust, and a simple, easy to use interface, maybe a conversational or voice UX. The only thing that matters in 2020: The customer relationship. (Side note EU’s General Data Protection Regulation, or GDPR, will reinforce this)


Beginning of Blockchain-enabled AI

So we have this global shared data layer right…

A second, a longer-term implication of a global shared data layer is blockchain-enabled AI. Agents (not even particularly intelligent agents) can use blockchains as a ‘substrate’ as Fred Ehrsam has put it in the past.

Deploying and using agents on blockchains rather than using proprietary tools and platforms like Facebook Messenger will be more attractive to developers, users and regulators.

Well developers are going to love it…

For developers, first, they have access to a vast amount of free (on public chains anyway) and structured data, data that they would never be able to buy or generate themselves at first. Second, they have structured and high-quality data (right now, just transaction data, but increasingly all sorts of value store and exchange). Third, native automation tools in smart contracts and hopefully very soon production sidechains make it easier to build reasonable agents that can perform reasonably complex actions. Finally, developers and companies that deploy agents have a native payment channel to be paid almost immediately based on all sorts of conditions like usage or user utility. The business models with tokens and smart contracts are not limited to up-front payment or a paywall. All sorts of new business models will be available for experimental developers.

Users will love it, too. They can get paid just to use agents…

Users benefit because unlike any other environments, they will have direct access to token capital, investment and real interest in the system. When users use a Facebook messenger bot, they get some utility. When they use an agent on a blockchain they can be rewarded or paid with tokens. Depending on the token economics a user can ‘own’ a stake in the agent or company behind the agent. The more the user uses or evangelises the product, the stronger the product and underlying blockchain gets. Network effects with a direct monetary reward thrown in. In a sense, a user is no longer a passive consumer of a service; they are a stakeholder. This model begins to look more and more like a digital cooperative. (Something we are actively exploring at Outlier Ventures with the tokenization of Botanic Technologies’ bot platform, a project named SEED that allows the fair exchange of information between AI and people)

Regulators will love it the most, they might even force data and AI agents to use blockchains…

The last stakeholder, and potentially the deciding factor will be regulators and Governments that demand some element of control or access to AI algorithms. The public and political tide are turning against technology companies. Certainly many Governments around the World are waking up to the power amassed by large US-based tech firms through their exploitation of data. Without overselling it, it seems to me that an open-source, auditable data structure would be an ideal technical solution for regulators that want a window into AI decision making and data used to train models. This would at the very least allow scrutiny of training data to check for bias as well as potentially providing an audit trail for exploration if an agent makes a bad decision. It’s not a leap to imagine regulators actually mandating the use of either a public blockchain or demanding a node in private networks for audibility of AI.

So now we have the perfect environment for autonomous agents…

If this scenario plays out you have more developers, more users and happy regulators. There are many different descriptions, I like Autonomous Economic Agents (AEAs), these new types of decentralised AI are the logical next step when autonomous agents start using blockchains (something FETCH.ai, an Outlier portfolio company are working on). The level of human involvement with the agents will vary; some AIs can be managed by traditional organisations, others will be managed by decentralised autonomous organisations (DAOs). Regardless of the human involvement, the fact is AIs will be accumulating tokens (seen another way, wealth). For example, an autonomous vehicle can be paid in tokens for rides and can pay for re-charging and servicing with tokens. Or an AI DAO could manage a neighbourhood distributed energy grid in which energy is exchanged using smart contracts based on real-time supply and demand.

Cool yeah, this sounds like a pretty big deal…


I don’t think many people have truly thought through the implications of this. A non-human and non-human controlled entity will have the ability to acquire resources and wealth. When people talk about exponential growth, this is exactly what they are talking about. Society and politics are simply not ready to even begin a discussion about these sorts of issues. Can an autonomous agent generate wealth? What is the optimal level of taxation that doesn’t act as a disincentive to activity? We already have enough trouble collecting taxes as it is, how and who will collect taxes from an AI DAO?

Blockchain-enabled AI might seem pie in the sky. But unlike say artificial general intelligence (AGI) we know exactly the problems that need to be solved to bring this vision to reality. There are already rudimentary versions of these agents available today. For more on AI DAOs you must read Trent McConaghy’s AI DAOs, and Three Paths to Get There.

Yea it’s a big deal alright, possibility an economic paradigm shift…

Blockchains combined with artificial intelligence is more than just a technical innovation: it’s an economic paradigm shift. The political philosophy written in next 10 years will be as important as the socialist and labour movement of the late 20th century.

Thanks for reading 👍. If you enjoyed it, please hit the 👏 button and share on Twitter and Linkedin. Honestly though, Ev Williams, surely tokenizing claps is the perfect business model for Medium?

— –

This is a working thesis and an high level description of the work we are doing at Outlier Ventures. I am looking for feedback so please tweet me Lawrence Lundy. The thesis can certainly be improved upon. I am particularly keen to explore the potential impact of improved unsupervised learning algorithms and reinforcement learning on the need for large data sets. If large data sets are no longer required the outcome would also be that data becomes less valuable which serves to reduce the value of data but wouldn’t enable blockchain-enabled AI.

Thanks to Aron Van Ammers, Joel John & Jason Manolopoulos of Outlier Ventures; Jeremy Barnett a Barrister at St Paul’s Chambers; Omar Rahim of Energi Mine; Toby Simpson of FETCH.ai (an Outlier portfolio company); Mark Stephen Meadows of Botanic Technologies (an Outlier portfolio company) for reviewing and providing valuable feedback. Also to Trent McConaghy for driving and pushing forward the thinking on AI DAOs.

Disrupting Tech Monopolies & AI Tycoons — Part 1

Blockchains & Artificial Intelligence is not just a technical innovation: it’s an economic paradigm shift 💰💰💰


>>Blockchains combined with AI will create the conditions for disruption of platform monopolies>>

>> As data stops being a competitive advantage, powerful token-driven network effects will lead to AI agents using blockchains to accumulate tokens>>

>>This will lead to profound questions about the how we govern non-human entities in the economy and society. 🤔 🤔


Data is siloed…

It is almost banal to say it, but as a society we have a data problem. Most of the World’s data is held on private servers. The client-server architecture of the Internet and the design of corporate IT networks has resulted in data being hosted and stored on private and centralised databases. Lucrative consulting businesses sprung up to help organisations connect systems and try to make it easier share their data internally. Open Application Programming Interfaces (APIs) have gone some way into opening up data externally, especially in the public sector, but nevertheless these fixes are typically forced upon organisations. PSD2 is supposed to force banks to open up their data but they are doing their best to wriggle out of it. See Chris Skinner’s blog here.

Regulation and data privacy further limit sharing…

Even when organisations might be set up to share data and have a culture of sharing and collaboration, privacy laws and data protection legislation has had a chilling effect on data sharing. The Health Insurance Portability and Accountability Act (HIPAA) in the US or the Data Protection Act in the UK explicitly state what data can and cannot be shared. But these are complicated policies. The technical difficulty of implementing data policies, combined with the degradation of user experience those implementations produce, make IT systems costly, inefficient, and unpleasant to use. On top of which the security theatre that surrounds them only adds to the insecurity of participating with such systems. It’s just not worth it for people to go through the risk and hassle of sharing data.

It’s also really hard to make any money from sharing data…

Even when sharing is encouraged internally, technically possible with Open APIs, and legislation is favourable, current data infrastructure makes it difficult to monetise data. The best that happens is that external developers or citizens get free access to data. Lovely stuff for the user of the data, not so much for the publisher. There is no good way to get paid appropriately for published data. Some content can be published using an open-source license, sure, and attribution is great, but it doesn’t pay the bills. Dual-licensing for software is possible. See Oracle’s MySQL database which is dual-licensed under a commercial proprietary license and under the GPLv2 license. But in most cases licensing is costly, difficult to enforce and inefficient. Licensing of personal data is officially non-existent. Despite that in developing countries companies make money from this value. Around the world doctors use Facebook’s WhatsApp to send medical reports, nurses use Gmail to provide remedial advice, and that data, while not licensed, is published to advertisers as users uncaringly share the data.

So basically there is no business model for data sharing…

The fact is today there is no rational economic incentive for individuals to do anything other than give data away for free and for corporations to hoard it. If only there was a solution…


Blockchains are terrible databases, stop going on about it…

First and foremost, despite my clickbait-y headline I want to get this out of the way: public blockchains are in most ways worse than existing databases. They are slower, have less storage, are extremely energy-inefficient and in most cases less private (although zero-knowledge proofs 👏, self-sovereign identity 👏 and Masked Authenticated Messaging (MAM) 👏 will help this). But these are design choices are made to improve one feature: decentralisation. By decentralisation I mean the elimination of a central administrator which leads to extreme fault tolerance and increased data integrity and tamper-evidence. The removal of centralised control and vulnerable centralised data repositories has trade-offs which make most blockchains unsuitable for a ton of use cases that have been spoken about. But in cases where security and tamper-evidence is more important than throughput, speed, capacity, and stable governance, well then public blockchains are well worth exploring. For more on this Adam Ludwin of Chain explains it better than anyone in this post.

Okay, so it’s more secure, but nobody cares about security…

So the question becomes: how important is security to individuals, corporates and Governments, right? The bull case for blockchains is that it matters a lot and it will matter more in the future. If security continues to be an afterthought, well existing databases are cheaper, faster and more convenient, so why bother with blockchains at all?

Well, I tend to believe that security will become more important. Things like the Yahoo! or Equifax hacks certainly shine a light on the vulnerability of centralised data providers but tbh I really don’t think individuals are going to demand change. People are going mad for Amazon Echos and Dots and sticking them in every room, very few people are asking: what data is actually being collected? Where is it being stored? Is it encrypted? How can it be combined with other datasets? Security and data protection matters far more to business and to Government and the so-called Internet of Things will force the change.

Blockchains will actually help manage the sharing of data instead of fighting over it…

Never has so much data been available for collection and analysis. Connected cars are throwing off vast amounts of data, the challenge is that every single stakeholder wants access for their own purposes: car makers want it to improve the driving experience; tire makers want it to see how their tires perform; City administrations want it for traffic prediction; and software makers want it to improve their self-driving software. As sensors are embedded in all sorts of everyday objects, everybody is fighting for who ‘owns’ the data. This is fighting yesterday’s war. Blockchains can provide an open, shared data layer in which all stakeholders have access to data.

So blockchains will actually be quite useful in securing and sharing data…

Sure, not every bit of data will need a fully decentralised blockchain with proof-of-work. In most cases a simple distributed ledger with a Merkle tree will suffice (see DeepMind Health Verifiable Data Audit). Much of the data could even be stored off-chain with just links to the on-chain hash. Regardless of the blockchain flavour, cryptographically-secured distributed ledgers offer a better alternative than centralised databases. Of course, an assumption here is that blockchains don’t suffer the familiar fate of incompatibility by competing blockchains. The community does seem to be fully behind blockchain-connecting projects like Polkadot, Cosmos, Atomic Swap and AION. These services combined with zero-knowledge proofs mean data can be shared privately on public ledgers. At this point, we are close to the ideal of a globally shared database with easy and, ideally, public permissions.

Add in data exchanges…

Now, the final piece. Data exchanges like the Ocean Protocol bring together data buyers and sellers (also including bots and devices). As explained, today data is either given away for free or sits underutilised because people and organisations have no way to monetise it. A blockchain-based data exchange can enforce data quality standards, ownership and usage rules, and pay sellers to rent or sell data. A data exchange provides the missing component to a shared ledger: a business model. People and organisations can easily earn money from their data.

Now we have the infrastructure to share and monetise data…

Sure, people won’t just stop using Google or Facebook tomorrow. The value they provide is far too great. But these new networks will change the conversation. The public will begin reading news stories about how they can be paid when people download their pictures or paid when they upload their smart watch data.

The seed of disruption has been planted. Why allow yourself to be sold for nothing when you can get paid?


If this piqued your interest and you want to know more about the implications Part 2 is available to read here.

Thanks for reading 👍. If you enjoyed it, please hit the 👏 button and share on Twitter and Linkedin.


Thanks to Aron Van Ammers, Joel John & Jason Manolopoulos of Outlier Ventures; Jeremy Barnett a Barrister at St Paul’s Chambers; Omar Rahim of Energi Mine; Toby Simpson of FETCH.ai (an Outlier portfolio company); Mark Stephen Meadows of Botanic Technologies (an Outlier portfolio company) for reviewing and providing valuable feedback. Also to Trent McConaghy for driving and pushing forward the thinking on AI DAOs.

Blockchain-Enabled Convergence

First published at www.outlierventures.io

Humans often use the past as a guide for the future; this mistake often makes it impossible to adapt to our rapidly changing world. Technological progress is not linear, it’s exponential in nature, making it much harder to grasp. This means we constantly underestimate the pace of change and as software eats more industries, improvements compound as traditionally human-centric industries like healthcare, logistics and agriculture digitise. As these industries come online and capture, process and automate data; ownership of this data will define the state, market and nation over the next half a century. Blockchains are therefore one of the most significant technological innovations since The Internet and fundamental to Web 3.0. Blockchains, distributed ledgers, smart contracts and other decentralisation innovations provide the foundation for a scalable and secure data and asset management layer for the new Web 3.0. It acts as a platform to support individual rights while benefiting from the aggregation of vast amounts of data from the Internet of Things. They also ensure the benefits of artificial intelligence are shared broadly across society and do not aggregate to a few AI owners, or the 0.00001% of the population. The Internet of Things, artificial intelligence, autonomous robotics, 3D printing, as well as virtual & augmented reality

Blockchains are therefore one of the most significant technological innovations since The Internet and fundamental to Web 3.0. Blockchains, distributed ledgers, smart contracts and other decentralisation innovations provide the foundation for a scalable and secure data and asset management layer for the new Web 3.0. It acts as a platform to support individual rights while benefiting from the aggregation of vast amounts of data from the Internet of Things. They also ensure the benefits of artificial intelligence are shared broadly across society and do not aggregate to a few AI owners, or the 0.00001% of the population. The Internet of Things, artificial intelligence, autonomous robotics, 3D printing, as well as virtual & augmented reality are all converging in new and exciting ways. Blockchains will become the decentralised data and asset management layer that links the data and value from these technologies, ushering in the era of blockchain-enabled convergence.

Convergence is not a process that will happen immediately, nor be a simple and linear progression. Trends will combine at different speeds based on technical limitations, political and social barriers, as well as commercial considerations. The market dynamics will vary with industrial manufacturers and telecommunications providers leading the charge in the Internet of Things, while consumer Internet companies like Google and Facebook innovate in artificial intelligence. It is important to grasp the nuances of each market, but in doing so, it’s easy to miss broader macro-trends. The development of blockchains is a good example, as exceptionally talented developers push the boundaries of cryptography with zero-knowledge proofs and smart contracts but fail to see the implications on broader governance structures and political philosophies. These are the kind of things we have been trying to figure out since the dawn of civilisation. It is just as important in technological progress to study Plato and Hume as it is to study Von Neumann and Shannon.

As the rate of change increases, it is critical to understand these technology trends as part of a wider collective rather than as separate developments. Blockchain-enabled convergence is our attempt to capture this wide collective. The first part of the white paper explores blockchains, artificial intelligence, the Internet of Things, autonomous robotics, 3D printing, and virtual & augmented reality to understand the drivers and barriers to adoption. Part two investigates how blockchain-enabled convergence changes the trade value chain from manufacturing and design through logistics and distribution to retail and commerce, and even more profoundly changing the very governance structure of the organisation.

5 Key Themes

The Outlier Ventures white paper explores an extremely broad range of technologies and markets, yet despite this breadth, we found 5 key themes that kept coming up time and time again. These themes are not technological in nature but rather trends that will reshape markets, society and excitingly, the relationship between humans and machines.

1. Web 3.0 — The Global Trust Network

Web 3.0 underpinned by blockchains and decentralized technologies provide global trust. The core design of The Internet was enable the sharing of information. The core design of Bitcoin, and other open permissionless blockchains, is a network of trust for exchanging value and asset ownership. Web 3.0 provides trust and chain-of-ownership; the missing link with the existing Internet infrastructure.

2. The ‘Real’ Sharing Economy

New digital intermediaries have sprung up so individuals can ‘share’ unproductive assets; spare rooms on Airbnb, spare seats on Uber and spare time on TaskRabbit. These ‘sharing economy’ companies are nothing more than a new middleman sitting between a buyer and seller capturing outsized value. Blockchain-enabled convergence allows seamless peer-to-peer exchange of assets and value reducing the need for trust brokers in the middle of a market extracting economic rent.

3. The Killer Business Model: The Decentralized Data Marketplace

A blockchain-based data marketplace helps solve two major problems in artificial intelligence today, the access to data for those that need it, and monetizing unused data for those that have it. A decentralized data marketplace creates an economic mechanism for individuals and organisations to buy and sell data, reducing the incentive to hoard valuable unused data and remunerating the creators of data not just the processors.

4. The Commoditization of Logistics & Production

Blockchain-enabled convergence transforms the trade value chain. Autonomous robotics, AI, IoT and blockchains will digitise logistics and distribution reducing its importance and therefore ability for companies at this point in the value chain to capture profit. Producers can capture more of the value they create and consumers can pay less. In the long-term, technical deflation will hit the knee of the exponential curve as much of production gets commoditised by 3D printing, and virtual and augmented reality make it cheap to design and print products at home.

5. The Rise of the Decentralised Organisation

The global multinational corporation that developed to coordinate global trade is under threat as the dominant form of governing structure. New decentralised processes for business financing with Initial Coin Offerings (ICOs), incorporation, voting, payments and talent and project coordination are enabling start-ups to choose processes that are suitable for smaller, more agile start-ups rather than using an expensive corporate structure designed for large companies.

Special Thanks to Anish Mohammed, Trent McConaghy, @edcafenet, Vijay Michalik, Creative Barcode, @API_economics, David J Klein, and Ethan Gilmore of VarCrypt for conversations and contributions.

Why I am dedicating my career to Blockchains

Blockchains are not just the next growth market or investment opportunity; they represent the most important democratising force since the Internet. With this is mind, I am joining Outlier Ventures to head up research and partnerships. I will split my time between providing research to our portfolio companies and exploring new investment opportunities and venture partnerships on behalf of our fund, Outlier Capital LLP.

I will work with our existing portfolio companies to help them build sustainable and profitable businesses. MoneyCircles.com is building social banking platform on the Blockchain. Buyco.io is exploring decentralised collective procurement. BlockStars.io offers blockchain consultancy and AssetCha.in is securing valuables on the blockchain.

I leave Frost & Sullivan, a global research and consulting company, after three years as a strategy consultant in the digital transformation team. I helped companies like Amadeus, Shell, Huawei, and Intel to assess the impact of emerging technologies, validate opportunities in current and adjacent markets and create marketing strategies.

After working across many industries, it became increasingly clear that the macro decentralisation trend is reshaping every company and industry.Decentralisation is the framework I use to explore seemingly disparate trends such as the Internet of Things (IoT), 3D Printing, Drones, Deep learning, reinforcement learning and, most fundamentally, blockchains. I published numerous reports at Frost & Sullivan investigating all of these technologies and their impact on industries including agriculture, healthcare, and financial services.

My thesis is that blockchains will be the data management layer in the technology stack that secures and enables IoT to scale, and provides the platform for artificial intelligence to be accessible and to benefit society broadly. Without blockchains, IoT will be insecure and data will be owned, controlled and ultimately locked down by the companies that own the platforms. With tech platforms moving into connected cars, healthcare and education, this just isn’t a case of having all your emails with one provider. A company will know your genomic data, health records, sleeping habits and educational attainment. We must make sure we as a society don’t allow all of that data to be owned and controlled by a single company.

Even more dangerously, the benefits of artificial intelligence will accrue to those Internet-scale companies such as Google, Facebook and Baidu that have the most data to feed their deep learning models. As data network effects kick in, these companies will collect, retain and leverage proprietary data into ever greater economic and social power through their AIs.Blockchains, as an open-source shared data management tool, has the potential to provide a global incredibly large free-to-access data source for everybody in the world to build AI.

Without blockchains, the era of connected devices and artificial intelligence will further erode privacy and exacerbate societal and economic inequality. I believe blockchains are the single most significant technological development since the Internet, and together with artificial intelligence, will shape our society and economy over the next 25 years.

Apple and Artificial Intelligence: The Odd Couple

Apple Builds Premium Computers as the World Moves toward Commoditized Invisible Computing

Apple is doomed. I said it. Sell your stock now, get out while you can.

Well, that isn’t quite true, but I do think that as we move into the next computing paradigm, the Cognitive Era, Apple’s business model and culture will need to change. The Cognitive Era describes the next phase in technology development in which the world around us The future success of Apple will not be determined by the iPhone, iPad, Watch, or even a car. For Apple to continue its dominance of the technology market and expand into healthcare, transportation, and fashion, it needs to move away from being a computer company. This may sound like a ridiculous statement considering Apple’s position at the moment. But as the Internet of Things gathers pace with every device connecting to the Internet, the value is in making these devices do smart things. Similar to electricity today, computing power will be on tap whenever and wherever we need it. The next computing paradigm, the Cognitive Era, with ubiquitous and invisible computing power poses an existential threat to Apple.Apple is a company that makes money from people who are happy to pay extra for a computer (phone, tablet, or watch). The long-term danger to Apple’s business is that AI solves problems such as natural language understanding, computer vision, and behaviour prediction.

Apple is a company that makes money from people who are happy to pay extra for a computer (phone, tablet, or watch). The long-term danger to Apple’s business is that AI solves problems such as natural language understanding, computer vision, and behaviour prediction.

Artificial intelligence becomes the only product differentiator

Right now, Siri, Google Now, and Cortana are similar enough that the user might be unable to tell the difference in quality. Continued progress in combining deep learning with other machine learning techniques such as genetic algorithms and bayesian inference, together with data network effects, means the quality of applications will begin to diverge. Smartphones, wearables, and cars will be bought based on what they can do rather than how they look. Sure, design will always matter and there will always be a high end of the market, it’s just this will matter less for the products Apple sells. Apple needs to become an AI company.

Apple’s only true competitor in the smartphone space is Google, and at its core, Google is an AI company. Google’s guiding strategy is to do whatever is necessary to collect data to feed into its AI engine. Google Fiber, Loon, and Chrome are all products designed to get more people using the Internet and to leave behind more data. Nest, Waze, and Dropcam are services that generate vast amounts of data. Moreover, Google has the best AI experts on the payroll. Geoffrey Hinton and his team are pioneers of deep learning and have invented many of the most widely used tools. DeepMind, a 2014 acquisition, was recently on the cover of Nature magazine and is leading the progress in general AI. When it comes to data and talent, Google is miles ahead of Apple.

Apple is Waking up to the Threat of AI

In the last 6 months, Apple has attempted to address the AI threat by acquiring 3 leading machine learning companies: VocalIQ, Emotient, and Perceptio. VocalIQ is a UK-based machine learning company that builds voice user interfaces. Perceptio is a US-based deep learning company focusing on developing smartphone-based computer vision solutions. Perceptio’s solution is unique in that unlike most computer vision products, it allows smartphones to identify images independently without requiring access to cloud-based data libraries. The most recent acquisition was Emotient which is attempting to automate facial recognition and analysis. Their vision is to build emotionally aware technologies.

The technology behind VocalIQ will be useful for the Siri and the Apple TV teams. Emotient and Perceptio will help bring local intelligence to FaceTime, photos, and Apple TV. The local angle is important firstly because Apple’s core business is in devices not cloud like Google. The more processing that can be done on the device the better for Apple. Regardless of how the acquisitions will fit existing products, the fact that 3 AI companies with deep learning expertise have been acquired in such a short space of time shows Apple’s commitment to bolster its AI capabilities.

Apple’s Secrecy and Closed Approach Will Limit Its Ability to Succeed in AI

What made Apple so successful in the mobile era was its fully integrated approach to building computers. By vertically integrating the supply chain and building the hardware and software, Apple has offered an unparalleled user experience that enabled the company to maintain premium prices and high margins in an exceptionally competitive market. This vertical integration required tight control across the value chain to ensure the products and user experience are market leading. This focus on providing the best user experience created a relatively closed and secretive company culture.

This culture means Apple has not embraced the open-source community in the same way as Google, Facebook, and other leading AI companies. For example, the 2015 Neural Information Processing Systems (NIPS) conference was the largest annual gathering of deep learning experts. In addition to Google and Facebook, Baidu and Microsoft were present; however, Apple was not. Over the last year, as detailed in The Only Thing That Matters in Machine Learning is Data, Google, Microsoft, IBM, and Facebook have released their machine learning frameworks for free to the open-source community because in the machine learning field, only data and developers matter. This strategy collects more data and obtains more developers and PhDs using their tools to make it easier to recruit them in the future.

Apple is still to publish a research paper on AI, despite using the technology and that Siri is a leading natural language processor. The secrecy runs counter to the prevailing approach within the AI and, specifically, the deep learning community. The small talent pool is attracted to the open approach of Google and Facebook and even the new venture OpenAI that plans to release all research to the community. In fact, the speed of progress in the field is due in large part to the collaborative approach of researchers who have worked together in small laboratories and have taken their academic approach to the companies who hired them. Apple’s secrecy and closed approach directly impacts its ability to attract the best deep learning talent.

Can Apple Adapt to the AI Era?

The integrated approach that made Apple successful in the mobile era could be the very thing that makes it unsuccessful in the cognitive era. Apple is one of the few companies to have successfully navigated a paradigm shift in computing, from PC to mobile. That shift, however, favoured Apple’s focus on user experience and, therefore, a closed and controlled culture. In the PC era, the main buyers of PCs were heads of IT departments who decided what to buy based on quantifiable indicators such as processor speeds, RAM, and compatible software. Apple’s focus on the user interface is less tangible; therefore, its proposition was weaker. In the mobile era, however, consumers are the buyers of devices that are used pretty much all day every day. User experience is far more important. Apple has the right company culture to build consumer devices.

The shift from the mobile era to the AI era does not suit a closed company culture. In the field of AI, an open and collaborative culture is needed. Without changing the culture, Apple’s products will be weaker than its competitors.

AI is the biggest threat to Apple’s position as the most valuable company in the world.