CoinEx Research Institute: Research Report on Ocean Protocol

CoinEx
7 min readJan 22, 2021

I. Project Overview

Ocean Protocol is a decentralized AI data exchange protocol designed to promote the advantages of AI by unlocking data while preserving privacy, and provide data users with fairer results. It allows data providers to interact with data consumers through a decentralized data market, while ensuring the control, auditability, transparency and compliance of all participants.

The Ocean Protocol has a distributed orchestration: the core lies in a distributed service protocol and distributed access control, which are executed on distributed virtual machines. This allows connecting to any data service, monetizing and managing any data service from it. On this basis, OceanToken (OCEAN), used as a value exchange method on the ocean protocol network, enhances the protocol and incentivizes network nodes. By providing data as a market, users can earn OCEAN that can be used to provide network services and manage data.

II. Business Model

2.1 Business model

Ocean Protocol is a protocol and network on which a data market can be established to unlock data to be shared and sold. It allows data sharing. Most other projects are distributed/centralized data markets, while the Ocean Protocol is a supplement to the centralized and distributed data market that provides an additional channel for the market to publish its services.

2.2 Blockchain Necessity Data is usually held by countless different organizations and needs to flow from one place to another. This may happen within an organization or across organizations. Data comes in different forms and formats, and data assets seldom exist in their final form of problem solving. In addition, there are many different data platforms and technologies, all of which have their own interfaces and APIs.

Economic specialization often occurs in high-value or technology-driven activities. Organizations often play the role that suits them best. Therefore, to extract value from the data, multiple stakeholders are involved. Organizations and regions vary in regulations and trust issues. Given the decentralized nature of data, centralized methods are unlikely to be a great success.

In this case, the Ocean Protocol aims to adopt a distributed approach to solve the challenges facing data sharing and create real value. Ocean’s main role is to develop a set of open data exchange standards using blockchain + AI to enable the operation of distributed data supply lines. At the same time, data tokens are used on that basis to build an incentive mechanism for providing AI data and services.

2.3 Token economy

1. Token information

Token name: The Ocean Token/ OCEAN

Token type: ERC-20

Contract address:

0x985dd3D42De1e256d09e1c10F112bCCB8015AD41

2. Token function: OCEAN is the main token of the network, expressed as “Ọ” or OCEAN. It is mainly used in three scenarios:

(1) OCEAN is used as an exchange unit for data buying and selling services. The market will price data services in OCEAN or any other cryptocurrency chosen by the market provider (such as USD, EUR, ETH, and DAI). For the latter case, the market will instantly convert the selected cryptocurrency to OCEAN through a cryptocurrency exchange.

(2) Use OCEAN for stakeout. It involves staking out in each given data set service and introducing OCEAN derivative tokens, with “Ọ” denoting OCEAN derivative tokens. Each data set will have its own derivative token, and the exchange rate between different data set services is determined through the interest rate curve.

(3) According to OCEAN’s inflation schedule (as shown in the figure below), OCEAN Token will be used to distribute network rewards.

The percentage of network reward tokens that will be released in the next three years. Source: Ocean Technical White Paper

III. About the Team

3.1 About the core members

The Ocean Protocol team is composed of dozens of developers who are equipped with both know-how and rich experience in big data, blockchain, artificial intelligence and data exchange.

The main team members are as follows:

1. Bruce Pon, co-founder, graduated from the Massachusetts Institute of Technology. As a serial entrepreneur, Bruce boasts extensive blockchain and database-related experience, and acts as the founder and CEO of BigChainDB, a blockchain data company whose main partners include well-known companies such as Microsoft, Polkadot, MongoDB, and Toyota.

2. Trent McConaghy, co-founder, is a successful, serial entrepreneur. Acting as co-founder and CTO of BigChainDB, Trent has extensive experience in machine learning and data mining. He started to study hacking in the early 1980s and AI in the late 1990s.

3. Masha (Maria) McConaghy, chief marketing officer, professional curator and researcher, holds a doctorate in fine arts from University of Paris 1 Pantheon-Sorbonne and a doctorate from Ecole du Louvre — Sorbonne University in Paris, France. Masha has extensive experience, in exhibition organization as an independent curator and assistant to the curator of a famous museum in Paris, as well as professional knowledge of marketing and research.

3.2 Overview of investors

This project is effectively empowered by the good investor resources. Among them, Digital Currency Group, the parent company of Coindesk, enjoys great superiority in market PR. At the same time, BlockVC has advantages in exploring the Chinese market as a first-tier investment institution in the Chinese blockchain industry.

IV. Supporting Exchanges

Currently, trading platforms that support OCEAN token trading include CoinEx, Bittrex, Kucoin, gate.io, MXC, and Hotbit.

V. Governance Model

The Ocean Network Protocol is under the co-governance of the foundation and community. The Ocean Foundation will take the lead in building a vibrant and healthy economic ecosystem. Given that many organizations currently isolate and lock data, the Foundation believes long-term and resource-intensive effort is needed to help organizations unlock data. All resources reserved by the Ocean Foundation will be managed by it under the control of the board of directors. The Foundation will regularly report updates in the progress and the transparency of resource deployment to the community. At the same time, it is cooperating with a leading global data management company and another auditing company for tax planning.

VI. About the Techniques

6.1 Technical progress

The technique has been deployed and applied to a larger extent, compared with early days. In March 2020, Ocean’s decentralized/unmanaged data market started testing, along with the customizable data provider market; in May 2020, Ocean unlocked private data while preserving privacy by introducing calculations into data.

6.2 Technical architecture

1. Ocean Protocol Network Framework: The Ocean Protocol Network Framework mainly consists of five core components, namely:

(1) Frontend: Located in the third layer of the application layer, the frontend represents applications implemented by HTML + JavaScript + CSS and running on the client (user’s explorer).

(2) Data science tools: Located in the third application layer, data science tools are applications executed by data scientists that usually can access Ocean data and execute algorithms on the data.

(3) Aquarius: Located at the second layer of the protocol layer, Aquarius is a back-end application that provides metadata storage and management services, usually executed by the market.

(4) Brizo: Located at the second layer of the protocol layer, Brizo is a back-end application that provides access control services, that is, consumers’ access to publisher data or computing services, usually executed by the publisher.

(5) Keeper Contracts: Located at the first layer of the distributed virtual machine layer, Keeper Contracts is a Solidity smart contract running on the distributed Ethereum Virtual Machine (EVM).

2. Ocean data service process architecture

The following figure shows the mechanism of Ocean orchestration. The SEA contract forms a higher-level SEA for the entire DAG when processing each step in the DAG calculation. Running on a blockchain-based Ocean network, all SEAs guarantee execution. Once deployed, they can run easily. A single entity cannot intervene or stop them (unless within the definition of SEA). Decentralized settings mean that SEA can release new features, such as computing to preserve privacy. Any form of calculation can be introduced into the data. Therefore, value can be extracted from private data (the most valuable data) that never needs to leave the place.

3. Ocean inter-service network architecture: Ocean itself does not provide these services, but only connects them. That makes it an inter-service network dedicated to computing DAG on Big Data AI (as shown in the figure below).

6.3 Code overview

According to its official website and Github open-source code, the Ocean protocol network architecture has a relatively complete core component code, which is divided into 8 parts, namely Core Components, Libraries, Tools, OceanDB Drivers, Osmosis Drivers, Compute-to-Data, Parity Secret Store, and Project Management.

Github open-source code. Source: Github

The core component code base contains 5 components. The last code submission in the Keeper-Contracts code base was 13 days ago, and the last code update of the other 4 core components was in early June. The core component code base is not updated frequently and is thus relatively stable.

Core component composition and tool code. Source: Ocean’s official website

VII. Competition Analysis

The Ocean Protocol is a decentralized data service protocol. A data market can be established on this protocol and network to unlock the data to be shared and sold. It allows data sharing. Most other projects on the market are distributed or centralized data markets. The Ocean Protocol is complementary to the centralized and decentralized data market because it provides an additional channel for the market to publish its services. For example, Enigma and Datum are the first movers, but the Ocean Protocol is more like a partner than a competitor. The difficulty in the development of artificial intelligence lies in the separation of algorithms and data. Through decentralization, data can be shared in a fair, transparent and secure way, liberating the AI market to a large extent.

--

--