...by Daniel Szego
quote
"On a long enough timeline we will all become Satoshi Nakamoto.."
Daniel Szego

Sunday, November 18, 2018

How to create native external oracle with Nakamoto consensus





Similarly to the previous blog desisgning a decentralized native external oracle can be realized in a same way as a native random oracle. The principle is basically, the same: on the one hand the miners or validators measure the external data sources and put them into blocks or at least temporal blocks. On the other hand, imported data should be kept in secret, because otherwise new miners could influence or even hack the algorithm itself. 

The full multi-block native external oracle algorithm can be described as follows:

1. An imitator having a {Priv, Pub} private and public key pairs creates a external oracle request that includes the pubic key as well:

request_extern (Pub, N, ext)

, where Pub is the public key of the requestor, ext is the external data source and  N is the number of round during the external data source has to be measured.    

2. At mining or validation, a miner or validator measures the value of the external data source that is encrypted by the public key of the reqestor and put at in the request itself. So after the first validation, the request will look like:

request_extern (Val1)(Pub, N-1, ext) 

where Val1 is the measured external data encrypted by the public key. To ensure security Val1 value should be put into the blockchain as well, not necessarily forever but at least during the N phase of execution of the native external oracle. So with other words, the request would be splited into two parts:

request_extern (Val1) 

will be written into the blockchain

request_extern (Pub, N-1,ext)

can be available as a new transaction request that is propagated throughout the network in transaction pools, waiting to be mined or validated. Similarly, after k<N rounds, there would be k encrypted external values in the blockchain:

request_extern (Val1)
request_extern (Val2)
...
request_extern (Valk) 

and a new request as a transaction which is 

request_extern (Pub, N-k, ext)

3. After N rounds, the requestor can aggregate the external values and decrypt them with the Priv private key. 

Ext1 = Decrypt_Priv(Val1)
Ext2 = Decrypt_Priv(Val2)
...
ExtN = Decrypt_Priv(ValN)

The individually external values should be aggregated in a way that the they provide a correct average value even if some of the nodes try to hack or game the system. The algorithm should contain an incentive mechanism as well, giving a reward to nodes that gave correct values in this way motivating nodes to produce correct data, providing a Schelling point as decision making algorithm. Supposing that around one third of the nodes and measurements can be fault, we can have the following algorithm:

a. filter 33% of the most extreme values
b. make an average of the rest that remained providing the real external value
c. reward the nodes whose values were not filtered out based on the distance of the average.

The algorithm has unfortunately some performance issues: It takes N rounds to find out a real decentralized external value. This can be pretty long and pretty costly depending on the given blockchain platform. Considering a Nakamoto consensus the algorithm is pretty difficult to speed up, as the basic idea is that the external values are comming from N individual different sources that actually means at a Nakamoto consensus N different blocks. This implies the fact as well that the data source should keep the value for a long-enough time, like preserving the same value for hours. The algorithms can not really be used with fast-changing data sources.   

A further question can arise how the data exactly used like within a smart contract. As private keys should not be stored in the blockchain, there should be an additional round with the wallet software to encrypt and aggregate the information which might introduce elements of unnecessary centralization and difficult to build in directly into a decentralized smart contract. It is important to note however that this private key is not necessarily the same as the private key of the account so it can be actually revealed as soon as all the N nodes created a guess for a random number. As a consequence a second transaction can be used to calculate the real random number, like:

evaluate_extern (Priv)

At this moment the private key can be published to the blockchain and the evaluation algorithm can be implemented with a smart-contract in a decentralized way.

From a practical point of view a smart contract with built-in native external oracle would look as:

function with_ext (input params) return (output params) {
   ...
   variable ´= External(<external_datasource>);
  ...
}

for evaluating such a smart contract, there should be two transaction used:

- the first one would call the function with a public key of the external oracle making the initialization in the external data measurement.

transaction init with_ext pub_key_ext

- the second transaction would publish the private key and make the whole evaluation and the execution of the rest of the business logic, like:

transaction exec with_ext priv_key_ext

The proposed system has unfortunately one denial of service attack possibility. The one who initiated the external oracle has the private key and can calculate the result previously. If this result does not make benefit for him, he choose not to reveal the private key or not to execute the second transaction.