...by Daniel Szego
quote
"On a long enough timeline we will all become Satoshi Nakamoto.."
Daniel Szego

Sunday, December 31, 2017

Notes on Quorum private contracts and sharding


Quorum has a feature called private contracts in which a contract is executed only on specific nodes meaning that the data of the transaction is encrypted and it can be encrypted only on specific nodes, defined by the contract. The state model is spited into two major categories: 
- public state is validated by each node and made consistent on the whole network
- private state is validated only on specific nodes: only if the contract is in party with the node. 

From a practical point of view it is a special kind of sharding. As private contracts can not call a public ones, the public state is always consistent. However, considering the private contracts, there might be some unexpected result, if the similar logic is deployed with different parties and they confidently call the same private contract, it can be pretty much questionable how the consensus is created.  

Solidity and Truffle Tips and Tricks - Migration hangs


Sometimes if you migrate with truffle to a consortium blockchain network, especially if the network does not use proof of work, like at quorum, the migration hangs at the first transaction. Despite, if you check manually on the network with geth, the first contract seems to be deployed without errors, it is just somehow truffle does not get the feedback on that. Surprisingly, if you use the -- verbose-rp parameter, the migration continues without error. Do not ask why :)

truffle migrate --network quorum --verbose-rp



Notes on decentralized artificial intelligence algorithms and platforms


As there are many initiatives to create decentralized artificial intelligence algorithms, most of them work in a way that they weakly integrate the blockchain technology with different blockchain platforms, usually via interfaces and a common tokens that represent some kind of an exchange of the services. Similar projects that successfully made an ICO are for example Neureal, SingularityNET, or Neuromation. However it would be much more interesting to integrate a distributed ledger technology with a machine learning algorithm really on the algorithmic level. It would provide this way a whole integrated algorithm for decentralized artificial intelligence. Certainly it is an open question which AI algorithm can be efficiently integrated with which decentralized technology. 

Friday, December 29, 2017

Azure Blockchain walkthrough - Setting up a Quorum Blockchain Demo

Azure Blockchain platform have some pretty cool solutions to create consortium blockchain solutions. In this walkthrough we are going to demonstrate one step by step guideline for setting up a Quorum demo environment including, configuring the environment and deploy the first applications on top. Quorum is an extension / fork of Ethereum for consortium scenarios, including two useful extensions:
- increased performance with a Proof of Authority algorithm
- private and confidential transactions
- node privacy

1. Choosing the necessary Azure Blockchain template: Azure Blockchain has many different templates. Two of them are important for Quorum consortium blockchains: with the help of "Quorum Single Member Blockchain Network" you can create a minimal infrastructure, with "Quorum Demo" a preconfigured demo environment is delivered.

Figure 1. Quorum Azure templates

2. Delivering Infrastructure: In this tutorial we deliver the "Quorum Demo" template. It requires the configuration of a standard Azure image with the usual virtual machine sizing parameters. All blockchain specific installation step will be configured in the post configuration.

Figure 2. Setting up Quorum demo infrastructure

3. Post configuration of the infrastructure: 

 ssh <user>@<ip> -- logging in into the environment
 git clone https://github.com/jpmorganchase/quorum-examples.git -- cloning the repo
 cd quorum-examples/examples/7nodes -- the 7 nodes demo
 sudo su -- changing to root
 ./raft-init.sh -- initialising the environment (you have to probably use sudo)
 ./raft-start.sh -- starting the environment (you have to probably use sudo)

If you see the following screen, the Quorum demo started successfully.


Figure 3. Quorum demo started successfully

Optionally, before starting the nodes it might be a good idea to pre-allocate some ether. The 7nodes demo uses the genesis.json genesis file. You can pre-allocate ether to the coinbase address with the following command:

"alloc": 
 {"0xed9d02e382b34818e88b88a309c7fe71e65f419d": {"balance":
    "111111111"}}

4. To test if things working you can attach to the nodes with the help of the Geth console and make further configuration if necessary (make sure that you are in the 7nodes demo folder, if you get permission denied error message use sudo at the beginning, if the ipc file is not found than there was probably an error in the previous step at setting up the network):

 geth attach ipc:qdata/dd1/geth.ipc

Optionally, after attachment it might be a good idea to distribute some of the preallocated ether, with the following command:

eth.sendTransaction({from:"0xed9d02e382b34818e88b88a309c7fe71e65f419d",to:"<to_address>", value: 100000000})

5. Testing a pre-deployed private contract: with the 7nodes demo scenario, there is a private contract, called simple storage that has already been deployed at the setting up of the network, you can test as starting two windows and attaching to the node 1 and node 4:

geth attach ipc:qdata/dd1/geth.ipc
geth attach ipc:qdata/dd4/geth.ipc

then configuring the contract

var address = "0x1932c48b2bf8102ba33b4a6b545c32236e342f34";


var abi = [{"constant":true,"inputs":[],"name":"storedData","outputs":[{"name":"","type":"uint256"}],"payable":false,"type":"function"},{"constant":false,"inputs":[{"name":"x","type":"uint256"}],"name":"set","outputs":[],"payable":false,"type":"function"},{"constant":true,"inputs":[],"name":"get","outputs":[{"name":"retVal","type":"uint256"}],"payable":false,"type":"function"},{"inputs":[{"name":"initVal","type":"uint256"}],"type":"constructor"}];

var private = eth.contract(abi).at(address)

then calling get function should deliver 42 on node 1 and 0 on node 4:

private.get()

6. Configure Truffle: Truffle must be configured to the environment with a custom network configuration that can be set in truffle.js. It is important to set the public address to the public address of the virtual machine and configure the 22000 (Quorum rpc port to be open):


Figure 4. Live (Quorum Azure) network configuration

You can use the following command for migration:

truffle migrate --network live --verbose-rp

If you configured correctly, you should be able to start a migration with Truffle. There might be some issues however, as an example, if you Truffle deployment will remain hanging in some scenarios. The reason for that is that the development javascript expects that the transaction was actually mined. As however in Quorum there is no mining, the process might stay hanging. One workaround is to use only one javascript deployment script and based on the transaction hashes check explicitly if the given transaction was correctly validated. Another workaround that sometimes work is to start the Truffle console explicit on a given node and execute the migration from there. And last but not least, do not forget to open the 22000 - 22008 ports on the Azure environment.  

7. Configuring metamask with Quorum: configuring metamask with quorum is pretty similar to configuring another given network via the Custom RPC of Metamask. Simply use the IP address and the port with http:


Figure 5. Metamask configuration for Quorum















The rise of the new technology elite


With the current trends and dynamics of changing technologies and business environments, old fashioned company leading techniques and management techniques simply do not have the chance. High tech companies can be only lead with a real hard-core, real high-tech and geek mindset with first case experience in the field. Any other strategy will simply not be able to keep track of the changes and define strategy even in a couple of years ahead. 

It works even with very simple examples, a software development methodology can not be efficiently lead by people who never ever wrote a piece of code. Similarly, infrastructure management can be efficiently managed by people who already did configure live systems and perhaps made outages as well. However, perhaps the best example is general software strategy: if you want to predict the next generation of IT services, you must simply have the first-hand, in-depth experience with a couple of technology changes. The situation is perfectly demonstrated by the fact that old-fashioned management schools offer technology oriented courses, like Artificial Intelligence in sales, or Big Data in Human Resources.   

notes on the increasing need of enterprise IT support

Our IT systems are getting more and more complicated every day. That is actually not a surprising fact, what is however more surprising that the possibility for getting support does not seem to keep up with the technology changes. As an example about 15 years ago installing an enterprise software meant to have something as a stable release, that is tested and runs more or less in a stable way with an install guide install software sometimes checking prerequisites as well. Nowadays the situation seems to be more complicated, probably due to the lean agile methodologies. 

There is not anymore something as a stable release, check the releases on git and try to install the version that you like. There is no real documentation about the prerequisites, there is usually a list of software that surely needed but which version is compatible with the software that you want to install is pretty questionable. If there is something as an install, it is usually a script which runs sometime but does not run in other situations and well if it does not run it is actually for you to edit and modify. And on top you find usually zero documentation, at most some Readme.md files or some hints with the repository or in social forums. 

Solidity and Truffle Tips and Tricks - setting up Truffle development environment


As I install a brand new environment usually with a brand VMware image, it is time to summarize the necessary steps instead of google-ing them always again and again.

1. Install brand new Ubuntu, I usually use the 16.04 version.

2. Installing open vmware tools:

 sudo apt-get install open-vm-tools
 sudo apt-get install open-vm-tools-desktop

3. Install packages:

 sudo apt-get update && sudo apt-get -y upgrade
 sudo apt-get -y install curl git vim build-essential

4. Install NodeJS:

 curl -sL https://deb.nodesource.com/setup_6.x | sudo -E bash -
 sudo apt-get install -y nodejs
 sudo npm install -g express

5. Install Truffle

 sudo npm install -g truffle

6. Install TestRPC

 sudo npm install -g ethereumjs-testrpc

7. Test: you can test the whole installation for instance by initializing a new project by Truffle:

 mkdir test
 cd test
 truffle init
 truffle develop

8.  Install Visual Studio Code, you can install fully from command prompt with the help of the following commands:

 curl https://packages.microsoft.com/keys/microsoft.asc | gpg -- dearmor > microsoft.gpg
 sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg
 sudo sh -c 'echo "deb [arch=amd64]  https://packages.microsoft.com/repos/vscode stable main" > /etc/apt/sources.list.d/vscode.list'
 sudo apt-get update
 sudo apt-get install code

9. Last but not least, install the solidity extension under Visual Studio Code extentions

 









Azure Blockchain walkthrough - Setting up Ethereum Consortium Blockchain

Azure Blockchain platform have some pretty cool solutions to create consortium blockchain solutions. In this walkthrough we are going to demonstrate one step by step guideline for creating, configuring a consortium network and develop and deploy the first applications on top. 

1. Choosing the necessary Azure Blockchain template: Azure Blockchain has many different templates. Two of them are important for Ethereum consortium blockchains: with the help of "Ethereum Consortium Blockchain" you can create a consortium infrastructure up to 12 nodes in one configuration step. With "Consortium Leader" or "Consortium Member", there is the possibility to do the same configuration step-by-step, with other words dynamically if a new member joins.   

Figure 1. Azure Ethereum templates

2.  Configuring the template: After the necessary template have been chosen it has to be configured. Apart from the usual Azure parameters like resource group and administrator accounts, the structure of the network has to be set, like the number and size of the mining and translation nodes. or the number of consortium members. The last tab provides the possibility to configure the Ethereum specific settings, like the network ID or a custom genesis block.    


Figure 2. Configuring the Ethereum Consortium Blockchain

3. Creation finished: If the network creation finished, the most important parameters are to be found in the result page:
- Admin-site: a general page about the status of the network, including a faucet as well to distribute some pre-allocated ether.
- RPC-Endpoint: is an important parameter for communicating with the consortium blockchain for example from Truffle or Metamask.
- SSH Info: is important for logging in into the environment and configuring parameters, like most typical for unlocking the coinbase account.

Figure 3. important parameters for accessing the consortium blockchain

4. Unlocking the coinbase account: It might cause difficulties in the future so it is not a bad idea to explicitly unlock the coinbase account. With the help of ssh and the ssh command of the previous configuration window, you can attach to the first node of the ethereum consortium network (note that usually the first node is the transaction node, if it is confidently different, you can log in into further nodes with using -p 3001 3002 ... parameters).

If you logged in, you can use: 

  geth attach  -- to attach to a running geth instance

  personal.unlockAccount(eth.coinbase) -- unlocking the coinbase account, the default unlock time is 5 minutes.

 eth.coinbase -- getting the address of the coinbase account

 personal.unlockAccount('address', 'passphrase', 'duration')  -- unlocking the account for a longer time period. If you use 0 as duration, the account will be locked forever. 

5. Configuring Metamask: Based on the Ethereum-RPC-Endpoint, Metamask can be configured with changing from the Main Network to a custom RPC. With the help of the admin page faucet and newly generated accounts, the pre-allocated ether can be further allocated and the network basic network functionality can be tested.


Figure 4. Metamask configuration

6. Deploying contract: If Metamask set up you can directly or indirectly deploy contracts with the help of remix by selecting "Injected Web3" or "Web3 Provider" at the environment. "Injected Web3" deploys the contract with the help of the configured Metamask account.


Figure 5. Remix configuration


7. Configuring Truffle for the consortium network: if you want to deploy to the new Azure consortium network, make sure that you configured the new network in the truffle.js configuration file. You can get the host name as the Ethereum-RPC-Endpoint from the output window, and you can get the network_id from the initial configuration. You can deploy on your consortium network for instance with the truffle migrate --network azureNetwork command.


Figure 6. Truffle configuration


Thursday, December 28, 2017

Notes on Nakamoto Consensus



From a theoretical perspective in a blockchain protocol a Nakamoto consensus works as a lottery game, the one who wins the lottery gets the possibility to create the next block. If the next block is faulty or there is a cheat, than there is a huge possibility that the new block will be not propagated successfully to the network, so the winning node does not get a reward. The winning lottery combination or "ticket" is actually a nonce value of the block that is both consistent with the previous block hashed and with the difficulty of the system.

The analogy is actually not 100%, in a blockchain style lottery system the winning ticket might be different or individual for each miner because the coinbase transaction is different for each of them. There should be on a long term always one winning ticket, in short term however it is possible that winning tickets are competing to become final one, it is practically called as standard forking. 

Solidity and Truffle Tips and Tricks - gas limit, gas exceptions


If you reach different gas limits during the development and testing your contract and you get different exceptions like out of gas exception or gas limit exceeded, you can do the following things.
- configure the given gas explicitly at the contract creation or at the function execution, like:  

testContract.functionCall(_functionParameter, {gas:4700000});

- or configure the block gas limit in the truffle environment to your testRPC by modifying the truffle.js configuration file and adding explicit block gas limit, like:

  networks: {
    develop: {
      host: "localhost",
      port: 9545,
      network_id: "*",
      gas: 4700000
    }
  }

Note however that the block gas limit on the live network is around 4700000 at the moment, so truffle at this current release do not let you go above this parameter.


Monday, December 25, 2017

Solidity and Truffle Tips and Tricks - updatable smart contract


Ethereum Smart Contracts are immutable by design, meaning that if you deployed them, there is no way to update. However, with some tricks there is the possibility to make updates at least to a certain level.  

- client side: one way might be to handle everything on the client side, like deploying a new contract and simply forgetting the old one in a way that all client references are set to the new one. Certainly, it is not a very reliable or professional way of doing things.

- selfdestruct: one way of doing it is to implement selfdestruct into the contract, destroy the contract and develop a new one. Certainly, it a highly centralized way of doing things, hence you should probably provide a provable way to migrate the data of the old contract to the new one.   

- data dependent execution: another way might be if you explicitly implement several execution logic into the smart contract and the execution is dependent on an internal variable. Certainly, the disadvantage in this scenario that you have to prepare to every possible execution path:

contract DataDependenExec{

  uint _execPath = 0;

  function execute() public {
    if (_execPath == 0) {
       // ... execution path one
    } else {
       // ... execution path two
    }
  }

  // .. some functionality to set _execPath 
}

- multiply smart contracts: a further solution might be that the contracts are organized in a master - slave style and cooperation via a certain interface is realized. If the slave contract is updated, the master calls the new functionality. Certainly, the interface have to be designed very carefully, in a way that no change happen in the fingertip of the function and of course security has to be taken pretty seriously as well. 

contract Master{

  address _slaveAddress;

  function changeSlave (_newAddress) public {
    _slaveAddress = _newAddress;
  }

  function callSlave () public {
     Slave slave = Slave (_slaveAddress);
     slave.callSlaveFunction();
  }
}

contract Slave{

  function callSlaveFunction(){
    ...
  }
}

Solidity and Truffle Tips and Tricks - estimate gas usage


Estimating the gas usage is a critical part of every Ethereum Smart Contract. If you use Truffle and test rpc, the easiest way is to write a unit test for each critical function as a transaction. If a function is separated into several subcalls, it is a good idea to cover each of the sub-function separately from an individual unit test as an individual transaction. Having the individual tests, you can start a new console windows with truffle develop --log, in which you can see the gas consumption of each transaction. As gas consumption might be dependent from data, it can be a good idea to measure each transaction with several possible data input. 

Proof of Useful Work with Proof of Stake


Classical Blockchain systems with Proof of Work have been suffering from the fact that the work for resolving the cryptographic puzzle is actually not a useful one. There are many attempts to overcome this problem, however the real problem is that finding a useful computational problem that can be scaled both with difficulty and with timing to a blockchain algorithm is extremely difficult: 
- Such an algorithm has to have a difficulty level that can be adjusted based on the miners or validators. 
- The algorithm has to produce result in a certain time frame. 
There is only one attempt that managed successfully to find such an algorithm, that is Primecoin. 

There is however a fundamentally different approach to deal with Proof of Userful work and that is actually via Proof of Stake. In Proof of Stake something has to be deposited as a stake to provide an incentive to stabilize the network. This deposit can be actually produced with the help of a scare resource that might be a computationally intensive algorithm that is doing something useful. In this way several problems of a classical Proof of Work system is eliminated: the useful algorithm must not necessarily be solved in a certain timeframe, the difficulty should not be necessarily varied, and what is most important the actual miners should not compete on the resources but everyone who puts work in gets rewarded. 

Certainly, there are some open questions as well: 
- How can be guaranteed and measured that a certain amount of work has been really carried out ?
- How can be made sure that the resulting algorithm and consensus really provide a Nash equilibrium to maintain the consistency of the network,    

Friday, December 22, 2017

Solidity and Truffle Tips and Tricks - important Truffle console commands



eb3.eth.accounts[0] - getting the address of an account address

SmartContract.address - getting the smart contract address

SmartContract.deployed().then(inst=>{SmartContractReference=inst}) - getting a real SmartContract reference if it is deployed

SmartContractReference = SmartContract.at(tokenAddress) - getting a real reference if the tokenAddress is previously known

SmartContractReference.callFunction().then(res => res.toNumber())
- calling smart contract function with converting the result value

SmartContractReference.sendTransaction({ from: account1, value: web3.toWei(5, "ether")})
- sending ether to smart contract








Thursday, December 21, 2017

Solidity and Truffle Tips and Tricks -msg.sender vs tx.origin


Ethereum transactions are initiated by a transaction and if there is multiply contracts involved, calling each contract us manifested by a message. So, having a transaction initiated by a user that calls the following contracts A -> B -> C -> D has one huge transaction and several internal messages: between A and B, between B and C, between C and D and between the initiator user and A.

tx.origin is always the user initiated the transaction itself,  msg.sender is the initiator of the message itself. In case of end user and contract A, the two elements are the same, but for example considering contract D, msg.sender is C and tx.origin is the user who initiated the transaction. 

Please note that using msg.sender is regarded as insecure and should be avoided. 

Wednesday, December 20, 2017

A Securities Law Framework for Blockchain Tokens


The most important frameworks for an ICO to evaluate if a token is a security or not:

https://www.coinbase.com/legal/securities-law-framework.pdf

Howey test:
https://docs.google.com/spreadsheets/d/1QxOV2dgxO3C_TyVE0-41ZwLlzPmB-EE1NNshJGuedCU/edit#gid=0

Solidity and Truffle Tips and Tricks - public, internal and private


General visibilities in solidity both for variables and functions mean the following:

- public: it can be called from overall, from outside the smart contract, from inside or from a descendant class. With variables it is a little bit special, at properties public means that a special getter function is generated that makes possible to read out the value of the property from outside. It must be paid attention however, because this is an automatically generated function that must be called in some scenarios, like from another contract or web3.js as function and not as a property.   

- internal: internal can be used / called only from the same contract or if there is an inheritance from descendant contracts. 

- private: can be called only from the contract itself. Not even a descendant contract is able to access the property or the function.

In functions there is one more option to take into considerations, using public or external. See the previous blog about the consideration of distinguishing the two modifiers.  

Solidity and Truffle Tips and Tricks - external vs public


Function modifiers can be explicitly set to external or to public. The default is always public. The difference between the two calls manifest if the function is called internally, if public functions are called externally it means always an extra call (via messages or transaction), however if they are called internally it does not manifest with an extra call. An external variable can always be called from another smart contract implying a transaction or a message in a transaction. If they need to be called internally an explicit external call has to be simulated with the this keyword:

contract Calls{

  function externalF() external {
  ...
  }

  function callInternally() public {
  this.externalF(); // internal function is called as external
  }
}

Using the this keyword and calling an external function internally is usually not a best practice and cost more gas. Instead, it is proposed to set a function to public if it can be called both from internal and from externally as well. There is one exception if the input value of a function contains an array, in such scenario forcing always the external call might be more gas efficient.    


Monday, December 18, 2017

On the fiscal and monetary policy of a cryptocurrency


Most cryptocurrency have at the moment something as a simple algoritmically specified token supply, which is pretty far from classical tools of a nations' currency. So let we examine if something similar to the classical monetary and fiscal policy can be realized with cooperation of a cryptocurrency. 

Classical monetary policy.
- Increasing monetary supply: increasing monetary supply simply means increasing the amount of tokens that are available in the circulations. That can happen in an algorithmic way, as it is usual in most of the cryptoruccencies, however it can occur as a result of a centralized or semi-sentralized explicit action.  
- Decreasing monetary supply: well this is not so easy. One option might be to somehow burn coins, however the major question is with this situation where should be the coins burned. If they are burned directly in the wallets of the customers, then probably the trust of the currency will not be huge. Another idea might be to have a standard inflation rate with a standard token issuance rate, so this issuance rate can be increased or decreased to zero as well.   

Direct counterparty involvements.
If the major focus is to influence or keep stable the changing rate on market, one possibility is to influence it directly with the help of a counterparty. The party will have an amount of funds to change the changing rate of the currency and can act consciously for a certain market situation at selling or buying. Certainly, this situation involves an explicit counterparty risk.   

Policy via token multiplication. 
If we consider not just the cryptocurrencies but the tokens financing projects or companies on top, we get a system that can be better fine-tuned. A token issuance can act something like a currency multiplication act, especially if we consider the tokens as part of the monetary basis. If we can motivate or demotivate the issuance of the new tokens, we get a system that influences indirectly both the monetary supply and the whole market readiness to create new projects or companies.   

Fiscal policy. 
Real fiscal policy is pretty difficult in a cryptocurrency context. It is probably because of two reasons: On the one hand crpytocurrencies provide just currencies and not necessarily a full or partial economy. Even if some economy exist behind, it is far from being a closed one. It is questionable if pure fiscal policies ever can be interpreted in the context of cryptocurrencies. 


Solidity and Truffle Tips and Tricks - BigNumber in javascript


Using the web3 javascript library from the user interface or from the a Truffle javascript unit testing, you can sometimes get an object that look somehow like: 

{[String: '5'] s: 1, e: 0, c: [5]}

It means the result has the BigNumber type, web3.js converts every integer that comes from solidity into BigNumber because solidity integers have a bigger range than javascript integers. Use simple the toNumber() call to convert, like:

result.toNumber();


Solidity and Truffle Tips and Tricks - Unit testing


In Truffle there are two unit testing framework, unit testing with solidity with special testing contracts and unit testing with a javascript unit testing framework called Mocha. You can actually use the mixture of both. However, solidity unit testing is pretty much in a beta phase, besides the language was not really designed to write unit tests, like you should define one contract for each test in a separate file paying especially attention to the file name, having difficulty to make console logs during running the tests and so on. As a consequence, Mocha is proposed whenever it is possible, perhaps with one big exception: if the contracts should be unit-tested not from the end-user perspective but explicitly from another contract.

Notes on modular AI infrastructure and Apps


AI algorithms and platform solutions are coming for sure. The only question is how should be designed an AI application from the architectural point of view. In classical computer systems, usually there is something similar as the operation system that provides a low level software architecture with certain basic services and on top apps or applications that cover specific needs. The question is if it is possible to separate this way an AI application, considering some low level primitives as infrastructure and building specific Apps or applications on top.

Proof of Useful Work in Hashgraph


Hashgraph algorithms actually do not really need a Proof of Work algorithm, the voting itself can be realized by the graph itself with a relative minimal cost, the only issue that remain how the system should avoid Sybil attacks. Here one of the idea might be to use some kind of a tokens for voting and the token distribution can be controlled in a way. One of the idea for the voting token distribution can be simply a consortium network scenario, the other one is a something similar to a Proof of Stake. 

However the voting token distribution can be controlled with the help of a Proof of useful Work algorithm as well. Everyone who did some amount of useful work get tokens in the system to vote. Such a mechanism provide several advantages comparing to classical Proof of Work of a Blockchain system. In Hashgraph Proof of Work we do not need rounds, so the work itself must be not scalable to blocks, it might vary and take largely different amount of time frames. On the other hand, the proof of work should not be competitive as in Bitcoin for example where the fastest wins and the rest of the efforts will be lost. Here, practically every piece of work will be regarded with tokens, so they are considered as valid.    

Certainly, there remain some open question. On the one hand, how can be proved that the amount of work was really done. On the other hand, it must be analysed by game theoretical perspective if maintaining the system is a Nash equilibrium. 

Sunday, December 17, 2017

Solidity and Truffle Tips and Tricks - Computing contract data in a timed fashion


So, let we make a little bit brainstorming about generally data calculation and data consistency in solidity. Let we have the following contract: 

contact DataSync{
   uint public syncVar;

  function computeVarSync() public {
    syncVar = <complicated computation>;
  }
}

In our example, we have a syncVar variable that is public, so we can read out the value anytime we want, for setting the value however we have a function that contains a complicated computation that might require a lot of gas to execute. In such an example, we might consider to call the setter valuable as rarely as possible, as an extreme example a timer job might be implemented somehow that calls the complex function once a week. Certainly, on the one hand, the use case must accept that the syncVar variable contains only the value computed the last week. On the other hand, the timer job concept is not really decentralized, so we must make sure that the computeVarSync method can be called by the community and that calling the function two times in a row does not cause any unexpected effects.  


Solidity and Truffle Tips and Tricks - Design principles and gas consumption


If you design a solidity code, it is not really practical to use the same design principles as at the classical object-oriented design. It is because having a good object oriented design with multiply objects, references perhaps indirectly event delegation will cause that your gas consumption will explode. Instead try to model first which are the critical transactions that should be carried out as atomic ones. Which are the transactions that can be executed independently from each other, and which are the functionalities or business logic that should not necessarily run as transaction. Based on this model a more efficient gas consumption code can be implemented. 

Solidity and Truffle Tips and Tricks - Constant Function


Solidity functions can be marked with constant modifier, meaning theoretically that the execution does not cost gas, because the function is run only by a local node. However making a couple of simulations and measurements constant functions work a little bit funny at the moment. 

- If the constant function is called directly from outside: it does not cost gas. 
- If the constant function is called by another non-constant function but from the same contract: it does not cost gas.
- If the constant function is called by another non-constant function which is located in a different contract: it does cost gas. 
If the constant function is called by another constant function which is located in a different contract: it does not cost gas.

Certainly it is an interesting question how will it work in the future, as the constant modifier will be replaced in the future by two ones: view and pure. However, as a conclusion we might as well conclude generally that even if with such a modifiers, there is not always a gas savings.    


Solidity and Truffle Tips and Tricks - Reading public variable with console


At developing with Truffle and solidity sometimes it is needed to explicitly test the contract from the Truffle console and read out public properties. Well this is not as easy as it seems. The logical way: 

Contract._publicVariable;

does not really work, instead an additional reference should be got to the token and the public variable has to be called as a function as it is actually a getter function.

var MyContract;

MyContract.deployed().then(function (a) {myToken = a;});

MyContract._publicVariable();



Saturday, December 16, 2017

Solidity and Truffle Tips and Tricks - Debugging a unit test error


Well, Truffle and solidity will be surely a great thing one day, but today it is still a little bit different to make debugging things. Supposing you have a smart contract, you were writing some unit tests for that in solidity as well, and you have an error message, something like:

 Error: VM Exception while processing transaction: invalid opcode

What you can try out is to open a second window with the truffle develop --log command, rerun the unit tests and get the last transaction id in the log windows. Having the transaction id, you can try to debug your code in the truffle window with:

 debug <transaction_id>


Solidity and Truffle Tips and Tricks - object Promise


Working with solidity, truffle and the web3 javascript console in some way, you might get the following message if you want to write out an address to the console:

console.log(contractReference.addressField());

[object Promise]

On the one hand, this is the right way to get data from a public field of a contract, because public fields mean that they are automatically converted to setter and getter functions. On the one hand this funny result is because Truffle automatically resolves for some reason the address to a promise and prints out somehow only the compacted version. If you want to get the field value use instead: 


contractReference.addressField().then(
               function(result){console.log(result);});


  




Tuesday, December 12, 2017

The last bastion of global enterprises


The last bastion of the global enterprises seem to vanish. As an individual or small team it is already not too difficult to work remotely, make sales channels globally or recruit teams internationally. The last critical resource that used to be only available for global enterprises is capital. However it is not the case anymore. With ICO mechanism and token economics every idea, every project and small team have the same chance to get global funding as practically any big enterprise. This was pretty much the last benefit, the last bastion of the global enterprises.  

Sunday, December 10, 2017

The end of Blockchain .... long live the Hashgraph ?



Without any Proof of "Waste" algorithm, without incentives, without transaction fee, proovable Byzantine, implicit decentralized voting with limited network  usage, maximally scallable ....
Actually, I still did not read the proofs but if only half is true what they claim, this is gonna be a big one.

Solidity and Truffle Tips and Tricks - private variable visibility


There is a possibility to set a variable in an explicit way to private like:

uint private _privateVariable;

It means that the variable is not accessible directly from the outside world, neither it is accessible from inherited contracts. As opposed to the standard visibility without any modification that is not reachable from the outside world either, however it is accessible from an inherited contract. However, it is important to note that "private" only means that it is not visible from other contracts, the blockchain despite contains the information in a public way ! So it is not the best idea to store really private information in such a variables, even if they are marked as "private".  

Saturday, December 9, 2017

Notes on the economics of Ethereum gas consumption


Surprisingly, the fact that code on the ethereum blockchain is based on gas consumption and the price of the gas is based practically on market mechanism implies that using Ethereum classic can be a much logical choice than ethereum itself. The reason for that is that ether in ethereum isn't only used for being a system for paying for performance on the system, but it actually acts as a way of investment and speculation. It implies the fact that the evaluated value of the platform directly implies the price of a transaction. As a consequence if the platform becomes main-stream, the transaction cost will be on a short run sky-rocketing due to the fact that computing power does not evolve so fast as the demand. On a long run, this effect could be stabilized with decreased transaction cost, however the surprising experience that is does not happen. As an example, comparing average transaction fee of Ethereum and  Ethereum classic, we see a 15 - 20 multiplication, that is actually the same as the price between ether and ether classic. Certainly, the higher transaction price is compensated by higher security, however not all applications do need this higher security that are running on the blockchain.   

Thursday, December 7, 2017

Decentralized Gamified Orgnasitation


The employment models, the corporate structures, the organisations and actually the whole work culture are in a heavy changing phase throughout the world. The reason is for that that most of the structures and philosophy comes from the industrial age where the basic model was the conveyor belt which was more or less adapted the white collar jobs, including job descriptions, job hierarchies, performance based measurement and of course the most hyped nowadays word is business efficiency. 

In the last fifty - sixty years this seemed to be pretty normal, spending eight hours in a workplace, doing well-defined intellectual work, like filling excel tables and being measured by that efficiency. However, this model does not seem to be longer maintainable. One reason is for that the increasing amount of automation of the white collar jobs, the other reason is the increasing presence of the artificial intelligence. This implies on the one hand a changed skill set from the employee side, like perhaps more flexibility, more creativity, social skills and even skills to work with artificial intelligence. On the other hand, it simply implies radically different organisations and organisation structures and I do not only mean that the corporation is located in several continents and instead of personal meeting there will be online meetings, but actually radically different organisation structure. 

As certainly noone knows how such an organisation structure would look like, there are some elements that can be identified: 
- The new organisation structure should not be only online but actually must born as decentralized and online. 
- It should be less look like as a nowadays classical organisation but rather something as a nowadays community. 
- It should not be based on processes but rather on some general internal rules that might be even changed by the community. 
- The whole internal working structure should be based on tokens and tradable tokens with maximum transparency. 
-  Every activity that can be automated should be automated or supported by automation like artificial intelligence. 
-  Overall where people work the work itself should be maximally gamified. Human performance should be reached overall by playing games. 
- There should be the possibility actually to play different style of games to do the same corporate activity.
- The border of the organisation should not be handled too seriously, the corporate "game" must be played together with vendors, suppliers, customers....

If it seems to be idealistic just imagine that the Linux foundation and the whole open source community; it works pretty much similarly and as practically Linux has beaten Microsoft Windows in the operation system competition, I would say, they were pretty much successful.  

Let we call the new organisation structure as Decetralized Gamified Organisation.