...by Daniel Szego
quote
"Simplicity is the ultimate sophistication."
Leonardo da Vinci

Friday, May 21, 2021

Intermediate CA with Explorer or Gatweway on Hyperlegder Fabric

 


If you use Hyperledger Fabric with hieararchical CA-s and want to use Hyperledger Explorer or gateway, you might as well face with the situation that the tlsCACerts property for the TLS communication must be configured. The thing is that neither intermediate TLS certs or root TLS certs will work here you should create a file that contains the whole certificate chain (just copy in one file first the root cert then the direct intermediate cert, then the next intermediate cert and son on). 

Friday, December 25, 2020

Viewing channel transaction and genesis block in Hyperledger Fabric


Channel transaction and genesis block in Hyperldger Fabric are stored in an encoded way. However you can take a look on the content of the files with either with the configtxgen or with the configtxlator tool tool and with the help of the following commands: 

Viewing the content of the genesis block with configtxgen:

  configtxgen -inspectBlock genesis.block

With configtxlator, you have to start the tool first:

  configtxlator start

Viewing the content of the genesis block with configtxlator:

  curl -X POST --data-binary @genesis.block http://127.0.0.1:7059/protolator/decode/common.Block >    genesis.json

Viewing the content of the channel.tx with configtxlator:

 curl -X POST --data-binary @mychannel.tx         http://127.0.0.1:7059/protolator/decode/common.Envelope > mychannel.json

Viewing the content of the anchor transaction Org1MSPanchor.tx with configtxlator:

 curl -X POST --data-binary @Org1MSPanchors.tx       http://127.0.0.1:7059/protolator/decode/common.Envelope > Org1MSPanchors.json



 


Hyperledger Fabric CLI commands summarized

 


Some of the most important Fabric CLI commands are the followings.

Create channel:

  peer channel create

Fetching block and channel information, it is required at new peer joining the netwotk:

  peer channel fetch

Joining the channel:  

  peer channel join

Listing channels:

  peer channel list

Updating channels: (like at sending anchor peer update)

  peer channel update

Package chanincode:

  peer lifecycle chaincode package

Installing chaincode (must be executed at each peer):

  lifecycle chaincode install

Query installed chaincode

  lifecycle chaincode install

Approve chaincode (it must be executed at each organisation)

  peer lifecycle chaincode approveformyorg

Check if chaincode is ready for the commitment

  lifecycle chaincode checkcommitreadiness

Committing chaincode

  peer lifecycle chaincode commit

Check if chaincode is committed

  peer lifecycle chaincode querycommitted

Invoking chaincode

  peer chaincode invoke

Query chaincode

  peer chaincode query






 


Friday, November 27, 2020

On decentralized artificial intelligence markets

At a conference, I was recently asked what I think is the most disruptive technology in the world at the moment. I chose without hesitation, and perhaps surprisingly, I didn’t choose quantum computing, IoT, or even the blockchain, but decentralized artificial intelligence markets.

The topic began to unfold during the 2017 ICO hype, when several competing and similar platforms were launched, such as Neureal or SingularityNET. The basic technical and economic idea is that at the moment, much of the mainstream artificial intelligence and machine learning market is concentrated in the hands of a small number of players who use the results largely in specialized areas. Some of the actors, for example, are cloud providers or social media platforms that use the data collected to make our own product portfolio more marketable, such as more targeted and positioned ads. Other major players are government and / or military uses, which also focus only on specific well-defined areas. On the one hand, the social benefits of these uses are fundamentally doubtful, and on the other hand, smaller firms do not have much chance of putting together the right quality MI competencies and algorithms. Distributed artificial intelligence / machine learning platforms are trying to help with this by virtually democratizing AI.

The basic idea is that many different research universities in the world, or even AI hobby groups, are working on improving various algorithms for each subfield of artificial intelligence. Most of these initiatives can be found, for example, on GitHub, an open source way for anyone to use. However, these initiatives very often do not achieve concrete business realization even if the idea would be commercially viable due to marketing difficulties: In practice, the AI researcher is not a professional businessman, and bringing such technology to market is often even more challenging than usual.

Decentralized AI markets work similarly to, for example, a mobile App store: each developer or researcher can create a specific running service from their own algorithm, the operation of which can be sold through the market and purchased by specific users. For example, if I have a very good algorithm that translates from Chinese to Hungarian, I can implement this idea with a service that I can register for a decentralized AI market. After that, anyone who needs this service can use it through the market for a fee. This award is given to the developer and the creator of the algorithm. One of the great advantages of such market platforms is that not only can end-users use the results of individual AI services, but individual services can also build on each other.

For example, the Chinese-Hungarian translation algorithm is not necessarily bought by end users, but may be built into another MI algorithm that subtitles movies automatically. On the other hand, it is possible that we need other other MI algorithms, such as Chinese or Hungarian dictionary programs, to work. These can also be exploited and incorporated into our own algorithm using the decentralized MI market (Figure 1). In this way, each MI algorithm forms elementary and reusable “lego” blocks that form complex applications for end users.


The best known decentralized MI platform is SingularityNET, marked by Hanson Robotics, famous for the Sofia robot, and MI legend Ben Goertzel. The platform has been implemented on the Ethereum blockchain and the IPFS (Interplanetary File System) platforms, however, the architecture is prepared to use other blockchain technology in the long run. Part of the system:

-  A decentralized database (registry) for registering, accessing, and using MI services.
- Market mechanisms / MI service for the sale and purchase of token system and related components such as crypto wallet.
- Components that facilitate the use of the MI service, such as off-chain chanels to implement fast and low transaction fee token transfer.
- We provide services to facilitate the implementation of development tools and libraries.


However, apart from the fact that decentralized MI platforms are trying to fill an actual market niche in the short term, several initiatives have much more ambitious goals. Most successful MI applications on the market try to provide a dedicated solution for a relatively narrow spectrum area, such as recognizing special patterns and shapes in images. These so-called narrow MI (narrow AI) solutions can sometimes work more effectively than human problem solving, but only in a specific narrow topic. Some initiatives, such as SingularityNET, ultimately aim to create artificial intelligence systems that deliver performance comparable in all respects to human thinking and problem solving (so-called AGI - Artificial General Intelligence). There are several theories that such general artificial intelligence can be achieved by some kind of network-based interconnection of narrow MI systems:

-    On the one hand, in artificial intelligence theory, so-called agent-based systems attempt to achieve much greater problem-solving capacity than separate components by combining independent MI components (so-called autonomous agents) (e.g., Marvin Minsky - Society of Mind).
-    On the other hand, some neurobiological research also shows that the human brain and intelligence are based on the cooperation of several highly specialized subsystems in different places.

Of course, it is an interesting question whether this kind of general intelligence can be created in this way at all, and it actually reaches the human level in problem solving. This can probably really be seen when one of these platforms reaches a critical mass: a large number of MI services are implemented that interact strongly with each other. Until then, however, the area provides a solution to a valuable market niche: quality MI services can be provided and received efficiently in a decentralized manner.




Wednesday, October 28, 2020

CA backup and recovery in Hyperledger Fabric

 


Certificate Authority (CA) plays a critical role in production Hyperledger Fabric networks although this role is not always visible for the first sight. Some of the important characteristics:

- CA is not necessary to run continuously in the Hyperledger Farm

- in case CA is down no new certificate can be registered or rolled in, but the remaining of the farms works further without error. 

- If the CA database is faulty or lost, no new certificate enrollment can be done for already registered users.

- If the CA database user information is compromised, attackers might enroll new certificates for existing logins. 

   

Wednesday, October 7, 2020

Off-chain computation via Oracle


On chain computation is sometimes too expensive for critical data. One possible solution is to outsource the computation off-chain, like with the help of an external Oracle system. There might be two solutions for such an off-chain computation: 
-  with the help of a trusted external Oracle: of course the problem is that the off-chain actor has to be trusted. 
- with the help of decentralized Oracle system: here several independent off-chain actor would compute the result and they are incentivized with like game theoretical tokenization that produce the correct result. 


Friday, October 2, 2020

Ethereum solidity security tools summarized

 


Security in solidity - Ethereum has been always one of the most important topic. Some of the current most important tools are the followings:

SWC Registry /  

Smart Contract Weakness Classification and Test Cases

https://swcregistry.io/

Ethereum Smart Contract Security Best Practices 

https://consensys.github.io/smart-contract-best-practices/

MythX - a cool tool for getting information on solidity smart contract vulnerabilities 

https://mythx.io/ 

EthLint - an open source tool for analyzing Ethereum smart contracts. 

https://github.com/duaraghav8/Ethlint

Slither  - for making static code analysis on solidity contracts

https://github.com/crytic/slither

Hydra - framework for security and bug bounties

https://github.com/IC3Hydra/Hydra



How to get test DAI on Kovan

 



Getting test tokens on the test nets are not always simple. As an example on Kovan for getting test DAI for ethere, you can use the following repo: https://github.com/Daniel-Szego/DAIFaucet

The process is simply: 

Getting DAI test tokens on Kovan

Simple interface for changing ETH to DAI with the help of Uniswap

Kovan deployment: 0x786e3c83cd270414649079A758Ad92f961EDdA0A

Usage (Kovan only): 

Send ether to the DAIFaucet smart contract: 0x786e3c83cd270414649079A758Ad92f961EDdA0A
be sure that the gas limit is high enough, like 300.000 because it is a contract call

Changed DAI token will be available on your address. We use DAI token with address (on Kovan) : 0x4F96Fe3b7A6Cf9725f59d353F723c1bDb64CA6Aa

Exchange rate depends on Uniswap, it can be far from the mainnet exchange rates

Kovan DAI test tokens only, do not use it in production !

 



Sunday, July 26, 2020

CBDC and the Blockchain

In recent years, news has risen one after another about various CBDC (Central Bank Digital Currency) studies and possible technological implementations. Most of these studies can be linked to various Central Banks (such as the Bank of England, the European Central Bank, the Central Bank of Sweden ...), where there is usually no specific proposal for the implementation of a specific technology. In this sense, it is questionable whether a blockchain platform to implement a CBDC use-case would be the right technology. On the other hand, specific technology companies, usually using some distributed general ledger technology, develop and publish CBDC implementation architectures. An example of such an developed technology architecture is the CBDC proposal based on Consensys Ethereum, R3 Corda and Algorand platform. In this article, we take a closer look at Consensys ’Ethereum-based proposal (source: https://consensys.net/solutions/payments-and-money/cbdc/ ).

For the successful implementation of a CBDC, it is worth summarizing the requirements of the CBDC at both technology and use case level in general:

1. Type of CBDC: The very first question is whether the CBDC should execute a retail, interbank, or both type of transaction.

2. Token or account based.

3. Monetary policy and token allocation: an important planning question is what monetary policy is for digital money, how much of it is in circulation on an annual basis, and who determines the output. Examples of such issues include determining interest rate and / or token inflation.

4. System change and decision mechanisms (governance): similar to the previous point, another question is who can change the various parameters of the system and how.

5. Privacy versus transparency: it is generally technically difficult to comply with both properties and, in addition, to comply with all possible AML / KYC regulations. Here, it is worth setting the priorities differently for each user case.

6. Performance, robustness, stable operation, and scalability: While stable operation, robustness, and high availability are a matter of course for most blockchain protocols, speeds such as the number of transactions per second cannot in many cases be met trivially.

7. Legal requirements and applicable legal environment.

8. Risk Analysis: One of the biggest challenges for CBDC is perhaps its risk-free implementation without compromising the existing financial ecosystem.

Consensys proposes a multi-tier architecture in general to implement a CBDC.

- The core of the system consists of a consortium block chain network that runs at the central bank and similarly significant players and regulates the issuance and administration of the fund token (layer 1).

- The second layer connects this basic system with other major financial actors through so-called state channels, which allow private and fast asset / token transfer (layer 2).

- The third layer implements the quasi-retail BCDC with channels subordinated to the main system, so-called side channels (layer 3).

 - The latest layer of the system would provide services to end users, either directly or through additional technology and / or fintech providers (layer 4).

Although the system outlined is not a plain blockchain protocol, but a combination of several decentralized systems, state and side channels, it is expected to meet the requirements for a CBDC:

- Token issuance is fully controlled by the Central Bank.

- Quasi real-time token transfer.

- High number of transactions per second.

- Large number of participants with different roles.

- Well controllable and configurable transaction visibility, compliance with various KYC / AML regulatory requirements.

- An acceptable amount of energy required to maintain the system (Proof of Stake).

To the best of our knowledge, an Ethereum-based CBDC implementation does not yet exist. Thus, of 
course, an interesting question to be answered in the future is exactly how well the architecture outlined by Consensys fits in with a particular CBDC implementation.