...by Daniel Szego
quote
"On a long enough timeline we will all become Satoshi Nakamoto.."
Daniel Szego

Sunday, December 18, 2016

Towards unmanned software development and delivery

In the good old days custom software projects were simply. there were some kind of an initial interaction with the customer called as specification or analysis then developing and architecting a solution and at the end some kind of a delivery. Independently if the project was running in a classical waterfall style or something more fancy agile style  software projects had the following roles:
- Business Analyst / Industry Business Consultant: to help for the customer to create / update / finalize the specification.
- Software Architekt / Developer : to implement the software itself.
Infrastructure expert: to deal with special infrastructure oriented questions.
- Tester: to quarante software quality.
- Project Manager / Scrum Master: to coordinate the whole process.

- Sales: selling the software.
In the meantime complex development frameworks appeared as well making a custom software development from the end customer point of view simpler, usually reducing the development efforts and making possible a much faster agiler and bug-free software delivery. Typical examples for these frameworks:
- Rapid application development frameworks: these are practically rather development oriented frameworks that make the coding much faster, implying in less coding resources. Examples were for instance the Enterprise Library for .NET framework.
- Complex Business Solutions: There are a lot of initiative to provide a framework with many use cases out of the box, starting from the most common user and identity services to complex business applications or apps implying reduced again the need for reduced resources in the classical technical coding - testing fields. Example is Odoo Business Apps.
Self-Service Solutions: Last but not least there are many solutions that work in a self-service manner, providing the possibility for end-customers without prior development or even IT knowledge to build up software applications themself. Examples are elements in the Azure cloud, like PowerApps or PowerBI.

Certainly there is a catch using these frameworks. Even if they are theoretical self-service or providing a lot of functionality out-of-the-box, they reach in real life a highly increased technical complexity. So configuring these frameworks usually require pretty much know-how, as a consequence delivering solutions with such frameworks require highly increased IT consulting (Figure 1).



Figure 1. Roles and resources in different IT projects.

So the question is in which direction can we imagine the further development of custom software development and delivery, I think there are three major factors that should be considered:

- Sharing economy: As software are not necessarily developed from the zero but rather half-ready parts are configured together sharing parts of the software will play an always increasing role. Something similar is already happening on the cloud backend side, instead of individual software, everyone try to concentrate on creating and sharing (or selling) micro-services. On the other hand, certain software configuration frameworks like Azure PowerApps already contain built-in feature to share part or the whole application. It is important to note that sharing does not only make an application available, but the collected domain and industry know how as well. As an example supposing that I spent 20 years in the logistics field and I create and publish my custom logistics application then indirectly part of my collected domain know-how will be shared as well.

- Electronic Markets: Electronic markets are similar to sharing economy, making possible that the collected knowledge is used by someone else. Direct applications of the market can be selling and reselling building blocks of a software or whole ready to go solutions.

- Machine Intelligence: Despite machine intelligence is pretty much in an over-hype-cycle at the moment, it can be used to analyzing or refining specification of a software. Creating the first requirement analysis with machine intelligence is certainly difficult, although there are some attempts for that like in process mining. If a version of a software set up however it is simple to monitor the usage of the software itself, analyzing the data with machine learning and proposing a new better version of the software automatically. Example for such a framework is the Gird, fully automated web pages based on machine learning of web traffic.

- Robo advisors: To decrease the complexity of a configuration framework from an end-customer perspective and eliminate the need of IT consultancy Robo-advisors can be used. Although such advisors are more typical in the fintech field at the moment, it is not illusory to imagine that similar technology can be implemented in the software development branch.

So as a conclusion let we imagine perhaps in the not far future a classical software development example as an end-customer needs some kind of a custom software for supporting his business processes. Let we investigate which human resources and roles do we need for that:

- Do we need Business Analyst - Software Architekt - Business Consultants for creating and delivering a custom software solution ? Well, not necessarily, firstly ready-to-go software solutions can be bought via online markets or simply used via sharing platforms. If the end-customer does not find a ready ready solution, custom software solution configurations can be used in a self-service manner with the help of robo-advisors and machine learning to create the necessarily solution.

- Do we need Developers - Software Architects or Testers for creating and delivering a custom software solution ? Well, not necessarily, If we work with configurable software solution frameworks, most software is produced on a self-service manner. Certainly such frameworks have to be implemented first in a classical way, however if they are created already, concrete software solution development can be realized without developers, software architects or testers.

- Do we need Infrastructure experts for creating and delivering a custom software solution ? Well, not necessarily, if everything is hosted in the cloud than minimal infrastructure expertise is required.

- Do we need sales people for selling a custom software solution ? Well, not necessarily. if everything is sold via online markets and application stores than further sales support can be avoided.

- Do we need Project Manager for creating and delivering a custom software solution? Well, not necessarily, if we consider the previous points the only human in the software delivery process is the customer, so project management is not required.



Notes on the drowbacks of business productivity


Most of the business productivity software solution and company management philosophy expects that people work as machines, with maximum efficiency and productivity implementing some kind of  usually meaningless company process in a way that can be monitored and presented to the management team as reports.  However not sure why the companies are surprised if the people make decisions as machines, simply based on profit maximalising, like changing company for 5% extra and having zero innovation or passion at work.

Tuesday, December 13, 2016

Notes on Software Project Management


Having spending a decade with different software projects and development methodologies, I would conclude that they are somehow not really optimal in many real life Scenarios, independently if we speak about classical waterfall style software development, or about a rather modern agile or scrum method. In the following I just try to summarise a couple of points that I see important based on my experience.
1. Specification: Most software project methodology suppose that there is some kind of a specification for the software; it is relative good specified at waterfall however much more flexible at agile. However in life the first question of the customer if you already have something ready ? The second question is if you have at least something half-ready for prototyping. And the reason for that is that even creating a relative good specification even if it is agile takes time and money. Certainly it can be supported by the business analyst role, so it takes more money and time. As a consequence, custom software development is ususally the last choice for any corporate purchasing process. Customers usually prefer to get something ready or at least-half ready and making modification on that.
2. Architecture: There are a lot of general frameworks that delivers a lot of use-cases out of the box, other frameworks provide self service use-case development even by people without prior development knowledge. In this sense one of the most important step for every software development project is to choose the right software architecture.
3. Agility: Agility can be interpreted only within a certain software architecture. Each architecture can provide several use-cases out of the box or easy to set up, however others are very difficult to implement. One of the biggest problem of each project if the basic architecture should be redesigned during the project.
4. Lean software development: Lean software development can be  interpreted as a way for delivering usable software for the end customer as fast and as efficient as possible. In this sense it makes sense to evaluate if part of the specification or respecification at a new release and the related development can be supported by machine learning algorithms. Certainly the architecture hat to support such an improvement as well, however it is not difficult to imagine that for instance the user interface structure is automaticaly adapted just by analysing the usage of the software itself. There are  initiatives for that for exmape The Grid.
5. Resue - Resell: Most software companies do not want to sell a project only once but several times, even if it is not meant to be develpped as a product for the first run. In this sense one of the most important question for each software project is reuse and resell: How can be the software project sold for several customers, how can be an implemented business-case used in several projects or even on several platforms.   
    

Monday, December 12, 2016

Competitive Analysis for Decentralised Business Process Management


Competitor landscape has been analysed with the help of two frameworks. Figure 1 uses the classical ambition matrix positioning Decentrlised Business Process Management or simply DBPM as a rather further out than closer in innovation. We positioned multiply version of DBPM on the market segment, because creating a cross-company business process between two companies should not necessarily be regarded as a new market, perhaps not even emerging. However creating business processes between companies, individuals and IoT devices surely manifests as a new market segment.
Figure 1. Industry and competitive analysis with ambition matrix
Figure 2 summarises the competitors on a rather architectural way concentrating on two dimensions:
1.  At the dimension of high trust architecture the question is if the technology is based on Blockchain, providing a scalable high-trust architecture by design, or on a classical client-server framework (or perhaps old fashioned mainframe).
2. The diversity of actors analyses how many different parties should be considered for a business process, ranging from a one-company process to the processes of multiply companies, individuals or intelligent gadgets.

Figure 3. Solution segmentation
Based on the two orientation diagrams as well, we would argue that enterprise ready classical Business Process solutions should be regarded as direct competitors of DBPM. They do not necessarily focus directly on a cross-actor market, apart from some examples like Kofax, but enterprise ready basically means that they can be extended with some further development on a cross-company or even cross-intelligent gadget scale. For example some of the ‘big players’, like Oracle, IBM, or SAP implement cross-company functionalities as well. However on the one hand they are not based on Blockchain solutions on the other hand they require that the same solution is installed and licensed in all of the trusted companies, like SAP processes between subsidiaries of a company. Enterprise business process solutions can clearly compete on a markets where processes between a few previously trusted companies should be implemented, but they will have the weakness as soon as processes between a larger numbers of untrusted actors should be set up.
As indirect competitors we regard the whole smart contract market, including general platforms, like Ethereum, Counterparty (Bitcoin based smart contract system) and systems like Hyperledger. These platforms clearly provide the possibility to set up trust between different untrusted actors based on smart contracts and Blockchain, however they have rather a general focus and not a BPM specific one. As an example a specific decentralised business process can surely be implemented as a set of hard-coded smart contracts, but hard-coding one such a company cooperation is completely different from providing a whole platform on which such processes can be practically clicked together. Nevertheless the risk that someone develops a competing solution on top of these frameworks is surely bigger than zero.
Companies who do something in the direction of multi-party trust system or business rules can be regarded as potential competitors. The range is here pretty big, starting from business rule engines like Decision, through digital business logic platforms like Drools, to general IoT solutions, like Ritc. They seem to be pretty far from DBPM at the first sight however they have a similar and strong technology basis that can be further developed in the direction of decentralised business process management without too much effort.
Based on the previous analysis our competitive strategy is based on two major building blocks:
1.  Decentralised processes among large number of possibly different kind of untrusted parties, including companies, individuals and intelligent gadgets.
2.  Low code solution in cooperation with local consulting companies of different industrial fields, making possible that decentralised processes are set up directly by the local experts of a certain field or even by end-customers in a self-service way.
We see that the market is pretty much in the emerging phase, hence the strategy is based on segments, like IoT communication that are emerging markets themselves. As a consequence we see the market and our competitive strategy as a blue ocean one.

Financial Framework for Decentralised Business Process Management


Setting up a financial framework for analysing DBPM (Decentralised Business Process Management) we are going to use the following considerations: 
From a unit economical point of view the most important element that directly carries the cost and indirectly generates revenue is an atomic t transaction. A p process is set up as executing and validating a sequence of t transactions.

In the business model, we distinguish two levels of the business. Core platform is responsible for executing decentralised transactions and processes. On the top of the core platform industry specific solutions can be delivered with the help of partner companies. On a long run industry specific solutions could be implemented directly by partner companies, however it is probably not realistic on a short run. As a consequence initial financial models consider both the development of the core platform and some of the industry specific solutions.
Costs structure is based on the following elements [Figure 1]:

1.    Variable cost:

a.  CoGS is mostly based on executing and validating the transactions on a decentralised consensus. It can be easily scale up or down depending on the customer needs. As opposed to a general Turing complete Blockchain system where executing a transaction might be pretty expensive due to the possibility of the infinite loops, at DBPM runtime of the atomic transactions are always limited from above, implying O(|T|) execution cost for a set of T transactions. The exact cost pretty much depends on the consensus mechanism, implying different numbers at proof of work, at proof of stake or at majority voting.

b.  CoGS is also influenced by storing the states of the validated transactions of a p process. It is manifested as a general storage cost that will be increased as the process itself is used. It requires further considerations if the whole state between two transactions is directly and fully stored in the Blockchain itself or rather off-chain solution might be used and only hash of the state is stored in the Blockchain.

c.  Supporting the system to partner companies or end-users is manifested as an additional variable cost.           

2.   Fix cost factors are mainly the SG&A expenses for running the company itself. As the primary focus is on the partner business, the most important is the professional business development as CAC.

3.    Investment: For a successfully start both the core framework and the first two or three industry specific solutions must be ready. They requires a certain amount of development and initial investment.

Revenue structure has the following factors [Figure 1]:

1.   Core services: customers get the value of using a certain business process itself. As executing a business process practically means executing a set of state changing transactions, it makes sense to charge money after transactions.

2.     Extended services might be possible as well, as providing training or consulting. It might be however a possibility that the company concentrates only core functionalities and every extended service is provided only by partner companies.


Figure 5. General cash-flow schema.

Business Model for Decentralised Business Process Management


Decentralised Business process management or simply DBPM is intended to be a strong infrastructure oriented solution, business model is based on an intensive partner network. The platform is not planned to be sold directly to the end-customer, instead partner companies having the industry specific know-how set up solutions on the top of DBPM [Figure 1]. A more detailed description of the ten types of innovation is covered in chapter plan as general operation model.
Figure 1. Business Model with Ten types of innovation
As DBPM is based on a Blockchain protocol, both cost drivers and profit should be based on transactions. It is certainly a question which digital consensus is applied, at Poof of Work model each transaction or block validation cost directly a certain amount of energy and money called the mining. At Proof of Stake or at general consensus mechanism like voting at Ripple rather a general hosting and participation cost is manifested. In any case profit model should be based on charging the customers after usage for individual or branch of transactions.          
From the network perspective DBPM solution would not be offered directly to the end-customers, instead the service would be sold with the help of strong B2B cooperation via partner companies. Examples can be:
-   consulting companies for setting up cross-company contracts and processes
-   IoT solution providers for setting up decentralised processes between hardware devices, for example for smart-homes.
Structure and process of the DBPM company would be pretty much similar to the classical enterprise software vendors. The core solution is continuously developed; based on the core solution industry specific features are developed. Instead of internal development, a possible way is to work with a strong open source community and develop the framework either with freelancers or with partner companies, just as at Ethereum Foundation. Another aspect that must be taken into consideration is the Network of hosters or miners who actually run the whole system. Depending on the Blockchain solution miners and hosters are paid in either by an internal cryptocurrency system or directly based on revenue.
In the middle of the product performance the core platform can be found: A decentralised P2P process management platform, in which each workflow state transition is validated by a consensus of the whole network. Apart from the core, the product system will contain industry and branch specific solutions and features.
One of the most important feature from a customer point of view is that DBPM is a no-code solution. It means that business processes can be ‘clicked’ together; in most of the cases no software developer is required. It makes a strong impact both on the service and on the customer engagement side. As a consequence of no-code development, distributed business processes can be directly built and modified by the knowledge workers of a specific industry. In this way rapid solution development and continues process improvement can be realised, instead of having long-running development life cycles. 
As a consequence of no-code development, sales channels are the best to realise in cooperation with local consulting companies. As an example on of the Big Four or Big Three companies can be an excellent candidate to use DBPM with local consultants realising indirectly common sales channels as well.
Branding must suggest the core business idea in a way and must suggest a strong and stability. From a technology point of view, name of the core technology can remain DBPM however it is adequate only for strong technical audiences.

Thursday, December 8, 2016

Notes on hybrid Blockchain solutions


Blockchain solutions provide a couple of nice characteristics, based on the public ledger: high-availability, high-trust and  immutable. Unfortunately, they sometimes lack of the industrial scale providing an cheap and easy way to work with the Blockchain, or simply the necessary stability or necessary throughput to deal with a huge number of transactions. There is however a way to integrate blockchain to existing systems with creating hybrid Blockchain solutions.
Let we have a classsical IT System, with a couple of standard transactions. Let we say that there are a couple of transactions or system states that are required to be saved and validated in a high-trust and immutable way. Well the natural way is to save them into the Blockchain. Examples might be archiving a document into some kind of a storage with the guarantee however that the data can not be changed or frauded in a later timepoint. The natural way is to calculate a hash (a fingerprint) for these documents and save it to the blockhcain. In this way the registrated fingerprint of the document can not be hacked or modified anymore and in the future it can be always checked if the certain version of the document matches with the saved fingerprint.       

Trust as a Service, Immutability as a Service


Blockchain and Bitcoin technologies do not only provide a way for a high-availability and high-trust system, but several hybrid applications might be possible as well. The bitcoin blockchain has some several characteristic that could be delegated to other non-Blockchain based systems, such features are for instance the immutability and trust of the central ledger. Considering a non-Blockchain application that want to save certain events in a way that they are high-trusted and immutable for future changes the Bitcoin Blockchain provides an excellent way for that.

Wednesday, December 7, 2016

Notes on Botcoin

 
Considering the current trends of the Blockchain evolution, it is appearing more and more coins that can be programmed in a special way. However, given the possibility of programming and the basic economical rules, it might be the question raised if it is possible to implement intelligence in the coin itself. Let we call this phantasy coin, Botcoin, that works on the one hand as a standard cryptocurrency, it works however on the other hand as an intelligent agent that is responsible for both its own survivol and reproducing.  

Tuesday, December 6, 2016

Decentralised Business Process Management


Reaching slowly to the second machine age, cooperation and trust between different actors will be an always increasing problem. We did have previously difficulties as well as processes had to be set up including several companies but the situation will be much more challenging as several million more or less intelligent IoT devices start to communicate with each other.
Decentralised Business Process Management (or briefly DBPM) is a customisable process management platform to set up workflows among different untrusted parties, independently if these parties are companies, end-users, end-customers or intelligent gadgets of the IoT revolution. The processes are set up on the top of a Blockchain, as a consequence trust is evaluated by the decentralised consensus itself, making possible to implement complex trust scenarios among many fundamentally different actors. DBPM provides a no-code or low-code way of configuring processes, meaning that most workflows can be clicked together with the help of a web based tool. It implies on the one hand that end-customers without prior development or deep IT knowledge are able to customize processes on their own. On the other hand, consulting companies of different industrial fields are able to set up industry specific process solutions as well; again without the need of coding or deep IT know-how. 

Considering current achievements and technological platforms of the Blockchain Revolution , setting up a DBPM framework and the full business around can be feasible from a couple of million dollar financing.   .

Friday, December 2, 2016

Notes on the next generarion of User Interface Design


It is interesting to see the development of the user Interface or simply user experience technology. as it was a couple of years ago dominated by the display technology itself there seem to be shift in the direction of artificial intelligence and machine learming. And well honestly to say the puspose of the user interface is not necesserily to provide something very fancy or complicated design but to provide an easy and efficient way for commincatons with computer servises. In this sense it is not a science fiction to expect that the next wave of user experience technologies will be 100% based on AI, like on robo advisors.     

Monday, October 10, 2016

Notes on Business Productivity and DAO


Business Productivity means somehow working more efficient, in other words producing as much output for a given time as possible. Considering however from a pure financial perspective human resource is basicaly just a resource that is actually pretty much expensive and risky. We see the tendece for decades that human labour has been actively replaced by software and hardware compontents and noone has ever complaind from the inverstor side.
However if it is really a tendency and a need producing as much output for as little input as possible than we might as well ask the question from the other side, similarly as the question was asked by Ethereum and DAO. 
- Do we actually need humans as work force for value creation ?
- Which activities and Organisation structures can be 100% replaced by algorithms ?
- Which companies and company structures can be fully automated ?    
- How is it possible to invest in a fully automated company ?

Monday, October 3, 2016

Notes on crowd buying

As crowdfunding is an interesting area which is just getting more and more popular, a similar direction gets less attention and that is crowd-buying or social-buying. Crowd buying means practically allocating several buying requests into a thread having a much better position to negotiate and probably getting much better price and delivery conditions. 

Tuesday, September 20, 2016

The next generation of Business Productivity == Machine Assisted Human Intelligence


Artificial intelligence research has been producing a couple of surprising results nowadays. As the target of most artificial intelligence research is to copy or reproduce somehow the human thinking the question is exciting but somehow much less studied from the other direction: what are the limits of human thinking; where are the limits of the human cognitive capabilities, what are the limits of the human communication or collaboration and how can be these limit overcome with the help of computers, algorithms or generally by IT technology. 



Figure 1. positioning of machine assisted human intelligence.

I think the field is to be found in the intersection of three different areas:
1. Cognitive Science or, generally psychology and sociology should provide input regarding the limits of human communication and thinking.
2. Artificial Intelligence provides probably the best toolkit for algorithmic and IT support of the whole area.
3. Last but not least, Business Productivity is the market. The area in which the results of the field should be positioned, like thinking faster, making better and faster business decisions, communicating better, or just generally achieving more result with less effort. 

Considering current trends in cloud computing and the fact that most providers like Amazon or Microsoft offers Artificial Intelligence as cloud solutions, it is not impossible that in a couple of years productive work will mean that you plug in into a cloud service of machine assisted human intelligence platform.   


Sunday, August 21, 2016

Notes on social as a service


Current trends of social platforms, like Facebook, LinkedIn, WhatsUp try to extend in some way the original human social communication. However, they seem to be pretty much ad-hoc for the first sight.  What is missing somehow the systematical analysis: these are the drawbacks and awkward of the human communication and social interaction, these are the points that can be done better with different IT systems and as a result these are systems that are available.

As at the end of day all of these systems are extensions of the human interaction, we might as well call the whole field as Social as a Services.   

Brainware Plugins or Brainware as a Service


Artificial intelligence has got a long history of trying to achieve a computer program that can match with humans in thinking, like passing Turing test, beating humans in games like chess or go. As it is certainly an ambitious research direction, there is another direction that is much more practical and probably business oriented as well. As opposed to create thinking machines it would be similarly exciting systematically analyse the limits of the human thinking and focusing on extending it with different kind of IT support, like brain computer interface, extended memory, additional external knowledge for unknown domains. As there are already some achievements in this area they seem to be rather island solutions, there is not seem to be a general platform for that. It would be more existing somehow mimic the  mobile app platforms and provide a basic brainware computer interface and provide the possibility to write custom Brainware apps or custom Brainware plugins on top. Considering that the platform is probably supported by the cloud, we might as call as Brainware as a Services.

Tuesday, August 16, 2016

Artificial Intelligence as a Service - Cognitive Science as a Service



Recent trends in cloud computing shows the direction of integrating several artificial intelligence and machine learning tools into a cloud platform. From the Microsoft side tools like Cognitive ScienceAzure machine learning, or Cortana Analytics provide machine learning and artificial intelligence in the cloud. Similarly tools can be found from AWS, like Amazon Web Services Machine Learning. In this sense, it make sense to identify the whole area as Artificial Intelligence as a Service, or rather Cognitive Science as a Service or just simply Machine Learning as a Service. 

On the other hand applications can be found as well, that use intelligent cloud services to achieve certain domain specific tasks, like intelligent thread analytics from Microsoft.  

Wednesday, August 10, 2016

How not to invest in Cryptocurrency

Well investing into Concurrency is pretty much risky, it is not a bad idea to know at least something about concurrency before you invest. For that topic I would propose the following perhaps rather funny flowchart. 


Tuesday, July 26, 2016

Notes on Business Productivity


From a clear economic point of view Business Productivity is simple: human resource is simply too expensive and too risky. So Business Productivity software solutions make to increase the speed of carrying out a task or increase the general availability of a human resource. Examples are efficient collaboration and project systems, offline and home office availability, automated business processes and so on.  
  
However if we stay with the economic point of view: the cheaper and more effective solution is to fully replace the human resource with artificial intelligence software. In this sense in Business Productivity, the major question is not how a certain activity can be delivered with more efficient Business Productivity software solutions, but the question is if a certain activity still needs humans or is it possible to replace fully with artificial intelligence. 

It may sound shocking for the first sight, however the same thing has been being happened with the human physical labor in the last hundred years: everything that could be automated were automated.

Welcome to the second machine age.   

Sunday, July 17, 2016

Blockchain and the technology limits



Considering the Blockchain Hype that will be being evolved in the next couple of years, one of the most important questions, where are the limits of the Blockchain technology. In other words, for which Business scenarios does the technology make sense and which are the tipcial examples where rather traditional client server models should be used.

- Decentralized database: blockchain is actually a database that is stored with all the past changes in all of the full nodes of the network. In this sense it is critical that the size of the data that is actually stored in the blockchain is limited. Perhaps there will be in the future for efficient mixed blockchain - off-chain storage possibilities, however until that point blobkchain should be regarded as an extreme expensive storage, in which only a limited amount of highly sensitive data should be stored.  

- Transactions: The state of the database is modified or even read out by several transactions by different actors.

- Trust: Central use case of the blockchain is to guarantee a trust of several different agents, so that normally these agents would not trust each other. From the system perspective both trust of the state of the central database and the validity and order of the transactions should be guaranteed.

As a conclusion, in examples where a lot of data have to be stored, there are no many actors that are cooperating, there is trust outside the system as well or transactions do not really play  a relevant  role, rather classical client server models should be evaluated. 

Wednesday, July 6, 2016

Blockchain 2 Business, Blockchain 2 Customers


Considering Blockhain applications it should be considered that the classical market segmentation look differently. We can not really speak about B2B (Business to Business) or B2C (Business to Customer) solutions as the Blockchain itself is usually not a company and it is not run or operated  by a company. Instead they are rather community solutions, developed by community, operated by individuals and can be used practically by everyone. As a consequence, these old terms should be newly interpreted: 

- B2C (Blockchain to Customer): Blockchain services for end-users.
- B2B (Blockchain to Business): Blockchain services for other companies.    

Notes on Blockchain privacy and private Blockchains


Blockchain applications like Bitcoin or Ethereum are highly secure by design, despite some of the data that is stored in the Blockchain are actually far from being private. Even in the Bitcoin Blockchain practically every pieces of transaction and account are visible even with a simple browser, like Bitcoin Block explorer. That is certainly not a desired functionality, professional Blockchain 2 Customer or Blockchain 2 Business services would require extended information privacy. Let we just summarize some of the possibilities.

- decentralized public ledger: that is the most basic model, all accounts and all transactions are publicly available, all mining and validating nodes of the network run public as well.  

- private identity: all transactions and accounts are visible in the blockchain however identity behind an account is practically impossible to identify. There are attempts for such a mechanism in Bitcoin protocol with the always newly generated addresses. 

- private transactions: well it is pretty difficult question. On the one hand all transactions have to be validate by all nodes, on the other hand it is a normal customer requirement that certain transactions should be fully analysed only from theirs owner and not from other third party users. If the two requirements technically satisfiable in the same time is questionable.  
  
- private network nodes: processing and mining of information is not available for everyone, but only a certain group has got the privilege to do. 

- private blockchain: the whole blockchain is not available for everyone and does not contain every transactions but only transactions of a certain application and party of member companies are recorded and certainly the information is only for this group visible. 

The major question is certainly if it makes sense to create such a private blockchain applications or is it better to use for such a scenarios classical client server models. 



Tuesday, July 5, 2016

Cloud Storage versus Dezentralised P2P Storage


Current trends of the decentralized software development makes new applications and platforms to appear every day. One exiting direction is to build decentralized or P2P storage systems on the top of exiting Blockchain technologies, like SWARM on the top of Ethereum. Certainly these technologies are pretty much in the experimental phase, despite it is interesting to evaluate which advantage or disadvantage can have a P2P storage system for example comparing with a classical cloud storage. 

- Zero downtime: well cloud systems have got surely the high availability characteristic. The same property can be found however  at a P2P storage as well. Copies of a document or data is generally stored on a lot of nodes: if an adequate distribution algorithm is implemented to store copies based on availability and geographical location of the nodes, than high availability can be guaranteed.  
  
- Fault tolerance: The same is true for fault tolerance. Adequate distribution of the copies of different pieces of information on different nodes can realize a highly fault tolerance system, similarly as at a Cloud storage. 

- Scale up - Scale down: well from the point of scaling up or down the two systems have got more or less the same characteristic. One can always get some more storage with a couple of clicks, one can set some storage free similarly.

- DDoS resistance: Well Cloud is more or less centralized or at least based on several huge centralized cloud center, as a consequent they are not so immune for a DDoS attack as a fully decentralized P2P network. 

Censorship-Resistant: Possible censorship is a major characteristic even of a cloud storage system. As the cloud service itself is operated by a large company like Microsoft or Amazon, there is always an easy possibility for censorship. On the contrary on a P2P system for a successful censorship at least 51 or perhaps 100% of the resources are required that is economically pretty expensive.

- Security: Security is a big issues for a P2P storage system. As our data is stored overall of the world, the only way to provide professional service is that the data itself is so highly secured that even the hoster of the node can not make the decryption. As it is theoretically possibly to realize such a strong decryption mechanism, the question is if accessing the data remains performant enough. 

- Price: The second big question is the price of such a system that can be influenced by two factors. On the one hand a P2P storage can be more expensive than a Cloud storage as most of the data are stored in more copies. On the other hand, a P2P storage can be cheaper as well as most of the resources are extreme cheap, meaning that they are stored on resources that otherwise would not be utilized. 

As a conclusion, I would say the major risk is the security - performance characteristic. If these two properties can be realized in a way that is comparable with a Cloud storage or at least acceptable to the market, than there is a chance that in a year or two decentralized P2P storage services appear.  


Friday, May 27, 2016

Decentralized Business Process Management ?



Current boom of the Blockchain technology raises the interesting questions if certain IT technologies can be extended on a decentralized model. As an example the question raises right away if it makes sense to extend a BPM (Business Process Management) to run on the top of Blockchain. The two poin
ts that need to be analyised, if it possible from a technological point of view and if it makes sense from a business perspective. 

Technical point of view: from a technical point of view BPM can be regarded as an extended workflow, containing states of the workflow and transitions between the states that require more or less human interaction. States of the running workflow instances are usually serialized somewhere, for instance into a database. As decentralized ledger can actually be regarded as a database, decentralized BPM solution can be imagined as saving the workflow instances states into ledger and cryptographycaly validate both the states and the transactions between the states, implying a decentralized BPM system. The question is actually if the scenario is usable in real life scenarios: at least two points should be considered: 
a. the amount of data to be saved: well most decentralized ledgers are not really prepared to store a huge amount of data. Certainly a workflow state is usually far from Big Data, despite it is questionable if we face difficulties. 
b. reaction time: well most of the workflow system do not require real-time reaction time, despite waiting for a hundred percent verified answer, as an example at Bitcoin protocol, might be too slow.    
Business point of view: So, let we imagined if we have the technology, a ready to go decentralized BPM solution. The question is which situations can we use it? Well enterprise customers have probably got their` own business process technology, based on the good old client server model. Actually small companies can have cheap internal BPM solutions, typically hosted int the cloud. The situation gets interesting if we consider cooperation processes of more companies, perhaps together with end-customers or even with non-human elements as well, like IoT devices. In such situation several independent "agent" work together to reach a common consensus of a workflow state or workflow action that is implemented with a "shared" business process. For such a shared process decentralized trust is actually essential, and the best way is to achieve via the Blockchain technology.

Friday, May 20, 2016

Testing versus Monitoring ? - any difference with managed cloud services ?



"Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test" 

"Website monitoring is the process of testing and verifying that end-users can interact with a website or web application as expected." 

Software testing means checking the requirements during a software is being developed and monitoring means rather in the checking if a ready developed software still working. As the two activities usually mean two totally different things,they seem to be pretty similar if we consider cloud service integration. As we use practically ready to go building blocks, checks that we use to identify if we developed a correct software can be used to monitor if the ready application still working. In this sense, the two process should not be two separate things, but actually one common platform to guarantee the software quality both at development and at operating the application as well.

An idea for such a common platform might be the following:

Let CTEST = {C1, C2, ... CN} a number of checks that is being used during software testng. 
Let CMON = {CI, CI+1, ... CM⊂ CTEST a subset of checks that are being used in the future as well for monitoring purpose.

Certainly the major question is how we can define checks and a common framework that can be used both at testing and at monitoring.  



   

Robots -> Softbots -> Corpbots.... Notes on automated cach-flow systems




ROBOT - "A robot is a mechanical or virtual artificial agent, usually an electro-mechanical machine that is guided by a computer program or electronic circuitry."

SOFTBOT - "In computer science, a software agent is a computer program that acts for a user or other program"

Well the question is what is the next step of the evolution. Combining the current trends of Blockchain and artificial intelligence technology there is a possibility to create automatic cach flow systems that practically run 100% without people, having both the value generation and the administrative tasks fully automated. So let we call such an automated cach-flow system, or rather fully automated corporation as Corpbot.

CORPBOT -  A Corpbot is a fully automated virtual corporation in which both the value adding process and the management and administrative tasks as well are being executed by computer algorithms.

Well it still does not exist, however it is expected to appear pretty fast....

22.05.2016 - BREAKING, It exist and the name is DAO..-.Forbes


Tuesday, May 17, 2016

Software as a Service in the cloud à la Azure



Well Azure software as a service portfolio seems to boom. One finds new services and solutions every day. Examples are for instance, PowerApps for mobile application development, Flow as a rule Engine, Microsoft Forms, Power BI for Big Data, GigJam for data visualization, Sway for visualization as well and who knows what comes the next week. 

To analyse this chaos, we try to categories these services with two dimensions:

1. Targeted Customer Segment: sometimes it is pretty difficult to identify for the first run, which customer segment is targeted by an Azure tool. Is it a tool that is planned for end-customers, is it rather a tool for small businesses or is the customer segment really enterprise. As an example  Sway seems to be a real end customer segment, create your content,  share with your friends, be happy. On the other hand tools like PowerApps or PowerBI planned rather on a business segment.

2. Possibility for customization or integration: The second dimension that should be considered is the possibility of customization or integration. Some tools like Sway are rather out of the box products, without the possibility of customize too many things or integrate with other SaaS solutions. Other services like Azure Machine Learning provide pretty much possibility to customize with, or without scripts or create integration via rest api with other services of Azure. In this sense they have many characteristic with Platform as a Service solutions. Perhaps, Framework as a Service would be an adequate name. 

Certainly, the whole area is changing pretty fast, so not only the current positioning of a service is important, but the future direction as well. As an example, the mobile App development framework Project Siena was focusing on the end-customer segment,  it has been further developed for the enterprise segment and renamed as PowerApps.  

We try to summarize some of the SaaS products on the following picture. Certainly the exact characterization of a certain Azure service based only our fast subjective analysis.


Figure 1. Azure SaaS Portfolio Analysis.


Monday, May 9, 2016

Blockchain for everyone: is it time to have no code application development platforms on blockchain ?


Well, on the one hand, it seems the blockchain technology is starting to get into a hype phase, implying new starts ups with new ideas for each month and a lot of turbulence in media and press. As the technology is however disruptive, meaning creating new markets and destroying old ones, it is pretty difficult to predict what kind of a business models or ideas can be successfully implemented with decentralized ledgers and which do not have the chance. I guess it requires an extensive experimental phase in the next couple of years, implying a lot of small rather proof of concept and prototype projects.

On the other hand, the whole area looks like as the wild west, so finding a professional blockchain developer for each experimental project will be not so easy. First of all everything from a technological point of view is pretty huge, bitcoin, altcoin, ethereum, hyperledger,... and who knows what comes the next week. Secondly, developers who really have got the experience in one field are probably already hundred percent involved in one or more hot start-ups. Thirdly, there is not something like, official developer training or certification for a blockchain developer. As a consequence, it can be predicted that there is gonna be a relative big demand for blockchain developers in the short future that can not really be satisfied. 

A possible solution can be low-code or no-code development platforms on the top of blockchain. No-code or low code platforms provide generally the possibility for IT power or business users to build application pretty fast without having deep coding knowledge. These platforms are usually not Turing complete, meaning not general enough to build up every possible applications, but they provide a way to build a set of typical ones fast and efficient. So, typical proof of concept or experimental blockchain applications can implemented directly by power or business users.

Certainly, the major question is where the focus should be set. One way is to get a general blockchain development platform, like Ethereum, and build a no-code or low-code framework on top. Such a solution would provide the strong integration possibility with the blockchain technology, however integration with other technologies, like mobile apps, storage or data, might be difficult. The other way might be to get a classical low-code development platform, like K2, and extend for a a specific blockhain field, e.g. integrating the services of the bitcoin API. It would provide all the advantages of the existing low-code platform, but it would be rather an interface integration without the decentralized philosophy.

Friday, May 6, 2016

Job Interview versus Pitching

Well, actually there is only one huge difference between a job a interview and a business pitch, the sum that you communicate at the end....

Wednesday, May 4, 2016

The future of SharePoint == Cloud Microservice Integration and Development for Business Productivity

The future of SharePoint == Cloud Microservice Integration and Development for Business Productivity
...
The future of SharePoint == Cloud Microservice Integration and Development for Business Productivity
...
The future of SharePoint == Cloud Microservice Integration and Development for Business Productivity
...
The future of SharePoint == Cloud Microservice Integration and Development for Business Productivity
...

Monday, May 2, 2016

Low-code application delivery in the cloud à la Microsoft


Current trends of Microsoft cloud services show a pretty strong trend of realizing low-code application delivery solutions. As most of these solutions are pretty much in the beta phase it is difficult exactly  to predict how they will look like in a year or two, but it can be easily imagined that a complex low-code application delivery ecosystem will be realized.    

Parts of the ecosystem might be the following:
 - PowerApps for Mobile application building framework.
 - Flow as a  rule engine:  https://flow.microsoft.com/en-us/
 - Office 365 for web publishing and corporate collaboration
 - Office Forms as a Form engine: https://forms.office.com/
 - Azure Machine learning for AI and data mining:
    https://azure.microsoft.com/en-us/services/machine-learning/
 -  ...?

Of course the question is what are a the requirements to be realized for a real enterprise ready low code platform. I think there has to be at least two requirements:
 1. Integration: there has to be a very good integration possibility between both the above mentioned solutions and between other parts of the Azure cloud infrastructure as well. As an example, the same rule of Flow should be possible to use both with a mobile application and with web publishing as well, Similarly general infrastructure elements of Azure, like connecting corporate data with the cloud should be also available (AppFabric, VPN..)
 2. Extensions: If a solutions reaches the architecture limit, there has to be a way to extend the solution with hard-core coding elements, like with Visual Studio and Xamarin for mobile apps,
 and create professional solutions.

Application delivery for such an ecosystem can be realized in two steps:
  a. Low code application delivery: The phase provides the possibility for power users to build up environment on their own or provide the opportunity for partner companies for consulting and training. First step of a whole application delivery, like Proof of concept or prototyping can be realized here as well
  b. Hard core development: real development can be realized if the framework does not provide enough possibility for certain requirements, so further use-cases have to be realized by hard core software development and project management.

Building up consulting and development services based on the technology might contain the following phases:
 i., Beta technology: until the technology is in beta phase, it is not very realistic to make business on the market. However there is the possibility to capitalize the first movers advantage, positioning on the market with strong marketing. As an example, writing blogs, articles, presentations, case studies, or even making indirect partner marketing with the provider (Microsoft).
 ii., Early Phase: at the early phase of the technology it is expected that everything is changing very fast, some integration and extension methods are not carefully designed, as a consequence the whole platform is not very stable. As a consequence rather consulting and training business or rather small development projects are expected.
 iii., Performing Phase: As the platform getting more stable, less innovative and changing slower, full scale extensions and development projects are expected as well. Like classical development projects with off-shoring, project management...

Certainly the major question is if the ecosystem is capable to achieve an enterprise ready and strong performing state or it remains just a couple of innovative island solutions.  




Sunday, April 24, 2016

Notes on offline availability measures...


Considering the wider-spreading of cloud solutions nowadays and the sometimes still limited possibility of full scale online availability, it is important to define if a certain solution or software is available offline as well or not. Therefore we define the followings terminology and definitions.

Let s be a certain cloud solutions or software. Let Fall(s) be the number of all functionalities or services provided by the s cloud solution. Let Foffline(s) be the number of functionalities or services that are still available even if the cloud solution is offline. So we can define the offline capability of a cloud service as:

Coffline(s)Foffline(s) / Fall(s) 

The number of services that are available offline per the total number of services. 

Similarly we can define the amount of time until the cloud service can remain offline but working further, as cloud service availability.

Aoffline = the amount of time until a the cloud service can operate offline as well...