The grid provides or at least at the moment promises a desing and user interface web technology that changes based on the average user pattern of a site. The usage is measured analysed by different artificial and data-mining algorithm and a new design a new, a new site structure is proposed automatically.
As the idea seems pretty much science fictional for the first run but actually the idea itself is not so wild. Basically every search based user interface and application follows the same pattern. Certainly not only whole user interface is restructured, only a part and it is basically regarded to content as opposed to structure. As an example consider an news portal that automatically shows some information based on different sources and selection criteria. Despite the field is rather regarded as search than like data mining, the pattern is pretty much the same:
1. collect data
2. analyse data
3. show something different on the user interface.
The two differences are :
a. The field is regarded as search and not data mining or artificial intelligence: well it does not seem to a be a huge difference. Basically most of the search and indexing algorithm show some similarity with data-mining. Actually the two fields were pretty much common a couple of years ago.
b. In search based applications, only small area of the user interface can be dynamically changed, like a search result or a filter result. However it is not necessarily a big different either, technically the whole user interface could be reconstructed. If it makes sense from an endure point of view is certainly a more difficult question.