Background

Most current transactions and interactions at business level, but also at leisure level, are mediated by computers and computer networks. From email to virtual worlds, the way people work and enjoy their free time has changed dramatically in less than a generation time. This change has made that IT research and development focuses on aspects like new Human-Computer Interfaces or enhanced routing and network management tools. However, the biggest impact has been on the way applications are thought and developed. These applications require components to which more and more complex tasks can be delegated, components that show higher levels of intelligence, components that are capable of sophisticated ways of interacting, as they are massively distributed, sometimes embedded in all sort of appliances and sensors. These autonomous components are usually termed ‘agents’ to stress their capability of representing human interests, of being autonomous and socially-aware.
These trends in software development have motivated a number of research initiatives in Europe and USA in recent years. One of the most related to our goals was the Global Computing initiative (GCI) launched in 2001 as part of the FP6 IST FET Programme. The vision of the call, also contained in the current Global Computing II (GCII) initiative, was to focus research on large-scale open distributed systems: a timely vision given the exponential growth of Internet and the turmoil generated on the media and scientific fora of some international initiatives like the Semantic Web, or the IBM autonomic computing concepts, and the peak of Napster usage in 2001 with more than 25 million users. Most projects had a highly interdisciplinary nature, and a large number of groups from theoretical computer science, agents, networks and databases worked together in a fruitful way.
The focus of GCI was on three main topics: analysis of systems and security, languages and programming environments, and foundations of networks and large distributed systems. Along these lines, GCI projects dealt with formal techniques, mobility, distribution, security, trust, algorithms, and dynamics. The focus was ambitious and foundational, with an abstract view of computation at global level, having as particular examples the Grid of computers or the telephone network. Both functional and non-functional (e.g. Quality of Service) properties had to be studied. The focus on GCII was shifted towards issues that would help in the actual deployment of such big applications, namely, security, resource management, scalability, and distribution transparency.
Other initiatives for large distributed systems (although not necessarily open in our sense) include P2P systems, where nodes in a graph act as clients and servers and share a common ontology that permit easy bootstrapping and scalability, or Grid applications where the nodes in a graph share and interchange resources for the completion of a complex task. The Semantic Web proposal that has received large funding in the EU and the USA is generating standards for ontology definition and tools for automatic annotation of web resources with meta-data. The size of the Semantic Web is growing at a high pace (10 million documents with meta-data by the end of 2006). Finally, the availability of applications as web services has permitted an approach to solving complex systems by combining already available web services. The annotation of those through standards like WSDL or BPEL permits the automatic orchestration of solutions for complex tasks. Combinations of Semantic Web and Web services standards are currently underway (SA-WSDL, SEE TC) within standardization bodies such as the W3C and OASIS. And finally a strong social approach to develop new web applications is at the heart of the Web 2.0 initiative (Wiki, Flickr, Blogs).

SetPageWidth