ECHO Content ECHO Technology ECHO Network ECHO Policy
Search About the ECHO Initiative Promotion Activities Intranet Full text search New fulltextsearch (internal)

Jürgen Renn, Towards a Web of Culture and Science,

a contribution to the World Summit on the Information Society
in Geneva 2003


Excellencies, ladies and gentlemen,
it is a great honor to address you and I would like to thank the organizers for providing me with the opportunity to present you an open-access initiative by the German Max Planck Society, the French CNRS and other major international science and education organizations.
In the following, I will describe the crisis which culture and science are facing in what is still the beginning of the Internet Age.
I will present you a vision on how to overcome this crisis, and I will indicate steps towards the implementation of this vision, in particular the Berlin declaration on open-access to science and culture, signed by many international organizations in Berlin.
Let me begin with describing the crisis and first turn to the crisis of culture on the Web. It has two dimensions.
The medium of today and tomorrow, the Internet, might leave behind a culture which is the heritage of our past but urgently needed to meet the challenges of the future. This cultural heritage is presently in danger of being left behind, of missing the boat of the rapid technological developments carrying us into a new information age.
The bulk of information which forms the core of cultural heritage is largely excluded from the information system constituting the backbone of an ever-more knowledge based world.
The great works of art and literature, the multitude of languages of this world, traditions sometimes reaching back over millenia, the treasures of scientific, scholarly and philosophical writings going back to the dawn of our civilization are not being as substantially transferred to the new medium as is necessary for their preservation in view of wars and dwindling public funds menacing them with rapid degradation.
The deficit in the extent to which cultural information is available on the Web is accompanied by the underdevelopment of cultural techniques adequate to the new technologies.
Reading, writing, and calculation, the traditional cultural techniques have to be complemented by new cultural techniques allowing every single individual to optimally exploit the Internet as a representation of the collective human knowledge.
The question is, of course, how to overcome this crisis which menaces the link between our past and our future.
Before coming to answers, let me turn to the second major crisis of this transitional period, the crisis of science.
It is most visible in the rising journal prices which effectively make science ever-more inaccessible, to developing countries, for instance, but more generally to all those who have produced scientific knowledge, mostly with public funds. Scientific organizations are in fact forced to repurchase the information they produced in the first place.
But effectively, the so-called journals crisis amounts to a complete breakdown of the traditional distribution of labor in the traditional information circuit.
According to this traditional model, research results are produced by scientists. This is and will remain the most cost-intensive element of the information circuit.
The results of research are disseminated by publishers and archived by libraries.
Information is filtered by a process of evaluation performed by scientists (peers) and organized by publishers.
Only that which survives this filtering process is being disseminated.
This well-established traditional system is now endangered by technological changes with radical consequences.
Even within the system of printed information, these technological changes are felt by the rising prices charged by publishers for dissemination, which scientific organizations are no longer able to cover and which dramatically increase the divide between industrialized and developing countries with regard to the availability of scientific information.
The information revolution has radically changed the technical and economic basis for maintaining the scientific information flow. This radical change is evident from the as yet unexploited potential of the Web for scientific communication.
Dissemination is no longer a cost intensive component. It can, in principle, be handled by scientists without the services of the publishers.
In the electronic medium, evaluation follows and does not precede dissemination. It no longer has to amount to a simple in/out decision about publication.
There is no longer any reason to preclude access to the information hinterland, to observational and experimental data, software tools, or to historical sources, which presently only serve as a logistic background for published research results. Making such additional information available will help ensure the reliability of scientific information, to broaden the scope of available resources, and to avoid the duplication of efforts. Moreover, the Web offers completely new forms of scholarly publication reaching from digital libraries of cuneiform tables to entries into biological databases.
The new medium could facilitate and improve the quality of the selection process. The immediacy and in principle unrestricted scope of electronic dissemination increases the likelyhood of rapid responses, distinguishing valuable from non-valuable contributions.
On the background of this impressive potential of the Web for scientific communication, it becomes particularly evident what is wrong in the present system of scientific dissemination dominated by publishers with a quasi-monopolistic status:
As I have mentioned, there are, first of all, the increasing costs for scholarly journals corresponding to capital urgently needed to build up an infrastructure more adequate and efficient for scientific dissemination, thus corresponding to a waste of public money.
Then there are the commercial barriers to the connectivity of knowledge, enforcing a fragmented landscape of information islands rather than fostering the development of a global representation of human knowledge, constituted of interoperable contributions by all players.
It is also important not to forget that publishers do not offer a guarantee for the long-term archiving of information, again a challenge that remains with public institutions.
As I have mentioned before, we also lack an adequate access and retrieval infrastructure correponding to the needs of scientists and educators.
And finally, mapping the traditional commercial publication system into the new medium perpetuates the digital divide in science. In fact, simply creating a mirror image on the Web of the traditional system amounts to erecting an articificial boundary cutting off developing countries from the scientific information flow.
Let us now turn to approaches towards a solution of the double crisis of culture and science on the Web.
The two standard solutions are the big player solution and the scout solution. Both have failed to create an adequate infrastructure fostering the much needed dynamics of the transfer of scientific and cultural content from the old medium to the new one.
The big player solution is most familiar from the present debates on electronic journals where a few publishers use their near monopoly to erect new barriers of accessibility.
But for the digital availability of cultural heritage, the situation is perhaps even more problematic, resembling a gold rush where everybody tries to stake out claims.
The big players have in fact long since begun to secure exclusive rights on the reproduction of cultural artefacts, be they manuscripts of Leonardo da Vinci or representations of traditional cultures.
But in spite of their eagerness to control large domains of cultural heritage, the big players have so far failed to create an infrastructure that guarantees a steady and reliable flow of content from the old media into the new, an infrastructure offering equitable access to all nations and people, often deprived of their heritage by the pitfalls of history.
The scout solution, on the other hand, is based on the assumption that the transfer of cultural and scientific content to the new medium can essentially be achieved by pilot ventures.
It amounts to the realization that bringing culture and science to the Internet means settling a new continent, rather than just exploiting its resources in a gold rush.
But it also amounts to the problematic assumption that this can be done by merely sending out a few scouts to survey the new territory.
As a matter of fact, also the scout solution has largely failed to launch a self-sustaining dynamics of culture and science on the Web.
The right solution to the double crisis can only be found if we have a vision.
The vision I would like to sketch here is that of a Web of Culture and Science.
This is a vision concerning both the enrichment of the Web with content and its future technological development, hopefully turning it into a global and accessible representation of human knowledge.
For creating a self-sustaining dynamics enriching the Web with meaningful content represented in adequate structures, we need a support program for open access aiming at building up a technical and social infrastructure. Such a support program is the core of what we have called the Agora solution, in analogy to an institution of ancient Greece where the common good emerged from the contributions of all citizen.
For creating the tools which make it possible to adequately exploit this content for science and education, we need the development of a semantic Web allowing future users to truly interact with the content they find.
Let me first turn to the Agora solution which aims at building up an infrastructure turning the consumers of the Web ever-more into producers.
In fact, we urgently need support for creating an open-access infrastructure for making resources freely available online with little effort and in a way that guarantees the interoperability with other contents and tools, thus creating an added value for every user.
I have no time to discuss why and how we have to go beyond the vision of the semantic Web as it is presently discussed. Let me say only that much:
The future transformation of the Web will be driven not only by technical issues of speed and bandwith but by innovative usage scenarios, just as it was the case when the Web itself was invented here in Geneva at the CERN.
Making the Web more democratic, for instance, will also create a technological drive from the client-server assymmetry to peer-to-peer interactions, from browsers used by essentially passive clients to knowledge weavers used by active citizens.
Let me conclude with some words on the implementation of this vision, coming back to the Berlin declaration.
The Berlin declaration was signed in October 2003 by major national and international governmental, scientific, cultural, and educational organizations.
They consider their mission only half complete if the information they produce is not made freely available to society.
Otherwise science is, according to their view, simply unable to reveal its full impact so that investments in science fail to reach the returns they could in principle attain.
Let me quote from the Berlin declaration
In order to realize the vision of a global and accessible representation of knowledge, the future Web has to be sustainable, interactive, and transparent. Content and software tools must be openly accessible and compatible.
Our organizations are interested in the further promotion of the new open-access paradigm to gain the most benefit for science and society.
The Berlin declaration also recommends specific measures for implementing the open-access paradigm.
Scientists are being encouraged to publish their work according to the principles of the open-access paradigm.
The holders of cultural heritage are encouraged to support open-access by providing their resources on the Internet.
As for the humanities, my own field, the Berlin declaration has been a great encouragement for many archives, museums, libraries, and research institutions to make their contents freely available.
It was, however, also crucial that we have been able, within the context of the ECHO Initiative, to offer such institutions an open-access infrastructure in the spirit of the Agora solution, helping these institutions to overcome the competence and technology thresholds separating them from the Web. The infrastructure built up by the ECHO (European Cultural Heritage Online) Initiative allows for the web-based collaboration on images and texts and automatically creates, for instance, links from any text embedded in the infrastructure to dictionaries for a variety of languages ranging from Ancient Greek to Chinese.
In the context of implementing the Berlin declaration, the Max-Planck Society has in fact followed a double strategy, aiming at fostering access to electronic information in the traditional journal format, on the one hand, and developing with the support of a newly founded innovation center, innovative models of open-access electronic dissemination, on the other hand.
Let me briefly comment on the present signatories of the Berlin declaration. On the German side, it has been signed not only by the Max Planck Society but also by all major research agencies associated with the Max Planck Society in the so-called alliance, such as the German Research Foundation, the Fraunhofer Society, the Leibniz and the Helmholtz Associations, the German Science Council, and the Association of Universities. Taken together they organize and fund the lions share of German basic and applied research.
The Berlin declaration has also been signed by the Berlin Brandenburg Academy, one of the national galleries, and the German Library Association. All of these institutions have been pressed by the ever-more scarse funds for science and culture to use their resources as effectively as possible and to regain control over the knowledge they produce.
On the international level, the Berlin declaration has been signed by the French CNRS about which we will hear more shortly, by the National Hellenic Research Foundation, as well as other major research funding and governmental organizations from Belgium, Spain, Austria, Norway, Italy, Hungary, and likely soon Croatia. It has furthermore been signed by transnational organizations such as the Academia Europea.
The core text of the Berlin declaration has been closely agreed upon with the American Bethesda group representing major research organizations in the US such as the Howard Hughes foundation, the National Institutes of Health, or the University of California.
An international follow-up conference in 2004 will be staged together with the Bethesda initiative to achieve a closer coordination between all players involved.
This brings me to my conclusion, the next steps. What are the next steps? Very easy: you can join the Berlin process and thus help paving the way to the science of the future which will have to be based on the open-access paradigm, if we want to exploit our scientific and cultural resources as effectively as possible to meet the global challenges of humankind.
Governments, universities, research institutions, funding agencies, foundations, libraries, museums, archives, learned societies and professional associations are invited to join the present signatories.
If you wish to do so, please contact the President of the Max-Planck Society, Prof. Gruss, who has offered to coordinate the process.
For further information, also about contact addresses, either consult the website of the Max-Planck-Society or the brochure we have brought with us.
Thank you for your attention and do not forget:
   CONTACT   IMPRESSUM   Last Update: November 2014