Thesis - Open Technologies for an Open World
Open Standards, Open Source, Open Mind

 

The Document objectives, structure and an

avant-goût of the basic concepts.

Introduction


Foreword

· Motivations

"Sometimes dreams are all that separate us from the machines" - Dan Simmons in The Fall of Hyperion

"Sometimes (…) the shortest route to courage is absolute ignorance" - Dan Simmons in Endymion


Recently we have heard a lot about Linux. Several discussions have been held in the professional circles, and the media is getting the message to the general public. It can be considered the major open source product, and the responsible for this public awareness. However, the concepts behind open source must also be explained, and - beyond this - the existence of open standards and protocols shall be understood. The comparison between "open" and proprietary software should not be limited to the cost, neither be influenced by the impression that open means free. What is the importance of this understanding for multimedia project managers?

There is a general theory about the transparency of the computer infrastructure for the Internet and intranet implementations, in phases like design, project management and decision-making. During this document, I aim to prove the importance of a general understanding of the hardware, software and communication infrastructure, to improve the overall quality of the project and the decision making process. The first battle between open and proprietary solutions will take place in the technological field. And this is only the beginning of the war.

The knowledge acquired when studying the impact of standards and open source for the Internet infrastructure, may be used to understand the consequences in other levels, like the design, the development and the exchange of structured information. Intensive research has been performed in the last decade, now the solutions are ready to be deployed. It's time to participate in the dialogs, discussion and analysis, which will make the standards for the future. It's essential to understand the social and economical rationales behind open and proprietary solutions, and how the political world can help to build a technological future.

This is an ambitious thought, I agree. Nevertheless, the ongoing social movements using the peer-to-peer concept are showing the power of tightly coupled motivated persons. Linux is there to prove it.

· Sources

I tried to be eclectic, using some books and syllabi as basic references for the technological, economical and social analysis, my 2-year collection of Datanews magazines to analyse the current evolutions of the Belgian and European market and political environment, seminars and the Internet to complete the evaluation from a global and updated standpoint. These are always fully referenced in the footnotes, with complete details being given in the Appendix D.


· Method

There are many different technological aspects discussed in this document. At a first sight, it may look like a collection of superficial investigations about different topics. However, it aims to discuss open solutions in a very broad scope. So I studied every information technology aspect, which is at the same time important to the development of multimedia applications and currently with interesting questions about the choice of open or proprietary alternatives. A brief explanation about the technology itself is always given in the beginning of each chapter and paragraph, to allow the reading of this document without pre-requisites. The focus of the discussion is always around open source and open standards. To allow this document to be read in modules or in non-sequential browsing, the conclusions are often done in the end of each chapter and paragraphs.

· The light side of the moon

There are two ways of reading this thesis. The shortest and funny way is to go directly to the online conclusion (http://www.k-binder.be/INCA/), play around with the suggested architecture for the future, and then come back to the document and read only the topics that really interest and attract you. Then disagree with my standpoint and tell me why (by using the website to exchange opinions, the readers can have an open discussion around the topics discussed in this document, which can be a basis for the effective design of the new architecture).

The second way is to continue reading this document.

· Acknowledgements

Many thanks to my wife Joyce for the strong support during the long research evenings, nights and weekends. Big thanks to my sister Mariana who helped me to keep concentrated to finish this work. The open spirit of my daughter and family, from the cradle, were also fundamental for my motivations.

Special regards to the Professor Jean-Luc Vidick for the help to structure this document in its final form, and to complete it with important topics about the extreme development; to the professor Michel Bauwens for the interesting discussions that helped me to find the target subjects and to compose the bibliography; to the professor Attila Darabos for the final revision; and to the professor Pierre Rummens for the remarks about the research methods and the document formats and structures.

I hope you enjoy the reading, and if you feel motivated by my ideas please feel free to help me improving this document .

1. Introduction

1.1. Structure
"Let's say there are four steps. Four stages. Four levels. The first Is learning the language of the dead, the second is to learn the language of the living, the third is hearing the music of the spheres. The fourth step is learning to take the first step." - Dan Simmons in The Fall of Hyperion

The standardization is an important point to be considered in the evolution of the different technologies, and so is their openness. It gives developers the possibility to understand the infrastructure requirements, to create compatible products, to participate in their evolution, to innovate. This openness may be well represented in the source code, the "alma mater" of every computer program. We need to understand "Open Source" as the concept created by computer programmers united under the hackers ethic. However, we should not be limited by it: We shall go one abstraction level further to define the "Open" concept, by analysing also the Open Systems, the Open Standards and the Open Platforms. The first part of the analysis will be divided into three different tiers, which compose together the realm of IT and Internet today. They are the infrastructure, the development and the information exchange (see figure below).


Figure 1 - The three tiers under the scope of the first part of this analysis

Let's start by the bottom-level tier, and discuss in the first chapter the hardware and the operating systems used to build the Internet infrastructure. Different alternatives are investigated, trying to understand their original goals, their evolution, their current situation and the tendencies. The analysis is always focused on the applicability of the platforms as servers taking part on the Internet infrastructure. And what is the base of Internet? The standards. They participated in the spread of the Net from the very beginning, so they do in this document. The goal is to understand what are the different hardware platforms, the software environments, their concepts, history and targets, and how they became standards. We will also analyse what benefits we may expect from the open environments, together with the level of openness available from the common proprietary alternatives. We conclude with the role of standards on the creation of network protocols used worldwide, tantamount for the success of Internet.

In the second tier - development - we can consider the ongoing battle for a common web platform and the role of the web services. This is excellent for a case study, as we have three different warriors: An open source, an open platform, and a proprietary solution. Additionally we will analyse the usage of common rules to exchange information about the design of applications and data, with the goal of technology independence: MDA and UML. We briefly discuss the extreme programming concepts and their relation with the open source products and projects.

The analysis ends with the third tier - information exchange - by exploiting the benefits of a common language to exchange information and data definitions, coupled and structured. This is done by XML and its derivates.

The aim of the second part is to analyse the technologies by a broader standpoint. We should be able to trace some conclusions and imagine future trends and proposals. Using the cases detailed in the first part, we will analyse some of the basic concepts around the network society and the new economy, trying to understand the impact of standards in the usage of technological power, the strategies currently used by the enterprises, the comparative elements between open and proprietary solutions. Finally, we will discuss the role of politics, to enforce or stimulate the usage of open technology.

This study contains background material that introduces some important topics for readers who are not familiar with them. References are provided for those who want more complete understanding. As an avant-goût, the basic concepts used during the rest of the document.

1.2. Open

To define the Open general concept, under the scope of this study, let us consider two definitions, with the help from the dictionary :

Open
1: having no enclosing or confining barrier: accessible on all or nearly all sides
5: not restricted to a particular group or category of participants

Based on this, we can consider as "Open" any product, concept, idea or standard:

§ That is freely available to be researched, investigated, analysed and used with respect to the intellectual property;

§ That can be adapted, completed and updated by any person or company interested in improving its quality or the functionality, possibly coordinated by a person or organization.

Let us now analyse the open model, used for Open source development and the establishment of open standards.

1.3. The Open Model

1.3.1. Open Source

· Definition

Historically, the makers of proprietary software have generally not made source code available. Defined as "any program whose source code is made available for use or modification as users or other developers see fit" , open source software is usually developed as a public collaboration and made freely available.

Open Source is a certification mark owned by the Open Source Initiative (OSI). Developers of open software (software intended to be freely shared, potentially improved and redistributed by others) can use the Open Source trademark if their distribution terms conform to the OSI's Open Source Definition . All the terms below must be applied together, for the product and its license, and in all cases:

§ Free Redistribution - The software being distributed must be redistributed to anyone else without any restriction.

§ Source Code - The source code must be made available (so that the receiving party will be able to improve or modify it)

§ Derived works - The license must allow modifications and derived works, and must allow them to be distributed under the same terms as the license of the original software.

§ Integrity of the Author's Source Code - The license must explicitly permit distribution of software built from modified source code. The license may require derived works to carry a different name or version number from the original software.

§ No discrimination against persons or groups

§ No discrimination against fields of endeavour - It may not restrict the program from being used in a business, or from being used for a specific type of research.

§ Distribution of license - The rights attached to the program must apply to whom the program is redistributed without the need for execution of an additional license by those parties. In other words, the license must be automatic, no signature required.

§ License must not be specific to a product - A product identified as Open Source cannot be free only if used in a particular brand of Linux distribution.

§ License must not contaminate other software - The license must not place restrictions on other software that is distributed along with the licensed software.

As Perens summarizes, from the programmer's standpoint, these are the rights when using Open Source programs:

§ The right to make copies of the program, and distribute those copies.

§ The right to have access to the software's source code, a necessary preliminary before you can change it

§ The right to make improvements to the program


· Licenses

There are many different types of licenses used by Open Source products. The most common are :

§ Public Domain - A public-domain program is one upon which the author has deliberately surrendered his copyright rights. It can't really be said to come with a license. A public domain program can even be re-licensed, its version removed from public domain, with the author name being replaced by any other name.

§ GPL (GNU General Public License) - the GPL is a political manifesto as well as a software license, and much of its text is concerned with explaining the rationale behind the license. GPL satisfies the Open Source Definition. However, GPL does not guarantee the integrity of the author's source code, forces the modifications to be distributed under the GPL and does not allow the incorporation of a GPL program into a proprietary program.

§ LGPL (GNU Library General Public License) - The LGPL is a derivative of the GPL that was designed for software libraries. Unlike the GPL, a LGPL program can be incorporated into a proprietary program.

§ X, BSD and Apache - These licenses let you do nearly anything with the software licensed under them. This is because their origin was to cover software funded by monetary grants of the US government.

§ NPL (Netscape Public License) - This license has been prepared to give Netscape the privilege of re-licensing modifications made to their software. They can take those modifications private, improve them, and refuse to give the result to anybody (including the authors of the modifications).

§ MPL (Mozilla Public License) - The MPL is similar to the NPL, but does not contain the clause that allows Netscape to re-license the modifications.


· Open Source software engineering

The traditional software engineering process generally consists of marketing requirements, system-level design, detailed design, implementation, integration, field-testing, documentation and support. An Open Source project can include every single one of these elements:

§ The marketing requirements are normally discussed by using a mailing list or newsgroup, where the needs from one member of the community are reviewed and complemented by the peers. Failure to obtain consensus results in "code splits", where other developers start releasing their own versions.

§ There is usually no system-level design for a hacker-initiated Open Source development. A basic design allows the first release of code to be built, and then revisions are made by the community. After some versions, the system design is implicitly defined, and sometimes it is written down in the documentation.

§ Detailed design is normally absent of pure open source initiatives, mostly because most of the community is able to read the code directly, interpret the routines, functions and parameters, and modifying them as required. This makes further development more difficult and time-consuming.

§ Implementation is the primary motivation for almost all Open Source software development effort ever expended. It is how most programmers experiment with new styles, ideas and techniques.

§ Integration usually involves organizing the programs, libraries and instructions in such a way they can be used by users in other systems and equipments to effectively use the software.

§ Field-testing is one of the major strengths of Open Source development. When the marketing phase has been effective, many potential users are waiting for the first versions to be available, and willing to install them, test and make suggestions and correct bugs.

§ The documentation is usually written in a very informal language, free style and usually funny way. Often websites are created to allow the documentation to be provided and completed by the user community, and becomes a potential source for examples, "tips and tricks". One of the brightest examples is the online documentation for PHP .

§ The support is normally provided via FAQS and discussion lists or even by the developers themselves by e-mail, in a "best effort" basis, depending on their availability, willingness and ability to answer the question or correct the problem. The lack of official support can keep some users (and many companies) away from Open Source programs, but it also creates opportunities for consultants or software distributors to sell support contracts and/or enhanced commercial versions.

The commercial versions of Open Source software (like BSD, BIND and Sendmail) often use the original Open Source code, developed using the hackers' model, later refined by most of the phases described above.


· Open-source cycle

Many Open Source software start with an idea, discussed via the Internet, developed by the community, implemented as first draft versions, and consolidated into final versions after a lot of debugging. After this final software starts to be used globally, and interesting companies, the authors may decide to start receiving some financial compensation for their hard work.

Then an organization may be created, or some existing company can start to distribute the product alone or bundled with other similar pieces of software. Sometimes, after a reasonable funding is raised and more development is done (now remunerated) a commercial version of the product may be released, often with more functionality than the free version, and sometimes without the source code being generally distributed. This is attributed to the difficulty of small companies to remain profitable by distributing only open source software.


· Open Source Science

As argued by DiBona, Ockman and Stone , "Science is ultimately an Open Source enterprise. The scientific method rests on a process of discovery, and a process of justification. For scientific results to be justified, they must be replicable. Replication is not possible unless the source is shared: the hypothesis, the test conditions, and the results. The process of discovery can follow many paths, and at times scientific discoveries do occur in isolation. But ultimately the process of discovery must be served by sharing information: enabling other scientists to go forward where one cannot; pollinating the ideas of others so that something new may grow that otherwise would not have been born."

Ultimately, the Open Source movement is an extension of the scientific method, because at the heart of the computer industry is computer science. Computer science differs from all other sciences, by having one means of enabling peers to replicate results: share the source code. To demonstrate the validity of a program, the means to compile and run the program must be provided.

Himanen considers that the scientists have developed this method "not only for ethical reasons but also because it has proved to be the most successful way of creating scientific knowledge. All of our understanding of nature is based on this academic or scientific model. The reason why the original hackers' Open Source model works so effectively seems to be - in addition to the facts that they are realizing their passions and are motivated by peer recognition, as scientists are also - that to a great degree it confirms to the ideal open academic model, which is historically the best adapted for information creation" .

1.3.2. The Open Standards

One of the important factors in the success of the Internet comes from the "governance mechanisms" (rather than regulation) that guide its use and evolution, in particular, the direct focus on inter-connection and interoperability among the various constituent networks.

· Standards

The standards are fundamental for the network economy. They allow the companies to be connected by establishing clear communication rules and protocols. For manufacturing networks, composed by the company with the product design, the suppliers of different components and the assembly lines, standards guarantee the compatibility of all the different parts in the production process. In the information and technology networks, the standards guarantee the compatibility of the infrastructure components analysed in the first chapter - hardware, operating systems and application software - and the interoperability of the companies via the Internet with clear network protocols.

Standardization is by definition a political, economical and technological process aiming to establish a set of rules. These are documented agreements containing technical specifications or other precise criteria to be used as rules, directions or definitions. Thus, equipments, products, processes and services based in the same set of rules are fully compatible with each other.

· De facto or De jure

Standards are normally classified according to their nature. De facto standards are products and protocols which conquer the market by establishing a network of interconnected and compatible products, and gaining recognition from the consumers. An example is the set of de facto standards built around the IBM PC specification.


Figure 2 - Examples of de facto standards and their connectivity

De facto standards are normally preferred by companies because they don't need to follow a complex and time-lengthy process for its validation. Often - as the case of Windows - they are accepted by the market even without being elaborated by scientific process, and without strong R&D investments. These are also the reasons by which they are not easily accepted by normalisation organisations, by the scientific institutions and by the academic circle.

By opposition, de jure standards are often based on a new technology, created by a company that accept to publish their concepts and definitions for a review of the academic and scientific communities, and be officially approved by a normalisation institution. A good example is the OSI standard and the protocols compatible with its different tiers (see the Figure 14 on page 42).

Some of the main motivations for this are the ability to compete with existing dominant standards, to easily found a cooperation network of software and hardware suppliers, to obtain credibility from the academic world and consequently gathering the cooperation from the students community.

A good practice from the really independent and non-profit normalisation organisations is to only accept to recognise open standards. This is necessary to avoid the formation of monopolies and unfair commercial practices by the royalty-owners. This has recently been contested when standards for web services have been analysed.

· Mare liberum or Mare clausum

One standard is known as proprietary when it has been developed by a company, which remains the owner of royalties that limit the usage of the standard specifications by the payment of licence fees.

The Open Standards specifications are open to the public and can be freely implemented by any developer. Open standards are usually developed and maintained by formal bodies and/or communities of interested parties, such as the Free Software/Open Source community. Open standards exist in opposition to the proprietary standards and work to ensure that the widest possible group of contemporary readers may access a publication. In a world of multiple hardware and software platforms, it is virtually impossible to guarantee that a given electronic publication will retain its intended look and feel for all viewers, but open standards at least increase the likelihood that a publication can be opened in some form.

From a business perspective, open standards help to ensure that product development and debugging occurs quickly, cheaply and effectively by dispersing these tasks among wide groups of users. Open standards also work to promote customer loyalty, because the use of open standards suggests that a company trusts its clients and is willing to engage in honest conversations with them. Criteria for open standard products include: absence of specificity to a particular vendor, wide distribution of standards, and easy and free or low-cost accessibility.


· Protocols

A protocol is the special set of rules used by different network layers, allowing them to exchange information and work cohesively. In a point-to-point connection, there are protocols between each of the several layers and each corresponding layer at the other end of a communication.

The TCP/IP protocols (TCP, IP, HTTP…) are now quite old and probably much less efficient than newer approaches to high-speed date networking (e.g. Frame Relay or SMDS). Their success stems from the fact that the Internet is often today the only possible outlet that offers a standardized and stable interface, along with a deliberate focus on openness and interconnection. This makes the Internet extremely attractive for very different groups of users, ranging from corporations to academic institutions. By contrast with traditional telecommunications networks, the rules governing the Internet focus on interconnection and interoperability, rather than attempting to closely define the types of applications that are allowed or the rate of return permitted to its constituents.

· Architectures or Platforms

Architecture is a term applied to both the process and the outcome of thinking out and specifying the overall structure, logical components, and the logical interrelationships of a computer, its operating system, a network, or other conception. Architectures can be specific (e.g. IBM 360, Intel Pentium) or reference models (e.g. OSI). Computer architectures can be divided into five fundamental components (or subsystems): processing, control, storage, input/output, and communication. Each of them may have a design that has to fit into the overall architecture, and sometimes constitute an independent architecture .

As it will be exploited in the chapter 2.2.1 (z/VM and z/OS), the usage of open architectures - even when maintained by commercial companies - can bring many technical advantages to the suppliers of the building blocks (Software, hardware and network components) and mainly to the user community, which can profit from a large network of compatible components, ensuring an independence and favouring concurrence, creativity and innovation. The economic potential will be analysed in details on the chapter 6.3 (Feedback).

The term platform may be used as a synonym for architecture, and sometimes may be used to designate their practical implementations.

· Certification

Certify is to validate that products, materials, services, systems or persons are compliant to the standard specifications. One chapter of the standard definitions is always dedicated to the certification steps to be performed, and the specifications to be verified.

The certification process may be performed by any entity. However, it's normally performed by independent organizations, as it may be expensive and requires specialized technological knowledge.


Full Document- PDF (2.5 MB)
Full Document - HTML