Thesis - Open Technologies for an Open World
Open Standards, Open Source, Open Mind

 

Strategies used by the IT companies to dominate the hardware and software market

The impact from the open platforms and standards on those strategies

The chances for the Open Source community to increase its market share

New Economy


6. The New Economy

"The concept of profit has always been the noble version of a deeper, more fundamental human instinct: greed" - Manuel Castells


A new economy - networked, global and informational - emerged in the last quarter of the twentieth century on a worldwide scale. Its epicentre has been the information technology industries and financial institutions in the 1990s. The first aspect of this new economy - the network - has been discussed in the previous section. The second aspect is the institutional, organizational, and technological capacity to work as a unit in real time, or in chosen time, on a planetary scale, enabling the company to survive and thrive in this global economy. This discussion is out of the scope of this document. Let's now consider the fundamental elements of the third aspect - the informational economy - always from the open source and open standards point of view.

The quest for possible answers to three important questions will drive this chapter:

§ What strategies have been used by the IT companies to dominate the hardware and software market?

§ What is the impact from the open platforms and standards on those strategies?

§ What are the chances for the Open Source community to increase the market share?

To increase profits, when looking for short-term results, there are four main ways: to reduce production costs (as seen in the last two years with the labour cost reduction - mainly in the ICT market - and the falling price of electronic components), to accelerate capital turnover, to broaden the market, and to increase productivity. This last has been the main advantage of the open source products over the competition. The productivity, when applying the hacker ethical principles , is extremely higher than in the other firms . Its technology is proved to have higher quality and to be more innovative. Productivity and technological innovation are important means, but certainly not the only ones.

6.1. Standard wars

When a new technology standard appears, it may complement (and sometimes merge) or compete with the existing ones. When UNIX become a standard for open operating systems and started to dominate the market, IBM decided to create its own version of UNIX, and let the mainframe platforms to be UNIX compliant. It reacted so fast that the OpenEdition layer of the MVS operating system was one of the first to comply with the POSIX standard. This was motivated by two interdependent factors: The first was the risk of the established base of mainframe customers to migrate from the old systems to the most attractive and flexible UNIX environments, which appeared to be less expensive and to have more human knowledge resources coming fresh from the universities. The second was the usage of UNIX-compliant software under mainframe platforms. It revealed to be less expensive for IBM to develop a UNIX layer than to redevelop a whole subset of software - mostly related to the Internet, Web, User interfaces and Database technologies - to compete with the UNIX versions. It proved to be the right decision - even if many customers eventually migrated to open systems platforms - with many mainframe customers using today the UNIX layer to run network and Internet services.

With Microsoft, it was the opposite. At first, in the desktop segment, when its cooperation with IBM and Apple was terminated, it decided to profit of its own installed MS-DOS base - a high-valued network - to implement a new graphic environment. The quality revealed to be a less important element than the number of users (see section Network externalities below), and quickly Microsoft became the preferred operating system provider from the manufacturers of PC-compatible machines - including IBM. In the server segment, Microsoft always thought of mainframes and UNIX as legacy technologies, and started by dominating the low-entry market share, by providing small servers fully compatible with the installed base of desktop PCs. With the increased capacity of Intel-based machines, the Microsoft servers started to perform functions for which a mainframe was needed before. And Linux appeared to play in the same market share than Microsoft windows servers. As the ideals of Microsoft and the Open Source community were completely different (money, monopoly and marketing against gratuity, freedom and quality) the fusion or cooperation was not possible.

6.2. Network externalities

As Metcalfe helped us to demonstrate in the section 5.1, the value of the network increases with the number of its nodes. All new networks start with a zero value, and the network externalities from the opponent networks make their spread more difficult. When a new technology appears, incompatible with the existing ones dominating the market, a huge effort must be made to create a large network, for its value to be higher than the changing costs . This concept is not new, and has been implemented in the postal services, railways, airlines and telephones.

Currently, in the operating systems market for desktops, the value of the network composed by the windows users is extremely higher than the competition, as more than 90% of the users are connected to it. This high value forces most of the companies and individuals willing to use Macintosh or Linux desktops to keep at least one PC compatible with the windows systems. It also motivates the creation of Linux products that may interface with windows systems - by recognizing the Microsoft proprietary formats, by being able to execute windows-compatible programs in a simulated environment, by recognizing the Microsoft proprietary protocols in client-server and network connections. This causes negative reactions from Microsoft, which keeps changing its standards and protocols to difficult the connections and data exchange with other systems and products, aiming to keep the users locked into its network, and forcing them to pay expensive costs to get out.

When a company has a complete monopoly in the operating systems or hardware market share, it is extremely easy to trigger the network externalities in favour of its own related products to destroy the competition. It was always the IBM strategy in the mainframe market. A first example is the security layer, tightly related to the operating system to improve the enforcement of the access authorization. IBM has its own product - RACF, later renamed to Security Server - that comes imbedded in the MVS installation material, even if not ordered. To install the products from IBM competitors, one must follow a procedure to remove the IBM product. A second example was in mainframe networking. A company called Interlink developed a software product that implemented TCP/IP on IBM mainframes. The performance was better, the price lower, and the flexibility higher, the door to the Internet was finally open. IBM quickly offered its own TCP/IP stack "free" together with the operating system, and started to build tight links between its own product and its databases. After complaints from customers, IBM started to provide a refund for companies that did not want IBM TCP/IP. As the impact was only felt by big companies, the affair is almost forgotten. A similar case happened with Microsoft against Netscape in the browser segment. This time, with a larger publicity and the difference that Microsoft continues to freely distribute Internet Explorer, what practically eliminated Netscape from the browser market . The only solution found by Netscape was to "escape" to the open source community, creating Mozilla, one of the best browsers compatible with Linux .

6.3. Feedback

The feedback is an important factor in the competition for market shares, already existent in the industrial age, which attained a fundamental position in the information economy. Due to the virtualisation of the networks explained in the section 5.1, the feedback effect tends to create monopolies in each domain of the software and information realms. The feedback may be classified, according to its impact for the creation of monopolies, as positive or negative. Please note that the feedback impact for the economy as a whole might be the opposite than its name would suggest.

6.3.1. Positive

The feedback is tightly related to the network externalities. In cases of positive feedback, the quality of a product (or sometimes the success of a brand or its marketing strategies) expands the network of users, which will in turn increase its network value, motivating more users to integrate it. This may happen successively, in a positive spiral, until the stability brought by the saturation level of the market segment. The earlier the product starts its positive feedback, the more chances it has to become a monopoly. However, as the technology evolves in cycles, products dominating the market in one moment can be quickly dominated in the next evolution step if they are unable to evolve as quickly as its opponents, or if they cannot react to new products bringing superior features.


Figure 25 - Two cycles of positive feedback

To illustrate this, let us consider the Figure 1 above and the computers market from the 1960s until the 1990s, discussed in the section 2.2.1. Initially there were several computer suppliers, competitively sharing the market. Then IBM created the System/360 architecture, with a superior technology, standards and aggressive marketing campaigns. It quickly dominated the market, eliminating part of the competition, and forcing the remaining to survive with a small market share. The paradigm was the evolution of UNIX systems, which could provide a good an inexpensive solution, as demanded by the potential customers. The market share of the IBM machines - then known as mainframes - started to reduce, forcing IBM to evolve an old platform - S/36, then transformed into AS/400 -, create its own UNIX-compliant machines and systems - AIX - and create a UNIX compatibility layer in the mainframe systems. The mainframe environments are targeted for death since then, but elevated migration costs and unbeatable technical superiority are still keeping them alive with a stable market share. IBM is trying to stimulate a new positive feedback in favour of their mainframes - now called high-end servers - with a strategy called server consolidation, which includes a compatibility with the Linux operating system and the advantage of integrating the processing capacity of multiple low entry servers into a single box.

6.3.2. Negative

Shapiro and Varian may describe this later reaction from IBM as a negative feedback, when force motivates weakness (UNIX competition conquer market share from IBM mainframes) and weakness motivates force (IBM redesign the mainframe systems to be compatible with UNIX and Linux). If IBM succeeds in its movement, the result may be the stabilization of the market, until the next paradigm, as shown in the Figure 26 below:


Figure 26 - Negative feedback

Standards de facto often evolve in the marketplace favoured by positive feedback cycles. As seen, they stimulate the monopolies in detriment to a healthy competition. The absence of competition normally has negative impacts in the technology (by reducing the pace of innovation), the social welfare (by increasing unemployment rates with the elimination of the competitors) and the economy (by directing the flows of profit into the dominating company).

6.3.3. The role of the standards

Standards de jure can promote the competition, by creating a negative feedback with the publication of the standard protocols and definitions, the implementation of several compatible products, and stimulating the innovation by the different implementations of the standard recommendations. This is one of the reasons by which the standards are increasingly limited to general specifications, leaving the details for the companies and scientific community to decide. The other reason is to reduce the time needed to develop a standard, allowing the negative feedback effect to happen before one company decides to implement its own product, outside the standard specifications, trying to quickly ignite a positive feedback in its favour. This is the preferred practice of companies like Microsoft, as defined by Bill Gates in his book The road ahead: "Because de facto standards are supported by the marketplace rather than by law, they are chosen for the right reasons" .

Microsoft is also accused of practicing "vapourware" : to avoid the competitors to conquer its market with products offering better quality, creating a technological paradigm and originating a positive feedback, one company may announce a new version of its product, containing the same facilities than the competition, even before its design.

6.4. Cost Analysis

6.4.1. Production Costs

The most important part of the production cost for software development, is from research and development (R&D). Other important costs are for marketing, promotion and documentation.

Those costs are important even for the open source software, despite the wrong impression that they are "free". Studies based in the COCOMO model estimates that "It would cost over $1 billion (…) to develop [the GNU] Linux distribution by conventional proprietary means in the U.S." . Another study points out: "If Debian [one of the Linux distributions] had been developed using traditional proprietary methods, the COCOMO model estimates that its cost would be close to $1.9 billion USD to develop Debian 2.2" . IBM announced investments of $1 billion USD in R&D for Linux, many other hardware and software suppliers - including HP, Intel, Sun and SAP participate in Linux research laboratories.


6.4.2. Reproduction Costs

As for the other products from the digital information economy, the reproduction costs are extremely low. They are normally the duplication of a CD-ROM, and may imply the reproduction of documentation.

The reproduction costs of the open source software, if we consider the main Linux distributions, are even higher than the proprietary software. Normally the Linux kernel is supplied with many complementary software products, in several CDs, and a good documentation for the installation and customisation procedures.

6.4.3. Distribution Costs

The open source software may be distributed in two ways: a free download from the web (with indirect network costs for the distributors and users) or a CD containing the operating system, tools, applications and an installation manual. In this second way, it has the same costs than other proprietary products.

6.4.4. Transaction Costs

The transaction costs are measured by the cost for the customer and the distributor per copy of the product.

Probably the main difference between Linux and the other operating systems, and between open source software distributed under the GPL license and other commercial software is the reduction of the transaction costs, for companies installing them in several computers. Normally commercial software is licensed according to the number of computers or users. The open source software is often distributed without limits for its usage, copy or installation. This is extremely important for big companies, and in moments of financial crisis, this might be a determinant factor for the adoption of open source software. In parallel, the new licensing practiced by Microsoft is helping to push customers out of the windows software, mainly from its server family. This has been anticipated by Bill Gates in 1995: "Customers express to me their worry that Microsoft, because it is, by definition, the only source for Microsoft operating-system software, could raise prices and slow down or even stop its innovation. Even if we did, we wouldn't be able to sell our new versions. Existing users would not upgrade and we wouldn't get any new users. Our revenue would fall and many more companies would compete to take our place." This may reveal an even more amazing approach: Microsoft may have increased its licenses to measure the real interest by the companies in the open source software. If it starts loosing market shares, its prices may be reduced again, to the previous level, in parallel with a strong marketing campaign.

6.4.5. Changing costs

The changing costs are paid by the companies and individuals willing to adopt a new technology, changing from an existing network of users to another. The total changing costs are calculated by adding the investment in new hardware, software, training and the maintenance of the compatibility with suppliers and customers. For example, companies wishing to move from windows to Linux benefit from the first two elements (Linux is compatible with all windows hardware with better performance, and the license and maintenance fees from open source products are usually lower). They may have expenses with training, higher for the people responsible for the installation and support than for the end-users. These subjects will be detailed on the next paragraph (TCO).

When analysing the implementation of a new technology inside a company or intranet application, only the internal costs are considered. When the implementation implies changes in external users, the collective changing costs must be taken into account. An example can be taken from the traditional client-server applications, when part of the application runs in the client side, another part on the server. If a new release of the server component - requiring a new version of the operating system - is to be installed, the changing costs are limited by the reduced number of servers. If a new release of the client component needs a new version of the desktop operating system, the cost is multiplied by the number of users. In case of applications available outside the company - like clients for home banking applications - the situation is even more complicated, because the customers need to be convinced to upgrade their own operating systems. Internet applications helped to eliminate this problem, with the usage of server applications exchanging information with the clients by the usage of standard formats and protocols - like TCP/IP and HTML. If a new version of the application is released, only the application servers are impacted. The customer is even free to choose the more convenient operating system.

This is also good a reason for the organisations controlling the Internet names (like the URL) to remain non-profit. The collective changing costs of changing the URL of a web site will imply the usage of publicity, electronic and paper communications, change of cross-links with other web pages, and may often imply the loss of customers coming from old links. If a single - and commercial - institution keeps the control of the URLs, the risk of abusive prices and practices is extremely high.
High changing costs are normally considered as a lock-in situation. Shapiro and Varian identified seven types of lock-in situations, frequent in the information economy : Contract obligations, Durable goods, Specific training, Information and data bases, Specialized suppliers, Research costs, Fidelity programs. Important for our study are:

§ Durable goods - The usage of open platforms may reduce the risk of the technological lock-ins (if the equipment is changed, the software and applications must follow). The standards are often implemented by several hardware and software companies, allowing the replacement of determined products with lower changing costs than proprietary alternatives. Commercial lock-ins (when the compatibility only exists between complementary products from the same supplier or from a close network of "authorized vendors") may be eliminated with the usage of open standards. The implementation of open source solution may also eliminate the need for the implementation of new hardware, as discussed below.

§ Information and data bases - The usage of strictly open formats - like HTML and XML - may ensure the compatibility with many software components, and eliminate the need for conversion tools when the application is replaced. When proprietary databases are implemented - like Oracle and DB2 - the existence of tools to convert the information into open formats must be verified, and the work needed for the data extraction, conversion and insertion into the new database might be taken into account.

§ Specialized suppliers - Open solutions are normally supported by many different companies, which have free access to the standards, protocols, documentation and sometimes even the source code. This is essential to ensure good problem solving techniques and a total independence from the services provided by the supplier.

The lock-in situation has also an impact in the suppliers, which are forced to adapt to new strategies from preferred customers or partners. This has been the case recently for software companies developing applications using DOS interfaces, when Microsoft decided to disable a subset of DOS functionalities in the Windows XP family.

Unification strategies like the single UNIX specification discussed on chapter 2.2.2, and the ongoing United Linux initiative (chapter 2.2.3) aim to reduce the lock-ins provoked by different (and incompatible) implementations, and to minimize the impact for the application suppliers when new versions of the operating systems are released. They also reduce the cost for application and hardware suppliers, which have a single version to test and validate their products.

The implementation of the Internet, based on open standards and protocols, is considered to be a turnkey for its development, with the consequent perfect communication among different hardware and software suppliers, and the absence of lock-in situations.

6.4.6. TCO

A difficult task when comparing alternatives - in our case proprietary against open source - is the need to estimate all the costs involved in the implementation (changing costs) and the recurring periodic costs. This is called "Total Cost of Ownership" . Elements like operating system licenses and hardware equipment are obviously included, but salary, consultancy services, training and the licenses for complementary - although mandatory - software should be obtained in advance. The scope of this document does not allow an extensive study of all costs implied in the installation and maintenance of servers and desktops; several comparisons have already been made and a brief analysis may be summarized by some of the TCO items:

§ Hardware - Solutions built around environments like Linux and Windows can use the same hardware equipment, and even if the performance obtained by Linux is commonly accepted to be better, this can be ignored for a fair TCO comparison . The comparison with proprietary hardware solutions, like Sun, Apple, IBM mainframes and AS400 generally gives a higher difference, in favour of the open source alternatives . Hardware maintenance fees, insurance and floor space may also be considered to obtain a more complete analysis.

§ Software - The Open Source solutions are always cheaper than proprietary ones, for small-to-medium environments. Generally, the license costs for large environments should consider the usage of proprietary databases and applications , which are generally less expensive - or with an equivalent price - when installed under open source environments. An important factor to consider is the need to pay yearly fees - known as maintenance and support - to have the right to use the software.

§ Human resources - It is extremely difficult to estimate the manageability - or the simplicity of installation, configuration, support and usage - for each solution. Commonly accepted arguments for solutions based on Windows are the easy installation (it is normally pre-built into new hardware) and cloning - duplication of servers and desktops -, absence of structured problem-solving analysis, and consequently the need for more support and help-desk people, and operations like "system restarts" and recovery of lost information. For Linux and other UNIX systems, the existence of well-trained (and more expensive) people, the time-consuming tasks to install and configure the environment, with later facility to maintain it. IBM mainframe environments are far more difficult to install and configure, often requiring specialized consulting services, but easier to maintain after the stability level is attained. Smaller systems tend to require more servers to give the same capacity of large systems, and consequently need more people to support them .

§ Training - The clear advantage is for well-known products like Windows, which are still largely used in schools and at home. The Linux interfaces - like KDE - and office suites - like OpenOffice and StarOffice - are quickly reducing this advantage . For the support staff, the time needed to learn how to maintain an open or proprietary environment is not different. Exception made for full proprietary environments, like IBM mainframes and AS400, where the complexity is higher, and so is the time for the technical staff to be fully productive. For any solution, the ideal situation is often to have experimented consultants to help in the initial steps and to complement self-study and classes with ad-hoc training.

§ Services - Related to the training costs, the consultancy services needed to help in the implementation of any solution are more dependent on the complexity of the environment than the openness of the source or platform. There are always differences in the price for consultancy for each platform, but due to the volatility and dependence of cultural differences, this should be analysed in the moment and country of the implementation.

6.4.7. Cost comparison

The overall conclusion - based on the previous paragraphs and on the bibliography referred by them - is composed by the following points:

§ For each item described above, the initial costs, the changing costs and the recurring costs should be identified.

§ Considering the installation of new hardware and software, in a company or educational establishment without previous knowledge of any solution - open or proprietary - the TCO for implementing proprietary software is more elevated than any open source solution, with the difference being directly proportional to the size of the computing environment.

§ As most of the companies already have an installed base of computers, the changing costs might be more important the other elements, and must be analysed carefully. Conversions from proprietary environments like mainframes may have a large cost for the redeployment of the applications and the tasks needed to redevelop the interfaces with other platforms. Migrations from Windows-based environments to open systems shall have a higher transition cost for the desktops than for the servers - due to the reduced impact on end users -, so the size of the network is an important factor to consider.

§ To compensate the changing costs, the recurring costs after the initial investments are generally lower for open environments, thus when considering the cost impact for a period like 3 to 5 years after the implementation of the new system can give different results than comparisons taking into account only the implementation costs.

§ All environments must be continually upgraded to keep current operating system versions and to profit from technological evolutions, like improved security, increased processing capacity, reduced machine size. The cost for upgrading mainframe environments is known to be high in terms of human resources - but decreasing - and normally oblige hardware changes each 10 years . Windows-based systems, while having usually minor technological advances, imply hardware changes - or upgrades of memory and processor resources - each 2,5 years. In the open systems, the capacity is directly related to the amount of service needed and not to the version of the operating system. An average time between hardware upgrades is estimated to 5 years, to benefit from the technological advances.

§ When the changing costs are higher than the cost savings, and the technological benefits are not compensated by attaining important business objectives, there is a lock-in situation. The decision is complicated because of the recurrence of the consequences: The lock-in increases with the time, due to the increasing usage of the technology - by new people, in new functions and by new applications - and consequent higher changing costs. This is one of the best arguments in favour of the open standards: freedom of choice. Once the migration is done, the changing costs may be amortized in the long term, and the company is free from lock-ins. If the supplier increases the price, does not follow the evolution pace or does not give a good support, the changing costs for a new alternative are lower: the open formats and standards are compatible with several products from the competition and consequently the migration tasks and training become easier. This is an example of good recursion: as the supplier is aware of the facility to change, it is more motivated to ask for a correct price and continuously invest in innovation. Both supplier and customer benefit from this process.

6.4.8. ROI - The conclusion is beyond the costs

Despite the obvious reduction of software costs and considering the high changing costs and the nightmare of calculating a real TCO, the Open Source community tries to concentrate their marketing efforts in comparing open source against proprietary software (and the example is often Windows x Linux) by measuring factors like manageability, control, stability, scalability and freedom of choice. This may be represented in a business case by estimating the amount for the Return on Investment . Besides the cost savings obtained, which in our hypothesis is already part of the TCO, all the other elements benefit the open source software, and the analysis should concentrate in the importance of the benefit for the business objectives.

As an example of the indirect benefits, let us consider the opinion from Bruce Perens: "Control means being able to get a different service provider if you don't like the service you're getting on your software. Control means not having to convince the software's producer that your needs fit in their marketing plan. Control means not living in fear that the BSA (Business Software Alliance) will bring federal marshals to raid your business. Control means not having a domineering software company" . More benefits can be exploited with the help from the explanations and comparisons made on chapter 2.

A careful functional analysis should always precede the cost analysis, to verify if the open source alternatives can be implemented. A good example is for graphic environments. Currently there are no good solutions to compete with proprietary software - which largely uses proprietary formats - like Macromedia Flash® , 3D Studio Max™ and Adobe Photoshop® , which only run under proprietary platforms (Mac or Windows).

Despite the high importance given by consulting reports to the ROI evaluation, a research conducted by DataNews showed that only 30% of the Belgian companies measure the ROI to verify the economical benefits of technological projects.

6.4.9. Case studies - cost reduction

The Linux strategy is a major part of Unilever's drive to cut its IT bill, a part of an organisation-wide plan that already reduced the IT budget from €600m (£398m) in 2000 to €500m (£332m) in 2003, and aims further reduction of €100m until 2006. The main savings will come from hardware, which currently accounts for 40 per cent of the company's infrastructure costs. The Linux strategy will contribute through server consolidation and by reducing unit costs.

Implementation of Linux is also behind cost saving for Morgan Stanley. The numbers have not been announced but the company said that the goal of the project, underway since mid-2001, is to reduce the cost of computing by adding flexibility to its computing architecture. Morgan Stanley's institutional securities division has opted for new architecture that moves the company's data, applications and operating systems off specific machines and onto network servers throughout its global computing infrastructure.

This infrastructure is made up of around 6,000 servers, 25,000 desktop computers and thousands of applications. The aim is also to increase flexibility : "We want to be able to run any application on any box at any time," said Jeffrey Birnbaum, managing director and global head of enterprise computing at Morgan Stanley's institutional securities division. "Now, around 35 per cent of our servers are running Linux."

By 2005, two years ahead of schedule, the division plans to be running 80 per cent of its systems on commodity hardware, most of them using Open Source and open protocols for hardware independence. "When you run everything including the operating systems off the network adding computing power is simple," explained Birnbaum.

A study published by the Swiss consultancy Soreon Research GmbH concluded that "Companies with a €1 million ($1.1 million) budget for office software can reduce their costs as much as 20% by using OpenOffice software instead of Microsoft's Office product. (…). And those running the open source Linux operating system instead of Microsoft's Windows on their servers can save as much as 30%". An interesting conclusion of this study is that small and midsize enterprises may benefit only marginally by using open source software. "A company with 10 computers, for instance, can reduce its costs around 2% by using open source software. A larger company with 100 computers can save 6% on office software and 7% on server software."

Another research conducted by TheOpenEnterprise.com points out that 74% of the corporate managers interviewed consider lower costs as the main benefit from the usage of Open Source and Open standards-based software (see Figure 27 below. And 66% of the same group of managers consider the cost of Open Source software to be between 25% and 75% lower than proprietary alternatives (See Figure 28 below).


Figure 27 - Benefits of Open Source / Standards-based software


Figure 28 - Comparison between Open Source and Proprietary software costs

By opposition to the above cases, studies conducted by IDC research show that in some cases proprietary solutions like Windows may appear cheaper - in the long run - than Open Source alternatives. This study shows that Linux is certainly best for web sites hosting but for other server applications, Windows may be cheaper, due to reduced training costs. This study should be considered cautiously as it has been sponsored by Microsoft, by interviewing 104 American IT managers, related to the implementation of printer, security and file servers. One of the most important cost differences was for a security server, which would cost USD 91.000 under Linux compared to USD 70.000 for Windows (over 5 years). The main fact from the study is the low participation of license costs in the total implementation of the system (around 5%) .

6.5. Evolution and Control: Two flavours, four strategies

Considering the elements discussed in the previous sections - and based on the conclusions by Shapiro and Varian - four different strategies may be adopted by the information suppliers (like software and hardware vendors and internet service providers) when creating a technological paradigm. One criterion is the adoption of open or closed standards (control); the second is the option to create a new revolutionary technology or to keep the evolution of former technologies . These factors shall be recognised when implementing a technological change, to better know the changing costs in advance, and to avoid lock-in situations. This section concentrates in the analysis of the suppliers viewpoint.

6.5.1. Openness or Control

Proprietary architectures may give enormous benefits for the network enterprise built around the companies controlling them, when the positive feedback is generated. The controlling companies are the copyright owners and the producers of the core technology. The control is kept when those companies believe their products to be widely accepted without the help from certification and standard organisations, and start to build a supplier and distribution network around their technologies. Normally commercial coalitions are formed quicker than standards are approved, so their chance to trigger a feedback effect is higher if the users are convinced of their technological superiority. This has been the reasoning publicly defended by Microsoft and Intel.

The major benefit for the controlling companies is when the network value becomes higher than the competition. The feedback and lock-in effects are created and the tendency for the network is only to grow until a new paradigm arrives or a complete monopoly is created. Due to the feedback, new users are attracted to profit from the high network value; due to the lock-in, existing users have problems to change of technology.

6.5.2. Performance or Compatibility

When implementing a new technology to replace the current generation, two options exist: to ensure a backward compatibility - evolution -, or to create new functionalities and a performance level that can convince the users to abandon their current product in favour of the new one - revolution.

A compromise may sometimes be found and it's by far the best option. Guarantying the compatibility while giving more performance and functionalities will reduce the changing costs - a precious argument to gather users from the previous technology - while being attractive to new users.

This is the ideal situation and does not happen often in the information economy. What may facilitate the users to migrate to the new technology is the existence of conversion tools - to easily convert information in the old format - or the availability of bridges, in which the new product can read information created by the previous one. When the standards and formats are open, this is simply a question of development. However, closed standards may give the controlling company the right to legally avoid these features to exist, or to ask the payment of high license fees. This is again a reason to implement products based on open standards and formats: even if the customers don't have high costs to terminate a lock-in situation, the companies creating new technologies may be simply inhibited to innovate.

6.5.3. The strategies

To summarize the concepts of this section, let us briefly describe the four strategies:

§ Closed Migration - This is normally the release of new versions of a proprietary product, and should not present a risk (for the supplier or user) if the backward compatibility is fully guaranteed, and the migration tasks are not complicated enough to produce a high changing cost. Two examples of closed migration are the successful replacement of DOS by Windows, and the limited usage of Mac OS.

§ Superiority by performance - This is the introduction of a new and proprietary technology, incompatible with the dominant market solution. The changing costs should be calculated upon the existence of bridges and conversion tools - tested before to ensure a complete compatibility of the totality of the user information. Attention should be made to the credibility of eventual announcements made by the company owning the current market solution, to avoid the "vapourware" to destroy innovation. A market analysis should also be done to verify the existence of an open alternative, with similar changing costs, but with higher control - for the user. A thriving story is the implementation of the UNIX operating systems, by opposition to the PS/2 and OS/2 failure to conquer the PC market.

§ Open Migration - This is the preferred choice for the customers, with the solution being offered by different hardware, software and service suppliers, reduced changing costs, reduced risk of lock-in by the new product. For the supplier (without the ambition to create a monopoly, and accepting the challenge represented by the existence of innovative competitors) this is also a comfortable situation, as it may benefit from a wider network - even with the help from the competition - that can bring benefits when launching future initiatives or products. The UNIX wars show that the success of this strategy is not always guaranteed, and the creation of the single UNIX specification is a good example of the perseverance needed to attain such a goal.

§ Discontinuity - This happens when a new technology appears, and it's implemented by many suppliers. Its characteristics are similar to the previous strategy, with the exception of the existence of changing costs. It's also favourable to the most competitive suppliers, able to provide an added-value network of service and compatible hardware and software products. To exemplify, the triumphant implementation of the OSI standard and the Internet protocols and the flourishing Linux server market are good examples, by opposition to the difficulty of Linux to conquer desktop users, with the need to improve the compatibility bridges (like the Wine software) and migration tools (like OpenOffice, that accepts documents created by Microsoft Office).

6.6. The shift of power

In the end of the 1960s - the beginning of the information age - the computer manufacturers were focused in the production of machines, which came coupled to the operating systems. As discussed in the beginning of this chapter, in the 1980s the surge of the UNIX systems and the TCP/IP-based computer networks allowed the companies to form strategic alliances becoming network enterprises. With the consequent need for the flexible, interactive manipulation of computers, software became the most dynamic segment of the industry.

Irony, it seems, is that again an operating system from the UNIX family is helping to accelerate a paradigm shift: from software to services. This time the network was also important. The computer networks worked as the media, but the social peer networks were the real catalysts.


Figure 29 - UNIX and Linux catalysing the shift of informational power

This time IBM was one of the first big companies to react. Instead of considering Linux as another concurrent of its 5 operating systems (zVM, zOS, VSE, AS400 and AIX), IBM became part of the open source network, by sponsoring projects, creating laboratories around the world dedicated to open the door of every hardware platform and investing in Marketing . Other major companies interested in promoting Linux are Intel, HP, SAP, SAS and Oracle. There are two (non-exclusive) possible reasons for this: the first is pure marketing, the second a potential service market around Linux. According to TheOpenEnterprise.com (See Figure 30 below), 43% of the companies may consider to implement enterprise applications.


Figure 30 - Potential applications to be hosted in Open Source solutions

6.6.1. Behind the marketing scenes

HP, IBM and Intel are big players in the hardware arena. Today they work together with proprietary and open systems. In the case Linux growth previsions are true, and it finally dominates the server market in the near future, their interest is to be part of the new and powerful Linux network as hardware suppliers, keeping their market share.

These companies will keep their investments in the current operating systems, and will continue to actively participate in the Windows network. If the paradigm does not happen, due to important changing costs - in a period of slim IT budgets -, and Linux keeps a low part of the operating systems market, they have not lost their investment. They will enter to history as Linux Maecenas, have a good image to the open source community, and finally remain active in its market share.

6.6.2. The experience economy

Let us consider one statement from Castells' analysis of the classical theory of post-industrialism: "Economic activity would shift from goods production to services delivery. The demise of agricultural employment would be followed by the irreversible decline of manufacturing jobs, to the benefit of service jobs that would ultimately form the overwhelming proportion of employment. The more advanced an economy, the more its employment and its production would be focused on services" . An extrapolation of this sentence would produce a complementary one: The more advanced a company, the more its research and development would be focused on the production of services. This is the strategy from IBM, followed by the other companies.

The spread of the services realm, by the replacement of internal labour forces did not happen as predicted by analysts like Rifkin . The usage of highly specialized consultancy services increased during the years preceding the Y2K issue, the implementation of the Euro currency and the hype of Internet start-ups. They were long-term and expensive contracts. The crisis originated by the flop of the Internet bubble and accentuated by the spawn of terrorism and the menace of another gulf war changed the figures and projections. The current tendency is to keep the knowledge inside the enterprise, dedicated to satisfy the real business needs. The services are still required, in a smaller scale and for short-term assignments, and concentrated in the infrastructure. Stated another way, if knowledge is power, the enterprises prefer to keep it internally pushing the core business to increase profit, and to use external forces - specialized in the volatile technologies - to decrease cost.

The installation of a new hardware, software or network component is surrounded by several complementary activities - some already discussed in this document -, like capacity planning, installation, migration from former systems, training, consolidation of smaller servers into a large one, or into a cluster. The main advantages of Linux and open standards for service providers are:

§ The sharing of R&D costs with the other members of the open source network;
§ The availability of the source and protocols for problem investigation and a better understanding of the logic behind the software;
§ The easier communication across different hardware and applications;
§ The pervasiveness of Linux - able to run in any existing platform, and quickly adaptable to eventual new ones - thus giving the choice to the service provider to recommend hardware equipments from best partners, and to change these partners without compromising its recommendations, and reduced need of retraining.

This motivates the companies to participate in the standard committees, to be aware before the technological changes, but mainly to create lobbies to influence decisions, favouring technologies implemented by their coalitions, and blocking standards with new innovations from the competition.

6.6.3. Branding

Open source software is a commodity market. In any commodity market, customers value a brand they can trust. The brand building, in open source, is highly based in supporting the community. Marketing strategies from the big companies joining the open source movement (among them IBM, Intel and Oracle) consist in announcing their investment in open source, by creating laboratories specialized in testing new versions of open source software with hardware components and helping to port open source operating systems to new hardware developments. Another common tactic is to sponsor developers to produce open source code.

On the other hand, Microsoft has a hard work to change the bad image associated with its brands. David Stutz - The man formerly responsible for Microsoft's anti-open source strategy - said: "Recovering from current external perceptions of Microsoft as a paranoid, untrustworthy, greedy, petty and politically inept organisation will take years."


Full Document- PDF (2.5 MB)
Full Document - HTML