June 2001

From Mainframe to Minicomputer

In the original mainframe model of doing business, the customer bought everything from a single vendor. The systems vendor supplied the hardware, and then--because the hardware needed an operating system and applications--the vendor supplied those as well. Because the hardware was unique, and because its software would run on no other system, the customer was a captive of the vendor and would have to build an entire computer system over again in order to change vendors.

Customization was done by vendors, their dealers, and by the in-house experts the customer needed to run a mainframe system. Source code was often available to these experts to insure round-the-clock availability and reliability.

When the smaller and cheaper minicomputers came to market, the vendors managed to keep customers captive, although most of them used an operating system designed to be easily modified so that its applications would run on a wide variety of computers. The vendors simply re-wrote parts of UNIX to prevent customers from running other vendors' software on their systems.

The PC Upsets the Old Model

The original microcomputer market was as fragmented as the minicomputer market; although many of the vendors used the Z-80 chip and the CP/M operating system, none of the software could be switched between systems. Apple entered the market with its own operating system and Motorola chip, and IBM hurried into the market with its microcomputer, called the IBM PC and using an Intel chip with a CP/M-like system called PC-DOS (Microsoft's MS-DOS). So far the business practices duplicated those of the minicomputer market: hit your competition from below with cheaper systems, and keep the customers locked in.

If IBM had not been in such haste to enter the microcomputer market, things would have stayed the same. But IBM's haste kept it from developing proprietary hardware; the only part of the IBM PC that could not be bought by anyone was the BIOS. Once other companies figured out how to clone the BIOS without copying it directly, they rushed into the suddenly-exploding market. IBM had accidentally demonstrated the power of open standards in hardware. This power pushed Intel-based microcomputers running MS-DOS into market supremacy, while simultaneously condemning Apple (which fought any attempts to clone it) to niche status. The microcomputer companies based on other standards died.

The Current Workstation, Server, and PC World

The opening of the hardware standard for PC's (some of which eventually became servers) brought an explosion in hardware vendors, and--for a while--a flowering of third-party applications software vendors. Customers now had the power to shop among vendors and buy the software best suited for their purposes. What the new market left unchanged from the minicomputer world was the need to customize vendor products for individual customers.

Large customers might persuade vendors to make changes in products; small customers could only hope. The vendors' desire to have unique software products led them to keep source code secret. Customers depended on internal resources or on integrators to make software systems perform with the chosen components.

The component model, or modularity, is very important to users. The smaller the units of software, the more opportunities for system customization, even if the customer does not have the source code. But it takes time to understand how to work with software applications and operating systems whose insides are hidden, and the luxury of taking that time is rapidly disappearing: business practices are rapidly evolving with new technologies that must be deployed ever more rapidly to keep pace with competitors. And software vendors are fighting the best-of-breed shopping model by extending the capabilities of their software. Operating systems and office suites absorb the functions of third-party applications that formerly surrounded them, mail programs evolve into groupware while groupware evolves into workflow systems. Customers bring their increasingly complicated system problems to a dwindling number of vendors offering a more limited number of applications.

As this vendor lock-in increases, customers struggle for more freedom and pin their hopes on various solutions: re-usable objects, cross-platform applications, modularity within applications, and middleware to integrate their chosen applications with other applications and systems from third parties. All of these solutions would be made easier with open standards.

The Internet as a Model

There is one place we can look for an example of open standards leading to rapid technological progress and equally rapid adoption by users--the Internet and the World Wide Web. Because these projects were international, government-sponsored, and research-oriented, their bias was towards open standards and software. Like the IBM PC, they demonstrate the power of open standards to roll over closed, proprietary products. An example is the arrival of commercial mail systems on the Web: the X.400 gateway standard was devised to enable the interconnection of all mail systems that use the Internet.

Is Open Source Inevitable?

Open Source software is already here, making rapid progress in the server market. The only question is how far and how fast it will spread both in that market and beyond it. According to market researchers IDC, over half the servers on the Web run Open Source software, and of the 6.1 million servers sold in 2000, 27% were Linux servers (25% in 1999). The Windows share of the server market was 41% in 2000, and 38% in 1999.

The advantages of Open Source software for your business (as given in last month's piece) provide the driving force in this market. The big reason is reliability, the result of wide use and bug testing as well as the ability to call on development manpower and specialized skills that cannot be found in any one company. The other two reasons are low cost and customization, and all these reasons are intertwined in ways you might not expect.

The ease of customization is not only the result of having the source code available with the software; it is also a result of the development process, which requires that the software be modular in order to farm out the work among numerous developers. The low-to-zero cost of the software itself is the other component of customization: if selling 27% of the server operating system market nets the vendors only 1% of the money spent in that market, who gets the other 26% of the money?

You do. If you budgeted for another operating system but put in Linux instead, you now have about 95% of that budget available for something else. If you are feeling the needs of many businesses, you will now put some or all of that leftover money into system customization to make it run the way your business really wants it to run. You can use your own people or hire outside expertise to do the work. It's easy to see why server market share is growing so rapidly (and the IDC figures count only the sale of new systems with Linux installed, not the conversion of existing systems).

So far Linux is a server phenomenon. In the desktop world Microsoft has a 92% share, Mac OS 4%, and Linux 1%. But you have seen the powerful forces that drive the adoption of Open Source software and open standards. Some people believe that eventually all software will be Open Source because customers will be so accustomed to its advantages that they will buy nothing else. I believe that proprietary software does have a future, but that in order to appeal to a growing Open Source world it will have to adapt to Open Source characteristics of modularity, use of open standards, and even provision for some sort of access to source code.

Copyright © 2001 by Donald K. Rosenberg, Stromian Technologies (


Return to Rosenberg's Corner -- Topics