Charles Kafoure is a technology consultant based in Indianapolis. He has been in the business of providing equipment, software and/or services for twenty five years. He has managed the establishment of large, turnkey computer projects in Korea, China, Singapore, Japan, the U.S., other Asian and European countries, and in Australia. He currently focuses on technology services for legal, real estate and other professional businesses. He will soon publish an article entitled, “Litigation Management: Organize Using Project Management Methodology.”
(Archived September 15, 1998)
Our lives, and those of our children, are going to be profoundly affected by new implementations of technology. With their prominent (some say monopolist) role in the world of technology, Microsoft presents significant challenges in the coming years. Because of the speed at which change occurs in technology, we may not be able to catch it in time; therefore, we need not only to anticipate, but also to act.
During the 1970s, AT&T and IBM were two powerful names in two entirely different kinds of business. AT&T was still the “phone company”, providing all the long distance, and most of the local service, to homes. Their mission was almost exclusively to carry voice around the country. IBM, on the other hand, was the computing giant of the decade. Their mainframes dominated the landscape, first with the end of the 360 line and then with the 370. Their lead was unchallenged in the marketplace, with Burroughs, GE, Honeywell, Univac, Sperry, and a few other fringe companies splitting the remaining marketplace. Their duty (or, more properly, that of their systems) was processing. Their mainframe systems received data, processed it, stored it, and returned processed data to the users. At the end of the 70’s, however, IBM began to find value in telecommunications, and AT&T found value in the movement and storage of data. Soon after, AT&T was broken up, and its successor companies continued its march toward data traffic.
Pundits began saying that AT&T and IBM would be indistinguishable by the year 2000. They turned out to be right, and not just because they are two companies who lost their way, and are rediscovering themselves in one way or another. The meaning of the pundits related to the product/service set of those two businesses. This has become true because the definition of computing includes the source, processing, storage, access, and movement of data, and that the location of data has become irrelevant. Most specifically, data and processing have become inseparable. Tools to process that data have become a part of the data, in some cases.
Necessity is the mother of invention: anthropological factors in computing
Except in rare cases, technology is rarely driven by itself; human wants and needs drive development. During the 80s through the present day, human beings are on a steady course to regain their individuality, and to stand apart and alone from those mega-corporations with which they accustomed to be associated. There is irony here, in that the trend is toward larger, not smaller, corporations, but the individuality within those organizations continues to grow. The revolution surrounding the individual, originally addressed by Steven Jobs and others, continues to this day, by allowing individuals more input and more responsibility in their lives, and that of their enterprises.
What has really changed
When applied to computing, this means that the king of the IT world is now the individual. The individual’s responsibility is not only to produce data and interpret reports, it is to have complete control over the process, from source, to storage, to access, all the way to processing and interface with the world. No longer is the individual chained to a desk, taking orders from a tyrannical boss, dictating our every move. “Bosses” of this era, at least the good ones, are empowerers; i.e., they grow, and they believe that the enterprise will grow, if the individual expresses his will. Technological growth has responded to this desire by providing the individual with tools to perform all data processing functionality.
Along with that empowerment comes responsibility; those responsible employees carry the weight of the organization on their shoulders. They are responsible for what was formerly their DP department’s work. If “publish or perish” is true for academics, then “manage data or perish” is true for the “rest of us.”
The rulers of the computing world are now the users. In an unprecedented shift of power, the user, both corporate and individual, are the kings and queens of the information universe. They are no longer constrained, as they were in the Central Era, to whatever the DP Department on them. They are no longer subject to the restrictions of what information and poser was at their desktop or their LAN during the Personal Era. The world is theirs. They (we) are empowered to provide all the information and processing needed assure their continued operation. In fact, the quality and quantity of data that they process sometimes define that function. They have access to their own data, as well as that of any other enterprise which chooses to let them have access, and all that with minimal support. Therefore, contents of the public network, and tools to access it, has suddenly become our defining element. Content and access must be protected. The computing dream that we described above, that of free and unfettered access to data on the network, will be a nightmare without this protection.
In the Central Era, the mainframe and OS needed protection from the omnipotent IBM, which controlled most of the business computing at the time. The market afforded us that protection, leading to the Personal Era. In that area, the desktop needed protection from Microsoft, which dominated the desktop. We didn’t take care of that very well, partly due to the fact that the Personal Era of computing occupied a fairly small slice of our history. The marketplace quickly migrated to the Public Era, closing our Personal chapter forever, thereby voiding any “monopoly” Microsoft has on the marketplace. Emphasis comes now to the public network, and away from the desktop. The desktop has suddenly become less important. The network is in. Content is in. The desktop is out.
Knowing that the paradigm of computing has changed, and that control of data means control of the enterprise, whatever its definition may be, how we approach data control is most important. If one company controls the data itself, along with the tools to manipulate that data, we will be screwed. Let’s have a look at the ideal.
As we have discussed, there is a complex interaction between information and access to information, and processing of information. It has now all fallen under the realm of computing. There is no longer any separation. In order for us to be certain that party A (who now can be any user on the network) can use data from source B (ditto), we should be able to have standards that we can live with.
We need to be able to read data that belongs to someone who wants us to read it.
We need to be able to move data on the network at will.
We must have a common set of tools with which to process data, like middleware, with embedded processing tools, a common display code, like HTML, and a common method of running programs locally, like a virtual machine.
One company dominating an industry, such as banking must not shackle us, because, if they control computing as well, one party will control crucial elements of our society.
Artificial standards, while we so desperately would like them, produce Least Common Denominator (LCD) software. Further, standards organizations would unwittingly be promoting one platform or another, as the “standard” software would run best on one platform or another.
Without writing a treatise on how to develop and market software, software wins because it is the most useful software for a good price. It is no more complicated than that. If Microsoft writes software that most people use, then they control the standard. The marketplace will guide that development, and turn elsewhere when unsatisfied. Remember WebTV? The only demand in this area that is reasonable is for MS to construct true Chinese walls. We are a society who believes in fair play, not having them between application and OS (Internet and desktop) development is not fair. Sooner or later, we the people will rebel against this. Recommended solutions in Part III.
So what is left? Content.
Since we have determined that content and tools are both necessary for 21st century computing, and that regulating the tools will produce undesirable effects, we must regulate Microsoft out of the content business. It is the only solution. Part III will examine how to do this.
“I got by with a little help from my friends…”
Thanks to Duncan Kinder, Jerry Lawson, Professor Dan Gifford, University of Minnesota, salespeople (oops, I mean PR) from Sun and Novell, the same friends as last time, and…
…and a bit from Groucho Marx
The funniest guy of the 20th century once said, “I wouldn’t want to join a club, which would have me for a member.” I wonder why Microsoft would want to belong to The Software Publishers Association? They published a “white paper” entitled “Competition in the Network Marketplace: The Microsoft Challenge.” There were a couple of compelling technical arguments in that paper, certainly more than the DOJ had to offer, but it was a bit disturbing from several perspectives:
- There was no conclusion or recommendation
- It contained only arguments which could only be understood by very technical people
- It was unendorsed by its membership
The only person whose name was associated with the paper – Lauren Hall, Chief Technologist of the SPA, was very forthcoming, and willing to discuss technical issues in detail. Her rhetoric was clearly anti-Microsoft, but, in seeming incongruity, she is studying for NT certification.
Sun joined the SPA in early 1998, about the time the report was being written.