Thursday, February 14, 2008

Apple Computer, Inc.: A History

In many ways, the computer has been the formative invention of the second half of the twentieth century, much as the automobile was for the first half. As rapid technological change has enabled computers to become ever more powerful and at the same time smaller and less expensive, they have moved into every aspect of life. When the first electronic computer, ENIAC, was built in the 1940's, it was a monstrosity of wires and vacuum tubes occupying an entire room and so hungry for power that legend tells of it dimming the lights of all Philadelphia when it was first turned on. Within a matter of years the transistor made it possible to make electronic devices several orders of magnitude smaller than previously imagined, and the subsequent development of integrated circuits further fueled the move to miniaturization. With VLSI (Very Large Scale Integration), it became possible to put the power of the earliest computers on a single piece of silicon, the microchip.

Into this atmosphere of tremendous change came two young men, Steve Jobs and Stephen Wozniak. Each arrived at his interest in computers through a different route, but when they met one another, their ideas of the role of computing meshed and they began to seek a common vision. Jobs was a mystic who had come to believe that science and technology, not religious contemplation, was the key to human betterment, but continued to pursue his goals with a mystical fervor. Stephen Wozniak combined a more practical approach to hardware with a cheerful belief in the innate goodness of humanity. While Stephen Wozniak was still a high-school student experimenting with electronics, he came to recognize the important principle that design must work with the human element to be truly successful.1 The two of them worked together for a while in a partnership to make "blue boxes," illegal devices to enable people to manipulate telephone systems. After a few brushes with danger and the law, both young men decided to pursue a more legitimate line of business. However, their fondness for tinkering with electronics stayed with them.

In 1976, while working on a video teletype terminal for a friend's computer company, Wozniak began to combine that design with an earlier design for a computer, replacing the panels of blinking lights which has served as a display with a video screen. He then tried to market his design to Hewlett-Packard, but they saw no future in the idea of a personal computer and turned him down. Steve Jobs then convinced him that the two of them (Wozniak and Jobs) could start their own business to make this computer. Jobs called the computer an Apple, recalling his days of seeking enlightenment in the orchards of a Hare Krishna commune.2

In August of 1976 Wozniak saw a demonstration of a crude color display called the Dazzler and became obsessed with incorporating a color display into the Apple. At the same time, he wanted to simplify the memory architecture of the computer. This led to an innovation in which he made the processor share the memory with the raster (video) display. Wozniak realized that a cathode ray tube (CRT) monitor spent a good fraction of its time whipping the electron gun back to start the next line. The processor would be allowed to use the memory during this time, while the monitor got the memory the rest of the time.3

The original Apple computer was made at Jobs' parents' home (where Jobs was living at the time), and sold as a partially assembled kit for computer hobbyists. Software was loaded from cassette tape. When the original interface for loading BASIC (a programming language) failed to work, Wozniak built his own. They also settled on the slogan of "Byte into Apple" for selling their computer. However, they quickly ran into trouble. Hobbyists were more interested in the Altair and other early kit computers, while the computer's $666.66 price drew fire from Fundamentalist Christians alarmed that this was the machine of the Beast. Jobs found a suitably mystical pat answer for the second, but the first problem proved more difficult to solve.4

In 1976, Jobs decided it was time to expand Apple's horizons, and sought contacts with venture capitalists. After an initial rebuff, he made the connection with Armas Clifford Markkula, who agreed to underwrite a bank loan of $250,000 to start a company that would build the Apple II. Thus was born Apple Computer, Inc. Formal papers were filed on January 3, 1977, and within three months it bought out the former partnership of Jobs and Wozniak which had built the original Apple.5

The newly formed company then moved out of Jobs's garage to new quarters in Cupertino. Now came the challenge of making the company work. Somehow they had to create a market for personal computers beyond the small core of hobbyists who had previously been buying kit computers. They ultimately settled upon a strategy of starting with the hobbyist market and then expanding from that base to pull in professionals with small operations. For this they emphasized the Apple's ability to automate various processes such as controlling appliances.6

Jobs pioneered the Apple II's attractive plastic case, believing that an ugly computer would turn away the very group of people they were looking to attract. He absolutely wanted to avoid the rough home-brew look of many hobbyist computers, and tried several designers before he could get one to create the look he wanted.7

In 1978 Apple introduced the enhanced Apple II (the Apple II+), which was the first personal computer to have a floppy disk drive. Previously size had restricted disk drives to mainframes, while users of personal computers had to make do with cassette tapes, which were slow and frequently tangled or broke.8

Apple also innovated in its first advertising campaign. Instead of restricting itself to the technical periodicals, as had been the practice with previous computers, Apple also bought advertising space in popular press magazines such as Playboy. Its advertisements emphasized the practical utility of a home computer for such things as household finances and income taxes.9

In its early days Apple had a "loan to own" program which enabled each employee to obtain an Apple computer after a year on the job.10 Although contemporary writers seem to think of this as a revolutionary, even New Age, innovation that proves Apple's new humanism, it actually hearkens back to the days of Henry Ford, who paid all his workers at his factory enough that they could actually have a reasonable hope of owning a Model T of their own within a few years.

In 1981 Jobs started toying with the idea of creating a "book-sized" Apple, to be known as the IIb. However, it didn't go anywhere because he feared that it would compete with the Macintosh. However, after being showed a picture of an early Toshiba portable, he decided to do it one better by coming up with a design that would include a built-in disk drive. The technical challenge of fitting all the components into a package little bigger than a three-ring notebook excited him.11

However, not all was well within Apple. Some of the problems were caused by the company growing faster than its organizational structures could handle. Apple stock was a major source of friction in the early company. Some employees received stock options while others did not. Salaried employees had far better opportunities to buy stock than did hourly employees. Sometimes it was easier for well-connected outsiders to buy stock than it was for employees. When the board decided to make Apple a public company in 1980, the friction became even more intense, exacerbated by the fact that Jobs feared losing control to the shareholders.12

Apple actually went public on December 12, 1980. Within that single days, the price of a share of Apple stock went from twenty-two dollars to twenty-nine. Apple was the most successful initial public offering since Ford Motor Company went public in 1956. Apple hit the Fortune 500 faster than any company had previously done, and soon was worth more than many major corporations.13

However, the management crisis within Apple could no longer be ignored. Apple's rapid growth had outstripped its leaders' ability to handle the situation, until it became absurdly bureaucratized, with memos on every conceivable thing, including how to write a proper memo. Jobs was alienating everyone with his intrusive, antagonistic management style. The board finally determined that something had to change, and the best way to do this was to bring in a new chief executive officer. The man they chose for this role was John Sculley, a college-educated PepsiCo executive.14

By 1983 Apple Computer was in serious trouble. The promised Apple III had been a failure, and the Lisa seemed ready to follow it. The entrance of IBM into the computer market seemed the death knell for Apple. They were putting a port on their personal computers which enabled those machines to "talk" with IBM mainframes. By contrast, Jobs adamantly insisted that each Apple computer should be totally self-contained. And an even worse problem was coming -- the Macintosh was aimed at the same market as the Lisa, but instead of pooling resources, the two divisions saw each other as enemies.16

Jef Raskin came up with the idea for the Macintosh in 1979. The central component of this concept was the mouse, which would enable the computer to run on a graphical user interface instead of a command line. When first presented with the idea, Jobs adamantly opposed it. However, he soon came to see its merits. When he did, he took over the project entirely, alienating Raskin, who subsequently left.17

In 1979 Xerox had permitted Steve Jobs and several other leading people from Apple to visit their Palo Alto Research Center (PARC) and witness demonstrations of the Xerox Alto. This was a computer that actually ran a primitive but functioning system based upon the graphical user interface and the mouse. At the time Xerox regarded Apple as a partner rather than a competitor, but once Apple developed the concept in its own way and successfully marketed it in the Macintosh, recriminations quickly followed. These culminated in a lawsuit, which was subsequently thrown out of court on the grounds that the "look and feel" of the interface was too vague to be defined and Apple had clearly not stolen any Xerox code to create the Macintosh operating system.18

The Macintosh was an extraordinarily visionary computer when it was first developed. The Macintosh could be called the first virtual reality machine: its interface was a consistent virtual reality, in which the icons -- tiny pictures on the screen -- represented real things. These symbols -- documents, folders, trash can, etc. -- were borrowed from the familiar reality of the office. In the years since its introduction it has become familiar, but when it was first introduced, it was derided as a toy, unworthy of real computer users. Subsequently its fundamental concept has been incorporated everywhere, not only in other computer operating systems but also in run-time applications such as the control panels of high-end photocopy machines. Although the Macintosh will eventually pass into the electronic graveyard, its legacy will continue.19 Shortly before the introduction of the Macintosh, Microsoft chairman Bill Gates stated that it was the only microcomputer beside the IBM PC that was worth writing software for.20 In retrospect that may seem ironic, since Microsoft Windows and the Mac OS have become fierce rivals. But one must remember that in 1984 there was a wide variety of computers and operating systems, all mutually incompatible, which have since been forced out of business by market selection pressures, leaving only Windows and the Macintosh operating systems as contenders in the personal computing arena.

Difficulties with hardware design also led to some choices that would prove visionary, particularly in floppy disks. The Mac became the first major computer to use the 3.5 inch microfloppy disk because of a problem with the original "Twiggy" high-performance 5.25 inch drive. The latter was one of Apple's own innovations, but various technical problems made it almost impossible to manufacture reliably. Almost all the drives coming off the assembly line were defective in one way or another, and the few usable ones were woefully inadequate for the projected demand of the Mac. By contrast, a Japanese company known as Alps was prepared to produce a new drive which Sony had developed. This used a smaller disk enclosed in a rigid plastic case with a metal shutter, more robust than the paper-sleeve floppies which could lose data simply by being bent in carrying.21 That too has become the norm, until many people have forgotten how innovative the microfloppy was at that time, and how pervasive the 5.25-inch disk was.

The public got the first inkling of the Macintosh on January 23, 1984, when Apple ran an enigmatic Orwellian advertisement during the Superbowl. This video clip featured a young woman rebel running into a theater full of staring human drones and smashing a giant telescreen on which Big Brother yammered away. The following Monday, Steve Jobs unveiled the Macintosh to the world with a demonstration of a program which allowed the computer to synthesize digital speech.22

The early acceptance of the Macintosh was hampered by the lack of application software, a deficiency partly compounded by Apple's strict rules on distribution of materials for software developers. Because Apple was refusing to allow mail-order companies to sell Macintoshes, it also forbade mail-order distribution of Inside Macintosh, the guidebook to the Macintosh architecture which was the programmer's bible for the system. Apple also put very tight restrictions on the Apple Certified Developer Program, by which software developers were able to obtain Macs at a discount and receive informational materials. A few daring employees worked around these restrictions to get development materials into the hands of every software programmer possible, under the idea that each Mac out in the field would help sell additional Macs simply through its presence.23

The Mac posed some difficult challenges for programmers accustomed to computers that were restricted to displaying standard ASCII characters in a grid of twenty-four lines by eighty columns. Because of the Mac's graphical orientations, programmers now had 160,000 individual pixels to deal with, and had to do it in a memory space so limited that they kept running out, leading to freezes and crashes. Jobs' insistence on a uniform look and feel across the platform meant that many features were burned into the ROM rather than being open for programmers to tweak. As compensation, five hundred basic routines were stored in ROM, but the programmers had to learn how to properly invoke this genie in order to use it well.

Initial sales of the Macintosh lagged far behind expectations. Hoping to improve their figures, the Macintosh division came up with a revolutionary marketing promotion in the fall of 1984. Called "Test Drive a Macintosh," this allowed people to borrow Macs from their computer dealers to try out. The idea was to get potential customers hooked on the Mac interface so that they would buy it rather than having to return it. Unfortunately it coincided with the Christmas rush, and dealers simply wanted no part of the additional hassle involved in administering loaner units. In retrospect, Mac evangelist Guy Kawasaki called it: "Right thing, wrong way."25 The "Test Drive a Macintosh" program also was problematic in that it was primarily aimed at consumers instead of businesses, and made potential business customers feel it demonstrated that Apple lacked discipline and focus.


Digg Technorati del.icio.us Stumbleupon Reddit Blinklist Furl Spurl Yahoo Simpy

No comments: