Is it real or is it

Everything here is my opinion. I do not speak for your employer.
February 2019
March 2019

2019-02-07 »

Quotes from 1992

I was recently recommended to read the book Accidental Empires by Robert X. Cringely, first published in 1992 (or was it 1991?) and apparently no longer in print and also not in e-book format. To my surprise, it turns out archive.org has a solution for this, an epically user-unfriendly "virtual library card" (which is still worth it if you need to read a book) in which they apparently receive one physical copy of a book, scan it, and lend it out digitally, one person at a time, using an aggressively user-hostile DRM abomination called Adobe Digital Editions.

(I'm not kidding about the user hostility. The iOS version, for no reason, actually requires you to double click to open a book from the list. But, like double clicking with a mouse, the second click doesn't count if you're more than a few pixels off. I hope you have really precise fingers! I suspect the people who wrote it literally did not ever run it on real iPhone hardware, and therefore tested it with a mouse.)

ANYWAY

It's a pretty fascinating book, snapshotting the microcomputer industry (and its history) as things sat back in 1991 (or was it 1992?). It surprised me how many things are just like they were back then, 27 years ago (yikes!). The names have changed, but human organizations remain the same.

(He also serialized and updated the book on his blog back in 2012-2013, but it's really long and I didn't have time to read it, and apparently that version also didn't come out in ebook format. I would have happily paid for it to read in a more convenient format. Oh well.)

Here are some quotes I painstakingly retyped. Take that, DRM garbage.

On avoiding adulthood:

They weren't rebels; they resented their parents and society very little. Their only alienation was the usual hassle of the adolescent - a feeling of being prodded into adulthood on somebody else's terms. [...] And turning this culture into a business? That was just a happy accident that allowed these boys to put off forever the horror age - that dividing line to adulthood that they would otherwise have been forced to cross after college.

On big things failing to scale down:

How did we get from big computers that lived in the basement of office buildings to the little computers that live on our desks today? We didn't. personal computers have almost nothing to do with big computers. They never have. [...] Big computers and little computers are completely different beasts created by radically different groups of people. It's logical, I know, to assume that the personal computer came from shrinking a mainframe, but that's not the way it happened.

On amateurs as the carriers of progress, predicting the success of open source:

It takes new ideas a long time to catch on - time that is mainly devoted to evolving the idea into something useful. This fact alone dumps most of the responsibility for early technical innovation in the laps of amateurs, who can afford to take the time. Only those who aren't trying to make money can afford to advance a technology that doesn't pay.

On the surprising existence of limited ambition:

Let's say for a minute that Eubanks was correct, and Gary Kildall didn't give a shit about the business. Who said that he had to? CP/M was his invention; Digital Research was his company. The fact that it succeeded beyond anyone's expectations did not make those earlier expectations invalid. Gary Kildall's ambition was limited, something that is not supposed to be a factor in American business. If you hope for a thousand and get a million, you are still expected to want more, but he didn't.

On the Chief Scientist job title:

In a personal computer hardware or software company, being named chief scientist means that the boss doesn't know what to do with you. Chief scientists don't generally have to do anything; they're just smart people whom the company doesn't want to lose to a competitor.

"Research subjects" at PARC, the original doesn't-count-as-headcount TVCs:

Money wasn't a problem, but manpower was; it was almost impossible to hire additional people at the Computer Science Laboratory because of the arduous hiring gauntlet and Taylor's reluctance to manage extra heads. [...] Simonyi came up with a scam. He proposed a research project to study programmer productivity and how to increase it. In the course of the study, test subjects would be paid to write software under Simonyi's supervision. [...] By calling them research subjects rather than programmers, he was able to bring some worker bees into PARC.

On indoctrination of recent graduates:

Through the architects and program managers, Gates was able to control the work of every programmer at Microsoft, but to do so reliably required cheap and obedient labor. Gates set a policy that consciously avoided hiring experienced programmers, specializing, instead, in recent computer science graduates.

On the dangers of only hiring geniuses:

Charles Simonyi accepts Microsoft mediocrity as an inevitable price paid to create a large organization. "The risk of genius is that the products that result from genius often don't have much to do with each other," he explained.

On the value of limiting yourself to standards:

...which was why the idea of 100 percent IBM compatibility took so long to be accepted. "Why be compatible when you could be better?" the smart guys asked on their way to bankruptcy court.

The social effects of frequent reorgs:

The rest of the company was as confused as its leadership. Somehow, early on, reorganizations - "reorgs" - became part of the Apple culture. they happen every three to six months and come from Apple's basic lack of understanding that people need stability in order to be able to work together. [...] Make a bad decision? Who cares! By the time the bad news arrives, you'll be gone and someone else will have to handle the problems.

On survival in a large organization, which is very different from entrepreneurship:

I learned an important lesson that day: Success in a large organization, whether it's a university or IBM, is generally based on appearance, not reality. It's understanding the system and then working within it that really counts, not bowling scores or body bags.

An interesting organizational structure, where Bill Gates was the chairman but hired a president who would be responsible for everything except software development:

This idea of nurturing the original purpose of the company while expanding the business organization is something that most software and hardware companies lose sight of as they grow. They managed it at Microsoft by having the programmers continue to report to Bill Gates while everyone on the business side reported to Shirley.

On how Flight Simulator got approved:

Then there was Flight Simulator, the only computer game published by Microsoft. There was no business plan that included a role for computer games in Microsoft's future. Bill Gates just liked to play Flight Simulator, so Microsoft published it.

On Novell's very strange history, and why it was so unusually good for its era (it really was, too. That thing was great):

The early versions of most software are so bad that good programmers usually want to throw them away but can't because ship dates have to be met. But Novell wasn't shipping anything in 1982-1983, so early versions of its network software were thrown away and started over again. Novell was able to take the time needed to come up with the correct architecture, a rare luxury for a start-up, and subsequently the company's greatest advantage.

A more specific version of the "Microsoft takes three tries to get anything right" theory:

Microsoft's entry into most new technologies follows this same plan, with the first effort being a preemptive strike, the second effort being market research to see what customers really want in a product, and the third try is the real product.

On industry analysts giving you the numbers you want to hear:

...the question, which was: When will unit sales of OS/2 exceed those of DOS? The assumption (and the flaw) built into this exercise is that OS/2, because it was being pushed by IBM, was destined to overtake DOS, which it hasn't. But given that the paying customers wanted OS/2 to succeed and that the research question itself suggested that OS/2 would succeed, market research companies like Dataquest, InfoCorp, and International Data Corporation dutifully crazy-glued their usual demand curves on a chart and predicted that OS/2 would be a big hit. There were no dissenting voices. Not a single market research report that I read or read about at that time predicted that OS/2 would be a failure.

On Bill Gates's annual reading weeks:

Annual reading weeks, when Gates stays home and reads technical reports for seven days straight and then emerges to reposition the company, are a tradition at Microsoft. Nothing is allowed to get in the way of planned reading for Chairman Bill.

A partially-failed prediction about Steve Jobs:

(When InfoWorld's Peggy Watt asked Gates if Microsoft would develop applications for the NeXT computer, he said, "Develop for it? I'll piss on it.") Alas, I'm not giving very good odds that Steve Jobs will be the leader of the next generation of personal computing.

A little-known partnership between IBM and Apple to try to make a new OS (Pink) that would finally beat DOS:

IBM has 33,000 programmers on its payroll but is so far from leading the software business (and knows it) that it is betting the company on the work of 100 Apple programmers wearing T-shirts in Mountain View, California.

A perception of industry fatigue in 1991-1992, which was around the time almost everyone gave up competing with DOS+Windows. Interestingly, this is also when Linux arrived (produced by "amateurs", see above) and may have rejuvenated things:

But today, everyone who wants to be in the PC business is already in it. Except for a new batch of kids who appear out of school each year, the only new blood in this business is due to immigration. And the old blood is getting tired - tired of failing in some cases or just tired of working so hard and now ready to enjoy life. The business is slowing down, and this loss of energy is the greatest threat to our computing future as a nation. Forget about the Japanese; their threat is nothing compared to this loss of intellectual vigor.

A rather bad-seeming idea for "software studios," which maybe seemed like a good idea at the time, but we've kinda tried it since then and it has a lot of unexpected downsides:

[Comparing to Hollywood studio structure] In the computer business, too, we've held to the idea that every product is going to live forever. We should be like the movies and only do sequels of hits. And you don't have to keep the original team together to do a sequel. All you have to do is make sure that the new version can read all the old product files and that it feels familiar.

An interesting perspective on the (then still in progress) takeover of computer hardware manufacturing by Asian countries. China wasn't yet even on the radar. (His claim was that it didn't matter because software was so valuable and stayed in America. 27 years later, that prediction has held up okay, although Asia seems to be doing fine with that hardware stuff):

The hardware business is dying. Let it. The Japanese and Koreans are so eager to take over the PC hardware business that they are literally trying to buy the future. But they're only buying the past.

(All above quotes by Robert X. Cringely)

Why would you follow me on twitter? Use RSS.
apenwarr-on-gmail.com