Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor:
June 30, 2008

The User's Column, June, 2008
Column 335, Part 2
Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.


A note on scheduling: I am just climbing out of the general malaise caused by 50,000 rad of hard x-rays. The good news is that the treatment seems to have worked. The Big Mess in my head is much diminished and perhaps dead — they'll do more MRI to find that out — but my voice is back, my energy is up, and I'm getting some work done.

The schedule for Chaos Manor Reviews will remain: A major column divided into two segments each month, and the mail bag as warranted by the quality of the mail.

I have an apology to Intel, and others: I have all the parts for a top notch enterprise level machine based on the Intel Extreme Quad CPU and the Intel Desktop Board SX48BT2. I have a WD Velociraptor (link) Drive, 4 GB of Kingston memory, Antec case and 550 Watt power supply, and everything else to build a screaming machine suitable for enterprise developers. (I don't have a video board to do this Extreme system justice, but that's a matter of time. Those mostly interested in the kinds of games that need that sort of video board will have other sources anyway.)

Games are not the only reason one would want an Extreme system. The CPU intensive activities at Chaos Manor include running inBoxer ( an adaptive Bayesian learning spam filter I highly recommend), and dozens of Outlook rules; when a hundred emails pour in at once (and that happens) this can bring most systems to a dead halt. It can even slow the Quad 6600, although not to a stopping point. I have reports from others with similar work loads but using Intel Extreme systems with Vista Ultimate, and they are all positive.

Microsoft Outlook is a wonderful system for organizing your work, and inBoxer learns new spamming tricks faster than anything else I have tried, but Outlook plus inBoxer eat CPU cycles like nobody's business. The latest Intel systems can handle what I have found to be one of the most difficult non-graphics workloads any of us will face. My Quad 6600 will continue compiling, let me work on a document, and listen to TWIT even when there is a mass of incoming mail. Sometimes the TWIT broadcast will slow, but the system doesn't crash. With the Extreme I doubt I'll ever know anything is happening.

I realize I have made scheduling promises that I have only barely kept, and I am way behind on building up the Extreme system; but since my use of the Extreme will be in more enterprise-like tasks, and the gamers already have their review sites, I don't feel that I have to commit seppuku with a Pentium III chip...

Column Begins Here

There are two fascinating rumors about Vista: first, that Bill Gates, who is formally retiring from Microsoft management, really disliked the early shipping versions of XP (and quite possibly Vista); and second, that Intel has made the decision not to install Vista on its internal systems, but doesn't want to make a formal announcement because of its relationship with Microsoft.

I thought this would be an item for the mailbag rather than the column, but on reflection there are some important implications here.

Begin with a fairly sensational article in the New York Times:

Et Tu, Intel? Chip Giant Won't Embrace Microsoft's Windows Vista By Steve Lohr

http://bits.blogs.nytimes.com/2008/06/25/et-tu-intel/index.html

Intel, the giant chip maker and longtime partner of Microsoft, has decided against upgrading the computers of its own 80,000 employees to Microsoft's Vista operating system, a person with direct knowledge of the company's plans said.

This was quickly followed by a number of articles pointing out that it took Intel at least this long to move to XP; and that most large companies, with a great number of seats, take eighteen months to two years to do any significant migration of their operating systems; and even then it is done in departments, not industry wide. One example comes from the Seattle Post Intelligencer: (link).

Changing operating systems is a Big Deal, and changing to Vista appears to be even more so. Chaos Manor Associate Eric Pobirs puts it very well:

It would be far more newsworthy if Intel were making any major deployment of an OS so shortly after its release. I'm currently working in a contract position for project deploying new hardware and updating the image on still under warranty machines at Wells Fargo branches. With close to 6,000 retail locations this represents a lot of machines. Well over 100,000 plus whatever number are in the corporate offices. Part of this project is the elimination of Windows 2000 via an image based around XP.

So this massive bank will only have shifted from a nine year old OS to a seven year old OS by the scheduled conclusion of the project.

Many article writers make reference to businesses skipping Vista in favor of Windows 7, as it is currently called. Since Microsoft has already stated that Windows 7 will draw heavily on Vista rather than be all new as some misreported previously, and if Win7 makes its announced release date, this will be no hardship for Redmond. For companies large enough to have a unified image they deploy it would be remarkable if they hadn't skipped over several Windows releases in the past just because the consumer sector moves so much faster than that of corporate IT.

Considering the current shift to XP, an operation like Wells Fargo won't be bypassing Vista in favor of Windows 7. The new hardware we're installing would run Vista just fine but that not because Wells Fargo overspent on the machines. It's just what an entry level machine delivers for cheap these days. Rather, Vista, and possibly Windows 7, will have come and gone in the consumer market before the bank next has a deployment that includes a new OS in the image.

A lot of the misperception comes from the longevity of XP. If a company running Mac OS or Linux stuck with the same image for a few years consider how many releases they'd be skipping. It wouldn't be a comment on those OS versions that came in between. It would just be the normal course of business.

Even so, there may be more to the Intel story than meets the eye. One major factor is that Apple went over to Intel. There is now a real choice of operating systems for Intel chip users. Peter Glaskowsky notes that this isn't true for enterprise systems. I'm not so sure; we'll discuss that later.

More: all versions of Windows will run nicely on Apple systems, and not just by dual boot. I have Windows XP as an application under VMware Fusion on my iMac 20. I have many reports of the stability of XP under VMware. One is an enterprise outfit that writes FPGA tool-chains for hardware and software design on XP under VMware on both 4 and 8 core Mac Pro's. This is enormously computer intensive.

True, the real work is done on a Big Mac — A Quad Core Mac Pro — but so what? That system is available today. Indeed, I have enough reports from both business and enterprise houses that I have considerable confidence in using XP as a Mac application under VMware. If it's this good now, it can only get better.

At the moment I only have VMware and XP on the iMac 20, but I will soon have it running on the MacBook Pro (at which time the MacBook Pro becomes my main travel laptop, for obvious reasons).

With the MacBook Pro I will be able to do, under XP in VMware, everything the ThinkPad t42p does in XP. That means Outlook, Word 2004, and just about anything else I carry to work on. My major writing machine when I am not at my desk remains the MacBook Air; it is still the computer I am likely to have with me in a coffee shop or medical waiting room. It's great for that, but it's not a full service computer: when I go on the road, I need considerably more computing power. That used to be the ThinkPad t42p, but in future it will be the MacBook Pro.

At the moment, applications generally run on a particular operating system. Clearly that will change. As the hardware improves, the operating system becomes less important. Intel knows this.

Travel and the Border Patrol

As an aside, my travels aren't likely to include crossing borders, since the US Government now apparently has the right to confiscate one's laptops so that its incredibly skilled and careful technicians can copy and examine all its contents in the relentless fight to protect Americans from terrorism, child pornography, and political incorrectness. This is done randomly and without probable cause, and given the desperate measures to avoid the appearance of profiling it shouldn't be long before retired generals and aged grandparents receive their attention. So far TSA hasn't asserted any such right over domestic travelers, but we haven't heard from the outlying precincts yet. Of course I could encrypt everything and somehow forget the password, but that would result in having the computer confiscated. Is this an argument for keeping all your files on remote sites? They'd be encrypted there. Do remote files show up on your laptop? Clearly you'd be better off in that respect if they did not. I doubt that the TSA will search your paper logbook.

Advisor Rick Hellewell says that this is coming to the attention of Congress:

Two U.S. senators called on U.S. Customs and Border Protection (CBP) to back off its assertion that it can search laptops and other electronic devices owned by U.S. citizens returning to the country without the need for reasonable suspicion of a crime or probable cause.

More of this article at this link.

....Rick...

We can hope so.

My Advice on Vista

First, with few exceptions, you should not upgrade an existing system running XP. Vista needs resources, and some tweaking, and while I know of some successful upgrades, I know even more horror stories. One problem is that there's no test: your upgraded system can bite you in unexpected ways.

That said, there is no good reason not to buy a new mid-level (Core 2 Duo, 2 GB memory) with Vista installed if what you want is a Windows machine. My own preference would be to get a high end Mac and VMware, but then I do a lot of strange things. The fact is that many people and businesses are quite happy with the current Vista — provided that they have adequate hardware and it's properly tuned.

Of course you will have the very real problem of which Vista to choose. You can get a first cut picture from Microsoft here... but it won't really help. More technical details are available at this ExtremeTech article, but that's the problem: the details may be overwhelming.

In my case I use Vista Ultimate and have done with it. I do have a Lenovo TabletPC, which runs a lesser grade of Vista, and I do seem to have more networking glitches with that than I do with Ultimate systems. In the old days at BYTE we would have all versions running and compare them, but those days are gone. My tentative advice is that since you're going to be using this OS for several years, get Ultimate and be done with it. It costs $200 more, but getting Ultimate gives you one less damned thing to worry about. Fair warning: I get Ultimate free, so my use of Ultimate is not based on cost/effectiveness.

The usual trend in these matters is for Microsoft to add features as time goes on. It is my belief this is a lot easier to do if you start with Ultimate than with a deliberately crippled version.

And having said all that, I think that myriad of Vista versions is a shameful thing, and Microsoft ought to go back to being a code house that sells the best code it can generate, not crippled versions.

Bill Gates Retires

It's the end of an era: Bill Gates will no longer take a close interest in the affairs of Microsoft. He remains Chairman of the Board, which means he could return to demand an accounting at any time; the question is, will he?

We had a long discussion on this on TWIT (link) 149 "Bill Gates is not evil." My colleague John Dvorak says it's all a sham, and we'll soon see more of Gates than ever. I disagree. I think Bill Gates has fulfilled most of his Microsoft ambitions. He always wanted to build a big and powerful company, but it was more than that: unlike anyone else in the early 80's, Gates had a vision that wasn't confined to ambition. He wanted to see a computer on every desk, and in every home, and in every classroom.

No one else had that dream. IBM wanted to see 100,000 computers throughout the world, all of them running software leased from IBM. There were various other visions, but Gates was the only one who saw just how universal computers would be. Now it has happened. Changing the world by getting more people to appreciate computers is no longer the first thing one wants to do, because he already did it.

In the middle of all that came the CDROM, a way to distribute enormous amounts 0f information. Gates saw its potential, and started a series of conferences that made CDROM ubiquitous. Now true: technology advanced faster than anyone, including Gates, foresaw. The Internet has made the CDROM obsolete, and it took Microsoft a while to realize that. Still, the CDROM made it clear that information could be universal, not confined to libraries but distributed everywhere. Anyone could have access to almost any information, including primary sources — copies of the Dead Sea Scrolls, as an example. That changed the world, and Gates can take much of the credit.

He has been the richest man in the world (and very nearly still is). He has seen his dream of universal computer power go a long way to fulfillment. What's left?

Well, a Nobel Peace Prize, to begin with. His humanitarian efforts are likely to achieve that. That shouldn't take much longer. He will get his trip to Norway. And then what?

A New Dream

Back when I was Secretary of the L-5 Society, I used to say that the statesman who leads mankind permanently to space will be remembered when Columbus and Isabella the Great are long forgotten. That's still true. Arthur Clarke observed that if mankind is to endure, then for most of our history the word ship will mean "space ship."

I suggest a $10 billion X Prize to be paid to the first Lunar Colony. I define colony as an establishment of at least 31 people alive and well after three years on the Moon. Gates could establish that prize in an instant; and it is very likely to be successful. There are private rocket companies that can accomplish the mission, and a prize that big will attract investors. Now clearly this resembles the policy I have tried to get the US government to adopt for the past ten years. It differs in two ways: there is no way Gates could make his prize tax free; but on the other hand, he could establish it at will.

I don't suppose it's likely, but it's certainly possible, and it would change mankind's history.

Whither Microsoft

I've said this before: what Microsoft is good at is selling code. The Microsoft marketing philosophy was simple enough: if it works at all, ship it. Moore's Law will insure that the hardware will soon be good enough to make it work well. This contrasts with the IBM philosophy of shipping nothing until it works well — which often meant not shipping it at all.

Gates, I think, understood this. Microsoft made some decent profits in hardware, and still does. They make good mice and keyboards — the Microsoft Comfort Curve keyboard is my current standard choice. Xboxes are quite profitable. But the main product of Microsoft has been code. Will that change now? Steve Ballmer has said that a quarter of Microsoft's revenue will be from advertising. That's a lot of revenue, especially for a company that has yet to show much expertise in the field. Precisely how Microsoft will compete with Google is not at all clear.

Even less clear is why they want to do that.

For the first decades the major problem for computer programmers was the hardware. The machines were not very fast, they had very little memory, there was a shortage of mass storage, and communicating between machines was slow. The best programmers were those who could figure out tricks to make complex programs run in the existing hardware. Programming cleverness — the ability to "do a good hack" — made and broke reputations. It is probably significant that Bill Gates was one of the very best early programmers, and his head to head programming contests are legendary.

Those days are gone. Today's hardware has enormously more power than any program demands. There are some games that challenge existing systems, and there are some graphics manipulations that can bring nearly any system to its knees, but with those exceptions few programs use anything like the computer power available to anyone with a couple of thousand dollars.

And that's now. It won't be long before our present generation of computers will seem pretty tame. Within 4 years we'll have 64 core machines, and what amounts to infinite memory and mass storage.

Peter Glaskowsky comments

You've pointed out, many times over the years, that omnipotent hardware is further away than most people think. Please don't succumb to the temptation of believing we're finally about to get it.

Numerically, client systems are just now moving to 4 cores. It takes about two years for this number to double. That means 64-core machines are at least 8-10 years away for most people. In practice, they're further away because the software loses its grip in the 8-core/16-core generations, and chip makers will stop pushing for such a rapid doubling pace.

-- png --

I don't agree. I believe that competition — not merely financial competition but the intellectual competition among engineers and advisors — will continue to drive Moore's law. Peter is right about the time it takes for new technology to become ubiquitous, but my observation is that it is far less so now than it has been. The current economic crisis may slow down some of the growth: but on the other hand, rising fuel prices are already raising the demand for teleconferencing and telecomputing.

So while I do not think we are at the edge of "omnipotent hardware," everything I have seen says that we are on the threshold of what we all would have considered infinite only a few years ago — and you ain't seen nothing yet.

Wasted Power?

Someone should be able to take advantage of all that power. There are many things we would like computers to do for us, only no one knows how to teach the machines to do it. It takes so long to learn programming that one hasn't had a chance to be skilled at much else. Either you can do stuff, or you can teach computers, but it's very rare to be able to do both.

It wasn't always that way. I recall at Aerospace Corporation I needed models of US/USSR thermonuclear exchanges. We had variables that included the probability that an ICBM would be alert and ready to launch; the probability that it would launch; weather probabilities that affected accuracies; highly complex curves showing vulnerability; and a number of other factors. I needed a model that would let us look at the effect of changing some of those variables: would we be better off simply having more crews to increase alert readiness, or in fooling with the hardware? And so forth.

That was in 1964. I had some familiarity with computers — IBM taught me how to program the IBM 650 — but it was all at a system level. I had no idea how to write the program I needed, and certainly not in the time we had. Fortunately Aerospace had good Fortran programmers, and by working with them we were able to develop the model. I could come up with the equations, and they could teach the machines to run both expected value and Monte Carlo solutions. My point here is that the programmers didn't understand missiles, and I didn't understand computers, but we were able between us to do that job. Today there are some cooperative programmer/scientist/engineer/operations analysis teams, but there aren't many, and learning how to take part in that kind of activity can take years — and also, it requires teams, which are expensive, and that freezes out many sources of ingenuity. Computer scientists spend all their time learning how to program, with emphasis on efficiency and speed.

That should change. With all the computing power available, we don't need efficiency and speed. What we need is ease of programming, computer assisted programming, languages that are easy for humans to learn and which the computers can interpret. There's no real reason why that can't be done, and at one time there was a lot of work on natural language programming. I recall a Texas Instruments team that made considerable progress. Alas, the computers weren't good enough: it took human ingenuity to write code that would fit in existing memory, or run at the speeds then available. Now we are in an era of hardware plenty, but many of those efforts, abandoned when the hardware wouldn't do the job, have not been resumed.

The very concept of operating systems is obsolete. At one time what we call an operating system was called a "monitor", and it connected a number of routines that knew how to save and retrieve memory, and read and write to disks, and write words and draw figures on a screen. All this had to be done efficiently and with great speed, while not eating up all the available computer power. There were a number of operating systems that died of the bloat: the hardware just couldn't do them justice. That too has changed. The next generation of computers can devote as many resources to the operating system as may be wanted — and still have far more resources to devote to applications; and one of those applications can be any other operating system. I don't think the era of the exclusive operating system has many years to go.

In future, "which operating system does it use?" may be an irrelevant factor in choosing a computer — whether at home or enterprise or big business level. The hardware will be that good. And I don't think that future is all that far away. I've heard too many rumors of what's going on now in closed laboratories. The Big Brains are coming.

Someone is going to take advantage of all this computing power. At one time I would have thought that would be Microsoft. Now I don't know. The new masters in Redmond seem to think they can learn the advertising game and that will provide the growth revenue. I doubt that; and I suspect that when Bill Gates steps back and takes a hard look at where his company is going, he may want to come back after all.

This is not a time for management and marketing. This is a time for vision; and with Gates out of there, there is precious little vision on the Microsoft Campus.

We live in interesting times.