Dr. Jerry Pournelle

Email Me

Why not subscribe now?

Chaos Manor Subscribe Now

Useful Link(s)...


Hosting by

Powered by Apache

Computing At Chaos Manor

Mailbag for August 28, 2006
Jerry Pournelle jerryp@jerrypournelle.com
Copyright 2006 Jerry E. Pournelle, Ph.D.

August 28, 2006

This was a very busy week. For explanation see the current VIEW over at www.jerrypournelle.com.

Subject: multiple cores and memory access

Hi, Jerry -- regarding multiple cores and memory access: I agree with Mr. Montgomery that we are likely to see larger caches, and potentially more memory on-chip. However, before we see that, I think it is likely that we will see dedicated banks of external memory, with each bank serving one or more of the cores on chip. This will allow manufacturers to preserve the existing physical memory chip architecture, as well as allowing more flexibility in machine configuration and servicing.

Of course, this means a massive increase in the number of data and address lines leading into the chip sockets. However, I believe that problem can be more easily remedied than trying to implement a dramatic increase in core memory speed.

I would be willing to predict that, 10 years from now, it will be common for CPUs to have eight, 16, or more cores; and many banks of external memory to service them. Massively parallel computing is on its way.

Regards, Charlie

My guess is that you are, like most of us, far too conservative. I am confident that in ten years, 64-core systems will be the norm for desktop systems, and 32-core for laptops. I don't know what we will do with all that computing power. It is sufficient that we can all, if we choose, use the Apple OS as the primary OS, thus making UNIX usable by the rest of us without having a wizard on retainer, and then run any other OS we like - Windows, Windows TablePC edition, Windows Media edition, all flavors of Linux, the gaming OS that is likely to be developed by then, and stuff we haven't conceived of, all under the benign governance of the Apple version of UNIX.

We are very much in an era of computing plenty, and so far the developers haven't done much to take advantage of it. Surely they will do so in future.

Subject: Re: Closing Cheyenne Mountain

Maybe some one finally got around to reading "Moon is a Harsh Mistress" and decided the place was vulnerable.

On another subject there have been several reports recently on Southern California TV as to "Global Warming" from automotive and manufacturing CO2 emissions causing the environment to become too humid for the Joshua Trees to survive. I have been under the impression that the rise in humidity in the south west deserts was caused by redirected surface currents from seismic activity and coral reef growth south of Baja bringing a spring/summer monsoon flow to this region instead of as storms out to Hawaii as they once did. Since when does a monsoon flow of storms up the Gulf of California become pollution caused global warming? I do realize that humidity in the air is probably the number one global warming gas but only that from swimming pools and watering lawns can be considered man made.

-- James Early Long Beach, CA

There was a WorldCon panel on Global Warming, but I was unaware of it and my friends did me the favor of not telling me until it was over. Apparently it was attended by the religiously converted who gave short shrift to anyone who was not a True Believer.

The issue has become religious, and the peer-review process now assures that no paper questioning the hypothesis that "The Earth is Warming and Mankind is Responsible" can be funded, or if conducted without grant funding, then published. The danger in this should be obvious: The very process by which science operates has been subverted. I got Greg Benford to agree with me on this but only off stage; on stage he appeared as a True Believer like the others.

The subversion of science by the peer review process so that the "consensus position" can never be questioned nor and experimentum crucis testing the central hypothesis be funded no matter the qualifications of the tester is a potential disaster to all, and is far more important than the question of Global Warming itself.

Clarke's Law is that "Any sufficiently advanced technology is indistinguishable from magic." It is certainly the case that for most of mankind, high technology might as well be magic. The average citizen has no more understanding of the technology of, say, television, than his cat does of the dairy industry which produced the milk poured into its bowl. We now seem to be moving into an era in which the scientific community itself must accept the ritualistic explanations of various phenomena or be anathematized for heresy. This cannot be a good development.

I remain unconvinced of the global warming hypothesis, and have even less faith that reducing man-made emissions will do any good even if the Earth is warming. The Earth probably is warming; it most certainly has warmed since the cannon of Ticonderoga were dragged across the frozen Hudson to General Washington in Haarlem Heights, and the brackish canals in Holland froze hard enough to bear ice skating in Fall (read Hans Brinker or the Silver Skates). Since most of that warming took place before the year 1900 it seems unlikely that it was due to human activity; something else is going on. How much warming there is, and what it's due to, and what feedback mechanisms it triggers (high altitude clouds change the albedo something wonderful, but don't figure in most (if any) climate models) we simply do not know. Alas, studies designed to determine such factors are generally unfundable due to the peer review process and the "Consensus position" which is maintained with Torquemada-like fervor.


First congratulation on your award! Though I would tell them firmly that you are by no means finished, so lifetime achievement is premature!

I wrote once before about the time I wasted trying to get dual head running under Linux with my Nvidia 6600 based graphics card. The thing works under Windows XP but just won't with any incantation of drivers under Linux that I tried. I'm pretty competent, but the required time to get things working is too great.

However, Linux reared it's head again a few days ago. I need the big display area, I need Linux to do some builds, and I need Windows XP to do the rest. The solution I have found is Vmware Workstation 5.5. Running on my 3.8GHZ Xeon workstation, I get very respectable performance out of Fedora Core 4 and due to the dual processor Xeon system, still have plenty of MIPS left over for the Windows side. This is a big improvement from 2 years ago. I can move or resize the Vmware desktop pretty much as I please and it does not care whether it's stretched across two displays or not.

Now that sub ring 0 virtualization is coming on line, this type of solution should get much better.

Should have Conroy and Woodcrest systems running shortly. Should be fun.


Thanks. Bob Thompson assures me that standard Linux is now reliable enough for Aunt Minnie and can safely be installed without having to worry that you'll spend the rest of your life maintaining her system. As time goes on, more and more Linux solutions will be developed.

On the other hand, as the power of our systems increases, the use of virtual machines becomes easier and faster, and that is very much to the good.

Thanks for the kind words.