Dr. Jerry Pournelle

Email Me

Why not subscribe now?

Chaos Manor Subscribe Now

Useful Link(s)...


Hosting by

Powered by Apache

Computing At Chaos Manor:
The Mailbag

Jerry Pournelle jerryp@jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

July 24, 2008

A comment on last week's mailbag:

July 11th's story: "powerful enough hardware"

Dear Jerry,

Your conclusion about the consequences of "powerful enough hardware" is exactly the same conclusion that computer pioneer Don Lancaster has come to. So many formerly-difficult math problems can be easily to solve these days. Or as Don put in his July 2 entry to his blog:

"The available fast and cheap math tools and techniques these days utterly boggle the mind. It is utterly trivial to throw another hundred million calculations at a problem." - http://www.tinaja.com/whtnu08.asp

He demonstrates this with his short PDF article, "Fun with Fields." Solving for the electromagnetic field used to involve horribly intricate equations--but now it's just easier to do it numerically. And since he writes Postscript code himself, you can see a solution build right before your eyes in the Acrobat file: http://www.tinaja.com/glib/funfield.pdf

John D.

Well, I am glad that I am not alone in my views...

Directory copying on the Mac

Hi Jerry,

I noticed in your column that you are looking for a way to copy whole directory trees. If you want to do it from the command line, rsync is your friend (anything you can do in Windows from the command line, you can do better in Unix).

rsync -a src dest

Remote volumes will be under /Volumes

If you have copies that you want to define and run again, I'll give you a shameless plug for Prosoft's Data Backup (iGeek, Inc. is the original manufacturer for the product). You can define a set to copy from one directory to another, using the "clone" and it will check dates, and just copy the most recent over.

Dave Smith

I am still trying to master the Mac conventions. I am very accustomed to assigning drive letters to drives; it makes it a lot easier to deal with them in command lines. With Mac and UNIX you must get everything just right, and alas, if you get something wrong, it will often try to execute the command anyway, sometimes with disastrous results. Ungood. Bob Thompson points out that this is equally true with DOS and Windows command lines. My experience has been that Windows command lines are fussier, and if you get things wrong in general not much happens; but of course he's right, and you can write Windows command line instructions that will produce a disaster.

Subj: Modula-3 origins: interview: Luca Cardelli


>>[T]he idea to design type-safe operating systems ... started at Xerox with Cedar/Mesa, and continued at DEC with the Taos operating system. You might say it is still continuing with Microsoft's .NET, and we are not quite there yet.<<

>>Modula-3 was the stepping stone from Cedar/Mesa to Java...<<

Rod Montgomery==monty@starfief.com

Interesting. I didn't know that C# had continued to evolve to full lambda expressions. It might one day be close to Python! I'll have to go read ...

Robert Knight

Python is in fact a very useful language, and I do use it to write utilities and conversion programs. It's free, well maintained, and easily learned. http://www.python.org/

Moreover there are a number of good books on Python and Python programming, most of them from O'Reilly.

Regarding Modula's origins, the first I heard of it was as a continuation of the work of Niklaus Wirth who wrote Pascal as a teaching language and then Modula-2 as an actual programming language. Wirth studied in the US, and did a sabbatical at Xerox PARC. I have not seen him for years, but we were once close enough friends that he was a guest at Chaos Manor back when I wrote for BYTE, and I visited him in Zurich, both for BYTE and as a friend. When I was learning about small computers, Wirth and Dijkstra were the computer scientists I read most closely and paid most attention to. At one time I had a Lilith system which ran Modula-2 as its operating system. It was a bit slicer and enormously powerful for systems of the time. (It also ran a 5 megabyte Honeywell Bull hard drive that was the size of a 2-drawer file cabinet, and dimmed the lights in the house when turned on.)

I continue to believe that strongly typed and highly structured languages which do range and type checking in the compiler are preferable to the more flexible (and certainly faster) languages that allow you to do almost anything you like including changing types on the fly. In my judgment the best computer programming language is one that allows you to follow the logic of the program even if you did not write it, and by insisting on proper syntax makes debugging less necessary — sometimes not necessary at all. A properly written language does what you expected or it won't compile at all. That's the ideal Wirth was after. Dijkstra's specialty was developing theorems to prove programs; something else we seem to have neglected.

I have said all this before but it bears repeating: the days when computer resources required us to use assembler-type languages because we needed both the compilation and execution speeds are gone. Now what we need is better programs, and better programs won't happen without better languages.

I ran this letter over on the Chaos Manor web site, but it belongs to this thread, too:

Free Pascal

Hi, Jerry.

Knowing your interest in Pascal, I thought you might be interested in this: Free Pascal at http://www.freepascal.org.

I found out about it on Steve Gibson's Security Now podcast last week.

According to their web site: "Free Pascal (aka FPK Pascal) is a 32 and 64 bit professional Pascal compiler. It is available for different processors: Intel x86, Amd64/x86_64, PowerPC, PowerPC64, Sparc, ARM. The discontinued 1.0 version also supports the Motorola 680x0. The following operating systems are supported: Linux, FreeBSD, Mac OS X/Darwin, Mac OS classic, DOS, Win32, Win64, WinCE, OS/2, Netware (libc and classic) and MorphOS.

The language syntax has excellent compatibility with TP 7.0 as well as with most versions of Delphi (classes, rtti, exceptions, ansistrings, widestrings, interfaces). A Mac Pascal compatibility mode is also provided to assist Apple users. Furthermore Free Pascal supports function overloading, operator overloading, global properties and other such features."

I have not tried it - I'm not a programmer, but after 20 years of documenting poorly coded software, I agree completely with your arguments in favour of strongly typed languages.

Regards Keith

Pascal was designed as a teaching language, and in my judgment is still the best introductory programming language for those who want to learn to program.

And more on programming:


Dear Mr. Pournelle,

First off, I'm glad to hear that you are recovering, and glad to hear that signs so far are positive.

I have been programming for 25 years now, and all of my primary programming has been in various versions of BASIC. I have also used Pascal, Java, VB and other structured languages as well. I read with interest Nathan Okun's letter in the July 14th Mailbag, and wanted to make some comments on what he wrote.

Mr. Okun thinks that "a more English-like BASIC" would allow programming to be done by virtually anyone who knows what the end result should look like. "... the ability for an average person to take his problem and turn it into a working program would be minimal." The problem is not that the language is difficult to learn. The problem is that people in general do not understand the logic required to get the result they want.

Fifth-generation languages (Wikipedia link) were designed to do exactly what Mr. Okun describes: take a problem and get the computer to figure out how to "program" it. They have all been dismal failures because as soon as a program has any complexity, the computer just has too many options for it to efficiently figure out how to do what you want. They are generally only used in AI research nowadays.

"You do not need to become a programmer: If you know English and can write in coherent and unambiguous sentences (not easy, but this has to be done in any scientific or engineering writing, anyway, I would think), you can have those sentences turned into English-style BASIC-type code with little problem. The resulting program might be long and wander about a lot (depending on how the writer wrote his original descriptive spec document), but as long as it "sticks to the yellow brick road", who cares how long it takes to get there when computers today run so fast and have so much memory and storage capacity?"

Yes, today's computers are orders of magnitude faster than they were in the past, but coding efficiently is still important. Let's take real-world example: Not long ago a customer wanted to update their BASIC system so that they could do wildcard searches in their contact database. There were two ways to code it, and both ways would provide the correct results. One way is to read in each contact and perform the search. If there is no match then you go to the next record. The second way was to send an SQL statement to the server to find the matching records. The first method took roughly 5 minutes to search their 60,000 contacts, and the second method took one second. No, that is not a typo. Imagine the difference if there were 600,000 contacts instead of 60,000.

He suggests that making all variables and constants global would solve problems. Well, first off, I think the average person doesn't understand what a variable is, never mind global vs. local. And anyone who has programmed a large system knows that localized variables are critical to code-reuse.

The problem with "truly informative and useful warning and error messages" is that you are expecting the complier to understand what the program is trying to do, when it has no clue. Most compilers (or syntax checkers/interpreters as the case may be) do their best to produce a useful error message. But you have to understand what the code is doing in order to deduce what is causing the error. In his case "VARIABLE X is being used as the result of a SINE routine, so it cannot have a value of greater than 1 or less than -1", the issue really is determining WHY X is not in the correct range. A complier cannot help you there. The other kind of error is where the program runs without errors, but does not produce the expected results. Again, that often goes back to the logic of the program.

I think that a more "English-like BASIC" has been replaced by the scripting languages you have in programs like Word and Excel. Macros are essentially the lay-mans programming language, and just like natural language programming languages, they have their limitations. Of course as time has gone on though, they have been getting more and more powerful.

For the regular programmer, the visual tools have replaced much of the hand coding that was required in the past, and so arguably programming is getting less and less about code and more about applying programming logic. Unfortunately, not everyone can grasp that logic, just as not everyone can write good fiction. I could attempt to write a work of fiction, but I can tell you (and my writer wife would agree) that no one in their right mind would want to read it.

Glenn Hunt

Well, yes; but I am not sure there is much astonishingly new there. Of course the compiler doesn't understand what you want, and some intelligence is needed; but the compiler should understand that you didn't really want data used as instructions, or floating point data treated as integers, or local variables suddenly promoted to be global, and vice versa. Proper language syntax can prevent much of that, and good syntax makes it possible for someone to read a program and see what it does by following the logic. It has been the general experience that programs in C are often unmaintainable once the original programmer leaves (and sometimes after the programmer has been away from his own code for too long).

The value of strongly typed and highly structured languages is that they forbid you to do many things that may be convenient but are highly unwise and often lead to bad results.

It is my view that over time we will come up with pre-parsing programs that will take something akin to "natural language" and translate that into a structured language like Pascal or Modula-2. That will take considerable time; but if the intermediate is a properly structured program, the results will be much better than if an attempt is to go from natural language to, say, C.

Observations on the next hardware/software platforms


I have been an enthusiastic fan of yours for more years than I care to count. :) I've read your Chaos Manor columns off and on since some time in the early '80s as well as a pretty broad selection of your fiction. While I don't always agree with your conclusions or your world view, I always come away with something to think about. I've appreciated your insights for years.

With that in mind, I'd like to respond to a couple of points and a question you raised in your June 30th column:

"The very concept of operating systems is obsolete... The next generation of computers can devote as many resources to the operating system as may be wanted — and still have far more resources to devote to applications; and one of those applications can be any other operating system. I don't think the era of the exclusive operating system has many years to go."

"In future, "which operating system does it use?" may be an irrelevant factor in choosing a computer — whether at home or enterprise or big business level. The hardware will be that good..."

"Someone is going to take advantage of all this computing power. At one time I would have thought that would be Microsoft. Now I don't know."

I would argue that your take on OSes is a little off key. People in general have never cared about OSes. In large part, they didn't really care about applications, either. All they cared about was getting stuff done. The /only/ reason anyone ever asked what OS a PC ran was because they wanted to run a particular program to complete a task.

As I think you would agree, Bill Gates' real genius was not in engineering, it was in marketing. For nearly two decades he convinced people that the only way to get stuff done on a PC was on one running a Microsoft OS. Even better when he could convince them to use Microsoft applications. :)

Therefore, it's not really right to say that OSes are obsolete. The choice of OS for a particular platform will always limit your choice of applications. (Yes, virtualization techniques using VMWare does minimize this issue to some degree. But have you tried to run a DX9 game in a VMWare image yet?)

I contend that it's more accurate to say instead that with the consistent rise in popularity of OS/X based laptops at the high end and the unexpected popularity of the netbooks like the Asus eee running Linux at the low end, people are now becoming aware that there are alternatives out there. Microsoft to date hasn't figured out a good response to this challenge.

As to your comment that the hardware is moving quickly toward more available cores very quickly, I find myself in tentative agreement. There is a countertrend toward cheaper hardware that I think will slow that adoption down, though. Remember when a good laptop was $3,000 or more? Now you can get a decent one capable of doing whatever you want for about $500.

Still, the trend toward multi-core machines with 16 cores or more isn't that far away. So, the interesting question becomes, which OS has demonstrated the best long term performance when given that much hardware? I would suggest a quick visit to the site that has been tracking the top 500 supercomputers since June of 1993 might be in order:


I think this series of graphs in particular are quite telling.

June 2008:


June 2003:


June 1998:




Jim Smilanich

I don't recall saying that the OS is obsolete; I do say that as our hardware gets more powerful, Operating Systems become less important. With multiple core systems we can devote an entire core with memory to any OS we like, and move back and forth among them at will. I can run XP as an application under Mac OS X, and do so. I haven't as much experience at that as I'd like, but that's (I hope) a temporary situation involving energy levels and recovery from radiation treatment. So far, though, my experience has been that given powerful enough hardware you simply don't notice what the "primary" operating system is. I know I run Windows 95 programs as applications under Windows XP. I admit I have yet to do that with XP running as an application under OS X, but that's a matter of time; my iMac 20 may not be powerful enough, but within a year I intend to have a big Mac Pro that almost certainly will be...

Your observations on price are certainly accurate. I expect by the time I really need the Big Mac Pro I'll be able to afford it...

Continuing that discussion:

Many-Processor Computers

Mr. Pournelle,

When you think about many-processor computers -- using entire processors in places where mere switches or support devices (memory caches, etc.) are used now to support the few CPUs -- you are going into a whole new world of complexity-based algorithms, which include neural networks, quantum computers, layered LAN/WAN interlocked systems, the Internet, and so forth.

In our brains, the average neuron has direct connections using synapses to circa 9,000 other neurons, most nearby, but many from other sections of the brain about as far away as possible or in-between these two extremes. If each neuron is a separate processor that can emulate how a neuron processes these inputs and adjusts its functions as they influence it (reprogram itself on the fly as it "learns" its place in the system), you will get true intelligent machines.

Combine this network system with quantum computers at each node instead of conventional machines and we are approaching David Gerrold's HARLIE (Arthur C. Clarke's HAL is a wind-up toy in comparison). Such computer entities are not programmed. How can you program a network like that? How can you debug such a network, since there is no way to understand the threads of interlocking processes (trillions, quadrillions, more?) simultaneously running in such a machine?

Such machines will transcend what we call programming (that is, go beyond the use of mathematics and Boolean logic, interspersed with special functions like interrupts and so forth, we now use to control computers).

What "logic" causes self-awareness -- "sentience" -- in our brains? This is NOT part of physics or chemistry or any other physical process system that we now use. The generic term "cybernetics" relates computer software and hardware to the techniques in a computer that emulate "thought" and "feelings", but most of it, to my knowledge, only addresses directly the small-scale systems that give immediate feedback.

A true "mind" made from the HARLIE-level systems of systems that form such a computer is not directly addressable by any single or group of humans trying to use it to solve some sort of problem. You have to teach the machine to do what you want by having "conversations" with it.

It seems to me that the application programs on these machines will not be hard to do, in that the computer and you can work together on the problem to reach a satisfactory design (if possible). The hard part will be developing the baseline algorithms that connect the computer's neurons together -- its "instincts" and other built-in hardware/software/firmware processes that form its "mind" in the first place. This may have to be done in increments, using each machine to design the next, more-complicated machine, until HARLIE creates his GOD, for which HARLIE is the Prophet who talks with GOD.

What I am getting at here is that multi-CPU machines as described in that ARS TECHNICA article on INTEL are no longer computers as we know them.

They will fall under sentient living organisms and require the use of complexity theory and other such esoteric concepts (schools of thought, since I do not know under what heading to put them that is descriptive enough). Treating them more like a person is going to be more useful than treating them like a logic device. The Turing Test finally comes true.

Already, game computer chips are being used as the building blocks for super-computer network machines; such games that tie together many players simultaneously in entire worlds are going to be the initial hardware devices on which this Brave New World is founded, though they are many orders of magnitude to small for the true neural networks I am discussing here.

Therefore, telling programmers to try to program with such super-duper computers is a waste of time. They cannot do it. These machines will only be able to program themselves using "learning", "intelligence", and "wisdom" (I put them in quotes, since these may not be what we think they are when applied to such machines), where we tell (humbly request of?) them what we would like to do and they figure out the details on how to do it, suggesting improvements and explaining limitations, based on what we explain to them about our problem.

The very concept of "programming" will disappear, just like the concept of "driving" a car will disappear when it can drive itself and you merely tell it where you want to go (all cars become self-driving taxis). You end up more like riding a horse than driving a car -- digital devices that get so smart that they end up emulating analog devices, including living things!

There is simply no way a human or even a group of humans can figure out all the interactions (or even the small portion of these interactions involved with your specific problem) of 100,000,000,000-CPU neural network "brain emulators" (especially if each CPU is a quantum computer).

Such machines can solve problems and in some (most?) cases not EVER be able to tell you how they did it -- the computer had a "hunch" or stated the result "just smells right" or some such explanation (no matter what fancy language it uses to cover this situation up). How would you disagree unless you can run a huge number of tests and find a mistake somewhere?

The big problem here is that since you do not know how it is working correctly, you will not know if one of these computers goes "nuts" (HAL, anyone?) and starts doing things you do not want, either evil things like HAL or just crazy things like somebody with schizophrenia. Unless the machine does some obvious thing (kills all the astronauts in a spaceship), how would you find out except too late (build a bridge and then find out its stress/strain computations were faulty when it collapses)?

Brave New World, indeed!

Nathan Okun

First, most applications we run are perfectly happy with a single processor; the problem is apps like Outlook that want to eat everyone else's resources. We don't need to rewrite anything to make use of multiple processors.

That's true of operating systems as well. "Emulation" used to be a nasty word, but it's pretty tame now. OS X runs Linux nicely; with enough processors, you can have multiple instances of Linux, each happy enough running one or two applications.

It's multi tasking that profits from multiple processors.

For a few processors, true. But how many separate programs will you be running in a computer with thousands, millions, or billions of processors?

You are not thinking far enough ahead. We will not be running the software we use now in such machines. What will things be like when creating entire CPUs will be just as hard as a single transistor using nano-technology (pico-technology? femto-technology?)? The machines will be as like ours as ours it like an abacus. The operating systems and programs used now will be so quaint that most people will be amazed how we survived with such primitive things. We are talking STAR TREK's Data, not a Mac!!

I am talking about 50 years or more in the future (though I would not bet that this might not occur faster than that!). Similar to that "singularity" thing people talk about, which I think is dumb. People are people, so changing the WAY they do things will not change WHAT they do to each other much -- modern technology sure hasn't, has it? Cavemen and people with STAR TREK Q-like powers are still going to be people, unless we modify ourselves and I do not have much optimism about how people would do that kind of thing, given human history.

Nathan Okun

Well really, I don't recall saying that we will still be running the same programs when we have such superior resources, and I doubt that anyone who has given much thought to the trends would think so.

As John McCarthy famously observed many years ago, massively multi-processor computers tend to be immune to programming; something that Dick Feynman found frustratingly and stubbornly true at Thinking Machines, Inc.

But it is simply not true that changing the way we do things will not change what we do. I used to write essays about what I called "real" rights and freedoms. One of my examples was taken from Kipling's Captains Courageous. There is a scene in which the wealthiest man in North America needs to get from California to New England as quickly as possible, and the entire railroading resources of the United States were devoted to the task. It's an exciting scene, and it shows just how fast that could be done, and the enormous costs.

Today anyone can manage it for a few hundred dollars. When I was young, the kind of tumor they found in my head would have been invariably fatal, and I'd be dead now; but here I am, a bit under the weather from radiation poisoning, but still able to write and work.

Closer to home, I was part of the very earliest ARPA Net, with accounts on Tops 20 and MC at MIT, and connections through the IMP at RAND Corporation. I worked with Minsky at the time; and I guarantee you that we did not do things then the way they are done now. Email was painful and collaboration by computer even more so.

Old Zeke, my old computer, is on display in the Smithsonian because it was the first computer known to have been used to write a published novel. I guarantee you that few writers today did what we did before Zeke, with multiple drafts of 500 pp. mss., hand corrections in different colors of ink, retyping the ms. at least once and generally twice, etc.

Whether these new capabilities will make much in the way of change in human nature isn't anywhere near so clear. We do manage to save and allow to grow to sexual maturity people who would have been dead before age 2 or so; will that have an effect on human nature? For that matter, keeping an old duffer like me alive for another decade may have an effect.

Feynman tried to arrange for me to teach a senior seminar at Cal Tech on Technology and Civilization. It's very clear to me that technology has had an enormous effect on what we call civilization — hardly astonishing. And as our technologies get better, we darned well will do things we never thought of doing. Including going to the planets and from there to the stars.

Robert Bruce Thompson on my description of writing without computers:

That's always been my impression as well, so I'm always surprised when I run into writers who still use a typewriter or even write their manuscripts longhand. Emma Thompson and W. E. B. Griffin are two of the better-known authors who do it the old-fashioned way, but they have plenty of company.

I suppose it's just personal preference. I can kind of understand that. I'm very productive sitting at a desktop computer with a big screen, full keyboard and normal mouse, but I can't get a thing done on a notebook. My solution is just to put desktop computers everywhere that I might need one.


I borrowed $12,000 to buy Zeke back in 1979; the increase in productivity made that back in a year. I know some very good writers who still use fountain pens and ink, but I don't think I could do that.

Peter Glaskowsky comments:

Nathan Okun wrote, about neural networks of quantum computers:

"Such machines will transcend what we call programming..."

Alas, they also transcend design verification and debugging


In the real world we have to be able to be sure that a machine is doing what we meant for it to do, but the kinds of systems he describes are too complicated for that.

If God gave us such a machine and assured us it's working correctly, we could proceed to teach it and use it, but we'd still have no assurance we're doing _our_ part of the work correctly or usefully.

(And to the extent that God already gave each of us a neural-network computer, He gave us no assurances about its correct operation, and in fact it has more failure modes than we've been able to document to date.)

Even fairly small neural-network systems are essentially immune to design verification and debugging. I've never heard of a basis for believing that large ones can ever be verified or debugged.

He goes on to say:

"What I am getting at here is that multi-CPU machines as described in that ARS TECHNICA article on INTEL are no longer computers as we know them."

But no matter how far you extend current multiprocessor design trends, you're still not creating a neural network. Even with thousands or millions of CPUs. They're still comprehensible, verifiable, and debuggable, even though all of these tasks get significantly more difficult as the CPU count increases.

. png

And Eric adds:

Besides, we already have items like that. They're called children.

Eric Pobirs

Subj: Should we program in BASIC?

I must confess that, as I age, I become increasingly skeptical of all arguments -- especially about programming! -- of the form, "All You Have To Do Is Just ...".

To Nathan Okun (Mailbag of 14 July 2008) I juxtapose Edsger W. Dijkstra, vintage 1984:

>>[T]he teaching of BASIC should be rated as a criminal offence: it mutilates the mind beyond recovery.<<


One point Dijkstra elaborates in that essay is expressed more succinctly in Flon's Axiom:

>>There does not now, nor will there ever, exist a programming language in which it is the least bit hard to write bad programs.<<

(Flon, Lawrence. "On Research in Structured Programming", SIGPLAN Notices, October 1975, Vol. 10 No. 10.)

To say this is, however, not quite the same as saying that it does not matter what programming language one uses: there are programming languages in which it is frustratingly difficult to write *good* code -- and BASIC is one of them, or at least it was for me.

But Okun's essay is not without redeeming social value: I had a good laugh when I read his, "[S]imply make ALL variables and constants universal in their scope throughout the program unless specifically determined otherwise".

In any case, for programming, All You Have To Do Is Just ... use Python! 8-)

Rod Montgomery==monty@starfief.com

I do point out that many versions of BASIC had strong typing and mandatory declarations either required or as options, and that over time BASIC acquired just about all the elements of a highly structured language


I once wrote in defense of the GOTO statement, but I received enough very well reasoned replies that I changed my opinion and said so.

Marvin Minsky once remarked that my obsession with strong typing, mandatory declarations, and formal structured languages was a bit like insisting on donning a straitjacket before I sat down to work. But then Marvin could strain like a gearbox and write a three line program in APL that could do astonishing things. Of course he had no idea of how it worked, nor did he care, because he could always do it again. For people of his caliber I have no advice on programming languages; but it remains my view that we'd all be better off if systems programmers used languages from the Wirth/Dijkstra school.

Moore's law continuation


On your comment, "Moore's Law is inexorable. We are coming into the era of computing plenty."

Actually, as we've known it for 3 decades, Moore's law is over, having ended 2 years ago.

We've made a transition from faster and faster cpus to a multi-core world. If you run lots of programs on lots of cores, they all get faster because they don't slow each other down taking turns on a single cpu.

However, no single program gets faster without major changes to the code, which are sometimes impossible and often impractical. In fact, a strictly single core process is probably slower now on most computers than it was on the fastest P4 machines of a few years ago.

See this Ars Technica link

If---and it is a big if in many cases---the codes can be adapted/ rewritten to scale across many cores, then Moore's law lives on. Otherwise, not. Finding ways to write such parallel code in less programmer intensive ways is the biggest current challenge in computer science. Adapting old codes, which represent programmer millennia of work is perhaps an even bigger challenge.

It's a challenge that can be solved, but it is not a process engineering of silicon challenge, it is a computer science challenge. That's a different beast and the jury is still out on the solution.


The basis of Moore's Law — that we can get more transistors on a chip — may be fading, but the effect remains. Computing power seems to double every couple of years, and exponentials inexorably lead to steeper and steeper curves.

We already had a programming language that took account of multiple processors: it was built into Modula-2 and Oberon. The bit slicing techniques used in Wirth's Lilith are also applicable.

But my point was that the hardware we have is far better than our programs, and it will not be long before we can run all our programs at once without slowing any of them down. Now running multiple applications is not the same as massively parallel computing; but it's one approach, and I think we will see a lot of that while some of the really smart people step back and look at other ways to make use of multiple processors. Breaking an enormous task into a series of separate tasks has long been one approach: it was how we got to the Moon in 1969.

Me, I have great expectations, but then I'm an optimist as you can see from my book A Step Farther Out

re: If Vista were a model of Car, Highway Accidents would probably rise by 10^?

Dr. Pournelle,

I would take exception to Eric Pobirs' comparison of Vista Ultimate to the Full Optional New Car.

All those optionals, besides having someone think of, design, and integrate them into the car, also take someone else to construct them, install them and configure them. But in an OS all the "extras" are just disabled (isn't that a synonym for crippled?). In the car example, the options are not installed, because each addition requires physical additions and physical labor. In the OS, if they build the base OS without any bells and whistles ("Ultimate") they have... WindowsXP. In the OS, the economic principle changes compared to Car Optionals. Bytes are quite easy to mass produce, once they are well thought out. And if they are well integrated (exactly the point of being integral part of the OS, and not "just" external applications), then they are thought once, installed once, and disabled or left out half the time all those bytes are reproduced.

You could actually make quite a case (which you already have in part - for one, the Wang example is perfect) that mass producing different variations requires more money to produce than mass producing one model and then having only the ones correctly licensed show up to the user. The pieces are all there, but if you don't pay me more, I won't show them to you. Perhaps the correct analogy for multiple Vista versions is not the manufacture of cars, but rather a much less recent profession: does "Pay me more and you get to see more of me and I'll let you do more with me." ring true of any other service industry (of quite earlier origins than PCs and software)?

As to a completely unrelated comparison, if Vista were be required to respond to the same standards as the auto industry, could you imagine the field day the Lawyers would have with Vista? Safety recalls every other day, I would think, and class actions out the wazoo.

Good luck with your continued recovery,

James Siddall jr

I think I have little to add to what I have already said.