Dr. Jerry Pournelle

Email Me

Why not subscribe now?

Chaos Manor Subscribe Now

Useful Link(s)...


Hosting by

Powered by Apache

Computing At Chaos Manor:
The Mailbag

Jerry Pournelle jerryp@jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

October 13, 2008

We open with a recommendation from Robert Bruce Thompson, author of The Illustrated Guide to Home Chemistry Experiments (O'Reilly)(Reviewed here).

I'm not sure if I've mentioned this before, but 35mm film cans are extremely useful around the lab. Get them while you can. Digital cameras are killing 35mm. The woman behind the photo counter at our local Walgreens said they used to get a hundred or more a day. Nowadays, they get only a few a day. Most pharmacies have a box or bag of old film cans behind the counter and are happy to give them away if you ask for them.

These cans are made of high-density polyethylene (HDPE), which is perfect for storing most chemicals. They hold about 30 mL. Most solid chemicals have densities between about 1.5 g/mL and 3.5 g/mL, so these cans will hold anything from about 45 g to more than 100 g of a solid chemical. The lids are airtight, so they make excellent storage containers for solid chemicals and solutions. They're also good for storing many other materials, including samples for testing.

I was surprised to find how Kodak's market share had shrunk relative to Fuji, with Fuji cans outnumbering Kodak cans two or three to one in the bag of cans my local Walgreens gave me. The Kodak cans, black with gray lids, are good for storing any chemical that's light sensitive. The clear Fuji cans have a ridged lid that's a bit easier to use.

Either type of can has a lid with a central depression, which makes it easy to punch a tiny hole (or make an x-cut with a hobby knife) that allows you to use a Beral pipette to withdraw solution from the can without taking off the lid, while preventing much evaporation.

I've already gotten a hundred or so of these film cans from Walgreens, but I plan to keep accumulating them until I have a garbage bag full. They'll come in handy over the coming years. For example, one of our neighbor kids is only 5 right now, but he's very interested in science. Once he gets a bit older, I plan (with his parents' approval) to give him a real chemistry set, which means I'll need fifty or a hundred of these cans to hold the chemicals. Multiply that by the number of other kids I plan to help over the coming years, and having a thousand or more of these cans on hand isn't excessive.

Best regards.


Robert Bruce Thompson

Back when I did film photography I collected a lot of those, both the plastic ones and the older metal film cans with waterproofed screwtop lids. They have come in handy for everything from shop use to lightweight backpacking containers for spices. I suspect Bob needs more of these than you and I, but they don't take a lot of room — and they aren't going to be free much longer.

When I went from XP to Vista I discovered an odd problem: connections to WS_FTP have a nine-second lag. I use WS_FTP to move material from here to my web site, and I do that many times a day. The throughput is fine; but making the first connections took nine seconds. After that connections might be instantaneous for a while, then the lag would come back. I mentioned this in my daybook www.jerrypournelle.com and got a lot of mail on the subject. This letter is representative of a lot of them.

ftp lag

Dr. Pournelle -

I recently discovered your site. I enjoyed your old Byte columns and your books, although my favorites were the ones you did with Mr. Niven.

Congratulations on winning your battle against brain cancer. My sister died a couple years ago from a brain stem astrocytoma and it is a nasty desease.

I haven't followed the history of the ftp issue you mentioned, but intermittent delays in startup with no impact on transfer performance could be a problem with a dns server. Possibly one that is not working with a second one that does. The first request goes out, is not responded to, and a second request is sent to the next server is sent after a time-out. Later requests are satisfied locally from cache.

I apologize for wasting your time if the above has already been suggested, or if I misunderstand the problem altogether.

Very best wishes,

Fritz Speck

I also got more technical mail, including this:

I thought I might email you and provide a potential cause. I'm a network engineer formerly with a certain 3 letter computer company, now working for a different 3 letter (plus an ampersand!) telecommunication company. Any time I've seen the results you are describing its almost always due to reverse DNS issues. It's pretty common with FTP servers.

In a typical FTP session the client computer connects to the ftp server. The ftp server finishes the 3 way TCP handshake establishing the session, then in order to provide the hostname for its server logs, rather than just an IP address, it will do a reverse DNS lookup of your IP to a hostname. Once it gets that result (or times out) it will ONLY then provide the login banner information. Depending on the number of DNS servers the FTP server is configured to query, that timeout process could take many seconds, due to the way that DNS failover is architected.

For that reason, I usually recommend that ftp servers run with that reverse lookup setting turned off. Especially for environments where the hostname is inconsequential (as in connections from end-user PC's at ISP's, or where firewall rules would only allow certain IP's to connect).


The problem did indeed turn out to be DNS lookup. I haven't entirely solved the problem — it's pretty clear I need a new DNS lookup service — but I have taken care of it for WS_FTP by the simple expedient of putting the numerical URL into the WS-FTP connection scripts. That makes the connections nearly instantaneous.

The way I got the numerical URL was to open a command window and first Ping the site name, then do tracert sitename. That gave me each site visited on the way to my web host.

As to why the connection would sometimes be instant and sometimes take 9 seconds, apparently Vista and XP remember the URL for different intervals? In any event, the problem clearly is related to DNS lookup, and can be solved in scripts by substituting the numerical URL for the site name.

My son Alexander Pournelle does a lot of work in Microsoft Office, including managing very complex documents with multiple authors, as well as complex data bases. He recently commented about Google Docs:

Stupid Google Docs, or stupid user?

So I have a shared PDF on Google Docs. I want to upload a newly updated version of the PDF, replacing the old one, but with the same collaborators list. I'm lazy, and don't want to individually select the collaborators and share the document with them one at a time, for every version I upload. (The actual document is in Visio, so I want to share the PDF, if you were wondering.)

It appears that the only way to do this is to upload a new version as a COMPLETELY NEW document, then laboriously select each of the potential collaborators again and share the document with each of them, then delete the old document. Google Docs is too smart to allow you to simply use the same name again; it considers each uploaded document to be separate even if you use exactly the same name, and since there's no way to edit PDFs (in fact, they're only shown as graphical images, presumably RIPped by the server) you can't cut'n'paste in a replacement.

Worse, it appears you must select each collaborator individually (From the collaborator list on the lefthand side of Google Docs), then drop them on the document, and click through the requester.

The only alternative I can find is to create a text list of the e- mails for all the people with whom you want to share, and then paste that into the "collaborators" requester in Google Docs. How advanced!

My literature search doesn't find anyone else who's ever considered this possibility, which just makes me shake my head.

Yes, I know Google Docs is betaware. Attempting to use the "Tell us how to make Google Docs Better!" button takes you through a focus- group selection website where they decide if you're worthy to be called upon to participate in a future survey. No option to actually, um, help make Google Docs better.

Stuff I like: Google Docs does automatically keep versions of everything, though it appears to iterate versions every time you even view the document, and there's no way I can see to highlight or track changes. Needs work.

Any suggestions on fooling Google Docs?


P.S. And if anyone claims web-based word processing is a current threat to Word again, I'll just laugh, shake my head and point. Not for anyone who actually uses Word for serious work, sorry.

My own prejudices are against "network computing" entirely. That may be a function of previous experiences when hardware wasn't as good as it is now; but mostly it's just that I don't want all my data to be out in "the cloud." I prefer to have it right here on my disk — and given the enormous capacities of both spinning metal and silicon chip mass storage, and the falling prices, I see even less reason not to keep my data in local storage — actually in multiple copies, and key data goes on a thumb drive.

Alex's observations prompted a discussion among my advisors. Here are some excerpts:

Robert Bruce Thompson said

Yeah, I really don't get this whole so-called cloud computing thing. Why on earth would I want my data out in the fog where I may not be able to access it? I think we need to rename this concept nacht-und-nebel computing.


Alex replied

Don't get used to it, but I think RBT and JEP and ACP completely agree on something. We didn't have this whole PC revolution in order to go BACK to timeshare, whatever new noun they stick on it. I want my own stuff on my own computer which I control my own self. While I understand why business might want it to live in the magic cloud place, and would even support such a decision, it's not for me.


Which got me thinking. Eventually I said "It may be time to emphasize Pournelle's First Law of Computing," which says "One user, at least one CPU." That was my first answer to time sharing, and I still adhere to it.

Peter Glaskowsky added

To me, the key question is: what if they're not MY docs? What if they belong to a group of people, and I'm just one person in the group? And what if you want continuous collaborative editing, so a version- control system isn't good enough?

That's where Groove came from. Groove is why Ray Ozzie is the new Chief Software Architect at the world's largest software company.

That's where wikis come from, too, and (cranks aside) there's no question that wikis are a better way to collect the knowledge of multiple people than anything else we've seen so far.

Much of the work I've been doing for the last 12 years has been collaborative in nature, yet I've been doing it the old-fashioned way, and the loss of productivity has occasionally been very painful.

So I think we need local storage for personal documents, and cloud storage for shared documents. We also need ways to bridge the differences: ways to share documents managed by individuals (like Alex's PDF of a Visio file), and ways to work on shared documents when an individual is disconnected from the cloud.

Eventually we'll have all of these things, they'll all work very well, and we'll wonder how we ever got along without them all.

. png

Peter has more experience than I do at publishing proprietary collaborative research; I know that at his former company they used a commercial wiki server to create many of their documents.

I'll close this with Alex's final comment:

If only Groove worked on OS X, I might be able to try it...

I don't think Google Docs is a bad idea, I just think it's terribly behind the times, much like Writely (its innate word-processor). MS Office is terrible except when compared to its competition. Yes, it's "Google Docs Beta", and it's getting better Real Soon Now, I get that.

Online/offline collaboration: I agree that a balance needs to be struck between cloud- and pc-centric, yet every time I see a brave new Web 2.0 play for document collaboration, it's as if they were decanted from a test tube yesterday. They don't know what a requester is, or why you should intercept the open/close command from a desktop app, or how previous groupware apps worked, or what was good about them. They expect everyone to live in the airy-fairy web-centric approach that's all they've ever known, and since they start there, they immediately close themselves off from much of their market. Add to that a we'll- make-it-up-in-volume approach to a business plan and it's 1999 all over again.

And the tools from companies which appear to Get It are too general (I'm looking at YOU, Intuit Quickbase) or hard to configure (YOU, HyperOffice) or just baffling (Stand up, BaseCamp) to the average user. Even specific, well-supported collaborative tools for individual industries--I'm picturing AutoDesk's BuzzSaw for architects, now--have not covered themselves in sales-volume glory.

I've never been able to get past that initial ick factor so I could look at commercial wikis and understand why I should give a rip, so I can't really comment on their usefulness/usability/general purposefulness.

Meanwhile I have sucked it up and manually added the collaborators to the Google Docs document in question. Dan Spisak and I have both attempted to search for some appropriate assistance, but our Google-Fu was defeated by the vagueness of the terms. "Share google docs collaborators" might as well be "Marklar Marklar" for all the specificity of the hits we got. Pfui.


My own experience has been with a TabletPC and Microsoft OneNote, which can be set to have shared documents editable by anyone part of the data net; but it's also the case that I do little computer collaborations even though Niven and I are among the most successful collaboration teams in literature. Larry is not a computer geek. When we work on the same document we do it separately, then I use Word's compare and merge tools to make a new master document which I send back to him. That has worked very well for a number of novels, some of which have been best-sellers.

Peter does make an important point: eventually we will have resources we barely conceive of now, and we'll wonder how we lived without them.

I have a Mac Book Pro; due to the radiation therapy I haven't actually set it up yet, but that's coming fast now. My intention is to have both Mac OS-X and Vista running on it, so that it becomes my only laptop. Alex has already converted to a Mac Book Pro running XP and OS-X as his main machine. Many readers have made suggestions about my journey to MacLand.

Macbook Pro

Hi Jerry,

If you're going to run Windows inside VMWare (or parallels - but I prefer VMWare as having better foundational technology), you can use an external FW800 drive to host the virtual machine if you don't have enough space on the main drive. My own benchmarks show it to be good enough for almost anything. Make sure you set the virtual machine hard drives fairly large (they expand as needed), as adding space later is rather finicky. I've never found a need for BootCamp.

There's a number of FW800 drives out there, but the most cost-effective seem to be Lacie. I know some folks who swear by them, and others who swear at them, so I do recommend a good backup.

There is a 7200 RPM model (which improves performance significantly) of their rugged drive:


My recommendation is to start with the virtual machine on the existing internal hard disk and wait until space becomes an issue. You can either use XP or Vista to create the virtual machine - I suggest using the auto-install wizard in VMWare, as it makes the whole process painless. It will offer to install McAfee (if you're using 2.0) as a trial license. I also suggest turning off unity mode, but that's my preference. I want Windows in a robust sandbox. However, if you want seamless windows/mac integration, you may feel different.

Good luck!


P.S. The non-user replaceable hard disk is a huge negative for the MacBook Pro. The regular MacBook is user replaceable. For details on the upcoming refresh, the best place to look is www.macrumors.com.

My experience with LaCie is confined to their photon20vision II monitor, which I've had for several years; it's probably sufficient to say that it remains the monitor for my main communication machine. It's great.

Peter Glaskowsky says

LaCie is hardly ever the best deal on any storage device. The company positions itself at the high end of the market. The best deals almost always come from the drive makers themselves. WD and Maxtor both sell external drives, and they usually have lower prices than LaCie for any capacity.

On the other hand, I don't think any of the drive makers offer ruggedized hard disks, so if you need that sort of thing, you need to take it where you can find it.

. png

Captain Morse adds

But the LaCie external drives come in those attractive, stackable cases that won't have the spousal unit up in revolt when you tell them, "here, use this to store all your stupid dog pictures."

Ron Morse

I should note that I got my LaCie monitor on Ron Morse's advice, and I'm sure glad I did that.

I have Seagate drives in many of my machines and Western Digital in others, and several different external drives from both manufacturers. They have all worked very well for me; and do note that I don't so much do "reviews" as I use stuff, hard. The WD 500 GB My Book for the Mac has been backing up my iMac 20 for 8 months now. Western Digital makes a full terabyte version of this drive. (See a Mac World Review here)

I have a WD 500 GB Scorpio Blue which I intend to install in my Mac Book Pro, but alas, I have discovered that the Apple Store Genius Bar won't install third party hardware. What I will probably do is install it myself while keeping the original disk to reinstall in case I need Warranty service. Back when I was still a bit shaky from 50,000 rad of hard X-rays the installation seemed beyond me, but I now suspect I can manage it in half an hour.

But I will almost certainly begin by setting up everything on the original disk, and clone that. Reports on this will be in the column.

I often get helpful suggestions from readers.

Multiple computers on the same desk

I came across a new tool that you might find useful.

A friend of mine just added a second system under his desk. He wanted to keep both screens, but use one keyboard and mouse between them. I got him a KVM switch and just didn't hook up the video.

That worked, but not well. This particular KVM switch was essentially a USB hub that switched back and forth. The problem is, on the old machine running XP, it took anywhere from two to five seconds for the machine to recognize the peripherals. Just long enough to cause a pretty severe distraction.

My friend stumbled across an article in Maximum PC that described a program on SourceForge called Synergy that lets the user treat multiple computers almost as if they're two monitors on the same computer. That's a really cool idea. Especially since this program is cross-platform, Windows, Linux and Mac OS.

I looked over the project pages. There appears to be an active following. There are many recent posts on their forum. But, while it does appear that there is active development going on, the last official release was in early 2006. To get it to run on Vista requires a workaround.

While looking that over, I saw a mention of a Windows-only solution called Input Director. It has been releasing updates every few months.

I've been using Input Director for a couple of weeks, now. It appears to be a very well-behaved program. (I did have to turn off the water-ripple effect. It looked cool and I wanted to keep it. But caused a pretty noticeable hitching of the mouse cursor for a few seconds upon switching over. Global Prefs tab.) I'm finding that I like having two machines that behave like a multi-monitor system. One big advantage it has over the KVM switch is Copy&Paste between the systems.

I did not take time to install Synergy. You might find that one more interesting since you're running diverse OS'.

Drake Christensen

Thanks. At the moment I have multiple computers, but each has its own keyboard and monitor. There will come a time when I will have multiple monitors but perhaps only one keyboard. I haven't come to a final decision on just what the final setup at Chaos Manor will be; but I have been impressed with Apple's very big monitor used in connection with "Spaces" to make a desktop for each operating system.

A tip from Chaos Manor Associate Eric Pobirs:

Memory Stick

In the column you mention looking for a good deal on the Memory Sticks used by your Sony camera. The best deal is to not get a Memory Stick but instead the Memory Stick Pro Duo. (MSPD) This is the smaller version used in Sony's PSP handheld game system and consequently far more active in the market. A passive adapter allows a MSPD to be used pretty much anywhere a Memory Stick would go.

Fry's frequently has an 8 GB Memory Stick Pro Duo for $40. (The current ad has a Lexar brand unit in this role.) Make sure your camera can handle that capacity. Many Memory Stick devices topped out at 4 GB. The PSP was able to access 8 GB units after a firmware update. Similar may be needed for the camera if it is more than about 1.5 years old.

Side note: Letting the PSP use 8 GB MSPDs had its drawbacks for Sony. It enhances the value of the PSP but it also makes it easier for those inclined to pirate their games to carry a small library with them. One amusing accessory that appeared a while back was a simple item to allow a pair of MSPDs to be stored in the PSP's optical media slot. (The PSP uses a very small optical disc format called UMD, which stands for Universal Media Disc even though nothing beyond the PSP uses it.)

It should be noted that this isn't enabling piracy in of itself. It just makes it more attractive. As Sony's range of games and videos sold as downloads grows, the bulk and power demands of the UMD drive become more questionable. A new model with multiple MSPD slots instead of the UMD drive could be lower priced yet more attractive to consumers but the retail channel would not be happy about it. They don't want to carry the hardware with its near non-existent profit margin if they aren't getting a piece of the software revenue. The solution to that could be to have PSP games sold on high capacity Memory Sticks. The game would only occupy a fraction of the MSPD and allow for online purchases to be added.



As to piracy and copyright, I dither: sometimes I think that copyright is doomed and artists and writers will have to find ways to support themselves through patronage http://www.jerrypournelle.com/paying.html and/or performances. On the other hand, I have bought a number of books for my Kindle, and I note that despite the availability of pirate editions of most of my books for free, they continue to sell — including electronic copies. That makes me feel a lot better about the future.

Regarding my statistical section in last week's column:

Statistical reasoning

Dear Dr. Pournelle:

Savage (Leonard J. Savage, The Foundation of Statistics) is too difficult for anybody but specialists. For the rest of us I strongly recommend David S. Moore's "Statistics -- Concepts and Controversies". IMO this is the best introduction to statistical reasoning every written. First published in 1968, it's still in print -- the seventh edition is due out this month. You should take a look at this book. If you do, I think you'll find it good enough to recommend to the readers of Chaos Manor Reviews.

I must admit I like the second edition. It's become a classic -- used copies in fine condition are selling for more than $100 today.

Regards, Morton

While I do not agree that Savage is beyond the reach of many of my readers, I will agree that Moore's book does cover much of the field, and it is the textbook used in many courses. It also costs more than $50; the cheapest one I have found is a used 1979 edition for $20. Most new editions new or used are over $75.

I have ordered the book (well, a used copy) and I will review it when I get the chance — there are enough books to review to warrant a special review article sometime next month. Given that it is used as a text in many places I make no doubt that it is excellent for learning techniques; I'll have to read it before I can comment on the main reason I recommended Savage, namely that most people who use statistics don't understand the underlying assumptions from which they make inferences.

Incidentally, 1968 is several years after I left graduate school, where I encountered Savage.

My daughter Jennifer comments on last week's column, and particularly the languages section:

October Column

Hi Dad--

Just read this with some interest. Having just come out of an SDK development company that boasts everyone from Microsoft, HP, and Canon to tens of thousands of mom-and-pop-shops as its clients, I thought you might be interested to know the sea change that has happened in the last couple of years.

Now, 70+% of Windows developer clientele program in the .NET environment, meaning primarily C# compiled with Visual Studio. All of the "script kiddies" just entering the workforce now come in with Java as their starter language.

After that, in terms of production and support demand, comes the C DLL API, C++ Class Library (both diminishing rapidly in favor of .NET), Visual Basic .NET (the other "taught myself" starter language), and a smattering of Delphi for legacy systems.

The newest, hottest, "teach yourself" starter language is XAML, which you can teach yourself using Microsoft Expression Blend's free download. Even I, who for the same reason I never learned to make coffee developed an early allergy to programming, compiled sellable XAML programs within the first day of use. Granted, I already had more than passing familiarity with X\HTML, but most people do. It's architecture is designed so that anything you design, basically WYSIWYG, on the front end is already coded, and can be handed of as a compiled project for a developer to hook functionality into on the back end.

Thus, if you have a really good idea of what you want, you can do most of the front-end "programming" yourself, and only hand off to a "real" developer at the point where your own expertise ends.

Dr. Jennifer Pournelle

Which may show how far I am out of date. Alas, while I have the best intentions, the fact is that the only programs I am likely to write in the next couple of years are filters and tiny utilities to make some of my work easier, and those are best done (by me, anyway) in Python.

Precisely which languages one ought to learn is a matter for considerable discussion. Jenny's comment "if you have a really good idea of what you want," is the key here: if you can reduce your task to a series of logical steps, it's very likely you can do much of the front end programming yourself. You may be able to do all of it. It's practically certain that if you do it yourself it won't look elegant and "real programmers" may laugh at your results (heaven knows I have professional programmer friends who find the accounting system I wrote in 1982 in Commercial Compiled Basic highly amusing) but you may also have something good enough. For example, I have been using my ancient accounting program for a long time, and back when I had a best seller I was audited by the IRS at least three times with trivial results. The program was designed to produce Journal and Ledger books that look like those taught in Intermediate Accounting (in 1983; I went to the UCLA bookstore and bought the textbooks). I still use it, but I wouldn't recommend it to anyone else because the interface is arcane.

My point (and Jenny's) is that if you have a skill that may be turned into a computer program, you can do a lot on your own with no investment other than your time.

Subject: Quad cores and gigs of ram.

I am not sure that massive quad core CPUs are the wave of the future. I work at a software development house. There isn't a single quad core in the place. Well except our server. We just don't need them and it is a pain to swap out a PC for us. It takes a while to get them dialed in so they are just so. We do have a single Vista machine that we use for testing. The developers all stick to XP because of the speed and the fact that Vista gives us nothing of value.

I think smaller is going to be the way to go. Think about it, most people don't need to do much more than surf the web, play video, manage digital pictures, Quicken, and Tax software. You don't need massive quad core CPUs and massive amounts of ram for that. The new flood of netbooks is a good example of the type of power most people really need these days. Look at some of the new developments from Intel. The Atom is designed to for these new small efficient machines. Microsoft is already getting Vista's replacement ready. I for one hope they strip it down, speed it up, and drop the price.

Oh and why don't developers support 64bit more. Well you are right in that a big part of that is money. The other part is that there really isn't a need to. Most software just doesn't need a 64bit memory space. You do get one other benefit going to 64bit besides a bigger memory space on the X86 and that is more general purpose registers. The X86 cpu has been register starved for years compared more modern CPUs like the PowerPC. The extra registers will give you more speed but the extra pointer size that is required will take more ram so it really isn't worth it for most programs. The other part is that you will really have to double your testing, and ship two versions of the software all for a tiny increase is performance for a small segment of your customer base.


Well, of course you may be right. In particular there are a number of pocket computers. Most need some kind of web connection to be useful, of course.

Meanwhile, Quads are proving popular, first of course with gamers, but now with developers and as enterprise systems. I built the Core 2 Duo Quad 6600 for under $1000, and it does everything I need for my communications system, including running Outlook, 70 open Firefox Window tabs, a dozen Internet Explorer open tabs, three PDF documents, two instances of Word 2008, and Front Page (that's what I can see now) without glitches or hesitations under Vista 64, I consider it a splendid investment.

Of course my game machine is an Intel Quad Extreme, and it can run multiple instances of the kinds of games I play without noticing that it's doing it, and still compile programs and do enterprise work. Just now the Extreme is a pretty expensive chip, but it won't be this time next year.