Dr. Jerry Pournelle

Email Me

Why not subscribe now?

Chaos Manor Subscribe Now

Useful Link(s)...


Hosting by

Powered by Apache

Computing At Chaos Manor:
The Mailbag

Jerry Pournelle jerryp@jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

November 24, 2008

Most of this mailbag will not make sense if you have not read the November column I generally discuss the columns with my advisors before they are posted, but there comes a time when it's got to be posted. That sometimes doesn't end the discussions. Alex had some thoughts on the November Column:


Most cloud computing initiatives appear to have local+remote options, where there's some amount of local computing (at least for basic formatting, a la your half-duplex typing example). This makes a big difference for responsiveness. This tends to blur the line of "all cloud" computing. I don't really know if this discounts the OS agnostic aspects of cloud computing. Certainly if it runs in Java (e.g.) or another browser-hosted sandbox (Silverlight?) then it can be pretty thoroughly cross-platform. This completely blurs the line between local and networked computing; I think this is the only way cloud computing is going to be successful for the smaller customers.

Per your comment about my use of Boot Camp and the Mac: Actually, the only time I ever use Boot Camp is to play games. Otherwise, I use VMWare Fusion exclusively.

Your example of Office 2003 and 2007 not coexisting is exactly why virtualized OSes prospered. You could (and most smart people would) run the old and new versions of your OS, major apps, etc., in parallel, until you were certain the newest version was OK. IBM's MVS was enormously useful for this. VMWare's enterprise-level software does this for the big guys today; Microsoft is not standing still, and Server 2008 R2 (teased somewhat at WinHEC) will have more features to compete with VMWare.

Virtualized systems are ubiquitous, even down to the desktop. Control of them is still a bit confusing. On my Mac, I can use Unity mode to run Windows apps side-by-side with MacOS apps, and it's supposed to be seamless. So far it's been a bit more problematic than I'd like.

Phones and garbage collection: I have these symptoms on my Blackberry 8703e too. I can get several megs back by rebooting (or pulling the battery); on a system with less than 64M of storage that's significant.


Eric Pobirs, commenting on cloud computing, added:

Another example of this is the support for running VHD files directly in Windows 7. This will simplify a lot of compatibility mitigation and general deployment. It would have been extremely useful on that big MWD deployment.

Another notable local+remote aspect of cloud stuff is the evolving standards for having web apps live locally on your PC, allowing work to be done without a connection, although certain apps are crippled in that situation. But something like Google Docs shouldn't need a connection much of the time.


We have more mail on cloud computing


Two quick comments. I predict Cloud Computing will fail. The high priesthood of the mainframe has been replaced by the high priesthood of IA. They will kill any system that leads to government/corporate data stored outside of their control. So, large corporations won't use it and mom/pop don't need it. I don't think there will be enough in between these to make it a success.

Statistics - When I was working on my Doctorate in Business Administration, I had to take a statistics course. It had nothing to do with quant. or qual. Rather it dealt with selecting the proper tools to get the result you wanted. Nudge-nudge-wink-wink say no more.


Well, there are some conveniences to using Azure or Apple's Mobile ME. It makes it a lot easier to get at what you need independent of your location, and you don't have to have all the complex software installed locally. It's no big deal to have Office on your laptop, but there are much larger programs you wouldn't want to carry around with you. Clouds may not be with us to stay, but they will certainly be here a while.

Of course Apple's Mobile ME isn't exactly cloud computing; it's their scheme for using cloud storage as a means for synchronizing two or more computers. I believe this is Apple's opening move toward something like Azure, but I could be overstating their ambitions.

Peter Glaskowsky adds

MobileMe still isn't a cloud computing service, just a web service. There's data stored on web servers, and a web interface to the data, but very little computing is happening on Apple's servers. Certainly nothing that wasn't being done ten years ago.

Cloud computing is really about running enterprise IT applications on Internet servers, generally to supplement in-house IT operations.

Azure is a cloud computing platform.

Of course, sometimes people use "cloud computing" or "cloud services" or "cloud applications" to refer to things that used to be done with nothing more than Apache servers and Netscape browsers, but I think that's inappropriate.


Dear Jerry,

In your latest November 10, 2008 installment under "Pournelle's Law Revisited" you mention that "if the cloud server farm goes down, or there is some interruption of your access to the Internet, you're dead."

In Venezuela the Internet is run by phone and cable TV companies, although some big companies have private uplinks. Yesterday we had a massive interruption of service at most ISP providers which lasted the whole day. Thus one could not even go to a cyber cafe, since they were down too. No explanation whatsoever. Was it a freak occurrence or a practice run for shutting down the mobile phone and computer networks for the coming elections? (During a referendum which was lost by the government, SMS and computer networks were used by students to keep in touch and report on the different polling stations while looking over the shoulder of election officials and so limit voter fraud) If there were any argument against cloud computing, it would be control of access by a government with totalitarian tendencies.

Best, Willem van Doorn

PS: Good to know you're recovering

Certainly one of the downsides of the Cloud.

Hi Jerry,

I found your comments today on the cloud bang on.

On the one hand I am more and more dependent on the cloud for both ease of use and line of business reasons, being part of a small engineering consulting group. We depend on tools like google apps frequently for various projects. Personally I have depended on gmail for years as I am spread over different machines. Last but not least, I use JungleDisk to keep a back up of key files up on Amazon S3 for my offsite backup.

On the other hand having key functionality off site makes me nervous. Not only are technical and business glitches possible (e.g. I am seeing more and more gmail glitches and downtime,) the growing political activism of what should be purely neutral services is worrisome. I.E. how long will it be until google's growing willingness to censor political speech on youtube translates into the same on gmail or google apps? Even worse how long until some ever so PC geek up in the googleplex or yahoo decides to start blowing away folks email accounts or?? because they don't like your politics? I don't want to be paranoid, but we are entering an interesting part of the historical cycle.

One response I have to all this is to cut back on the use of cloud apps like google docs and keep back up copies of key info. Another is to run the OS X mail app once in a while on both Macs and suck down a copy of everything from google via IMAP. From there it can go a number of places to include DevonThink. (If our discussion of DevonThink got lost in the radiation, I still strongly recommend it.)

Since I brought up Apple, I have also been playing with Mobile Me. It is a love hate relationship right now. I don't use the mail features yet as it appears (could be wrong as things seem to be a bit of a moving target) that you can only sync .me mail between different Macs. I have the Calendar app syncing to google calendar, but can't get it to show up on the .me web page yet. More investigation needed there I think. The syncing of address books between Macs and google has started to just work. That is important to me given I will be getting an iPhone soon and loath out of sync address books.

iDisk is working fine, but is limited in speed and disk capacity for what I would like to use it for, which is to keep one set of files for my big research databases.

My bottom line on .mac? Apple is on a good path and once they reach the point that it keeps multiple Macs in sync down to the level of your files it will be a killer app and a good compromise with the cloud.

To close, I don't see the cloud to be the all to end all the pc industry is painting it to be. Is it the solution for Aunt Minnie? It is already there I think. An ASUS EEE and the googleplex is a killer combo for a lot of users and something I will push my mother towards next time she buys something. I think Microsoft knows this too or they wouldn't be pushing XP into that space so hard. (BTW, something I think a bad move given linux is already working there. I have spent way too many hours fixing XP on the machines of the Aunt Minnies in the family.)

All the best,

Richard Kullberg

Apple's MobileMe has had some horrible birth pains, but I think they will get their act together pretty soon. I do wish they hadn't forced everyone to convert from .mac to .me without giving us a choice. Oh. Well.

Security in the Clouds

I am very happy to hear you are doing better. Long term illness is not fun, and your experience of virtual amnesia last year brings to my mind, so to speak, my experiences after being hit by a car while undergoing treatment for a bad gallbladder. Losing almost a year of your life can be frightening but useful - the perfect excuse for not remembering names, birthdates, etc.

SecurityFocus.com and Mark Rasch, among others, have had some interesting articles about cloud computing and the law. It's not entirely clear that property rights are the same with information stored in the clouds, and case law is already starting to firm up that puts privacy of information stored on "cloud networks" squarely at risk. There was also an interesting discussion of the legal ramifications, as well as practical issues, of what happens when your data in the cloud disappears. Recently, several backup companies have failed, taking away information companies needed. Long term archival requires substantial due diligence. Now, to be completely frightening, is the government required to notify you if they serve a subpoena to access your email if it's "in the cloud?". Perhaps not - Mark Rasch has an excellent article on that specific issue as well.

Most small to midsize business owners want to ignore legal, security, and regulatory issues - I don't blame them, following those issues requires a substantial investment in time and resources that can distract from actually running their companies and performing the sin of actually generating a profit. However, these are issues that can change your life, since it's never good to be on the bleeding edge of case law. I'm not an attorney - I'm a security professional - but these days the two are inextricably intertwined. I see a future of unrelenting regulatory issues, ever dwindling privacy, and increasing technological complexity. As you wrote, some trends are inexorable - and cloud computing can be a very good thing. However, just like you'd want to weight test a bridge before driving an M1A1 Abrams over it, it's a good idea to understand the limitations and risk of cloud computing before jumping into the clouds.

If you get a chance, I think you'd find the articles about legal and privacy issues in the clouds interesting. I would certainly be interested in your take - your perspective is always something to take note of.

Thanks for a very thought provoking article and best wishes for continuing recovery,

Bill Kennon

The legal system was not designed with the computer revolution in mind. In particular, the whole notion of intellectual property - what it is, how it can be protected - is up in the air.

On the other hand, we have increasingly more powerful machines, making powerful encryption much simpler. Cloud computing will spur even more developments in that field.

It looks to me as if the future is going to be a lot like I envisioned in The Mote in God's Eye. Everyone carries a pocket computer that serves as PDA, telephone, notebook, book reader, TV viewer, etc. This connects to their home system (or the ship's library in the case of a naval vessel) as well as to the cloud. That way users can own their data as well as have it available through the cloud (of course I didn't call it a cloud, but I did have all the ships able to draw on planetary libraries... With local copies you are not cut off when the cloud goes down.

An Apple tip. I had some problems with Apple's MobileMe, which were not fixed until I did some obscure manipulations in Systems Preferences and it fixed itself. It turns out that what I did was irrelevant; what fixed it was the Mac somehow refreshed itself. A reader says


Or you could do it the fun way if you know it's DNS--

Steps to flush your DNS cache in OSX Tiger

1. Open up a terminal window (Located in /Applications/Utilities).
2. Flush your DNS cache with the following command: lookupd -flushcache
3. Type logout and press the 'Enter' key to close the window.

In Leopard flush your DNS cache via terminal

dscacheutil -flushcache

I have no idea why they changed it. I suspect there were other terminal commands that changed as well.


You might want to copy this and hang onto it.

Subj: Statistics: Cooking vs Gastronomy

Your mention, in the Nov 2008 Column, of "two aspects to statistics, one mechanical and the other more abstract" -- the one being concerned with how to do calculations, the other with what calculations it makes sense (or doesn't make sense) to do -- reminded me of John Tukey's comments about statistical education.

Tukey used to say that we teach mostly cooking, but what most students need to learn is more gastronomy.

Rod Montgomery==monty@starfief.com

statistical reasoning

Jerry: I came across this web-site, and thought you might find it of interest since you've mentioned statistical texts.


For my part, I maintain my (minimal) understanding of statistics by using R. Do note the extensive book list. Besides statistics, R is excellent for graphing/data visualization exercises. R is free (gpl) and cross platform.



I met Tukey many years ago at a conference on self-replicating systems in space. I was much impressed and sought out some of his publications afterward. His remark is a very good summary of what I have been trying to say. (Or perhaps what I have been trying to say comes from my having read Tukey many years ago...)

Most statistics classes teach techniques, and how to do inferences, but very little about the appropriateness of the models used. Most statistics manipulations assume the model is correct, and look to see how much your actual data varies from what you'd expect by random selection from the model. Alas, the world doesn't conform to models, witness the current financial mess.

I have no real remedy to all this, other than that I would require everyone to read The Black Swan as part of one's first statistics course.

We have recent news reports of viruses in the Pentagon, and severe restrictions on use of USB devices there. This prompted a reader to ask


Regarding the Fox News piece about malware hiding on external media, does your team of experts have advice on how individuals can verify that their media are free of the suspect code? Please respond via site.



Several of my advisors answered. Security expert Rick Hellewell says

Dr. Pournelle:

In regards to protection against the Autoplay functions and infected USB devices: Most AV programs should be able to 'catch' the attempt at an installation of malware from an infected USB drive, since many of those infections are 'known' by a current AV program. You can also do a manual anti-virus scan of USB-attached drives.

Readers should be aware that there are more than just the USB "thumb drives" that might be a risk. There were many reports this year about infected devices such as photo frames that attach to your computer via USB. Those photo frames are a popular gifting item during the holidays.

Passing around infected USB thumb drives is a great way to do penetration of business systems. Some penetration testers have done that as part of their 'war games' against a business by dropping some infected USB thumb drives in the business parking lot or entrance area. (Of course, those war games were done with the permission of the business. You'd want to be careful about doing that yourself.) That's a great social engineering way to get into a system....most people will plug the USB drive into their system out of curiosity.

There is a setting on your computer to disable auto-run via your Local Security Policy.

On a Windows XP system: Use Start, Run, GPEDIT.msc . Then click down to "Computer Configuration", then "Administrative Templates", then "System". In the right panel, double-click "Turn of AutoPlay". Click on the "Enabled" button, and use the dropdown in "Turn off Autoplay on" to set it to "All drives".

On a Vista system: use the "Start" button (the round Window icon), then type in "autoplay" and press Enter. That should get you to the "Control Panel, AutoPlay" dialog (which is another way to get there). In that screen, make the setting for "Software and Games" to "Take no action". Also set the "Mixed content" choice to "Take no action". (You could also set "Take No Action" on the other choices also if you want to be very conservative.)

Note that in a corporate/managed system, your network administrators may have already set this for you. If they haven't, strongly encourage them to do so. This will cause CD's to not auto-play, but that is a small price to pay.

Regards, Rick Hellewell

Eric Pobirs adds

This threat is pretty low for current stuff. Since SP2 on XP and more recent releases, the original autoplay function has been replace with a popup menu presenting a set of choices. The user would need to agree to run the malware, which they could just as easily do without it autorunning.

And as mentioned, most AV apps having settings to automatically scan removable media and have had this for a very long time. I can remember a community college computer lab that required the user to wait for floppy disks to be scanned before they were accessible.

- - Eric

Computer security practices don't always help, as Rick Hellewell notes:

There is the good chance that many users have done the "go ahead and auto-play, and don't ask me again' on that popup. ...but it's probable that those people are the ones that would go ahead and install the malware when the pop-up askss. There are a lot of people that have paid for the phony anti-virus programs ("Anti-VIrus 2009", etc) when they pop up with phony virus infection claims.


The bottom line on security was from Eric:

There are limits to how much you can protect people from themselves and still expect them to be of any use as employees bound to computers. (Although, a nervous system connection that would cause them to feel pain when they do something bad to their computer could be a real breakthrough in training...) Extreme measures go as far as eliminating the ports, as RBT related, and other go in the opposite direction of having users' boot sequence include an entire image installation. No matter what they do to the machine, it'll be exactly the same as it started upon the next boot. A lengthy boot but Wake On Lan can be used to wake up all of the machines just a few minutes before the workers are due to arrive. For all these efforts the users remain the most difficult security problem. It has frequently been suggested that the users should simply be taken out of the equation but nobody produced a practical implementation as yet.

- - Eric

Of course we have been through this scare before. Robert Bruce Thompson recalls

Back a few years ago, when USB flash drives were starting to get common, one of my confidential correspondents told me that the NSA had a policy of filling all USB ports on desktop machines with epoxy. I don't know if that's true, but it's certainly credible. Of course, at the time they were less concerned about malware and more concerned about their data walking out the door on a flash drive.


Indeed. Some years ago friends at Langley told me that there was only one machine in the building that could accept a USB connection. If you needed to read a thumb drive you took it there. They would then find other means to transfer your data to your own machine. I don't know if that's still the practice.

When we were at the Professional Developers Conference in late October, and they were rolling out Windows Azure, I sent this note to Peter Glaskowsky (who was seated next to me in the front row):

> I guess I'm a bit more certain that cloud computing is with us to stay,
> and Microsoft will embrace and extend. I could be wrong.

His reply was:

I don't think you're wrong.

Cloud computing isn't going away. It's the right solution for many business needs.

Nor is it going to remain a small part of the computing industry. Most of its potential is yet to be realized. As I said before, in 20 years, we're going to wonder how we got along without it.

The bigger questions in my mind are:

1) What will be the mix between cloud computing as an outsourced service and cloud computing operated by IT departments directly? Outsourcing will be the better answer for smaller and more agile firms, those that are less security-conscious, and those with more variable demands. Microsoft can play in both areas, and is already thinking about how to deploy Azure services in future server operating systems.

2) Is Microsoft the right kind of company to play a major role in cloud computing? Microsoft's ability to facilitate cloud development is a strong argument in favor. Microsoft's tendency to lock its customers in to proprietary platforms will be seen by some customers as an argument against Azure, but I don't think this will be all that big a deal.

Personally I suspect Azure was envisioned as the platform-- and a marketable line of business-- for Microsoft's Yahoo subsidiary. When Yahoo refused to be acquired, Microsoft decided to integrate Azure more closely. As that effort proceeded, Microsoft realized this was a better strategy anyway, explaining why the company is no longer interested in Yahoo.

Of course, if Yahoo gets cheap enough, Microsoft may buy it anyway.

. png

Since then Yahoo has ample reason to regret refusing the $31/share offer Microsoft made. The speculation was that they had counted on Google to keep them going with advertising revenue and other sweetheart deals, then that went away, and Yahoo is in major trouble.

I thought at the time that Yahoo's insistence on more was a good thing for Microsoft, not because I saw the crash coming, but because I don't think Microsoft knows how to compete in the search engine/advertising market.

Azure is Microsoft's effort to embrace and extend fee based services and get a steady income from subscribers rather than from selling - licensing - packages of software. I can sympathize with that: subscriptions keep me going between major book projects.


Dr. Pournelle:

Thoughts and prayers for continuing recovery. I'm not recovering, and so far I feel far less productive than you're proving to be.

When you once again have time and energy to do silly things so the rest of us don't have to, here's one thought: Microsoft Word does a fair job of checking grammar and spelling, but one thing that it doesn't appear to do is to check that for every "open parenthesis" there is a corresponding "close parenthesis." Nor does any version I've seen check for matching quotation marks, brackets, or braces. This seems to be an odd oversight. I've looked through "Format" and used (yeah, right) Help, and nothing so far.

I'm really gifted at overlooking the obvious, so I wouldn't be surprised to be wrong, but I sure can't find any such checking mechanism. It seems that this would be a handy proofing tool for writers.


I haven't found any such utility, and I'd sure like one. Of course one can write a macro to do that, but it does seem to be a natural feature for the grammar checker.

The Great Vista Bootup Flap: I have been using Vista for some time, so when Microsoft at PDC and WinHEC devoted some time to telling us how much quicker Windows 7 would boot up than Vista, I was cheered but not all that amazed. Sometimes one of my Vista machines will take several minutes to boot up, but this is almost always after a big update has been installed.

Then we got this from Robert Bruce Thompson:

Does Vista really take 15 minutes to boot and another 15 minutes to shut down? I mean, I know Vista is a pig, but that seems excessive even for Microsoft.


"Windows Vista is in more legal hot water and this time the ones getting wet are the companies who've rolled out the operating system, not Microsoft.

A series of lawsuits have been brought against major US companies by staff claiming unpaid overtime based on the time it takes Windows Vista to start up and shut down.

Mark Thierman, a solo legal practitioner based in Reno, Nevada, told The Reg employees are losing up to two hours of pay a week thanks to Windows Vista.

Thierman calculated damages could run into millions of dollars over a three- year period. He's representing employees of Cigna health insurance, with cases also pending against AT&T Inc and UnitedHealth Group covering thousands of employees.

Thierman said the Windows Vista problem particularly affects workers paid by the hour, in places like call centers or in retail.

The crux of the issue is the fact that some companies have connected time- keeping systems to their PCs.

These systems are not activated until the user logs in, which is taking up to 15 minutes after the machine running Windows Vista has been turned on thanks to the long boot cycle. This means staff are in the office or shop but not officially working until they've logged in

And when it comes to shutting down, people are logging off but hanging about without pay as the PC goes through the equally long shut-down cycle."

-- Robert Bruce Thompson

Several of us commented that we had not noticed any such problem. Eric Pobirs, who does on-site trouble shooting with all sorts and conditions of machines, said

Any system can be misconfigured. The companies responsible for those machines screwed up and this is a tempest in a teapot. Are they going to sue their own IT department? I've worked on Vista systems configured for heavily locked down domains with substantial login scripts. The time to a usable desktop was three or four minutes at worst.

Then David Em said

One of my Vista 64 systems starts up in a couple minutes the other takes ten minutes (did even when no software had been installed). No idea why.

-- David

Bob Thompson asked gleefully why, if this wasn't a general problem, David would be having this difficulty.

Peter Glaskowsky replied

Whatever David's problem is, there's no reason why business systems should be taking 15 minutes to start. Such a long delay indicates a misconfiguration problem, as Eric said.

There are a couple of other issues here, though.

1) The time from cold boot to the desktop can be much shorter than the time from cold boot to the first running application.

2) These companies could set up these systems to sleep instead of shut down. If they want to save more power, they can be set to hibernate instead. Resuming from hibernation is still a lot faster than booting.

Anyway, I think these employees have a point. If they're getting paid only for the exact number of minutes shown by the timeclock, they are getting cheated. It probably amounts to ten minutes a week, not two hours, but that's not the point.

. png

Eric suggested that it had to be a bad driver causing David's problems. David Em replied

Anything's possible, of course. One of these systems Alex and I installed the OS from scratch, the other came preconfigured from HP. It's interesting to me that there are many anomalies in how both systems manifest odd individual little OS problems -- things like one forgets to load my preferred cursor icons, or the other has trouble finding the network now and then, then magically it appears. These things don't shut down the operation, but they do wind up taking my time to troubleshoot and correct. Of all the many XP systems I've run (somewhere between one and two dozen), I only rarely had to deal with these sorts of niggly OS-related but time consuming issues. Overall, my experience with XP was very consistent across many different machines, both desktops and portables -- and once something required fixing, it tended to stay fixed.

-- David

I have some sympathy with David here: my Vista systems do mysterious things, like failing to find the net, or seeing a computer but being unable to connect to it until I reset both systems; enough so that I've concluded there's something wrong with my network, but the odd thing is that Vista does recover, often spontaneously, so perhaps it's not my net at all.

Eric Pobirs summed up the situation thusly:

Companies often shoot themselves in the foot on IT matters.

About ten years ago I spent a few months working for a company called Nexus Integrated System. Their main business was telephony installations. It was a pretty simple Novell shop with a lot of Win95 boxes. The big problem spot was the dispatch department. This was a half-dozen workstations that each took a good ten minutes minimum to be usable, if they didn't crash in the startup process. To avoid the time and expense of training the women working there (Typical IT failure #1) they set up all of their apps to run from the Startup folder. This included Outlook 97 and at three other apps to my recollection. On HP Vectras, mostly Pentium 90s, with 8 to 16 MB of RAM.

The fix was obvious. Add more RAM. These women were stuck for at least an hour of their work week waiting for these machines to boot and, with any luck, be usable on the first try. My immediate boss was terrified of doing anything that meant asking to spend money. He had only recently been promoted into his position after his predecessor had left for a job with the Getty Center. What I didn't know was that Nexus was about to be part of massive conglomeration of similar companies under the new name Exp@nets. (Yes, I know. It was supposed to be clever shorthand for 'expert at networks.') So every department head was trying to avoid expenditures for appearance's sake.

I assembled some calculations for how much this was costing the company vs. the cost of upgrading the workstations. The price of bringing the Vectras up to 48 MB (assuming none of the existing modules could be retained) was about the same as the cost of the idle dispatchers for two weeks. The IT department had a few SIMMs lying about and I took it upon myself to upgrade a single station to 32 MB. (This is where Win95 became really usable.) The difference was so dramatic the user was convinced I'd given her a new computer.

There was now, as intended, rebellion fomenting in the dispatchers room. They'd all seen what was possible and been informed of the price. I was putting together a ROI paper to give to my boss in hopes he would grow a pair and take it to his boss but then the axe came down. Due to the merger, the new parent company thought they had too many IT people. In fact, we were the only operation of the companies acquired in California that had any dedicated IT at all. So they were actually terribly undermanned but last hired, first fired was the order of the day.

That was an early direct experience. The only thing I've seen change since is the scale.

- - Eric