Dr. Jerry Pournelle

Email Me

Why not subscribe now?

Chaos Manor Subscribe Now

Useful Link(s)...


Hosting by

Powered by Apache

Computing At Chaos Manor:
The Mailbag

Mailbag for March 12, 2007
Jerry Pournelle jerryp@jerrypournelle.com
Copyright 2007 Jerry E. Pournelle, Ph.D.

March 12, 2007

We get a lot of mail. I can't use all of it, but please keep it coming!

Subject: Computing at Chaos Manor, March 6: Virtualization


I just read the March 6th Computing at Chaos Manor.

I have been interested in virtualization for quite some time, but have been hesitant to dive in due to certain limitations of the technology. At this point in time, much has changed and I am now waiting to build a new PC that will support the resource requirements of running a virtual machine.

Windows wise, both VMWare and MS have come a long way in virtualization, although from what I have heard and read, VMWare is still far ahead of Microsoft. Having said that, I have also read that each product addresses certain needs and has certain deficiencies, so a potential user has to take into consideration his or her needs in order to select the best product for him or her.

One of the limitations that I see is that a client VM is geared more towards testing and running applications that don't required more than a generic environment. My research shows that people often ask "Why doesn't device work in my VM?" The most common question regarding devices pertains to the USB port. The VM does not usually support every kind of device, even though the guest OS does. This limits the functionality of the VM for that specific user. The VM only sees a specific set of hardware resources, regardless of what really exists.

My ideal VM is an environment where I can get work done. For example, I may want to set up a VM with software development tools and another with P2P software, another for beta testing and yet another for as a test bed for the developed software. The latter is perhaps the biggest push for a VM or at least initially it was. Apparently, a VM is ideal for software testing, unless of course, the software being tested needs to access a specific hardware device (for example an application that takes readings from a USB connected digital thermometer). If that hardware device is not supported within the VM, configuring a test bed in this environment holds no value. In some cases, my need for a VM is to isolate applications and their potential effects on the operating system as a whole. If I only have one piece of hardware I don't want to use it as a test bed (in some cases I can't) and as a platform for beta software. I want my VM to be disposable. If I or some malware should muck up the VM, I want to simply trash that image file and replace it with a backup.

On the other hand, server virtualization is heaven! Given one piece of hardware various server environments can be virtualized. I can have various servers running different server applications (DB or Internet Server, etc). In contrast to a client computer, for the most part, server software works well in a generic hardware environment.

I see virtualization as the way computing technology will go, but I want a (virtual) sanctuary that will not only protect me from the outside world, but will also allow me fully exploit my hardware's resources.

I can't close this message without first thanking you and saying that I appreciate all the silly stuff you do so we won't have to :-)

Salvador Garcia, BSCS

Thanks. I believe that within a few years, virtually all computing will be done in virtual machines. There are many advantages for just getting work done, as you observe; and the era of computing plenty is upon us, so it ought to be technically pretty simple.

On Virtualization and licensing:

Subject: Keeping costs down with Parallels


Your Mac users probably already know this, but I discovered that Microsoft will allow you to install XP in Boot Camp and in Parallels with only one license required. It takes a call to authorization customer service to do the second one, but once I explained that it was going into a different partition on the same machine, I got a number in minutes, and both versions are authorized and working.

Geep Howell

Thanks! I think we had mail on this before, but it does no harm to say it again. On the other hand:

Subject: VirtualPC vs VMWare - license requirements

I've got a client where I think using virtual machines is going to buy him some time until he can switch to new software in about 9-12 months.

Right now he will need the ability to have 4-5 virtual PC's that will run an Access based program as well as IE.

I'm leaning towards MS VirtualPC just because I only know it as a free download. VMWare has some free downloads also, but I'm really not sure if I need one or two of them.

What about the license requirements for Windows itself? Microsoft is only talking about Vista and it looks like enterprise customers get a break but the common folks don't. If he's running XP, does he need a separate license for each VM? Am I right in reading that the host OS has to be at least XP Pro and that XP Home is not an option?

As much as I like Microsoft products, I hate the fact that it takes hours or days to figure out how to stay legit with them without paying an arm and a leg. Heck, I've got another client that I sold a three-year Open Value license to. Nearing the one-year anniversary and I can't figure out what SKU to order for them.

Universities could probably start offering four-year courses in Microsoft licensing.



Apparently there are people in Microsoft PR who haven't taken that course, because as of press time I don't have an answer. It looks as if it would be worth doing the research and doing a column segment on licensing requirements for using an OS in Virtual Machines. Do note that Windows 2000 does not require activation, and is often plenty good enough whatever the status of Microsoft support. And one more reader says,

Legally, I think you do need a separate license for each VM. It's not too hard to get around that (activate, then clone your machine and keep all the hardware settings the same), but not if you want things kept on the up and up.

I use VMWare workstation and buy a copy even though I've always gotten Virtual PC for free since Microsoft acquired it. For me, the snapshot feature alone in VMWare is worth the price of admission because when writing or testing software, snapshotting makes it easy to move back and forth through configurations you've made. Built in screenshot and movie-making are also quite nice.

But if your client just needs to virtualize some machines, the free Virtual PC 2007 may be the way to go.

- Walter

Thanks. We have found Virtual PC to work well with the Mac PowerBook; alas, I don't yet have a Mac Book Pro.

Subject: Things have changed a bit....


In light of recent developments, it seemed worth forwarding this back to you. The gist of the exchange below, 2.5 years ago, was my remark:

> So, I wonder what is it that can be done on a PC that can't be done on a Mac?

And your reply: Play Medieval Total War.

Of course, this was pre-Intel Macs.

And, of course, Parallels, now in a release version handles essentially all of the Windows compatibility issues. I use it and it works extremely well. The breaking news is the release of the 2nd beta of the competing VMWare Fusion product

which supports DirectX 8.1 games! I'm not sure if this immediately answers your earlier objection, but I'd bet a month's pay that one of these two products will give you a very playable experience with nearly any PC game you would care to name. The competition between these two products is clearly going to be a delight for users.

I know you're busy with Inferno 2 (what is the working title?), but after that, get a MacBook Pro (with PLENTY of memory) and you'll have a portable solution that runs everything.


Well, Parallels still doesn't support Direct X and the various properties of advanced video boards. We can only hope.

Subject: Pipes in Windows Command Prompt

Dear Dr. Pournelle,

In your Mailbag from March 5, 2007, you wrote the following:

Over the years people kept sending me UNIX shells that would make it usable, and they did help. UNIX has some interesting features, particularly the whole concept of pipes in which the output of one application is fed as input to a new application; but DOS and WINDOWS programs weren't written to take advantage of that.

Starting in Windows 2000 (I'm not sure about Windows NT) and continuing with Windows XP, the command prompt "cmd.exe" would allow you pipes and redirection, albeit in a limited fashion; not all commands supported pipes and/or redirection. I do agree with you that it was one of the features I found most useful in UNIX when I started using it in the mid 80s.

One note of interest is that with both UNIX and the Windows command prompt, data that flows through the pipe in raw text. This has certain limitations, and requires that programs that are providing input to the next command in the pipe, format their output in a specific way. Programs such as Osh (link) introduced us to objects flowing through the pipe instead of raw text. Osh though, is a program, not a true command shell. Microsoft in their Windows Powershell (link) allows the data flowing through the pipe to be object-oriented. So your directory listing, instead of having a bunch of columns that the next command needs to parse, is passed as an object, and the next program doesn't care about the format anymore. I haven't played with it myself yet, but it is certainly an interesting development, for those of us who like using the command prompt. A good introduction to Powershell is here: Windows IT Pro article link.

Glenn Hunt

Thanks. I was vaguely familiar with pipes in Windows 2000 and I actually did use them with some Python text filter programs I wrote, but I haven't had to do any massive text manipulations in the last couple of years.

Powershell does sound interesting, and I should look into it. Thanks.

A couple of weeks ago we had a discussion of vulnerabilities and operating systems. Chaos Manor Associate Eric Pobirs wrote a comment that got in past press deadline time.

The OS that has the overwhelming majority of desktops is going to be by far the most attractive target. Automation of infection is highly desirable, whether it is from a central source or self-propagating.

2. I have no idea what percentage of botnet clients got that way because they were infected by worms versus direct infection by malware, but the consensus among security experts that I trust is that malware is now by far the bigger risk. I certainly get a huge amount of spam that either contains executable attachments or includes links to malicious URLs. I haven't heard about many widespread worms recently, but perhaps I've missed something.

I would expect the malware makers to use every tool at their disposal they believe will deliver worthwhile results for the effort. There are more safeguards against self-propagating items today but the opportunity cost is far too low to leave it out of the payload. Especially when the growth of broadband makes it so easy to include in a single transfer. Back when most were still on dial-up, there was a lot of constraint on how big the payload could be, not only in attracting suspicion but also in having the transfer fail before completing.

This is why it was so valuable to gain control of a an email app. The bad guys got remote control of a big chunk of functionality with just a little code from their end. Outlook Express became a favored target because it was present on virtually every machine running the the most common, by far, OS. What would it matter if another email app could be easily subverted if it could only be found on a small subset of systems? Would it even be worth the investment of effort to find such potential targets?

As it got harder to find good ways to take over OE, broadband made a new approach convenient. A few years ago the payloads got a lot bigger, into the dozens and hundreds of kilobytes. The bulk of this was most often a self-contained email engine. Now it didn't matter if there was an unpatched hole in IE. They just needed to get on the machine and running. When the users were frequently visiting web sites with pages in the many hundred K, slipping a payload in that would once have been massive is no longer a problem.

Once it became easy to install their own software it was no longer worth the trouble of trying to find a new attack vector for OE. It wasn't that new ones didn't turn up but the reduction in major worm outbreaks is owed to more than security improvements. I believe it has much to do with the bad guys moving on to better opportunities.

3. Most of the really nasty stuff comes out of organized criminal groups in Russia, Eastern Europe, and Asia. Do you serious think that buying a few Macs for their malware programmers would stop them?

The talent is going to go with what they know. Unless Apple gains huge amounts of marketshare, why would the bosses bother buying machines and demanding their coders learn them? A criminal enterprise is a business like any other enterprise and they'll play the numbers they find favorable.

4. As you say, Linux is readily available to anyone. Where, then, are the exploits against Linux? There are a lot of high-value targets running Linux, and yet the exploits against Linux are few and far between. There are tens of millions of systems running Linux and OS X, surely enough to make it worth someone's while to attempt to exploit them. Where are all those exploits?

There have been reported exploits against Linux. Many, as some were shocked to see reports that the yearly tally ran roughly equal to Windows. The difference was that nearly all of the efforts, as I've already stated, were against servers. That is where the big money is for serious professional criminals. Banks don't call the press when their systems have been hacked. They keep it quiet if at all possible for fear of a PR disaster. Blackmail scheme where the mobsters give evidence of their remote access and demand payment is the hot ticket. It is in both the criminal and the victim's favor to keep such incidents quiet. The predator doesn't want the rest of the potential prey alerted to its tactics and the prey doesn't want its weakness made public.

There are tens of millions of people who visit Las Vegas and Atlantic City every year who are not attracted to the typical slot machine. But that hardly matters to the casino operators when there are hundreds of millions more visitors who are drawn to those machines. When the success rate is extremely high with the most numerous demographic, there has to be a huge payoff for it to be worth going after the remainder.

Another factor against targeting the smaller group is whether they are more cognitively resistant. If the smaller group not only requires investment in a different attack or sales approach but is also more likely to be suspicious, the value of the investment is further decreased. Linux needs a 100 million Aunt Minnies to become attractive to the malware makers and it's nowhere remotely near that installed base.

Another thing left unconsidered is that a major portion of the successful malware out there doesn't rely on security faults to get on to PCs. Just cooperation from the user. In those cases, the nearly exclusive targeting of Windows systems plainly is driven by their majority status.


Good sense as usual. Thanks.

Net neutrality continues to generate mail. First a question from a reader:

Subject: Service levels?


In regard to the statement from this weeks mail bag

"There is no way AT&T agreed to that. ISPs have a longstanding division between the service levels given to businesses vs. consumers. Consumer lines are 'best effort' but lacking in any guarantees compared to business-class lines that carry more assurance for a greater cost."

My experience is that ISPs provide the same service for both a 6 meg business vs. a 6 meg residential account. I get the same run arounds from tech support and my throughput is plagued with the same bottlenecks so what am I getting for the premium price? There is no difference between the infrastructure of a business connection and the residential connection a few blocks away.

The ISPs that I have talked too do not give any guarantees as to throughput unless you go for a high end account and even then they leave lots of wiggle room. Last week I called one and asked about expected speed if I signed up for a 6 meg connection and they would not commit to any speed guarantee at all. Turned out that they are a reseller of the national ISP I am currently using so how would I be further ahead I asked myself.

Lets toss this out there for discussion. I am interested in your readers' response because I am thinking of switching to a residential account and saving on my monthly bill.

Long.............. time reader,

Jim Potter

If they can't guarantee bandwidth for a premium it wouldn't seem to make sense to pay for it, but clearly I am misunderstanding your question. It does seem odd that a reseller would offer more than the original bandwidth provider. I use Time Warner Cable, and although I get odd slowdowns once in a while, I only really notice them when I am trying to ftp a picture up to the web site host. Generally it works well for me including when I use SKYPE, including Skype to connect to TWIT.

As I understand it, what you get for the extra you pay for a business account is a fixed IP address. I once thought that would be useful for setting up secure (VPN) connections to Niven and others, but it turns out I don't need that, so I never got one.

RE: Net Neutrality Issues

Dr. Pournelle,

I think I am missing something, or misunderstood it all, but I thought that Net Neutrality dealt with the fact that you have a "black hose" (the bandwidth equivalent of a black box) running to your house/computer/network from the Internet, and what you ran back and forth through it couldn't be your ISP's worry. Note that this in no way means that you could or should expect a bigger "hose" for the same price as before just because you would like to play fireman.

In other words, as a more practical example, even if your ISP sells voice traffic as one of their main businesses, they can't discriminate by reducing your agreed contracted bandwidth for the portion that is used for VOIP, even if it were in competition for their analog voice (or add-on VOIP) services. Likewise a cable based ISP can't discriminate against video over the Internet traffic by blocking or throttling certain types of traffic tied to the video which is competition for their cable channels.

Now if you want to be guaranteed that your streaming HD video is perfect, you need to pay to have the firehose and relative pressure hooked up to your house, otherwise you get the "standard" service.

Which is really only logical, it seems to me.

Best regards,

James Siddall jr

Seems logical to me, too. Thanks.