Dr. Jerry Pournelle

Email Me

Why not subscribe now?

Chaos Manor Subscribe Now

Useful Link(s)...


Hosting by

Powered by Apache

Computing At Chaos Manor:
November 10, 2008

The User's Column, November, 2008
Column 340, Part 1
Jerry Pournelle jerryp@jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

Microsoft held their Professional Developers Conference (PDC) and Windows Hardware Engineering Conference (WinHEC) back to back in the Los Angeles Convention Center during the last week of October and first week of November. PDC was just before the election, and WinHEC was just after, and that did have an influence. During the exhibitor party on election night they set up big screen TV's and showed the marvelous CNN "holograms".

Of course both conferences were planned and scheduled well before anyone suspected the full extent of the financial meltdown. (For that matter, we may not yet know, but that's another story; I'll have some comments later.) Both these conferences are traditionally important — indeed key — events for those trying to understand the future of the computer revolution.

WinHEC, the Windows Hardware Engineering Conference is annual, but PDC, the Professional Developers Conference, is held only when Microsoft thinks there is enough new to make it worthwhile. In past years the two conferences were about the same size; this year PDC was sold out, but WinHEC was the smallest I have seen in the dozen years I have been going to it. A lot of the conference presentations were recorded and are available on line, and those who want the gory details can go read about them. In both cases the opening keynote is worth your time if you have any interest in these matters; the other events can become very technical indeed.

The big announcement at PDC was Windows Azure. That was the theme of the opening keynote by Microsoft Chief Software Architect Ray Ozzie. Indeed, Azure was the major theme of all of the conference. There were references to Windows 7, but most of that was reserved for WinHEC. We'll get to Windows 7 in a bit.

Windows Azure

There's not a lot new in Azure: most of it has been available for developers in one form or another as part of Microsoft's .NET and NetStudio. What's new here is emphasis and integration. By making Azure the theme of PDC, Microsoft is unambiguously stating that the company has at least one of its heads in the cloud, and Azure will be a big deal even though there won't be any consumer implications for the next couple of years — but developers should take note, now, when they are deciding what to work on for the future. Microsoft apparently hopes to get not only developers but all enterprise customers involved in Azure, and then to move on to general business customers, then to Windows power users, and finally to everyone including Aunt Minnie. No one now knows how important the web computing market is going to be, but Microsoft is staking out a large claim.

To begin, Microsoft is opening a big server farm to support Azure. At the moment, Azure is pre-beta and available only to developers, particularly those who went to PDC, but that changes soon. Most readers won't find Azure interesting for a year or so; but in a few years it's going to be hard to avoid. Azure is Microsoft's horse in the race to what is now called "cloud computing". Cloud computing is supposed to be different from Microsoft's 2001 cloud system Hailstorm, and decidedly different from the Network Computing (NC) craze of the 1990's.

Of course NC was an attack on the very existence of Microsoft: it was going to obviate the need for an operating system. All you'd need was a thin client connected to the Internet. All your data would be out on the web — "in the cloud" — and all your applications would be run out there. Of course this was in a time when high speed Internet access was common on academic campuses. Off campus such access was chancy: long time readers will recall my lengthy battle to get affordable high speed access at a time when neither ISDN nor DSL nor cable modem was available at Chaos Manor. The spotty nature of high speed net access doomed Network Computing at birth.

By the time Hailstorm came along, most of the potential customers Microsoft was wooing had high speed Internet access, but it still didn't catch on for various reasons, and Microsoft retreated: Hailstorm became a part of .NET and the name faded away. Microsoft fell back to regroup; but it hadn't given up the cloud.

Cloud Computing

Most of us have a vague notion of what cloud computing is, but are a bit fuzzy on the details, largely because the details remain fuzzy. For those who want a systematic presentation, Cloud Computing, by Michael Miller (not the same Michael Miller who is my friend and former BYTE colleague and later EIC of InfoWorld), from Que Books, is a good introduction; there's a Kindle edition on Amazon. Miller discusses the concept, various problems and solutions, and the enabling technology for what Microsoft thinks is the next big thing in computing.

In theory Network Computing, Hailstorm, and all the other avatars of cloud computing are a great idea. You can carry a small and thin laptop — my MacBook Air would be far more than you'd need — and instantly have available everything you'd have on your desktop. Everything would be synchronized: calendar and meeting schedules, email, all versions of your documents and spread sheets, as well as all your other work. Everything would automatically be backed up at all times. There is no point failure source: any machine you have, indeed all the machines you have, can be destroyed utterly and you will have lost neither data nor applications. You only need access to the web, and it won't matter what machine you access it with. You'll have your data and your apps.

Collaborations would be much simpler. Changes in documents would be automatically inserted in everyone's copies even as you are working. A number of people could be working on the same thing at the same time. Someone would have to be designated as the project master with final say on what changes would be incorporated, but that's already a solved problem. Alerts would propagate automatically.

Many organizations already work this way, but they have to maintain large internal networks to do it, and the cost of keeping server farms with sufficient redundancy can be high: not only in hardware and software, but IT personnel. With Cloud Computing all this happens automatically with no local equipment investment or maintenance and no need for local IT gurus: small companies and consultants can be added to the network at any time and from any place.

It's great for commerce, too. Small outfits could have many of the services and features available now to Amazon and other big boys. The goal of Hailstorm (which became .NET My Services) was to allow you to have one logon identity that would then give you access to dozens of commercial sites. They'd all know your commercial information: that is, the cloud would know it, and could make it available to the various shops you would find in the virtual Internet Mall. No need to maintain many separate accounts on many different Internet stores.

And finally, Cloud Computing at least in theory doesn't care what operating system you use to access it. Obviously there will have to be different versions of applications, but since most of the work is being done by the computer farm out in the cloud, that's not very difficult. There will be some minimum requirements for what's accessing the cloud, but most modern pocket telephones have more than enough power. Use a powerful laptop like a ThinkPad or MacBook Pro; a more limited laptop like Khaos, my MacBook Air; a small sub notebook; an iPhone or other modern phone; the entire power of the Cloud will be available, and your experience is limited only by the inherent limits of your interface. In theory there is no reason you can't evaluate the Einstein Tensor through your iPhone and plug the result into the proper place in the 20 megabyte document you're working on.

That has always been the goal. Azure will, Microsoft hopes, make it real. It may take time, but this is the start.


If Cloud Computing is such a great idea, why hasn't it worked before?

There are a number of reasons, most related to hardware. Obviously Larry Ellison's Network Computing dream that was going to put Microsoft out of business was doomed from the beginning. Not only didn't we have high speed Internet access, but the display capabilities of our local machines were highly limited. Those were the days when you needed a "Windows Accelerator" video board before you could use Microsoft Word in Windows; the days when a "Page Down" command was a good excuse to go get a cup of coffee. Word processing by accessing a text editor resident on the web was even slower. Even those who hated Microsoft soon gave that up.

It's not so clear why Hailstorm — at least as transmogrified into .NET Services — didn't catch on. Part of the problem was confidence and faith. There were both hardware and software limits, and any glitch undermined confidence in the whole system. Meanwhile, the cost of both local computing power and local storage was plummeting. Everyone has misgivings about trusting vital work to systems not under their control. The result was that many concluded it was both cheaper and safer just to get a better PC and a "box of drives" for local networked storage. Enough enterprise developers found .NET worthwhile to keep it going, but it never took off as Microsoft had hoped Hailstorm would.

Pournelle's Law Revisited

In the early days of computing everything was done in batches. There was a department called Central Computing or some variant thereof, and it had custody of a big mainframe machine. Those who needed computing services took their problems to a programmer, who formed it into a FORTRAN (or other) program and had all this keypunched. You got a box of IBM cards, which you took to Central Computing, generally to a Dutch door where you handed your box to a technician. Depending on how busy things were you either waited, or came back the next morning, to get your box back along with a printout of the results. If you were lucky the program had compiled and ran properly and you had your matrix of correlation coefficients or whatever it was you needed. This got very old very quickly.

Then John McCarthy wrote time sharing software, and if you were one of the favored you had access to a terminal and ran your programs yourself, or, more likely, you found whoever was designated to do that for you. It was a lot better and faster than the old batch system, but it was still slow.

Microcomputers changed all that. Now users had their own machines. Sometime in the early 80's several developers began to write time sharing software so that you could have multiple users on a single PC. I thought this the very antithesis of distributed computing and a violation of the principles of the computer revolution, and I devised one of Pournelle's Laws: One user, one computer. I later revised that to be "One user, at least one computer,", and later "One user, at least one CPU." I pointed out that no one ever wants to share a computer with someone else.

Does Azure violate that principle? It certainly makes one dependent on someone else: if the cloud server farm goes down, or there is some interruption of your access to the Internet, you're dead. The Cloud means it when it says all your data belongs to us. The same is true of your applications. While no single machine is a point failure source, Internet access certainly is.

Some of this can be overcome through suitable architecture. Storage costs are low and falling: it's not impossibly complicated to be sure you have a local copy of your data and many of your applications as well as having copies in the cloud. It's not trivial: As an example think of writing an essay using Azure. You are writing on your local keyboard, but the keystrokes are sent up to the cloud. They may also be sent up to your screen — what we used to call half duplex back in the early days — or they may go to the cloud which then sends back what it thinks it received, and that is what is displayed on your screen. This used to be called "full duplex", and in the days of 300 Baud modem connections it could be annoying: you typed a message, and it took a while for it to display on your screen. By a while I mean from a noticeable fraction of a second to many seconds. I know people who wrote books that way, but not many. Today's connections are faster, but if you're connected through a satellite you may find the delays impossible. Still, we have very fast hardware and there are technological remedies.

Much the same can be said about security: modern hardware will let you do everything through encryption, and it will all be invisible.

There are many other reasons to dislike cloud computing, but in just about every case, there are technological fixes. I am temperamentally inclined against having my data out there waving in the air, but I also have to admit that part of my dislike may be irrational, assuming that civilization doesn't collapse — and if it does, I'll have much larger problems. I won't be using cloud computing without some guarantee that I'll also have a local copy, but we've already covered that.

Apple's Path

Apple has implemented something called MobileMe which isn't really Azure light, but it does have some of the advantages and disadvantages of cloud computing. Some of the Mac MobileMe features work well, including photo sharing. Others don't work well at all.

For example, the other day my MacBook Air began to complain that my .MAC account was rejecting my password. At the time I was trying to make some notes in Word 2008 for the Mac, and had no need whatever to be connected to my .MAC account. Indeed, I shouldn't even be hearing about .MAC — on the iMac 20 if I went to Apple | System Preferences I'd see a cloud shaped icon labeled MobileMe, but on Khaos the MacBook Air that icon was a world icon called .MAC. But .MAC or MobileMe, Apple was making it impossible for me to work. I kept getting the message that .MAC was rejecting my password. I'd click OK; the message would go away; and within a minute it would be back. I'd dismiss the message by clicking the red close bubble. The message would vanish, but within seconds it would be back again. It was, in a word, impossible to get any work done because the .Mac message was always on top of everything else.

Well, OK, let's see if I have given .Mac the proper password. The problem is, my memory is pretty bad. I got out my log book and found my .Mac user name and password. I typed that in. No change. I did other things to check my password, and found that wasn't the problem, and in fact the iMac 20 was working properly. Clearly the MobileMe server was having a problem, possibly because it didn't realize that .MAC was meant for it. That's just a guess. For more on this, there are notes from my day book appended at the end of this column.

Eventually I figured out that if I logged out of .MAC the system would stop trying to log me in. That worked, and I worked on my document. I'm not as familiar with the Mac system as I ought to be — much of what I learned last spring has gone away, victim of the 50,000 rad treatment — so I asked Mac users what was going on and was told that this was a MobileMe problem, and anyway I ought to be seeing MobileMe and not .MAC in the finder; I needed to update the OS on Khaos the MacBook Air. To do that I had to log back on to .MAC, and there was considerably more monkey motion, but eventually the system let me log in, and it got updated. It now shows MobileMe and the cloud icon, not a world and .MAC.

So far so good; but then the MacBook Air didn't see any other machines. It had in the past. I used to be able to connect to both the iMac and various Vista machines. Moreover, Imogene the iMac has always seen Khaos the MacBook Air just fine, and I can connect to Khaos from Imogene just fine; it's just that nothing I could think of would cause Khaos to see Imogene or any of the Vista machines. There is no command like the Vista "Network" button, or the various network places commands in XP that will require the MacBook Air to poll the net and see what it can find. I expect someone will tell me there is one, but for the moment I have no idea.

On the other hand, MobileMe now offers to synchronize my machines, and there's an iDisk cloud icon on both the iMac 20 and the MacBook Air. Eventually I'll figure out how this all works.

Then, suddenly, there were places in the Shared area, and Khaos saw the rest of the net, both Mac and Vista. Apparently it takes a while — like half an hour — for the system to get working properly after updating to the revision that took me from .MAC to MobileMe. I can hope that's a one-time thing, not something I'll have to go through on each boot up; half a moment and I'll go find out. Meanwhile, I can also hope that my local network doesn't have to access MobileMe before I can do local file transfers.

I am pleased to say that Khaos is back in form: I reset her, and she immediately came up able to see the whole network. Apparently it really does take about half an hour for the system to recover from installing the update that brings in MobileMe and banishes .Mac; and while that recovery is taking place there are no warnings and no error messages, and you have to be logged on to MobileMe while it's happening. If you can't log on to MobileMe the recovery won't take place.

This very much illustrates my point that if you are going to make your users rely on a cloud, it's terribly important that you be sure to make cloud access utterly reliable. Apple hasn't got that yet with MobileMe. I haven't any idea why, but my guess would be insufficient resources devoted to it. I do have confidence that Apple will get it right. Real Soon Now.

And as I write this, Microsoft is using its early adopter developers to test a pre-beta of Azure. Azure is, I am told, a great deal more ambitious than MobileMe. I wouldn't bet a lot on that. Apple understands the value of offering net services for a fee. They've been doing that for years with .Mac, and I doubt that changing .Mac to MobileMe was lightly done.

The Wave of the Future

An industry observer says:

Cloud computing is an irresistible tsunami in computing, it is the future of enterprise computing. It brings together the strengths of centralization (ie., the mainframe) with the innovations of new technologies (distributed computing/interaction devices, ubiquitous networking, virtualization).

Amazon are the current leaders in cloud computing with their EC2 and S3 services, while Rackspace/Mosso/Cloud Sites, Terremark, IBM, Sun, HP, Dell, 3Tera, Salesforce.com, VMWare, now Microsoft, et. al. are getting into the act; today, a startup would be crazy to run its own infrastructure when these options are available, and large financial and pharmaceutical companies are actually Amazon's largest cloud computing customers, with business groups doing end-runs around their Soviet-style lethargic, inefficient, overpriced internal IT departments in order to get things done quickly and more cheaply.

Some people will never be reconciled to cloud computing: they'd rather have their own local cloud. Given the falling prices of storage, and the way hardware capability has leaped ahead of software capability, this is likely to be a viable option.

Prediction is difficult, especially if it's about the future, but it's pretty certain there's a cloud in the future, and at some point we will all be using it some of the time. Most of us will also keep our own capabilities. Personal computers aren't going to be reabsorbed into the Borg.


As noted above, WinHEC was the smallest I have ever seen it, considerably less than half the size of PDC. I am sure that having PDC the week before had something to do with that; the rumor is that the economy had more, in that attendance was much lower than registration.

I took Khaos, the MacBook Air, to WinHEC. I didn't see anyone else with an Air, but there were a number of MacBook and MacBook Pro machines, not only in the press room where one expects to see Macs, but also in the highly technical sessions about Windows. I even asked one engineer, who told me he was running Vista on his MacBook Pro. My son Alex does that, too: he uses Boot Camp and boots up in either Mac OS or Windows on his 17" Mac which is both his laptop and his desktop. Both operating systems work perfectly, and I have many reports that a MacBook Pro is one of the best laptops you can find for running Vista.

I'll know more about that in a week: I have a new 15" MacBook Pro. I plan to set her up next week, with a view to letting her take over from the ThinkPad as my main portable system. Unlike Alex I don't intend to install Boot Camp: the plan is to boot in OS X, then use VMware to run Vista. That way I should be able to run Office 2008 on the Mac and Office 2007 on the virtual Vista machine. I may go farther. I have found a serious bug in Outlook 2007's contact management. The fix is to export outlook.pst to a machine running Outlook 2003, do the contact directory manipulation there, then import that back to Bette, my 64-bit Vista Core 2 Quad 6600 that functions as the main communications machine here. You cannot have Office 2003 and Office 2007 on the same machine; but virtual machines aren't aware of each other, and I see no reason why I can't have two virtual Windows machines sharing data files. It's not critical because I don't need to do the bugged operation very often, but it will be an interesting experiment.

In any event, I took the MacBook Air to WinHEC. She'd come down to PDC the week before. When I opened her in the WinHEC press room, I found I was instantly logged on to the wireless press network: they had kept the same user name and password from the week before.

I had some problems with Mail and .Mac, but since this was before the update to MobileMe I don't expect it to happen again. Otherwise Khaos the MacBook Air performed splendidly, confirming my impression that this is the right machine to carry to conferences. With the Wireless turned off it will run Word all day — or did for me. It's cool, meaning that I can hold it on my lap without cooking anything. The screen lighting adjusts to the room lighting, as does the keyboard backlighting. Any problems Khaos has with MobileMe and the cloud connections would be shared by a MacBook or MacBook Pro. If I had only one Mac portable I suppose I'd choose the Pro; but I would really miss the MacBook Air. It's a limited but truly professional machine for people who have to get some work done on the go. In future I'll take the MacBook Pro on trips, but leave it in the hotel room while I carry Khaos to meetings and conferences.

Pocket Computers

There was a recent Wall Street Journal article about people who no longer carry laptops at all: they use their telephones for everything. Clearly that isn't going to work for writers, although my daughter Dr. Jennifer Pournelle found the Compaq iPaq with a folding detachable keyboard more than enough when she was in Iraq. If the iPhone had such a gadget I might consider that; but I don't think anyone has developed such a system yet.

I am sure something like that is coming. I have had a telephone that functioned very well as a pocket computer for a couple of years now. It had both touch screen and a usable keyboard, and I'd have been glad to carry it, except that it was a lousy telephone. The iPhone, on the other hand, is a good phone, and works very well as a pocket computer for most purposes. I have learned to "type" well enough for short messages and even notes, and there's an application called YouNote that makes taking and keeping notes much easier. The map function works well, and I find myself using the iPhone for quick excursions to the Internet when I need to look something up. My iPhone is the old one with standard AT&T telephone connection, so I don't use it much for web browsing when the only connection is through that, but it Googles quite well if connected to any Wi-Fi network. I am continually surprised by how many such networks there are.

What the iPhone doesn't have is a decent keyboard, and apparently Apple doesn't plan to add one. I wish they would. There are times when I am tempted to get out my old iPaq and its folding keyboard. The camera you have with you is the one you take pictures with, and for a writer, the computer and keyboard you have with you may be the one you'll use to get your page or two done for the day.

I'm pretty sure that at some point everyone will have a pocket computer much as I described in The Mote in God's Eye back in 1974. It will be both computer and telephone. The only thing in question is the form factor: will it look more like a phone or more like a computer? If it will look more like a computer, then men's fashions will change and we'll all be carrying a handbag. That happened back a few decades ago, but the trend died out before it was universal.

Whether we're carrying bags or belt phones I'm sure we'll all have pocket computers; and that's going to have some profound impacts on educati0n. It will still be worth learning the addition and multiplication tables, but a lot of what we learn by rote now will be irrelevant because you can find it in seconds with your pocket computer.

Given what's happening to my memory, it can't be too soon.

Rebooting the iPhone

When I first got the iPhone it was fun and I used it a lot; but as time went on, it simply would not connect to local Wi-Fi networks, and without that connection it wasn't very useful as a pocket computer.

There were other problems, and I began to tire of the iPhone. Someone suggested that I do a hard reboot. For reasons I don't recall I thought that a good idea, but I didn't do it. A few days later I was in the Apple Store in Fashion Square in Sherman Oaks. I didn't have an appointment, so it wasn't possible to talk to anyone about the iPhone; but as I was about to leave I thought about the advice I'd been given, and tried the hard boot.

To do that you hold down the front button — the only button on the iPhone that you generally use — and the little black button bar on the top of the iPhone. Hold both down as the machine shuts down. Ignore the "Slide to shut down" message and continue to hold down both buttons until the iPhone cycles and the Apple appears on screen. Let go both buttons and wait until the phone comes up.

In my case when the machine came back up there were two differences: it had more bars of telephone signal strength, and it immediately connected to the local network. Moreover, everything seemed to work not only properly, but better. More to the point, everything has worked better ever since.

I am told that the iPhone needs that hard reboot every couple of weeks. I have also been given a number of theories on why, including the obvious one that it gets filled with software goop and has bad garbage collection. I wouldn't know. What I can say is that it does no harm, doesn't take long, and in my experience makes the iPhone work much better: I get more bars, applications are crisper, and the Wi-Fi connects more strongly. I now do a hard reboot every week. You might want to try that.

Roland Dobbins adds:

The likely culprit on the iPhone is memory leakage; OSX on the Mac leaks like a sieve, although certain RAM-hungry programs such as Adobe LightRoom can be run occasionally purely as garbage collectors (they seem to do something in terms of memory allocation which ends up freeing a lot of RAM). This is probably what's happening on the iPhone; someone who has jailbroken his iPhone could probably confirm this using the various OSX tools available to monitor system status.

That, I suspect, explains why I have to reboot the iMac 29 every now and then; if I don't it gets sluggish.

Windows 7

There were demonstrations of Windows 7 at both PDC and WinHEC, and some of the technically astute reporters were able to get Windows 7 running on their laptops. The version of Windows 7 they distributed at both conferences was a pre-beta, and I have decided I am not going to try it; so all my reports are either from watching others including the conference demonstrations, or from what others have reported.

It was emphasized throughout the conference that Windows 7 is incremental: it's not the radical change that you got going from XP to Vista.

That said, I have seen more than enough to conclude that Windows 7 is the Windows you have been waiting for. It has all the good features of Vista. It's cool and sophisticated. It's also leaner and meaner and faster.

One of the reasons it's leaner is that they've left some of Vista out. I am told that removed features, like video editing, will be on the Windows 7 distribution disk, but as applications, not as part of the operating system. That sounds like a good idea to me. I don't much care for stuffing an operating system with large features you don't use too often.

Microsoft also made it very clear that any application that runs on Vista should run just fine on Windows 7. Application developers will not be wasting their time in getting their programs running on Vista. They said that in nearly every session. It was also made clear that Windows 2008 Server R2 (coming next year) has many of the features we expect in Windows 7, and many in the press began calling it "Windows 7 Server" although I am not sure I heard any of the Microsoft troops do so.

After seeing WinHEC and Windows 7, I see no reason to change my previous conclusions regarding Vista:

  1. There is no good reason to "upgrade" an existing machine from Windows XP to Vista. It ain't broke, so don't fix it.
  2. If you need a new machine, there's no reason to avoid Vista if it comes with the system. Vista had many problems, but they've pretty well fixed them. Vista works well, it's pretty, and you'll like its features once you get used to it.
  3. Memory and disk storage is cheap and prices are falling. If your XP system doesn't have 2 GB of memory and half a terabyte of disk storage, it won't cost as much to fix that as it would to "upgrade" to Vista.

Memory Sticks

The camera you have with you is the camera you'll take pictures with. I carry the Sony SuperSteadyShot DSC-T100. It fits in my shirt pocket, it's light, and it's easy to have with me.

The only problem is that not all USB memory card reader boxes support Memory Stick Pro Duo, which is what the camera uses; and for some reason Sony did not include a standard USB port on the camera. There's an output port, but it's part of a cradle mechanism that's tied to the Sony photo filing and handling system, and while I suppose there's nothing wrong with that, I don't use it.

This hasn't previously been a problem because I have a Kingston Media Reader, and it's not a big problem to take the memory card out of the camera and use the reader — at least it's not a big problem until I put the Kingston Media Reader somewhere and forget where I put it. Of course I tried other media reader boxes, and none of them would handle Memory Stick Pro Duo.

I have been mislaying things for years, but lately it's worse. Mostly I lose inexpensive things, and the solution seems to be that when something goes missing, I buy another. Eventually a sort of saturation point is reached, and after that I can always find a red marking pen, or whatever the missing object was. I realize this isn't an optimum solution.

In any event I figured it was time to get another media reader that understood Sony Memory Stick Pro Duo cards, so I Googled that. The first recommendation was Buy.com, and its first item was the Kingston Media Reader. I knew that worked so I ordered one. The next day, of course, I found mine in the brief case I carried to the DC/X reunion in August. Now I have two, so I put one back in the travel case.

Interestingly, I took a lot of pictures at the DC/X reunion, and I've taken a lot more since. I hadn't peeled any off this camera for one reason or another, so there were nearly a thousand. That probably doesn't impress most of you, but I grew up with box cameras that had 12 shots on a roll of film, and I'm gobswoggled.

Winding Down

It was a full month: we didn't get to the movies, and I had almost no time for games. Of course the game I would be playing if I didn't have so much work to do is Fallout III; some of my friends made the mistake of installing it and haven't been seen for weeks. That will undoubtedly be the game of the year. So far I have been able to resist, but I don't know how long I can hold out.

Book Reviews

The election is over, but the financial crisis remains. At the moment things are not as bad as they were during the Great Depression, but there's no guarantee that will continue: historically, employers have been among the last to jump overboard when the economic ship is going down. Most employers, particularly small business owners, really hate to lay people off. However, as the market collapses and people stop buying things, they eventually have no choice but partial liquidation of the company. (Note that government is almost never faced with such alternatives: government doesn't have to convince customers they ought to spend to buy its services.)

There will be many efforts by the new Administration to fix the economic problems; indeed, given that the election was relatively close, it's pretty clear that economic issues were a, and possibly the, key issue. In trying to fix economic problems it helps to know what has been tried.

The Forgotten Man: A New History of The Great Depression, by Amity Shlaes, columnist for Financial Times, does just that. Ms. Shlaes — actually she seems to prefer Miss Shlaes — has written a detailed and accurate history, much of it compiled from news stories of the time. Her style reminds me a lot of Frederick Lewis Allen's Only Yesterday. She isn't neutral on political questions, but she's neither heavily partisan nor vindictive.

Readers will probably be surprised at how little integration there was of the Roosevelt policies — and just what Hoover did and didn't do during those critical years after the 1929 crash but before the 1932 election. Roosevelt had a Brain Trust of advisors on economic and labor policies; they didn't agree with each other. Sometimes one, sometimes another would get the President's ear, and there would be policy changes. Roosevelt considered himself a pragmatist willing to try anything that works; but sometimes there was no chance for a policy to work because it was replaced before it could have any effect. This thrashing about did nothing to speed recovery.

I would strongly recommend that anyone trying to come up with a way out of the present financial situation read this book. It won't take very long. My copy is on my Kindle; there is also a hard bound edition.

The End of Prosperity — How Higher Taxes Will Doom the Economy — If We Let It Happen presents the case for supply side economic policies. The principal author is Arthur Laffer, of Laffer Curve fame. Laffer has always claimed to be non-partisan, and perhaps this is literally true; he certainly favored John Kennedy's tax policies. Of course he is best known as a Reagan advisor.

Laffer is passionate in his economic views. He writes clearly and marshals a great deal of evidence, much of it surprising because forgotten. Whether one accepts or rejects supply side economics, John Kennedy and Ronald Reagan both applied many of the same principles, and presided over favorable economic trends; it's worth understanding the theory before rejecting it. The book carries this statement by Robert Reich, one candidate for Secretary of the Treasury: "Frankly, I think supply-side economics is snake oil. But you should know how three of its smartest proponents try to defend it in this influential and important book." Despite some reservations I don't think supply-side is snake oil; otherwise Reich is entirely correct.

In a previous column I recommended Savage's Foundation of Statistics as being difficult but the best guide to understanding the limits of statistical inference. Several readers protested that Savage was too difficult, and I should recommend David S. Moore, Statistics Concepts and Controversies. That book turned out to be rather difficult to obtain: it's in print, but it commands textbook rather than commercial prices. Eventually I found a used 1979 copy that had been surplused by a community college library.

I agree that this is a very good textbook for those who have to learn about statistics and statistical inference. Alas, it contains the same problems as most such books: it has insufficient discussion of the hidden assumptions one makes when using statistics for prediction. There are two aspects to statistics, one mechanical and the other more abstract. The mechanics of calculating means and standard deviations, and determining the probability that certain event would happen by chance are complex although made much easier by modern computers. This is what most students learn, and what most scientists believe is the heart of statistics.

The abstract aspect is less studied, but can be far more important: it's the problem of choosing a model, and determining how much confidence one should have in one's inferences from that model. Note that this is quite different from what most statisticians mean by "confidence limits". Those are mechanically calculated from the model itself. They say little about whether the model is appropriate.

I've discussed this at length in other columns and reviews and I won't repeat all that here. Moore's work is a well written standard textbook. It's very good, but I wouldn't think it worth the high prices it commands as a textbook. In fairness I have to say that I find few textbooks worth what students are forced to pay for them. High textbook prices are usually justified by pleading that the market is small, and authors and publishers deserve compensation. While that may be true in specialized subjects, statistics is taught nearly everywhere. If you have to buy a textbook, you have no choice; but if you're trying to learn the subject on your own, I'd recommend the O'Reilly book reviewed last month.

As the economy deteriorates many computer users will have to find new sources of income. Some will try freelance writing and programming. Those who do will need Intellectual Property and Open Source: A Practical Guide to Protecting Code, by Van Lindberg, O'Reilly. It won't be easy reading, but it's thorough.

Another specialized but important book is A Digital Photographer's Guide to Model Releases by Dan Heller (Wiley). The law on privacy, copyright, and trademark is complicated, and it's very easy for a photographer to get into trouble. For example, Hearst Castle is trademarked, and when you buy a ticket you actually accept a contract not to take photographs for commercial purposes. Heller found this out when a magazine offered him a fairly large fee to photograph certain Hearst Castle features; the offer seemed too good to be true, and it turned out that it was.

If you take photographs to sell, you need this book. It's well written and exhaustive.

Augmented Reality, A Practical Guide, by Stephen Cawood and Mark Fiala, (The Pragmatic Bookshelf, an O'Reilly imprint) is a brief and mostly technical introduction to modern techniques for creating virtual reality. Technically Augmented Reality blends real world objects with computer generated objects in such a way that you can't tell which is which. The illusions can be done badly or well. Done well, the effect can be startling.

This book gets technical fast; it helps to have at least some familiarity with C++ and C#, although the examples are fairly complete and there are pointers to a full SDK. I'd take the book's claim that "all you need is a computer, printer, and a webcam" with a bit of salt, but it's true enough that given determination you can manage to produce some remarkable effects.

As the paperback book market collapses and eBooks become more important, the ability to augment an electronic book with effects not available in printed works will probably become important: it won't be enough just to think up and tell a story, authors may need to do a good bit more, just as itinerant story tellers in the old days found it worthwhile to learn to play the lute or some other instrument. The ability to include Augmented Reality presentations may be important for future sales.

I am no Ubuntu expert. I'd like to be: Linux looks like fun, and many of my advisors have long since abandoned Windows and even Mac OS to go over to one or another Linux distribution. The distribution I hear recommended most is Ubuntu.

If you do get Ubuntu running, you will probably want Ubuntu Kung Fu, by Kier Thomas (The Pragmatic Bookshelf, an O'Reilly imprint). It's a book of tricks and hacks, some to make Ubuntu more useable, others just for fun. Many of the tricks appear to be well known among Linux users. Others are less well known; and of course if you're just getting started and are not part of an active and informed users group, you may not know any of the arcana and lore. As an example, there's a way to have a fish named Wanda swim across your screen. More practical is the technique for creating an encrypted file store accessible from any operating system.

There's a lot more about Ubuntu, and if/when I get time I want to learn it. This is another book I'm keeping.


Some notes from my day book. The following expand my report on difficulties with MobileMe.

This was not the first problem I have had with Mobile ME. I have also had a number of reports of problems. Apple is hoping to use the cloud for synchronization of several Macs; a noble goal, perhaps, although with gigabytes of stuff it may be harder than you think. In my case, I turned on Khaos, and she whammed me with requests to log on and synchronize; but then Apple rejected my password. That sent me to looking up just what I used as a password.

For those who don't know, the only real memories I have of events from last January to last August are those recorded in log books. If it wasn't logged or in my daybook, I may or may not be aware of it but I probably am not, and I may or may not recall it if reminded of it. During that time they used 50,000 rad to burn out a growth in my head. That worked, the tumor is gone, but so are many of the memories. I have a vague recollection of some very pleasant people at Kaiser Sunset and the X ray treatment center, and of using Khaos to write while waiting in their waiting room, but not a lot.

Since both iMogene the iMac 20 and Khaos the MacBook Air were set up in that time period, I wasn't at all sure I knew what password I had used, but I was sure I had written them in my log, and sure enough, there were both the system passwords and the .Mac password.

This is just as well, because at one point I told the truth about the secret questions, and Peter Glaskowsky was able to access my account by guessing answers. Much of the data was, after all, public. So I now lie — only in this case I have no idea what lies I told about where I was born and where I lived in my youth and such like. I do know the password and can log on; now I need to go change those questions. I'll have done that before I publish this in any case. The procedure is to go to me.com and fix it from there. Of course Khaos insists that it's part of .Mac and not Mobile ME, but that doesn't seem to matter. Apparently some updates that were routine for MacBook and Pro and iMac — they all think they are Mobile Me as shown in Apple | System Preferences — have not yet taken effect on Khaos the MacBook Air. So it goes.