Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor: August 21, 2006

The User's Column, August, 2006
Column 313, part 3
Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2006 Jerry E. Pournelle, Ph.D.

Continued from last week.

Anne McCaffrey, Todd McAffrey at WOTF Awards
Dinner before the ceremony. Lifetime Achievement Award winner Anne McCaffrey and her son and sometimes collaborator Todd McCaffrey.
Col. Rick Searfoss and Dr. Laura Brodian-Freas
Colonel Rick Searfoss, USAF Ret., astronaut, Chief Flight Engineer and Test Pilot for X-COR. Also Dr. Laura Brodian-Freas, widow of Frank Kelly Freas, a previous winner of the Lifetime Achievement Award.
JEP giving his acceptance speech
My acceptance speech for the Lifetime Achievement Award at the 2006 Writers of the Future ceremonies. That's Niven watching off to the right. He's already given his speech.
Niven and Pournelle pose with their awards
Niven and I pose for pictures. They called us the "brightest binary star in the science fiction galaxy." I can live with that.

I have been down at the beach house all this week. Entropy runs fast near the beach, and we're on Mission Bay, with the bay in front of us and the Pacific Ocean two blocks on the other side and to windward. We had some major maintenance on the telephone system, and major replacements in the bathroom. We also got Time Warner Roadrunner Cable Modem installed and connect through wireless using a D-Link Router. I've used this setup for a week with absolutely no problems. It's fast, it has been reliable, and it was a breeze to set up. For the gory details, see last week's column.

Writers of the Future 2006

This year's WOTF awards were held in the San Diego Balboa Park Air and Space Museum, which was a marvelous place for them. This year the officials chose to present two Lifetime Achievement Awards, to Larry Niven and Jerry Pournelle.

There will be far better photographs of the awards on the Writers of the Future web site. These were taken by Mrs. Pournelle using the Kodak 570 pocket camera.

Multiple Cores and the Future

Long time readers will remember Rich Heimlich, who knows more about high end game sound systems than anyone. He's the author of The Official Sound Blaster Book series. Rich spent the week at the Microsoft Gamefest Conference. I won't try to summarize that, but there were a number of items of interest. Conference details haven't been posted yet but are promised: check here.

Most of the conference seemed to emphasize XBOX-360. In particular, developers were told that all sound processing would be done by CPU. The "sound card" is over. This isn't terribly surprising for XBOX development, but Rich began to wonder why there was almost nothing about PC games and game development. Microsoft isn't stupid enough to abandon that market.

Rich is a sound expert and spent most of his time in the audio conferences, so it took him a while to realize that they were saying the same things about video processing: special chips were no more. It will all be done in software. This is possible, because the XBOX 360 has 3 dual core CPU chips - or 6 CPU chips, all told. Proper threading can devote an entire CPU to video. Or two, if needed. And another to sound. And still have computing power for the rest of the game.

Now all this makes sense for XBOX 360 games, but after a while it became obvious that Microsoft intended their PC Games to work this way too. Microsoft support for Video Cards and Sound Cards is a thing of the past: in future, if you make those products, you are on you own.

It will not be long before your brand new high end Video or Sound Card flat out will not work in Vista; not unless you've done a lot of hardware development work on your own. That explains the AMD purchase of ATI. The high end video card game is played out. For ATI and nVidia the future is mapped out nicely by what happened to Creative: larger and larger market share of a smaller and smaller market. Out in the real world, probably 65% of the high end gamers have Creative sound cards, but only about 5% of the PC's in the world use them. ATI sold out while there was still something worth selling.

The future is coming fast: the 24 core system. As many cores as needed can be devoted to sound and video software, with plenty left over for everything else; and this future is not ten years away, but more like three or four.

In future every system will assume that you have a 5.1 sound system, and that it will run off CPU power. If you plug in phones, the system will detect that and adjust itself accordingly. It will all be done in the processor, and the need for a "sound card" will vanish. At first this will eat the "just good enough" sound cards, but it won't be long until the programmers will be able to do, in software using CPU power, everything that has been done in hardware - and when there's a need for a change, the only thing that changes is software. No need to update your sound hardware except in the one place that really counts, in the speakers and headphones; and those last forever.

And - it will be that way in video as well. Today's high end gamer typically buys a $600 video card every eight or nine months. For less than that he can upgrade his PC to add more CPU cores and get better results. When the system can devote one or two or three CPU cores to video processing, and another to Outlook, and another to word processing, and have a dozen left for other assignments, the game for specialty hardware is over.

The whole industry benefits when it's all done in the processor. Well, all of the industry but companies like Creative, and ATI, and nVidia. ATI has seen the light and sold out while there was something left to sell. Whether nVidia sees the light isn't clear, but either way their future is likely to look like Creative's recent history. "They can become the most effective buggy whip makers in the world, and corner the buggy whip market," Heimlich says. But the future belongs to software.

That, at least, is what Microsoft sees coming; or that's what Rich Heimlich has deduced that Microsoft sees given their presentations in the game conference. He then consulted with sources who, after they saw where he was headed, told him they'd already said too much. For more details see Rich's blog at www.pcserenity.com.

It sure makes a plausible theory.

Bob Thompson says

It doesn't seem plausible to me. Intel is hiring graphics engineers, apparently with the intention of making their "extreme graphics" live up to the name, at least more than they do now. I don't disagree that standalone video will become more and more a niche market, but it's because integrated video will be Good Enough, not because video processing will shift to the general-purpose system processor. If anything, the trend may be in the opposite direction. I'm reading recently about physics coprocessors. I don't deny that a general-purpose system processor *can* do video, but it's much less efficient for the purpose than dedicated silicon. The Cell processors are a different story, at least if I understand it correctly. They're less general-purpose processors than morphable processors that can be reprogrammed to function as special-purpose processors.

I am not at all sure of this: my point is that integrated video is good enough, and it's easier to change software than hardware. Take away Microsoft hardware support for specialized chips - at least for new specialized chips - and you have to develop not just drivers, but possibly new sound engines for your new chips; all that for performance increases that few will notice. It depends on just how good "good enough" is when there's plenty of CPU power to do it all in software. It sure looks as if Microsoft is betting this way. Vista runs Direct X 9 (in emulation, actually) but that's the end. Yet we're sure Microsoft won't abandon the PC games market.

Peter Glaskowsky tells me that,

As a trend projection, this is all wrong. CPUs, no matter how many cores, will not soon replace GPUs in gaming systems.

Graphics processing is fundamentally different from general-purpose processing. It requires a different processor architecture and a different memory architecture.

There is a practical maximum to the size of a single chip in a PC-- around 200 mm^2 today for a CPU-type device, and slowly increasing. It's slightly larger for GPU-type devices because their circuitry is less dense and has more inherent redundancy.

For now, and into the foreseeable future, it will continue to be more efficient to dedicate one chip to general-purpose processing and another to graphics processing. This is better than two identical chips with some mix of CPUs and GPUs for several strong reasons, including:

Audio processing is a third distinct kind of processing, so the optimum audio engine will use a third type of processor (the DSP), but it doesn't take much of a DSP to handle all the audio processing anyone needs. Audio processing can be handled by a small part of the CPU or GPU, or by a small DSP tucked away inside some other chip, as in NVIDIA's chip sets.

Today's GPUs are perhaps 100 times faster than an adequate audio DSP, and GPUs for gaming won't stop evolving until they're at least a thousand times faster than they are today-- and I could make a good argument for saying "a million times" instead. Even at the unusually rapid pace of evolution in GPUs, that amounts to 15 to 30 years of progress. So the lessons of audio processing just don't apply to graphics processing for gaming.

Now, let me emphasize that last word a little more. GPUs have never been needed for systems without serious 3D rendering applications. Most systems sold today don't even have anything you'd call a GPU because Intel's integrated graphics processors aren't programmable and run very slowly.

So all we can say about this "trend" is that the status quo will continue; most systems will continue to not have GPUs. The fraction of GPU-less systems will increase for a while until process technology cuts the incremental cost of a simple (but real) GPU below two bucks, at which point Intel will probably stuff a GPU into everything they sell just so they can say they have one, even though most customers won't use it.

It all comes down to just how good "good enough" is. Peter is generally right about these things, and he certainly knows more about chips and chip designs than I do. I've always known that dedicated hardware is always faster than doing stuff in software, but this also reminds me of the early days when the big controversy was between dedicated word processors, such as Wang, and general systems such as the S-100 bus and CP/M. One of my early successful predictions was that going with a general purpose system was the right way to go. This may be a horse of a different color.

We all know that dedicated video processors will always be faster than general purpose hardware. We also know that software is more versatile, and it's easier to change software than hardware. We also know that Microsoft seems to be telling designers that the rules are changing, and much of the support for specialized hardware will no longer come from Microsoft.

Peter Glaskowsky, who, as I said, certainly knows more about chip hardware than I ever have, thinks

"... this is not a subject that requires discussion. There's no need to "discuss" an opinion that graphics processors are going to disappear from our systems during the timeframe of Windows Vista. It's just wrong."

Put that bluntly I'd have to agree. Certainly graphics processors are not going to vanish entirely. On the other hand, CPU power has increased beyond almost anyone's projections. The question is, how much market share will specialized video boards capture as opposed to using software and multiple CPU cores to do it all in software? Few in the industry predicted the rapid encroachment of software systems on dedicated sound processing, or the fate of Creative in the last few years.

We'll just have to see. Captain Morse suggests that Intel is looking two years ahead, while Microsoft is looking ahead ten.

The Era of CPU Plenty

Martin Winston says

It takes additional horsepower to drive lossless audio - not enough to call for a second chip, but enough to call for a faster/deeper chip - but that's a factor for only about 5-10% of the user base with ears attuned enough to discern the difference. The place that will need significant additional audio DSP horsepower - and some radically different algorithms - is in (separate, not to be confused) voice and speech recognition.

Then, well beyond audio, multiple cores might begin to allow inroads in optical/visual recognition (both for the user space and for imported or received images), in visual pattern recognition (find all photos with the dog and the daughter but not the cat or the son) and in improved biometrics

And, of course, you may want to save a core for calculating trajectories when hurling stones from the moon.

VOIP: Motorola has been demonstrating (still awaiting several necessary approvals before it will be available, but that may still happen in '06) a VERY interesting little box that includes the following: cable modem, WiFi router, wired Ethernet router, 4+ hour UPS & 2 VOIP lines for POTS (plain old telephone service). If you have 2 lines at home now, you don't need special phones - disconnect them from the phone company, plug them into the RJs on the back of this box & all the phones & extensions work on VOIP. And something makes this even more interesting: unlike the latency-plagued public IP network that most VOIP signals travel, there's a private IP network the cable TV systems use with close to zero latency - audio quality goes way up with none of that crude bounce-me-off-a-satellite tonality or echoing that are typical of most VOIP calls. I've been nagging Moto forever to get me one; by the time they can, I'll probably also be able to get one from Time Warner.

We've been speculating about CPU Plenty since WinHec 2005, and it now appears that the only thing we got wrong was the timing: the era will arrive a great deal faster than we thought. Intel is already talking about 8-core chips. A bit of digging on the Internet will show pdf files with graphs showing efficiencies and computing power vs. power consumption and heat generation for multiple chip vs. multiple core systems. It's pretty clear to everyone that CPU cooling is generally more efficient than trying to cool video cards. And so forth.

Everything points to a future of CPU plenty, which means less and less need for specialty chips. Those were required when using the CPU to do these jobs slowed the whole system down to a near halt. The prime example here is Microsoft Flight Simulator, which until recently did the whole job itself and thus delivered far fewer frames per second than its rivals. Flight Simulator now uses Direct X hardware acceleration. (This link mentions "Improved support for 3D graphics hardware acceleration in multiple windows and across multiple monitors.") Still, imagine the old style Flight Simulator on a 24-core system.

There's another implication of CPU Plenty: Apple is back in the game. Apple makes some of the coolest hardware around. It's expensive, but not greatly so compared to what high end gamers have been spending to stay on top and keep bragging rights. Apple system will already run Vista at quite acceptable speeds. Now imagine a 24-core Apple system, able to run the highest end PC games on Vista while simultaneously running Apple OS, looking after your mail, listening for Google alerts, managing your phone calls, doing VOIP (a whole CPU devoted to nothing but VOIP when that's in use), and all this on those cool Apple monitors. The Apple side of the system handles all your iPods and whatever comes after iPod including video iPod. It's all fast, it runs everything, and it's way cool.

That's one view of the future, anyway.

We'll continue looking at the coming era of CPU plenty both here and in the mail bag.

Cheers for IBM Thinkpad

I've been using the IBM (now Lenovo) T42p Thinkpad as my only computer all week. It has performed splendidly. Once in a while piggy old Outlook can slow it to a crawl for a few seconds, but that doesn't last.

The computing setup at the beach house
The setup at the beach house. IBM Thinkpad, ViewSonic screen, Microsoft sculpted Wireless Keyboard and Mouse. Lisabetta the TabletPC gets a rest, but I took her to the writers meeting at the Carlsbad Hilton where she was her usual sensation. Hilton wireless connections worked well, so I could do real time on-line research during my presentation.

My setup here includes the 17" Viewsonic flatscreen monitor we've mentioned before, and the Microsoft Wireless Keyboard and Optical Mouse. I'm also using a Belkin USB 2.0 Port Expander with its own power supply to avoid loading the Thinkpad power system. That supplies plenty of power for the adapter that charges my cell phone, as well as the wireless units for the keyboard and mouse. I've become quite fond of this setup, and I use it at home in the Monk's Cell when I'm doing fiction.

InBoxer

I had never installed InBoxer on Orlando, the IBM T42p ThinkPad, because in the past I have used Lisabetta my TabletPC for downloading mail at the beach. Until this weekend we've been at the mercy of dialup, so Lisabetta being a bit slow didn't really matter: I was resigned to simply connecting up to download mail and going off to do something else while Outlook ate all my CPU cycles.

We now have Time Warner Roadrunner, and it works quite well; but I noticed that it took darned near as long to deal with mail with my fast PC and high speed Internet connections as it had with dialup and the TabletPC. The difference was spam. I get a lot of spam. I hadn't realized just how much I get, because InBoxer does such a good job of sorting it.

The bottom line is that I installed InBoxer on the ThinkPad, and all is well again. InBoxer is a Bayesian spam filter that learns: I already had a bunch of rejected mail as well as acceptable mail, and I let it train itself on that. It takes InBoxer a day or so to learn just what I want and don't want - a lot of press releases I want and need look like spam - but once it has learned, it does its job so quietly and efficiently that I don't even notice.

My first day InBoxer sorted over a hundred messages into the "Reject" bin, and some twenty messages in the "Review" box. Of the latter I found five or so that I marked "Keep" and sent the rest to Reject. Next day it was ten of which I kept two. Today is about the same. Every now and then I do a random survey of the InBoxer Reject folder, but I have never found that it rejected something I wanted to keep. Meanwhile, Microsoft JunkMail rejects a pile of stuff, but I have to go through each and every one of them. It invariably junks half a dozen wanted mails, some of them in no way resembling junk, and I have to manually pass and whitelist them.

I have used a lot of spam filters, but InBoxer is the best of the lot: easy to install, easy to use, and very efficient. Highly recommended.

Bump Keys and Locks

This was originally prepared as part of the Defcon report, but got displaced by other material. It covers a subject of some importance.

Dan Spisak reports:

One important part of Defcon was the "bump keying" talk given by Matt Fiddler and Mark Weber Tobias. You can find information on their talk and research at their site at http://www.security.org/. Bump keying is a physical security issue for the majority of tumbler locks in use in the US (Europe has a head start on addressing this).

The short version is, you take a key blank for a lock and have it ground down to a 999 style key (the number refers to how deep the groove cut in the key is, with 9 being deepest). Using this kind of key and a mallet it is possible to apply torque to a key while rapping the end of the key with the mallet to cause the tumbler lock to turn and unlock.

This technique if done properly can leave no evidence of B&E inside the lock. No signs of forced entry can mean issues with getting insurance companies to cover theft losses. The DEFCON presentation showed how you can bump key into a PO box at the USPS offices and UPS/Mail Boxes, etc., boxes with ease. They alerted the USPS to the issue and the Postmaster General agreed and is planning to replace *all* locks in use to address this concern. They also alerted UPS/MBE and their response from their security people was "This is important" but their PR people then said "We know our customers" and decided they will not be changing their locks to address this issue.

Other issues addressed in the talk were how a set of 999 style bump keys can be sold and bought and transported via USPS legally under the current USC laws pertaining to transport of lockpicking tools, as keys are specifically excluded in the current acts. Right now, I can go online and buy a set of bump keys that will unlock about 90-95% of all tumbler style locks for $45 (see this page on the Hawley site). For a formal treatise on how bump keying works check out the linked PDF file below. In the meantime, if you want a lock that is resistant to this technique you need to look at the Medeco M3 locks or Assa. It comes down to this: spend money on better locks, and learn about this threat, or be vulnerable.

PDF of talk here:
http://www.security.org/bumping_040206.pdf

I would add that most of these techniques have been known to government agencies and a certain segment of the criminal community for quite a while; but the computer revolution made it inevitable that the information would become widespread. Welcome to the Information Highway. Don't be road kill.