Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor

Mailbag for October 9, 2006
Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2006 Jerry E. Pournelle, Ph.D.

October 9, 2006

Some observers have noted signs that Microsoft seems to be preparing for on-board graphics; see this column, August, Part 3. One speculation was that the GPU chip would be replaced by software running on a dedicated CPU core. That speculation was discarded by most of the Chaos Manor associates.

Continuing the discussion:

Developer Remedy shows off quad-core gaming goodness:

http://arstechnica.com/news.ars/post/20060929-7868.html

Note the middle paragraph where it mentions one core dedicated entirely to sound. One is also dedicated to FEEDING the GPU. So, how many more before the GPU isn't needed?

We're getting there.

Rich Heimlich

Peter Glaskowsky nails his point home:

We are NOT "getting there." Running that demo as a software simulation simply wouldn't be possible. No known algorithms would allow some of the critical portions of this work to be spread across a large number of processors. Even if algorithms could be found, the communications overhead would keep the simulation from running in real time.

I was at this keynote and I was very impressed with the demo, but GPUs are not just fast CPUs. They are fundamentally different, and there will be a need for both kinds of devices until decades from now, at least.

.     png

I was already convinced, but I did say "GPU is certainly a lot more efficient; but when is a dedicated CPU core good enough?"

Eric answers:

Not any year soon, I'd think. There is a vast difference between keeping the GPU fed and doing the GPU's job. Meanwhile, the GPUs are getting more capable and taking on roles formerly reserved for CPUs. For instance:

http://www.tgdaily.com/2006/09/29/folding_at_home_to_use_gpus/

Now, this is using high-end ATI cards but those will be pretty cheap in a couple years. Will $100 worth of x86 cores be comparable in that time frame for performing graphics operations? I doubt it. The will remain better at tasks that fall within its narrow range and the CPU will be supremely versatile.

The GPU is only going away if your needs are modest to the point of self-denial and you've nothing better to do with extra cores in the CPU. I really hope the industry comes up with better uses for those cores than that. There is the continuing fallacy that there is some magical border where graphics need never be any better. Aesthetic standards grow in their demands as the technology and costs allow. Just because you're many-cored CPU can use a slice of itself to provide what a $300 video board delivered a decade ago, that doesn't make it a worthy application to save the cost of what is now $5 worth of video silicon.

Eric Pobirs

Which convinces me entirely: the GPU will be with us for a long time. There remains another question: Are really spiffy on-board video processors the wave of the future? Will most systems come with "good enough" on-board graphics and thus cut the need for high-end video cards? It would certainly make life easier for game designers. I asked Rich Heimlich.

I'm really wondering now..... I've seen a few more people start to ask the same questions so we'll see. Should be a very interesting couple of years.

Rich Heimlich

Which may not settle the question, but it does give us something to think about.


Last week I wrote :

Possibly one reason Vista blew up on Satine was that I did not have a Vista audio driver. I kept getting messages to that effect.

Brian Bilbrey thought about the implications of that and said

My thoughts:

If I were installing the current Vista build, leaving aside debates over RC vs. Beta vs. Alpha, I would NOT update against a currently installed XP. You can have Vista trying to make use of drivers for which no Vista provision was made. You could have toolbars, antivirus and antispyware that try to run in the new Vista security context, only they weren't designed for that. Testing this Vista-whatever build, think clean slate and only clean slate.

Before starting, I would go to the sites for the Motherboard and Video Card at least, and look for drivers that explicitly support the build of Vista that you're experimenting with. Get those, unpack and put on CD, or whatever the site instructs, by way of preparation.

Pull out the XP disk from the system.

Put in a clean (or wipeable) disk. Remember, clean slate.

If you've got an Asus motherboard, you can go into the BIOS and disable all the nvidia special RAID crap. The drivers for that aren't going to be stable, I bet. I don't have YOUR bios in front of me, but the last two Asus/nVidia/AMD motherboards I configured allowed me to have the motherboard either use nVidia RAID, or use the disks in some "legacy" mode, if I recall correctly. Do that.

Install Vista, allow it to own the disk.

That's my best advice, that's what I'd do.

I wouldn't expect a Vista overlay on XP to work properly until actual GOLD release, if then. Possibly not until SP1, when they've updated against enough real-world customer systems and gotten the feedback about what broke.

If this doesn't work again... well, for next week, you can put Vista aside and give Linux its 4 weeks in the sun.

best,

.brian

Install fresh: good advice. Leo LaPorte told me the same thing. Of course this is Chaos Manor, where we do lots of silly things so you don't have to, but I think that question is settled: don't try upgrading XP to Vista. If you get the new Vista RC 2, install it on a clean disk.

And it is about time to give Linux a chance.


Continuing the discussion on what we will do with all the new computing power:

You have mentioned a few times recently what use the general user may have for all the power that is now becoming available in the PC world.

I work as a software developer for a stock-broking firm, and am using Visual Studio 2005 and .NET. I find that my build & run cycle is still far from instantaneous; it's far better than the bad old days, but I can still wait a good 90 seconds to do a full rebuild of our main application plus another 30 seconds or so to launch it in the debugger. An incremental build is perhaps only 5 seconds or so in general, but that is still a noticeable wait to launch the application for debugging. So I could really still use all the extra power I can find. (My workstation is a fairly recent dual-core Pentium.)

In my private life I am a keen amateur photographer; and that means using one of the modern digital SLR cameras, shooting in RAW mode and doing post-processing on the PC. A lot of the raw conversion and image processing software can be batch driven. Some of the software I use can take a good 30-45 seconds per image to process. A batch of 100 can take more than an hour. So I could really use some extra power there too.

After conversion an 8 Megapixel RAW file (and most cameras are now weighing in at 10Mp - my next upgrade will be to a 12.8Mp camera) generates a 24 Megabyte TIFF file. Processing that in Photoshop with a few layers, some interpolation, etc. and one can easily end up with a 250Mb file and topping out at 750Mb of RAM used for the history buffer.

So for Photoshop work I shall want my next PC to have at least 2Gb of RAM, preferably more with all of it available to the applications not the 2Gb XP limit.

Amazing as it seems I still spend an awful lot of my time waiting for my PC to finish what it's doing.

Thanks for keeping up what is still the most interesting site on the web.

Regards, Craig Arnold

Thanks for the kind words. And on the same subject:

I was meaning to write to you about two things I really want my computer to do but cannot do yet.

1. I use some big programs like for instance Logos Bible software which currently contains about 6.5 GB of books (I add on extra from time to time). It takes some time to do complex searches, less complex can take 2-3 minutes (3.0 GHz Pentium D, 2GB DDR, 150GB hard drive). I am waiting for a 64 bit computer where I can load the whole application and library into memory. I think I should have 16GB to 32GB of memory to load multiple apps in memory. Wasn't that the promise of 64-bit? What do you think, 2008, 2009?

2. A flash drive for speed and security. I know they are already working on this but I want more functionality. I want to able to load my whole operating system into a flash ROM which would be added to all PC's.

But at the end of the ROM code I want some kind of pointer to point to a single folder on my hard drive, so that any additional updates would go into that folder and be loaded from the hard drive. So for instance MS needed to give me a patch or update it would go into the folder, I would reboot my computer, the flash would load and then, the folder on the hard drive would load. That way I could make sure the patches worked, if not take them out of the folder. All rootkits and viruses that manipulate my OS would have to go into the folder instead of flashed into the ROM. I could eliminate all or check to see if those files should be there (in the folder). Of course this would slow things down by loading things from the hard drive but well worth it. Plus the functionality of being able to add the files from the folder into my flash (if I was really sure). But also being able to dump the contents of my flash into a different location on my hard drive (as a back-up purpose before I add the files in the one folder to my flash permanently). I think this would eliminate rootkits, viruses, bad patches, and junk you don't want.

Just two things I think I would like.

Michael Scoggins

I already have Migo and various Kingston flash drive programs that allow me to have my own tools and files on a computer I am visiting, and leave no footprints when I leave. I can carry a great deal of my work in my pocket.

When 64-bit systems are common, and memory is cheaper - both inevitable - the need to go to spinning metal during searches will be just about nil. I expect that disk drives will be used for non-volatile storage, but when we start up a system, the files, indices, application programs, and everything else we use will be loaded into memory, and stay there until it's time to log off; at which time all the changes will be written back to disk. As to when that will happen, it will be sooner than you think. Silicon really is cheaper than iron...


Subject: Many-core computers and new programming techniques

Hello Dr. Pournelle,

I was struck by Peter Glaskowsky's comments (in your October 2 mail column on Chaos Manor reviews) on the usefulness of computers with more than two CPU cores. This is a topic that's been on my mind recently as well, but from a different perspective. I've been studying the Haskell programming language, which, because of various aspects of language design, is trivial to parallelize. One user recently got the parallel version of the language running on a 32-processor Sun server. Because parallelizability comes pretty much for free as a consequence of the language design, any Haskell program could use as many processors as are on the system, provided the particular computation being done can be parallelized.

Now, Haskell in particular may or may not be the wave of the future -- it's still largely a research language, although there was a job posting recently for Haskell programmers to write security software -- but programming in a language that lends itself to parallel processing is one way to take advantage of many cores for more than just specialized media software.

(Haskell actually wasn't designed from the ground up as a parallel language; it was designed as a pure functional language, where code had no side effects. It's just that if you have a guarantee that a function isn't going to do anything besides what its type signature says it does, it becomes very straightforward to parallelize. It also makes functional programs easier to understand and eliminates a whole class of bugs.)

-- Grady Lemoine

I am not familiar with Haskell. Niklaus Wirth's Modula-2 and his later Oberon programming languages were designed with multiple processing in mind. I have always regretted that Wirth's languages did not win out over C and it's derivatives; I think we would have far fewer bugs and security problems if we had highly structured languages with strong type and range checking. Wirth's languages produced programs that were harder to compile because the compiler caught errors - but once compiled they tended to do what you had intended for them to do. Unlike C, which in its pure form will compile nonsense. Ah, well.


Subject: Grisoft

In this week's Review, you bemoaned AVG Antivirus' interface and overhead. The TweakGuides Tweaking Companion from www.TweakGuides.com, while mostly of interest to enthusiasts and gamers is a very thorough treatment of Windows configuration; recommends, and has a section on AVG Antivirus. The author recommends not having any of AVG's background protection options enabled, and describes how to disable those. You may want to see if this configuration works better for you.

.mg

I will look into it at some point, but so far I have found Microsoft's One-Care more than adequate.


Subject: 64-bit drivers.

Jerry,

I recently assembled a new computer (AMD dual processor, ASUS motherboard, etc.) from newegg.com. It's running 64-bit XP Pro, and there's a definite scarcity of drivers there too. For example, my HP700 all-in-1 didn't have drivers on HP's web site and wouldn't work with what was in the OS, so I bought a new HP5610 all-in-one. HP at least had beta drivers for 64-bit, and these seem to work fine.

A more serious problem is that neither Norton nor McAfee had any personal level antivirus program for 64-bit. Norton did have one at the corporate level, for 3 or 4 times the price. I looked around and found some decent reviews for avast! antivirus software, so I bought their home edition ( www.avast.com ). It seems to work as well as Norton ever did; however, I try not to go places that will stress it.

Bottom line is that unless you are a business, there seem to be some evidence that you aren't expected to run 64-bit. YMMV.

Thanks,

Steve Nelson

Agreed. I think 64-bit is the wave of the future, but that wave is still fairly far off shore, and will stay there until the driver magicians begin to work their magic. As usual there's a chicken and egg situation for the moment.


And finally, instant messaging is very much in the news. Some IM programs save everything by default. Others require you to cut and paste into notepad or some other text processor. I don't use Instant Messaging, so I have to rely on others to tell me about it:

JERRY: I work these days mostly on electronic design automation (EDA) software - i.e., used for chip design, currently at Synopsys, Inc. before that at Cadence Design Systems - both in Silicon Valley.

At both companies IM is used on a daily basis albeit not necessarily with the blessings of the IT department.

At Cadence it was MSN Messenger. At Synopsys it's Skype with colleagues in Yerevan, Armenia, in Shanghai, and working a various sites in the US.

Meanwhile many friends have used Yahoo Messenger for years.

IM text chat has the immediacy of a phone call with some advantages: (1) you don't disturb the people around you with the conversation and (2) you have a record of what happened - just as with email.

IM also provides (unless you work to defeat it) "presence" - knowledge that the other person is present and online. It lets me know when a friend in London is back at his flat in the evening or that one of the programmers in Armenia is in the office.

One interesting side effect of having used IM at several different companies is that I can easily keep in touch with former colleagues in spite of changing email addresses.

By the way, in every IM system I know you can disable message logging and you can defeat the presence aspect by making yourself invisible either globally or to selected contacts. You can also control who is allowed to invite you to be a contact and whether people who are not contacts can send you messages.

Just as with the telephone it requires discipline to know when to interrupt people and when to let oneself be interrupted but I find IM pretty indispensable.

I've been in the computer business almost as long as you - I'm 58 and first got paid to write software for the Space Science department at Rice University in 1967 and 1968.

I've seen a lot of things come and go but IM really is a new "paradigm" that works very well for a lot of people.

My $.02

Just don't get me going about cell phone text messages (SMS) the lifeblood of the teen and twenties set!

;-)

Cheers,

LAURENCE C. BREVARD

Thanks!