Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor:
Catchup Mailbag - Fall 2008

Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

November 27, 2008

2008 Fall Mail Catchup

I get a lot of mail, much of it interesting. I select some of it for Chaos Manor Reviews. Other mail goes into Chaos Manor Mail over on my View From Chaos Manor web site but that tends to be about matters other than computer technology.

Every time I do a Chaos Manor Reviews mailbag I select from mostly the latest mail, or the most interesting at the moment, but there is often a lot of really good mail left over. I've decided to try four catchup mailbags every year. This one is 2008 Fall; there will be a 2009 Winter, and then 2009 Spring.


Responding to a point in your Oct. 16th mailbag

>I'll close this with Alex's final comment:

"If only Groove worked on OS X, I might be able to try it..."<

http://www.collaber.com from Hyderabad has a Mac version as well as Linux and Windows.

I've been "using it" during their free development for more than a year. It is currently working through a server in Hyderabad, but the intent may be to have the nodes run P2P.

There is also www.collanos.com from Germany.

Regards and appreciations,

Joe O'Laughlin

Haven't tried that; I'll see once I get some fiction finished. Thanks.


Mac Office 2008 and keyboard shortcuts

Dear Jerry,

I'm very pleased that your symptoms have mostly subsided, and look forward to hearing that you are regaining some energy.

I noted with concern your remark in the June 3 column about reverting from the latest versions of Word. While I won't comment on PC Office 2007, as I switched to a Mac in December 2007 without ever trying the new version of PC Office, I would strongly encourage you not to abandon Word 2008 for the Mac. In the latest version of Mac Office, Microsoft has fixed more than 98% of the issues that afflicted Word in Mac Office 2004, about which I wrote to you in February. Most importantly, graphics conversion is now essentially flawless and of course everything runs smoothly as it is no longer working under Rosetta. There are a few residual annoyances in PowerPoint files converted from PC versions, but these are minor. Whereas I was previously working in Office 2003 within Parallels to cope with file conversion between PC and Mac (and you are apparently considering this approach as a fallback) I find that I hardly ever need to run Parallels any more (what's more, the fix that will finally allow EndNote to work properly with Word 2008 is due this month).

The saga described in the linked column at this site by the person who wanted to make Command-G execute a "repeat find" may well illustrate a lack of will on the part of Microsoft to comply with Mac conventions. However, the difficulties he experienced while attempting to fix the problem were, in my opinion, simply a consequence of not having the right tool for the job. I am now an enthusiastic user of Keyboard Maestro, a low-cost keyboard automation utility from Stairways Software at http://www.keyboardmaestro.com/ which you can try before you buy. Setting up a Command-G "repeat find" using this was trivial -- it took me a little over 30 seconds and works perfectly, as the macros in KM take precedence over any other key bindings.

In fact this excellent piece of software has allowed me to solve what I consider to be the greatest annoyance when writing on a Mac -- the fundamentally silly layout of the Mac keyboard (this assertion will doubtless annoy long-time Macfolk). While it is true that one can train one's fingers to cope with almost anything, I believe it is ergonomically unsound to type Command key combinations -- the Control key is a much more natural stretch. Swapping the Command and Control keys on a Mac is of course no big deal, but there's more to this: many navigation shortcuts use the Option key, so one has to work with two modifier keys, whereas PCs almost always use Control key combinations for both command and navigation shortcuts. For folks switching from a PC, or moving back and forth between a PC and a Mac, consistency would help. I have written a set of macros for Keyboard Maestro which replicates all the useful shortcuts as Control key combinations (a comprehensive set for the Office applications plus a generic set that works across Mac applications). This makes my life much easier and I have provided the full set to the folks at Stairways Software -- it is available for download from their site (NB I have no conflicting financial or other interest -- am simply a paid-up, very satisfied customer).

Best wishes, Rakesh

I have made my peace with Office 2007 for Vista and Office 2008 for the Mac. I am waiting for them to get the macros working properly for the Mac version; until then I use my network to send stuff from Mac to Vista where I use the macros on it, then send it back. That's a bit silly, but it's what I'm doing just now.

Word 2007 and Word 2008 take getting used to, but once one uses them a while, they're easy.


I periodically have problems with long delays in Internet access from the Mac. These turn out to be due to my Windows based network. Over time we've been fixing all that, and in another month or so I'll have completely replaced my old Windows 2000 Server Active Directory network. Having said that, I'll probably miss it; when it works it works well, but it doesn't play nice with Macs.

RE: DNS Troubleshooting (May column pt.II)

Dr. Pournelle,

It is wonderful hearing of your continued, if slower than you might have thought, but full recovery.

Reading your travails regarding Mac Internet and DNS, I have found a shortcut that can often save much time in the diagnostic phase: NOT using DNS to test. I found by chance that Google sometimes used numeric IP calls, and the address http://209.85.135.147/ consistently matches a normal Google search page, without having any phonebook/DNS Server match "www.google.com" to "209.85.135.147".

I don't know if it is a relatively permanent situation (Google, like anyone on the Internet, can change their "phone numbers" relatively easily when they move or expand servers or services), but so far it has worked well for months, so much so that I, working often in troubleshooting for friends, relatives, and less technically inclined clients, have just about memorized it, but all you need to do is note it in the front/back cover of your log book with other "important emergency phone numbers".

There are obviously many other sites which are available by both IP address as well as the more commonly used Domain Names (I also use http://66.161.12.81/ - dictionary.com, which I have bookmarked, but which upon submission regresses to Domain Name), but many others, such as ChaosManorReviews and JerryPournelle.com, share one IP address for the outside world, and so DO REQUIRE the DNS resolution in order to be visible. Google just is convenient so far, as it gives the immediate opportunity to test without DNS that it isn't cached, by doing a "nonsense search" using any oddly typed combination of characters.

Hopefully this can save you a little bit of extra nonsense, though that is of course often the best part of reading you: nothing is lost as superfluous.

Best regards and wishes for your continued recovery,

James Siddall jr

I use the IP address of the machine that hosts Chaos Manor Reviews and The View from Chaos Manor, located in a data center near Houston, for the same purpose. That works quite well.


Mac DNS tricks

Dr. Pournelle,

Speaking as a former Windows user and network admin, I agree completely that the OS X networking preference panes are NOT a sterling example of ease-of-use. And I think they were actually *better* in the last version of the OS (Tiger, 10.4) than they are now (Leopard, 10.5).

Go to System Preferences, then choose Network.

SHORTCUT: from the "Apple menu" in the upper left corner of the screen, select Location, and then choose Network Preferences from the 2nd level menu - this will put you directly into the Network Preference Pane.

Choose the appropriate Location from the dropdown list at the top of the pane, then choose the desired interface (usually Airport or Built-in Ethernet) from the list at the left.

With "Airport" selected at the left you get very little info in the dialog - everything is hidden behind the "Advanced" button. Click that, then choose the TCP/IP tab at the top of the new dialog, and you are in business. If at "Configure IPv4" you have selected "Using DHCP" then you get a big button next to your IP address labeled "Renew DHCP lease."

DNS gets its own tab at the top. Easy to see what addresses your machine is currently using for DNS, and add or remove. Addresses that are manually configured appear in dark type and are selectable (for deletion), addresses pushed down via DHCP are in lighter gray type and are not selectable.

I recently came across the terminal command for fllushing the DNS cache (which is new in Leopard, as Apple has changed the background process which handles directory lookups in this version of the OS). In the event that you need to flush the cache -

Open a Terminal, and type:

dscacheutil -flushcache

Note that it is "ds" at the beginning (for Directory Service) and NOT "dns" (for Domain Name Service). Easy to get confused.

Hope this was helpful to you!

--Matt Knecht


A new Era in digital photography?

EOS 5D Mark II, 1080P video and it's implications

Jerry,

I've written to you about the Nikon D3 and it's prosumer version, the D700 and how it has cleaned Canon's clock in the professional world. As expected, The Empire Strikes Back. Canon, not to be undone, has announced the EOS 5D Mark II. This is priced directly against the prosumer D700 (A D3 in a smaller, less expensive body, but with almost identical processing power - hence prosumer). In fact, the 5D is priced 300.00 lower at 2699.00. It will be released in November.

Canon apparently added newer generation processing chips, a high-resolution rear panel display and other improvements to compete against Nikon. Oh yes, and they increased the full frame sensor to 21 mega pixels, the same as the pro level, $8000.00 EOS 1Ds Mark III.

All, except the sensor resolution was pretty much expected. Here comes the neat part.

Canon added 1080P video recording capability. That's 1920x1080 non-interlaced at 30 frames per second. For reference, true 1080P is 60 frames per second. With a 16GB compact flash card, one can record up to about 15 minutes at a time. Canon uses H264 compression which is very high quality and variable rate based on content. Neat, but what is really important, is you get to use Canon's L series lens. You have to use them to understand just how good they are. Unlike the hi-def camcorder you may have purchased, this new camera and pro-level lens give you the same format and the level of control of depth of field and similar optical quality as the studios use! So, for about 6000.00, one can have an EOS 5D Mark II body, and the standard 3 L series lens and make studio quality video.

I'm not saying that this first camera from Canon will cause all of those $250,000.00 cinema cameras to hit Ebay, but this is the beginning of a new era in video production. The EOS 5D Mark II is the first in what will be a whole slew of pro-level cameras at a fraction of the cost of current hardware. And of course, they are also world class still cameras. Coupled with things like the up comming MacPro's with the Nehalum processors that should hit the streets in the next few months, and that million dollar editing studio is obsolete. Moore's law in action indeed. What happened to audio production is about to happen to video.

We were talking about movie distribution, hollywood and the money men. The other thing that is happening to change things is our friend the Internet. Apple has already put in place the infrastructure to allow anyone with Mac hardware to purchase, download, and play at will high-def video. Amazon is doing standard def on your PC and on TIVO's. The model is changing fast. While movie theaters are nice, my 46" LCD, Denon sound system, and a blu-ray player are so good, that I really don't want to sit in a dark room with a bunch of strangers. Why not sit in my own dark room with my wife and kids? And by the way, the aforementioned home theater is by no means top of the line. Just what I've cobbled together over the years. Maybe 5000.00 worth of hardware. For 3000.00 today you can get most of the effect. 2000.00 a little smaller, but good and so on. In a couple of years, a 1000.00 say?

The only real bitch I have is the silly copy protection built into blu-ray. I fully expect the general public to get really pissed at Hollywood over that. But with all of the new hardware becoming available, and the Internet to distribute content, maybe Hollywood doesn't matter all that much in the long run.

Here are some links to pro's previewing the EOS 5D Mark II:

http://www.usa.canon.com/dlc/controller?act=GetArticleAct&articleID=2086

http://www.luminous-landscape.com/reviews/cameras/5dmkiipre.shtml

http://www.robgalbraith.com/bins/multi_page.asp?cid=7-9316-9607

Phil Tharp


A Freeware source:

useful freeware

Jerry, I discovered a vein of useful, free software that I've been using for a few months now and thought your readers might like to know about. (I maintain a 3-station wireless LAN all running XP/SP3.)

1. Revo Uninstaller: http://www.revouninstaller.com/index.html

2. Real Alternative: http://www.free-codecs.com/download/Real_Alternative.htm (includes Media Player Classic -- a tidy replacement for Windows Media Player)

3. Quicktime alternative: http://www.free-codecs.com/download/QT_Lite.htm

4. CD/DVD burner: http://www.osalt.com/infrarecorder

5. Software Inspector: http://secunia.com/vulnerability_scanning/personal/

Here's another I haven't yet had a use for, but may also be worthwhile -- especially for new laptop buyers.

6. PC Decrapifier: http://pcdecrapifier.com/

-- Cheers, Alan Messer

Understand, I have not had a chance to check these out. Comments from those who have will be appreciated.


This discussion took place during and just after the Apple WWDC in June. I didn't manage to get it posted when it was most relevant, but today in going over the old mailbag candidate messages, I found it sufficiently interesting to Apple power users that I'm putting it here.

iPhone 3G

I've got a big beef here. This was WWDC, not the Worldwide Phone Developers Conference.

Where the hell was talk about the other two "pillars" of Apple that Steve mentioned in that one slide?

Where are the numbers about OS X 10.5 adoption?

Where was the information about OS X 10.6?

WWDC keynote was a two hour long iPhone 3G commercial!

My beefs with the iPhone 3G as it currently stands are these:

1. The camera is still 2MP. This is a joke. Up it to at least 3.2MP (ditto previous revision) 2. Steve showed power numbers but didnt mention how GPS use affected them (my guess: bad) 3. No mention of cut and paste support (ditto previous revision) 4. No mention of video support (ditto previous revision) 5. The cheaper iPhone 3G price is because a *new* two year contract with AT&T is *required* 6. The unlimited data plan for the iPhone 3G is now $30/mo instead of $20/mo 7. No mention of data tethering support (ditto previous revision)

I can appreciate what Apple is doing here. Apple is trying to make the iPhone a general purpose mobile computing platform. Its a market where Apple can make the hardware and the software and have a market penetration far beyond 6% and make lots of money by executing well. The only other company that is in a similar position of making the hardware and software is RIM and to date they havent shown that they give a rip about making developing software for Blackberry's easy.

Now if I decide I want to develop for the iPhone a cheap way to do this would be to pickup an iPod Touch since it will pickup all of the features the iPhone gets with the 2.0 firmware. But it is somewhat annoying that to buy an iPhone 3G you *have* to get a 2 year contract now instead of offer a subsidized version and a non-subsidized version.

-Dan S.

Peter Glaskowsky didn't quite agree:

WWDC is a multi-day, multi-track event, and the vast majority of it relates to Mac OS X. It happens that Jobs chose to focus the attention of his keynote onto the iPhone because that's where the most progress is being made this summer. He mentioned Mac OS X "Snow Leopard", but it's too early to say much about it in a public session.

Dan's comments about the iPhone 3G features that are unimproved from the original iPhone are relevant enough, but that list could be infinitely long to no particular effect.

I certainly don't believe that Apple is trying to turn the iPhone into a "general purpose mobile computing platform." It can never be that. But it can be more widely useful than it is, and the process of getting it there will take time and effort.

. png

And Dan answers

Peter, I'd disagree with you on the general purpose mobile computing platform. After all, Apple is spending what seems to be considerable engineering and developer time creating an SDK that is well documented and easy for third-party developers to use to create applications for the iPhone and iPod Touch. This is markedly different from what I see the other players in the phone space doing except for maybe Microsoft. But Microsoft doesn't also control the hardware design, that is the key. People spent time in the various application demos at WWDC talking about the power of the iPhones hardware to do a variety of things. Granted there is still no officially supported multi-tasking on the phone but it's conceivable this is a sacrifice that is necessary to fit within the power/performance envelope that Apple thinks is key to the end user experience.

Snow Leopard so far has some interesting aspects going for it, but when you think about the general purpose of Snow Leopard - to optimize the OS X codebase for size and performance, realize that Apple has stated iPhone uses the exact same OS X kernel as a Mac. Snow Leopard to me sounds like an excuse for Apple to spend time optimizing OS X to run even better on iPhone while also giving them the opportunity to toss out the excuse that they wanted to do this "because desktop computers are going multi-core".

As to the hardware deficiencies of the iPhone and iPhone 3G I realize some of my talking points only apply to power user types. But this is the same market of people who often are early adopters and tech influencers for the less technical when it comes time to buy. I have about as many people who want to buy the new iPhone that I know as those that don't. Those that don't cite many of the reasons I listed earlier. The ones who are buying fit into a somewhat less demanding target group, or have less specific requirements of the device. Myself, I would only switch to an iPhone once I knew it was a fully capable all-in-one device replacement. While I like the touchscreen aspect of the iPhone's UI, it alone cannot make up for the other shortcomings I listed.

So I will wait on the sidelines with my Nokia N95 (which Steve Jobs says the iPhone 3G renders web pages 36% faster than) until Apple makes a iPhone that addresses my needs, or the 3rd party developers address the shortcomings through addon software.

-Dan S.

We now have some experience with iPhone 3G - well, some of us do. I don't have the 3G yet, and have been making do with the original iPhone - it will have to last me until I get some fiction work done and a new big book contract - but most of the mail I have from iPhone users is positive: they prefer the 3G to the older model. On the other hand, many of my friends have stuck to phones other than iPhones. Jobs and Apple have done us all a favor by making everyone scramble to make phones into small computers.

I am convinced that pocket computers what will also serve as telephones will be what we all carry in the future. What that form factor will be is still being settled; it really depends on data input capabilities. I see that Blackberry is now trying touch screen input keyboards. So far they haven't caught on. In my case I can manage with the iPhone input keyboard, but just barely: I sure would hate to be forced to use it for very much.

An iPhone app for reading documents

Hi Jerry,

I was just catching up on your daybook and I saw that you were having issues with iPhone book applications. Yeah, they stink.

I don't know if the app I'm about to suggest is any better at long documents, to be truthful, but it does wonders with shorter ones. I've been writing a book in MS Word 2008 for Mac and have been storing the chapters on my iPhone with a program called "AirShare".

It's not free but it's useful. If you have WiFi turned on, it converts your iPhone into a WEBDAV server, and you can use the iPhone's flash memory to store documents. I was naively using it just for storing a backup of my book when I discovered that AirShare has built in readers for a number of common document formats, including MS Word 2008's .docx format.

You can set up a public folder if you want to exchange files with others, and a private username and password for your own stuff. I don't know if you ever have occasion to copy files onto a storage device and give them to others, but it's damned convenient to be able to just login to their WiFi network, turn on AirShare, and tell them what URL to use to connect to your iPhone so they can grab it. (It displays this information at the bottom of the screen.) No wires, no thumb drives, no physical connections required.

My chapters aren't much longer than around 7,000 - 10,000 words, but they load pretty quickly and paging through them is very quick. The program's even tilt-sensitive, and reformats the words so they fill the screen properly when you rotate the iPhone. You can scroll up and down by dragging your finger, or by pages with up and down arrow icons. It supports zooming by pinching, but you can't change the font size of the letters, which could pose a problem. AirShare isn't perfect, but I think it's a really useful program. Maybe you'll find it so as well.

All the best,

Jeff Kirk

Note that AirShare for the iPhone is now called "Air Sharing". Peter Glaskowsky uses it. Apparently there were too many products called AirShare.


Linux and Aunt Minnie --

Dr. Pournelle,

I was reading your "Computing at Chaos Manor, May 8, 2008" column about the Microsoft/Yahoo debacle and your speculation that Linux may not be ready for the Aunt Minnie users out there. I might have agreed with you a month ago, but a new piece of hardware changed my mind.

Several years ago, upon your praise of the NEC MobilePro, I purchased one and found it very useful when I didn't want to be weighed down by a full-sized laptop. The scaled-down version of Word and the need to sync between the palmtop and the desktop were the price for transportability.

In my quest for a better solution, I recently purchased an Asus Eee subnotebook. Like the MobilePro, it is solid state, with a "hard drive" of 4Gb. I added an 8Gb SD card for data storage. Total retail cost [minus taxes] was $458. Most of the unit's "hard drive" is taken up by Linux and Open Office. The user interface is intuitive enough so Windows users can navigate around and perform basic functions without the fear and loathing the mention of Linux might bring to the naive user.

Open Office, while not as powerful as Word, has just as much usability once one is used to the interface, and it stores docs in RTF format, so migration between Open Office and Word is almost transparent.

The Eee [bad name, BTW] connects via Wi-Fi to my home network, so there's no need to sync the unit as with the MobilePro. It is compact and light -- about the size of a 1"-thick trade paperback and weighs about 2.2 lb. Along with Open Office, Firefox and Thunderbird come pre-installed.

The keyboard is compact -- about the size of the MobilePro's -- but I carry an older Fellow's folding keyboard and a wireless optical mouse. The display is only 800 x 480, so there is some scrolling around. But, the keyboard and display are trade-offs I am willing to make to carry a much smaller note-taking PC with me, rather than my standard laptop.

If the Eee's Linux OS and user interface were migrated to a desktop PC, I don't think Aunt Minnie would have any problems using it.

If fact, I just read a release in PC World, dated May 28, that Asus will be marketing an EBox desktop version PC. Maybe that's all Aunt Minnie would need for her word processing and e-mail needs.

Pete Nofel, editor
pnofel@nelsonpub.com


I think the future is the Atom. Your right for the vast majority of users PCs have reached fast enough. Most users don't need 32 cores or 10 Ghz CPUs . Unless you are running vista the average single core PC will do what Aunt Tilly and your average office worker needs to do just fine. I am betting the future is smaller, lighter, and quicker.

Think of air liners. A 707 isn't really any slower than a 777 or a 787 but a modern plane uses a lot less fuel, crew, and spends a lot less time in the shop. I think the PC of the future will use less power, produce less heat, and will not require a lot of fiddling with to keep working. Think of the Asus EEEPC as an example. When I mean quicker I mean quicker to turn on and start up the application I need faster to ray trace an image of the world.

I am so grateful that you are doing better. Best of wishes and take care of yourself.

LTWATCDR

As I have said elsewhere, I don't think the desktop is dead, but the pocket computer is certainly the wave of the future. In Mote in God's Eye I described pocket computers connected by Wi-Fi to the ship's computer or home computer, and those connected to what amounted to the Internet - a universal data base. Of course the naval warships had to be self-contained and connect to the Internet at intervals. That was the model I saw in the 1970's and I am not sure I have many revisions to make. We all will want to own our data; it may be out in the cloud, but we want something we can get at even when the Internet fails.

As to multiple cores, the point is that with enough local computing power the operating system becomes irrelevant: it's just another application to be run simultaneously with others. Of course multiple operating systems may be just a phase, but if so, I don't know when it ends.


Dear Dr. Pournelle,

Have you run across anything like this -

" Another glitch Heiker continues to confront is a real doozy: with no explanation in sight, his 64-bit Vista PC has accumulated some 23 million Registry entries. No, that's not a typo -- /*23 million.*/

"I brought this to Microsoft's attention and there's no solution to it," he said. "Apparently, a Registry entry is made each time a 32-bit application tries to update the Vista-64 Registry ... duplicating Registry entries a huge number of times."

This is from the " Windows Secrets " newsletter - http://www.windowssecrets.com/ .

I bought a Gateway GT5692 computer two weeks ago which had Vista 64 factory installed. Running " RegClean" gave me an enormous number of registry entries (over 2 million). Thinking the Vista install was "broke" I reinstalled it with the reinstall operating system disk that came with the computer. But with the same results. A nice thing is that Gateway includes Vista 32 (you have your choice - 64 or 32 bit Vista) on the disk so I installed Vista 32 and the registry seems to be normal now.

Also on your slow connecting to the internet with Vista - I've have the same thing happening both with Vista Basic on another computer and with the new Gateway. I assume it's a Vista internet security check "problem".

Tom Slater

I put this to the advisors, and Eric Pobirs answered:

Most of the source article is grossly misleading and much of it dubious.

In one testimonial the person has not actually used Vista 64! Then there is the griping about the small number of 64-bit native apps. In nearly all cases the current 32-bit version runs perfectly well and would get no major benefit from a 64-bit version. The companies whose products would benefit are well aware of it and doing the work. Adobe, for instance, is making a big 64-bit commitment because their apps will benefit greatly. The article makes it sound as though 64-bit users were cut off from the majority of 32-bit apps, yet we haven't had anything fail to work correctly to date that wasn't also incompatible with Vista 32. In both cases, if there is a vital bit of legacy software that must be run on a Vista machine, it would be a use for an older Windows version on VirtualPC if one machine must do it all.

A 64-bit Flash would be nice but mainly because it would force Adobe to plug some security holes, as noted a while back.

I think your correspondent panicked over nothing when he chose to go back to 32-bit, especially if he has more than 3 GB of RAM and will lose access to it. Was there any indication the registry was impacting system performance?

A quick check of machines here indicates that nothing seems remarkable about the Vista 64 Registry. My brother's system downstairs is a Intel quad-core system with 8 GB of RAM. It has been in use for about seven months and has a large number of items installed. Its registry is 340 MB. Large but nothing extraordinary as my Vista 32 system with far fewer apps installed still has a 205 MB registry. (The Vista 64 box is Andrew's main desktop while my Vista system is frequently kept in reserve for times when the hardware performance is needed, such as video editing and rendering.) My laptop with Vista Home Premium has more stuff on it and I'd expect it to have a larger registry but it's just 204 MB. (This may be because I've run registry cleanup tools on it when I suspected a certain problem that unaccountably went away on its own.) By comparison, the adjacent Windows XP machine's registry (which got a recent Registry Mechanic session) comes in at a little under 100 MB.

So, Vista appears to have a much bigger registry than XP but the 64-bit registry issue does not appear to be endemic.

None of the other stuff Heiker mentions has manifested on any of the Vista 64 installations I've used. I'm inclined to suspect he has done something to screw up his machine that he is not acknowledging. For instance, native 64-bit mouse drivers are part of Vista 64. Why did 'old drivers' exist on his system in the first place? I suppose this may be the result of an upgrade install but I've never seen Vista 64 behave thus, especially considering how demanding it is in regard to drivers. As a retired IT guy, why didn't he just go in and delete the now useless 32-bit driver store once this was regarded as a source of trouble?

Eric

I have no such experience with my 64-bit Vista installation.


Dr. Pournelle,

You wrote: "Many programs become nearly useless within a year or so of the departure of the programmer who wrote them"

My first job out of college, lo these many years ago, I was maintaining a program used in a motion control system. 400k of program and data on a DOS box. So it had to use every trick in the book to squeeze operating system, program, and data (that was in use) into the 640k available. Obviously the program was written in C, with some assembler. A bit of machine code to talk to attached equipment. Lots of bit flipping.

So looking through the code to track down a bug I came across the following comment from the original programmer: //Why did I do this? Followed by 2 pages (when printed out) of various nested loops, the pow() function, and other oddness. With, of course, no other comments. Took all day tracing through the code to figure out what he was doing, and why, and where the bug was.

Since that day I have been fanatical about commenting. Not just so that whoever comes after me can read my code, but so that I will never, ever have to put //Why did I do this? in a comment a year after I've written the code.

Sincerely, Kit Case


Language Wars

Hi Jerry,

I expect your mailbag this week will be full of outraged letters from angry C programmers, so I thought I'd lend you a bit of support. Mark Allums actually had it right in his second paragraph: "The goal of a programming language is to get something done." That does mean that different languages may be better for different applications, but when it comes down to it, if the program is full of bugs, it is not doing the job. So a programming language that doesn't try to help programmers to avoid creating bugs is not generally fit for purpose.

Best wishes,

Cheryl Morgan


Purpose of a programming language

Mr. Pournelle,

The primary purpose of a programming language is to get something done that requires complicated (relatively) logic to decide among options (otherwise a purely mechanical system can do it, such as a temperature-sensing oven on-off burner switch).

However, there is "doing something" and "doing it RELIABLY". This is where programming languages like C fail miserably. You have to force yourself to ADD things to the code (notes, special buffer overflow checks, etc., etc.) and not "go over to the Dark Side" (use shortcut tricks that cut code, but make the program almost impossible to understand or modify to add or correct things later). A programming language should DEFAULT to the "best coding practice utilization mode" so you only need to violate this in (rare) very special circumstances. While speed of execution of the final operational code in the target computer is important, it is no longer very much of a problem with optimizing compilers, lots of memory, and high-speed, multi-CPU machines when the programmer has some experience in including such things in his programming technique.

The best reason to use "natural language" type programming languages (Basic being about the best) is that:

(1) You may have to go back yourself later to fix or change the code and cannot remember why you did something. While NOTES/COMMENTS help, sometimes the code itself is part of the "trick" that makes this operation work (especially true with C) and, for the life of you, you cannot remember why you did something in the seemingly odd-ball way it was coded -- until you take it out and the whole thing crashes, after which you say, "NOW I remember!" and have wasted your time going through an exercise you did not have to put yourself through, if you hadn't used that trick in the first place (usually possible, though it would have taken more time back then -- to some people, "there is never enough time to do something right the first time, but always enough time to do it over!"). If the programming language is easy to understand, this kind of thing will be rare. You can also get lost in the code and "lose your train of thought" when coding a hard process over several days, which would require you to start over from scratch -- at least in going back over your existing code in detail -- if you do not have an easily understandable coding language needing few tricks to implement the logic. Time is money. (I once wrote by myself an assembly-language program over 3.5 years, not being able to ever test it until I was completely done with the first draft, but I essentially wrote the COMMENTS as if the program were written in a kind of Basic next to the actual code (comments on EVERY LINE), so that when a bug occurred, 99% of the time the COMMENTS showed what I was trying to do and the error in the code itself was obvious. I was able to assemble and debug the 5-inch-thick listing in one month. If I had been using Basic, I would not have needed to write such extensive comments, just details of what the values used and function being used meant, if there was any confusion between possible choices, saving a lot of time (one of the major reasons many programmers forgo COMMENTS). I also ALWAYS put in all parentheses, brackets, and braces in mathematical formulae to make sure that there was not way to misinterpret the order of doing the math in the formula -- add these first, then multiply these, then take this to that power, then... Better safe than sorry later.)

(2) Somebody else may have to take over from the original programmer for updating the program due to bugs found or requirements changed. In this case, NOTES/COMMENTS can help, but it is amazing how even when you (as the original programmer) write things out in what you think is "infinite detail only a monkey could not understand" he/she gets it wrong. The problem here is that the replacement person may never have seen your program before. Hell, he/she may never even have heard of the entire project it is part of before, so he has no "feel" for what the program has to do to get the desired results, assuming he/she even knows the exact desired results from perhaps a poorly written spec. Programming is still an art and the program is written much like a book. Some people are dyslexic in reading, and some programmers suffer from the same thing in coding (they get the program to work, more-or-less, but just try to figure out why they did something the way that they did, if you can even figure out what is doing what to whom). It is essential that the code itself be easy to follow so that you can understand the logic the original programmer used, right or wrong, in doing that particular process (you have to understand the hardware and external constrains thoroughly, of course, to have something to start your analysis from). How many programs had to be scrapped and completely redone because nobody could figure out the old code, even with the original source code, spec docs, and extensive notes? This is one of the SOP situations with many C programs. However, once away from the project, the original programmer can say "no skin off MY nose" or "not my responsibility anymore" as he laughs all the way to the bank, leaving you, as the replacement programmer this time, as the guy who has to clean up the mess ("Your brochure said that this program was 'plug compatible' with any new printer! Why can't we use the 100 new printers we just bought!?") or update to the new configuration ('We want it yesterday!") and your employer holding the bag. This is obviously not a good idea.

(3) In many cases (most, actually), you are coding the program for somebody else. If he asks you a pertinent question about how your program handles a particular function, most languages, including C, make it IMPOSSIBLE for you to actually show him what you are doing (even YOU can get confused without detailed NOTES/COMMENTS while you are working on the original code itself, due to the rather esoteric forms of the STANDARD LIBRARY ROUTINES to do things). If he cannot be sure that you are doing it the right way, even a small, inconsequential bug (which is inevitable) may be blown up in his mind as a symptom of your incompetence (how many strikes will he give you before you are out of the game?). If he has been bitten before by bad software in previous projects (very likely), you do not have much slack before you get dumped. If, however, he sees that your step-by-step logic is correctly attempting to implement his ideas, he will be much more tolerant of the occasional bug ("We all make mistakes!") and listen to you if you find some major problem in his ideas that require a change in strategy to accomplish. Thus, using a programming language that is easy for EVERYONE who is involved in the design process to follow is a major benefit. (I myself got so good at Basic that I could eyeball scan somebody else's program when asked to and find errors EVEN WHEN I DIDN'T KNOW WHAT THE PROGRAM WAS DOING -- the bad code was kind of "outlined in red" when my eye hit it and I could fix it immediately and show the programmer the problem and solution, before continuing my scan. Just try doing that with C!! Highly unlikely!)

Executable code is for the computer. Source code it for the programmer and anyone else who has a hand in the software/firmware/hardware design and implementation process. They are totally separate things if you have the correct compiler to change second into the first.

Nathan Okun

I have not done serous programming in years, but I found that I could get a lot done quickly in compiled structured BASIC so long as I stuck to decent structural principles; but for a larger project, Wirth's Modula 2 was easy to understand and took less debugging than any other language. That was many years ago, and I am sure things have changed since then.


Subj: Programming Languages: From Smalltalk to Squeak and beyond - TWIT interviews Dan Ingalls

Ingalls was one of the original implementers of Smalltalk at Xerox PARC.

More recently, he has worked on Squeak, which is an open-source reimplementation of Smalltalk.

http://twit.tv/floss29

Most of the show is interesting history, but also, towards the end, there is interesting material about what's going on now out on the frontiers of languages and programming systems:

http://www.squeak.org/

Sun's "Lively Kernel" for web programming http://research.sun.com/projects/lively/

The "E" programming language and environment for distributed capability-based programming http://www.erights.org/

Rod Montgomery==monty@starfief.com

beginner programing languages.

I learned pascal as my first language so if for nothing else I have give that a thumbs up. Python also has it's pluses.

But I would vote for Java. It is very popular, powerful, well documented, and you can get everything you need for free.
http://java.sun.com/docs/books/tutorial/
http://netbeans.org
http://eclipse.org

It is what most colleges use for their first programming languages.

I would also have to suggest Squeak. It is not very well known which I think is a shame but it is a free Smalltalk based language. http://www.squeak.org/

LTWATCDR

I would certainly recommend that beginning programmers gain some familiarity with Java, if only because so many know it. There was a time when the most widely used programming language was dBsase2 scripting. That went away when George Tate, the founding publisher(Ashton-Tate; Ashton was either made up or Tate's dog depending on when you asked) of the language died at his desk and his successors couldn't get their act together. I often wonder what would have happened had Tate lived.


HD TV Formats

Jerry

You missed one significant HD format: 720p. This is a wide-screen progressive scan format. Of the major networks, I believe that at least ABC uses this. Native resolution is 1280 by 720, and being progressive scan, it's very good with fast action, such as sporting events. I know that NBC uses 1080i, 1920 by 1080 interlaced, which is higher resolution but being interlaced is not as good for fast action.

1080p is the best of both, the higher resolution, with progressive scan. As you said, no broadcast network uses this because of the bandwidth requirements. The only 1080p sources I've heard of are HD DVDs, such as Blue Ray.

When you are looking at connecting your cable box to the TV set, check to see if both have HDMI ports. This will give you the best picture and sound, and in fact is the only interconnect that supports 1080p.

One other thing to consider, if you have a stereo in the same room with the TV, is to connect the sound outputs from the TV to the stereo. While this isn't the same as 5.1 or 7.1 Dolby surround sound, it is noticeably better than the tinny speakers in the TV set.

Just a few hints from someone who had been using her HD set for a couple of years now, and enjoying it tremendously.

Oh, by the way, I get about 30 true HD channels from my cable system, including, in addition to the major movie channels, such stalwarts as History, Discovery, National Geographic, and A&E. And the Discovery program "When We Left Earth", on the NASA space missions, was just spectacular. NASA apparently shot a lot of this stuff on high resolution color film, and it really shines here.

Karen

Karen Parker


HD Cables

"I have been told that using a DVI cable would improve the picture; RCA cables don't have great impedance matching."

Well, you were told misleading information. For 99% of consumer applications any difference is immaterial.

What is interesting is that for professional purposes what is interesting is that it is probably 99.9%. I have yet to see a professional installation that does not use RCA or BNC componant cables (yes BNC is better but the point is that they still don't use DVI.)

Gene Horr


One user - at least one CPU

To which I'll add

One document, at least two storage locations.

Save early, save often. Save on different disks. And after watching the Hurricane bear down on my house Save in multiple geographical locations.

The "cloud" is a great place for storing one copy of a document. But I'd always have a local backup - which is then backed up along with my other important files to a location somewhere at least 100 miles away.

My personal strategy is to email important documents from my yahoo account to my gmail account. As long as I can access either yahoo or gmail I can get a copy of the document.

Jim Coffey


The right way to look geeky

You write in your latest column that wearing a bluetooth headset doesn't so much look geeky as like geek theater. It occurs to me that somebody cold make a fortune selling bluetooth sets that look like the little communicator Lt. Uhuruh wore on Classic Trek.

-- Joe Zeff
If you can't play with words, what good are they?