Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor:
The Mailbag

Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

July 14, 2008

Eric Pobirs comments on the June (2) column:

It's a misnomer to refer to the versions of Vista other than Ultimate as 'crippled.' This is like saying a new car is crippled if you didn't buy all of the available options. You could tell someone with an XP Home system that it was crippled but since the most significant difference between Home and Professional was being able to join domain based networks, this would be a difficult loss to assess for a typical home PC or even small business user.

(Tech/Knowledge has a client that was sold a Small Business Server 2003 setup for each of its facilities and it was a complete waste of money as the servers aren't doing anything beyond what could be had with a simple XP install on a workgroup's designated server.)

Although there are five versions to consider, only three really matter to shoppers outside of businesses large enough to justify a domain controlled network. Vista Basic for the bargain basement shoppers. My brother picked up an inexpensive laptop for our mother that came with Vista Basic. It has all of the stuff that made her like Vista over XP, such as the prettied up version of FreeCell. I'd have to direct her attention to eye-candy on a strong system to get her to see the difference because it just doesn't get her attention otherwise. It gets the job done but any of your readers are unlikely to buy something so low end unless they have a similar 'Aunt Minnie' function to fill. So Vista Basic scarcely warrants any consideration most of the time.

Home Premium probably accounts for at least 75% of the licenses out there to date. Its closest XP equivalent would the Media Center Edition. XP wasn't short of versions either, what with Media Center and Tablet PC editions. Since all but Vista Basic include Tablet PC support and Media Center is standard to Home Premium and Ultimate , things have been simplified in those regards since XP.

Then there is Ultimate. Google Shopping has the OEM version running from $150 to $250 while Home Premium gets as low as $95. So someone who builds their own systems would only be looking at a jump in cost of $60-80. Not bad if bells and whistles are a big turn on but if you need it factory installed on a new system, the cost is far more dear. A difference of $150 on Dell systems. Not so trivial for functions you might never use and have free or low cost alternatives readily available. For instance, the lack of Remote Desktop support in XP Home was a big issue for me but these days the free service from Log Me In ( www.logmein.com ) fills the gap nicely.

The Complete Backup and Restore is worth some money but that is just as well spent on Acronis True Image. Since this includes some very good drive imaging utilities it could be regarded as a better value than its equivalent portion of the added cost for Ultimate over Home Premium.

The Fax and Scan app gets a lot of use but an equivalent app is bundled with any decent scanner or MFP. The value here comes when you have a perfectly good scanner that is old enough that its maker refuses to produce Vista versions of its app set, or if you find a particular company's apps to be ill-behaved. (Lookin' at you, HP!) In general, if you've got a new-ish scanner/MFP to go with your new PC, you'll have software that provides everything and more on Home Premium that Ultimate has built-in.

Even the cute Ultimate Extras like DreamScene have been hacked to run on other Vista versions. So if budget is an issue and you don't need domain login capability, you should be fine with Home Premium.

Eric

Eric as always makes good sense: but I still believe that Microsoft ought to sell one and only one version of Vista, and it ought to be the best they can do. It is not as if it costs them more to sell Ultimate as opposed to silly home featureless. And if they concentrated on one version, they could afford to devote more time to pay attention to drivers for legacy hardware.

Peter Glaskowsky disagrees:

This would be bad business. Pick one price and you'll lose some sales entirely while leaving money on the table from other customers.

Providing multiple products to support customers with varying budgets produces more revenue. That's just a simple fact, and it's true in every industry.

Apply your reasoning to Eric's analogy and you'd have the auto industry selling exactly one pre-configured version of each automobile.

You may say this isn't a good analogy because the cost of the DVD is the same for all versions of Vista, but the cost of development certainly is not. Some Vista features, such as the Media Center functionality, simply wouldn't make the cut if there could be only one version.

The total revenue from Vista sales would be lower, so Microsoft wouldn't have the R&D budget to develop those features, so it would fail to profit from selling them.

"the bewildering variety of versions of Vista"

For most purposes, there are just four, and it's pretty easy to figure out which one is most appropriate for a given customer.

Microsoft Vista Compare Editions link

--png

I remain unconvinced. I don't recall that there were four versions of DOS, or of the early versions of Windows, when Microsoft was making Gates the richest man in the world. Perhaps it's an early influence: when I bought Zeke, my first S-100 system, I set up some games in BASIC. Barry Longyear used a Wang dedicated word processor which he rented; he read my description of the Star Trek game I had written, and called Wang to see if his computer could run BASIC and thus play the game.

No problem, they said, and told him what it would cost, some extra fee per month – he was renting the computer. So a customer engineer came out, opened the Wang – and removed a jumper. "All done."

Barry was incensed. So were most readers who heard the story. The system had the capability, but it was disabled.

If BMW built capabilities into its cars that were not enabled unless the customer paid extra, I suspect there would be considerable unhappiness. It may just be impressions.

But I would still rather Microsoft put its efforts into making the best product it can make, rather than playing marketing games with many varieties of Vista, all of which seem to have quirks we'd be better off without. I suppose that's an irrational desire on my part, but it goes with my world view that companies don't have to grow every damned year in order to be successful. Selling a good product for a decent price used to be the formula for success. In today's world that doesn't seem to be so: better to build bubbles, raise expectations of future income, and "grow".

I don't suppose it matters. Microsoft continues to market many versions of Vista, just as some of its management continues to lust after Yahoo and the advertising revenue game.

For the record, Peter Glaskowsky continues to defend Wang's practice:

Should Wang have sold its system at the same price without charging for BASIC? Then it wouldn't have been profitable to develop BASIC, and there simply wouldn't be any BASIC ROMs in there anyway.

Should Wang have sold the system for the higher price with BASIC included? Then it wouldn't have sold as many.

Wang was just engaging in good basic revenue management.

--png

Which may be true, but I don't see many Wang computers now. They chose to market specialized systems and charge for every feature; other companies chose a more flexible and more general strategy.

I suppose I am saying that building customer loyalty by giving the users more than they expected is very good business, particularly when you're selling operating systems. I have never "held back" anything in my novels; if I dream up a good scene it goes in. I don't save stuff for sequels.

I am hoping to build enough loyalty among readers that electronic piracy has minimal effect, because the readers think they ought to pay; that they get a fair deal, the best I can offer when I release a book. I think at one time Microsoft – and particularly Bill Gates – thought that way. There may have been problems with what they shipped, but they tried to fix those that better hardware didn't make moot.

I suppose I would rather see Microsoft sell one version of the operating system, and offer features and enhancements as an extra for more money, than have features in the system that are not implemented because not paid for. Perhaps I am wrong, but I don't think unused code is particularly good for something as critical as an OS.

Perhaps the world has changed and what we used to think of as "sharp practices" are not only usual, but effective. They make profits, and thus not much can be done. In any event, I think the positions are clear enough.


The operating system becomes irrelevant

Jerry,

All other things being equal your statement, "given powerful enough hardware the operating system becomes irrelevant," is true. However, in the case of Windows vs OS X vs Linux all other things are not equal.

In the case of OS X and Linux almost everything in the OS works as expected. Unfortunately, this is not true for any version of Windows that I have used and I have used them all since version 1.0.

Bob Holmes

Well, while your statement is true enough, I do not share your very low opinion of Microsoft Windows. On the other hand, I am moving toward a setup in which Windows only runs as an application on another OS, so that it can be scrubbed and reinstalled without much trouble.

And I will repeat, I sure wish Microsoft would concentrate on being the best code house it can be, and stop with the bewildering variety of versions of Vista.


Norton save and restore

I must write you about this trash. I bought it on your recommendation a while ago and only now got to install it. It gives me errors that my license has expired.

Backup software must never expire to be useful!

PS: glad to read that you're feeling better

Wow. I agree that backup software must NEVER expire. I have used Ghost and Save and Restore and like them, and I guess they may have sent me non-expiring versions? Ye gods. It's one reason I really prefer that publishers send me off the shelf copies...

Peter Glaskowsky adds:

I did try to find out what's up with Norton Save & Restore. It doesn't seem to be a subscription-based product, but other Norton products are. Perhaps your correspondent got the Save & Restore functionality as part of a subscription-based product. Even that wouldn't necessarily mean that the Save & Restore features are subject to expiration.

Symantec does offer subscriptions to "Content Updates." That concept doesn't seem to apply here, but again, perhaps your correspondent was invited to renew such a subscription and thought the subscription was required to keep the application operating.

Really, only Symantec can clear this up. I suggest adding a line to that section along the lines of "Perhaps Symantec would care to explain?"

. png

I have put in three calls to Symantec, and this mailbag is overdue; I'll report what happens when I make contact. For the record, I have many copies of both Ghost and Save and Restore, and I have never been asked for validation other than on first installation.


Since purchasing my Kindle the day it first went on sale, I've purchased some 57 titles from Amazon, and 5-6 unencrypted Mobipocket ebooks from Baen, FWIW.

There's no way I would've purchased that many physical books in 8 months; even with all the things which the Kindle system lacks as a 1.0 product, the portability and convenience of the device have made it practical to read more books, and more quickly (I've always been a fast reader, but I seem to be reading about 20%-25% faster on the Kindle, interestingly enough) than I would've otherwise done.

Roland Dobbins

I can't match your record for books read, but I do find that I use the Kindle a lot. I have read at least a dozen books as well as some magazine articles. Some of the books were free. I bought others, and in one case a writer sent me a copy of his book in Kindle format. Kindle can be improved, but it remains pretty well Good Enough. I do not know what the sales figures are. I hear rumors of high sales, but I also hear rumors that not all that many have been sold. Don't know; perhaps when some of my books sell on Kindle I can get a better idea. But I do note that Amazon is selling a fair number of books in Kindle format.


About writer's rights

Jerry,

I've read over 50 pieces of writing (short stories and novels) in the last four months or so, every one of them either legally purchased or legally free (i.e., given away as free samples, specifically by Tor books). And yet, I haven't purchased a single physical book during that entire period--every one was an e-book. And, I can't think of a single book that I'll want to buy in the near future that won't be an electronic version.

I think that says something. I'm certainly not "typical," in that today I read most things on my Sony Reader with its better-than-paper readability, and what isn't (yet) available for that device I read on my Tablet PC as PDFs. However, it's only a matter of time before I'll be in the mainstream, particularly given continued advances in eInk and other forms of electronic paper. I've been reading with some interest of a device (the name of which escapes me) with a 5" screen that rolls inside of a cell-phone-sized device (which might even _be_ a cell phone).

I think that you and other professional writers have something to fear. It's a shame that, as you mentioned in your most recent column, the SFWA no longer sees fit to fight for your rights to control your property. Because, once technology makes only a few more strides, the "give away the e-book for free to sell the paper copy" theory will hold even less water than it does today.

I refer you to the following quote (link: http://wiki.creativecommons.org/Cory_Doctorow) for Cory Doctorow's very peculiar view on copyright and intellectual property, and one should remember that Mr. Doctorow is one of the leading proponents of the aforementioned "theory" and is an important influence in the EFF and Creative Commons.

"I believe that we live in an era where anything that can be expressed as bits will be. I believe that bits exist to be copied. Therefore, I believe that any business-model [sic] that depends on your bits not being copied is just dumb, and that lawmakers who try to prop these up are like governments that sink fortunes into protecting people who insist on living on the sides of active volcanoes."

With logic like "bits exist to be copied," you're doomed, I hate to say. And again, it needs to be noted: Mr. Doctorow is a leader in the anti-copyright movement...

Mark Coppock

That is a large and important topic, far too large to be dealt with by mail. While Doctorow is the most visible champion for the "information wants to be free" and "bits were made to be copied" school, there are others, like Eric Flint (Senior Editor at Baen Books) who continue to say that piracy is not only no threat, but is usually good for sales.

Now Flint has a publishing model in mind: For one of their major authors, sales of some 25,000 copies of a book in hardbound at, say, $25 a copy. That means the author is getting about $2.50, or around $60,000 for the book. Now that's considerable money for science fiction, a lot more than writers were getting in the Golden Age, and a good bit more than most SF writers will ever see from any book.

Flint's marketing model is to sell early eBook copies of the work at a premium price. They are not copy protected, and if purchasers give them to friends that's all to the good. Enough buzz and hardbound sales of the book can go from under 10,000 to the 25,000 mentioned above, doubling the author's income and making good money for the publisher as well.

The flaw in that reasoning sounds selfish: but I point out that Lucifer's Hammer sold about 100,000 copies in hardbound and millions in paperback; and most of the income Niven and I got from that book over the past 30 years was from the mass market (paperback) rights. The book keeps coming back in print, almost always in paper. In my judgment eBooks compete with paperbacks; and up to now eBook sales have never been very high for any book. And as I said in the column, I do not believe that pirated electronic copies of a book help Ebook sales. I am sure there are some who think "Well, I liked reading that, so in fairness I'll buy a copy or perhaps send the author some money," and it has happened to me a few times over the years, but the amounts involved are insignificant.

The real impact of piracy has yet to be seen. As to the remedy, I don't know. Many people are honest, and would prefer buying books to stealing them. We can hope there are enough.


Programming

Mr. Pournelle,

You mention that programming takes time to do. Actually, with an English-type language, it is just like writing a book. I believe that we should program in BASIC (perhaps with added terminology to make it somewhat more English-grammar-like) and have "canned" algorithms to do the "math stuff", which seems to be where most people have their problems. I know people who did not ever learn programming who, with minimal help with a "BASIC terms dictionary" to explain a few less obvious commands (COMMON and SHARED and so forth required at the top to set up subroutines in other modules, etc.), could read a BASIC program and even debug it.

The canned modules would automatically handle such things as quadrants in trig routines and so forth that the built-in BASIC trig functions do not handle without added logic; that is, just create somewhat more elaborate trig subroutines with standard inputs and outputs (as in the C Library) so that you can simply say "using angles between 180 and 270 degrees, we get..." without having to know the sign conventions and all that detailed rubbish that gets in the way of doing the work when using the standard BASIC (or most other language) trig commands.

In most ways, a more English-like BASIC would allow direct changing of a written document explaining what you want to do (spec) to the program itself (object) with minimal work. Those people who say that BASIC allows "spaghetti code" haven't seen C-language programs. BASIC is a flat language that allows, at worst, 2D spaghetti (using nested GOTOs and RETURNS that can get somewhat confusing), while C allows N-dimensional loops and twists and wormholes, since "it knows where the code lives" and can use pointers and so forth to completely scramble EVERYTHING (and. as I am sure you know, many C programmers even pride themselves on creating such "string theory" code)! It seems to me that the argument against BASIC is merely that it lets the "hoi polloi" do programming, which should be reserved for the "experts" (always the person who has this attitude, it seems to me). As a programmer of 35 years experience with BASIC and FORTRAN and C and several assembly languages, I MUCH prefer to program in BASIC, since this is how I think (anyone who thinks in C would be a Borg, I would imagine!).

Most BASIC programming in my experience that causes problems has been the detail debugging work to handle the various variables and constants that have to be used in different modules (it is easy to forget to add them to the SHARED and COMMON lists, causing the program to fail in subtle ways, for example). If this kind of detail was automated -- simply make ALL variables and constants universal in their scope throughout the program unless specifically determined otherwise -- the ability for an average person to take his problem and turn it into a working program would be minimal. Such a person would still need somebody else to review his work to determine errors and so forth, but this must be done for his regular writing, too, so this is not much different.

Only if the person cannot "get his act together" and sort out how to do the job he wants would he need to have some other programmer ("ghost writer" equivalent) do it for him. Even a good automated BASIC compiler with truly informative and useful warning and error messages (such as, "VARIABLE X is being used as the result of a SINE routine, so it cannot have a value of greater than 1 or less than -1. In STEP N, you are expecting X to have an out-of-range value.") that would handle most of the busy work (shared variables, etc., etc.) would solve 90% of any problems a non-programmer has with using BASIC-type languages. Computers are far and away powerful enough to handle such a detailed "programmer's assistant" super-compiler. THIS, not more elaborate versions of languages like C (C# or C++, for example), is the way to go in programming in the future. Let the computer figure how to sort out the details -- what do you care about this stuff, anyway?

Even if the BASIC language source created a final machine code generated internally that was based on a C-language-type design philosophy, what difference would it make to the programmer? How many people have to go into the computer to see what machine code it generated when writing in any high-level language (other than when building a compiler or real-time system with interrupts, which would require much more detailed understanding of the hardware, but that is not what most people want a computer for)?

This is no different than any other kind of writing scheme, to my knowledge. You have to be able to write well to put down your ideas so others can understand them before you can do ANYTHING with them, to say nothing about programming with them! If you can't get this far, not being able to program is the least of your worries here!

You do not need to become a programmer: If you know English and can write in coherent and unambiguous sentences (not easy, but this has to be done in any scientific or engineering writing, anyway, I would think), you can have those sentences turned into English-style BASIC-type code with little problem. The resulting program might be long and wander about a lot (depending on how the writer wrote his original descriptive spec document), but as long as it "sticks to the yellow brick road", who cares how long it takes to get there when computers today run so fast and have so much memory and storage capacity? The object is to make sure that the program does what the writer wants (and ONLY what he/she wants, which is sometimes even more difficult to accomplish!), not how "efficient" it is in doing that -- we no longer have to stick our code into a COMMODORE 64. This may seem to demean the profession of programming, but originally there were people who specialized in writing letters and documents for the illiterate masses -- "scribes" -- and the introduction of universal schooling must have seemed the same way to them, too! Too bad! Time marches on!

Computer languages are for people -- computers use machine language, no matter what language was used to produce the original source code!!

Nathan Okun

I said in the 1980's that the real computer revolution will come when people who know how to do stuff can teach small computers to do it without having to become programmers. In those days Wirth and Dijkstra worked on algorithms for "proving" programs, and Wirth worked on languages that were closer to natural language in which the compiler would catch logic errors.

I still believe this is the right approach to living with small computers.


Multi-Core Processing

Dr. Pournelle,

Apropos of one of your points on TWIT this week, Intel's take on software for multi-core processors

[[Ars Technica link]]

It was great to hear you on TWIT. Keep healthy.

Richard York

Moore's Law is inexorable. We are coming into the era of computing plenty.


Subject: Computer operating systems

Dear Dr. Pournelle,

The June 30, 2008 Chaos Manor Reviews states that in the future, the operating system used will be unimportant or irrelevant. I do not see this myself. The money to be made by differentiating software with a new, improved, and bloated operating system is important for too many people.

I am stuck with an earlier argument. The job of the operating system is to provide connection to input, output, and data storage for the cpu (central processing unit).

Two primary schools of thought exist on this. One prefers the monolithic kernel, with as much work as possible done in the kernel. Whenever possible, any routine is rewritten to be a kernel module, to make the routine more efficient. Microsoft, IBM, Apple, and Linux have prospered with this model, and do dnot wish to change. The other school prefers the microkernel, with an absolute minimum done inside the kernel. Virtually all work is done in programs that have a well defined interface to a kernel devoted mainly to directing message traffic among the various computer functions. QNX, Hurd, and Minix have tried hard to promote this simpler model. QNX sold commercially. RMS (Richard M.Stallman), a brilliant man, has pushed Hurd for a long time, but has not attracted the following of Linus Torvalds' monolithic Linux. Dr. Tannenbaum has a similar problem with Minix, another fine idea with a limited following.

I prefer the microkernel. The moneymen, and most of the buyers, prefer the monolith. It is lonely having a drumstick without a drum.

regards,

William L. Jones

wljonespe [at] verizon [dot] net

What I meant about the irrelevance of the OS is that soon enough we will have the resources to run any operating system as an application under another operating system.


Hi, Dr. Pournelle,

I find that due to what is perhaps a unique perspective I don't quite agree with your contention that we're at the stage where things are "good enough" re speed and application efficiency.

My company makes software that is designed to let users design buildings and manipulate them in a 3D environment. Spreadsheet software exists to sorta/kinda estimate costs, which is probably quick enough for some, while CAD etc is used to get stampable drawings. Neither of these product types are good at talking to each other; they don't do 3D well, and labour (among other factors) is typically done with yet another application. And none of these are necessarily good at updating a builder's inventory system. And so on. Suffice to say that just getting a usable, reliable quote into the hands of a buyer typically takes a week or more, and a reliable quote is only a small part of the overall problem. Our company bundles all of the aforementioned categories into a single application. Hopefully this describes the niche.

In our niche the holy grail is to let a field salesperson create a virtual building with all prices factored, all drawings created etc and all things considered and do so under 5 minutes. We can do this now, and what we are working on is creating far more complex building styles/architectures and still keep this in the "few minutes" range. The goal is to be able to let a user create anything at all -- whether it's a barn, residence or shopping mall -- and quickly. The reason for this ought to be obvious; the underlying idea is that this translates into a sort of architectural spreadsheet (after a fashion) for the design of buildings and sites.

("What if we do this instead of that? Oh, wait, I know, let's put part of the parking underneath this section. How much will that run?" And so on.)

This software is highly efficient by definition. Imagine designing the Empire State Building and doing it in 5 minutes and getting the engineering correct and the costs calculated down to the last penny and factoring how many trucks will be used to deliver the materials, how many pounds of glass are used (i.e. logistics factoring), and so on. That's the goal, and you can't get there using languages etc that are "good enough." Just to do what we do now requires every clock we can steal.

My observation, then, is simple enough -- based on what I know of just in our niche I figure that there is an entire classification of applications out there waiting to be created that will still require high efficiency, and for perfectly valid reasons. Certainly there will come a time where hardware is so blazingly fast that it doesn't matter if the software itself isn't efficient. But we're not there yet; in fact, we're still quite far from it.

A second observation, deriving from the first, is that simpler user applications (email filtering, photo editing, etc.) are at the stage you mention and could be considered as first generation applications. 2nd generation applications that require visualisation are not at that stage.... e.g. your sidewalk is broken, and the concrete guy shows up and snaps a photo of your house. Inside 2 minutes he's showing you the first of many possibilities of how you can update the curb appeal of the house for roughly the same price range as mere replacement.

The concrete guy doesn't have to know spit about design. It's all part of the basic software "know how." Being able to do this and get it right with a $400 laptop and a left-side-of-the-bell-curve concrete guy requires a great deal of efficiency. The upshot is that as time passes, "good enough" will depend on which generation of application you're referring to.

Thanks for the time, and Godspeed on your recovery.

Gary Alston

I think we are closer to computer plenty than you do. Fast enough machines take all the sting out of Vista. I know there are applications that will bring our biggest systems to a choking point, but the hardware continues to improve even as we exchange letters.


Another VMware use

Jerry,

I'm going to use another tool chain for a new design. The tool chain only runs under Linux.

I was about to do the usual tedious setup of both a Linux installation and the configuration of a complex tool chain. I then realized that I had VMware on the Mac Pro. So instead of the usual day or two setup, I just asked the tool chain company to send me a pre-configured VMware virtual machine. All I do once it gets here is just copy the VMware virtual machine file to my macpro hard disk and run it. The virtual machine's networking will just work and the virtual machine will easily have access to my Mac Pro's hard disk, so getting files on and off is no problem.

The new way to deliver major applications software - just send me the OS too.

Phil Tharp

At least as long as it's Linux or you both have the proper licenses. Carry your virtual machine in a thumb drive...


: Digital Television and the February 17, 2009 Transition

Jerry,

Here is some information that may be of interest to all of your readers that get their TV via an antenna rather than cable or satellite.

Almost everyone who watches television should know by now that something is going to happen to over the air TV broadcasting on February 17, 2009. Namely, that analog TV broadcasts will stop and that a digital tuner will be required to receive over the air TV after that time. What they may not know is what digital channels are assigned to what stations and how that will also change.

At transition the bandwidth currently assigned to channels 2 through 6 will no longer be assigned for broadcast TV. Stations that are currently assigned analog channels have been assigned an additional channel for use as a digital broadcast channel. (These channels are in the UHF band. Because of the nature of the digital broadcast signal it is possible to assign adjacent channels. This was not done with analog broadcasting to avoid inter channel interference.)

Now on to what will happen on 02/17/2009 that might cause some hair pulling for those that already have a digital tuner of a TV with a built in digital tuner,

Most channels that are currently assigned channels 7 - 13 for analog broadcasts will be moving their digital broadcasts to those channels at transition. In Los Angeles this means that KABC-DT currently broadcasting on channel 53 will move to channel 7. KCAL-DT currently broadcasting on channel 43 will move to channel 9. In fact, almost every station that is currently assigned a digital channel higher than their current analog channel will be moving their digital channel to the analog channel at transition. (The lower the channel number the lower the frequency. The lower the frequency the better the signal propagation. This is particularly germane for channels 7 through 13 since there is a gap of 216 MHz between channels 13 and 14.)

How do you cope?

If on the morning of February 17, 2009 you go to the menu of your digital tuner or digital TV and rescan the channels you should be good to go. You might also want to do this every few weeks between now and the transition date to pick up any new digital TV stations. (Not all analog stations are also broadcasting digitally at this point.)

As an aside it might be interesting to know how you can plug in 7.1 on your digital TV or Tuner remote and get KABC-DT which is broadcasting on channel 43. Part of the information that is broadcast with the digital signal are the call letters of the station and a channel id. The channel id does not have to be the "Real" channel on which the signal is broadcast. All of this information is tracked by the digital tuner. ( It is, after all, a rather sophisticated single purpose computer.)

Once you try digital, even with a digital tuner attached to an analog TV set you will probably never want to go back. If you can get the signal you will not have snow, ghosts or other analog artifacts. If the signal strength is marginal you may get short periods of pixelation on the screen if the signal drops below a certain threshold. Another benefit of digital transmission is that the signal is compressed and it is possible for a broadcaster to transmit multiple channels in the allocated 6MHz of bandwidth.

All of this above about Digital Television (DT) and not a word about High Definition Television (HDTV). For over the air transmission DT does not necessarily mean HDTV; however, HDTV over the air requires DT.

Bob Holmes

Thanks. I have been experimenting with the Hauppauge USB TV tuner and Vista. That seems to work. More as I learn more. And we have a new High Definition TV, and as soon as I can get an Ethernet cable into the room where the TV resides I'll connect up Apple TV.

Bits is bits, and digital certainly beats analog...

Thanks for the explanations.


Eric on the June Column Part 2

One area where I'd disagree is on the capability of modern consumer PCs.

By my casual estimate, the consumer PC trails by about ten years what is being done in the high-end workstation realm. This may not seem to matter, at first glance, to the interests of consumers but it really does. The consumers generally don't know they want something until it is right in front of them.

Consider your first system. It got you started on word processing but this was several years ahead of the curve. It was a good decade from the appearance of the first dedicated word processing stations to when a non-technically inclined writer would be well advised to buy a computer and WordStar. Likewise, it took as long for WYSIWYG word processing to reach a wide spectrum of consumers.

The machines we have today are making accessible the applications that ran on super-expensive workstations ten years ago. You can do pretty good editing of 1080p video on an affordable system these days, so long as you don't mind waiting for it to render. You can spend some serious money on a fast system and still impatiently wait for the output to complete. Now that HD cameras are getting cheap, a lot more consumers are going to become aware of this, creating more demand for better encoding throughput from the CPU and/or GPU in PCs.

The transitions won't always be obvious because the way high-end systems are applied could mutate into something else on its way to the consumer market but still apply the knowledge acquired creating the $25,000 version.

Ten years from now the new machines will be enabling things we can only dream of now, while others complain how horribly slow their particular app is running.

-- Eric Pobirs

I don't have a strong disagreement, but I do think things will move faster than you seem to. As to when to buy a system: I contend that getting Niven, who is not a geek, to buy an S-100 computer (actually he bought two: one was his, the other Marilyn's, but also served as an instant source of spare parts) was good for both of us. Really ran up our productivity. I note that Greg Benford got a system identical to mine and churned out at least one novel on it, which more than paid for it. Precisely who would be well advised to keep up with technology will depend in part on productivity.

Of course we now have systems that are pretty well good enough for writers who work with text; will the next step be to add "enhancements" and illustrations? And will that need more computing power? I really don't know; but I do try to keep up.

It takes less time now for systems to go from high end early adopters only to commodities. As see next letter:

Comment on "With sufficiently powerful hardware, the OS becomes irrelevant"

Jerry--

You wrote:

>I am rapidly concluding that given powerful enough hardware the operating system becomes irrelevant; the decision factor is the application you prefer.

Best Buy is selling a 64-bit, 4 core, 6 GB RAM, 640 GB disk system with the "Vista" operating system for (drum roll please) $900.

Nine hundred dollars.

For $900, you get about as much computing power as existed in the entire world when I was in college. (But where do I plug in the card punch and line printer?)

I'm agog.

-- David Schachter

Indeed. Under a thousand for more computing power than any government had two decades ago. The mind boggles.