Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor:
The Mailbag

Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2008 Jerry E. Pournelle, Ph.D.

April 21, 2008

Continuing the discussion of Microsoft's latest antics:

Subject: Microsoft and Vista

Jerry,

It would appear that Microsoft is hoist on its own petard.

Rather than follow basic software design principles and keep Windows neat and clean, Microsoft, in an attempt to monopolize certain functions such as Web Browsing, has laced application functionality and support throughout Windows like a giant bowl of spaghetti. Now when it comes time to add necessary functionality such as a modern file system, NTFS seems to fit the definition of maximizing file fragmentation rather than allocating new files in the usually available acres of empty space on the average user's hard disk, or find and fill the Security Holes. How hard would it be to go through the code and find all of the buffers and add checking for buffer overflow?

Microsoft may hire very bright people. What they really need are some people that understand the basics of software architecture. There are quite a few available right here in the good old USA. However, many are over 55 and currently unemployed. They might not be up to speed on all of the latest middle ware whizbangs or SQL Server of Exchange Server or whatever else is the latest and greatest, but they know the basic rules of software architecture and how to apply them to development projects. They are not afraid to use techniques that they did not invent and are willing to do research to find out if a better way has already been discovered.

In short, it is time for Microsoft to wake up or end up on the ash heap of failed enterprises.

Bob Holmes

Agreed. One of the features in User Friendly is the elderly programmer who understands.

What Microsoft ought to do is bite the bullet and reprogram the OS in a strongly typed language with range checking. Of course they won't do that, but hiring some programmers who know why that would be a good idea couldn't hurt.

It takes a long time for a dinosaur to die; indeed parts of it may be dead before the animal realizes it is mortally wounded. Microsoft is huge and rich and out trying to buy diversity. It can last a long time. And it can take a lot of those who stay the course with it.


On a related matter:

O/S's and language

Jerry,

We have both been around long enough to know where multi-tasking O/S's came from. The main frame. Hardware was big, expensive, and precious. Multi-tasking O/S's were invented to make each user think they had their own computer. Software was relatively cheap. Portably O/S's such as Unix were developed later so that they could be easily made to run on the latest cool hardware, again, big and expensive.

Now the reverse is true. Hardware is small, very fast, and very cheap. The hardware was standardized I.E., the PC, so that software would keep running as new hardware came out. It was found that with NT, recompiling a code base to run on a new processor, such as the Alpha, was a small part of the job. Regression testing, tech support, etc, was much more expensive. Hence, the collapse of the hardware environment to the PC. You spend as much money as you need to get the PC that will do the job. Be it a laptop or a server farm.

The O/S is still of the same form as when it was invented. We need to re-think it. It needs to be much more responsive to the outside world. If we have 1gigabit Ethernet, then we should get close to 100 megabytes /sec of I/O. Same type of rationale applies to disk storage and so on.

However, new ideas are starting to emerge. Intel's massive multi-core mesh chips (in research only) are going to provide 100's of cores hooked up in some kind of switching mesh. Currently, they are looking at core allocation outside of Windows or other traditional O/S's. User level allocation of resources.

We may be approaching the one cpu core one execution thread type of environment.

On languages. The PHD CS types in Intel research are enamored of Java. Guess why? Garbage collection. In other words, no memory leaks. You can program as sloppily as you want and the run-time environment will save you. Now there is a good use of Moore's Law.

On the many core chips I mentioned above. The CS types want to write the same old code they do today and have the compiler magically apportion out the code to multiple CPU's. No parallelism extensions required, just write linear code. This is from a senior Berkley professor. And they are all stuck on C and it's descendants. Remember, Java is tamed pcode interpreted C++.

I'm just irritated to see Apple go down the Microsoft road.

Phil Tharp

ptharp@vreelin.com

There was a time when we needed assembly languages and C which is a variety of assembly language, because our hardware wasn't up to anything better. Structured languages like Pascal (which was a teaching language not really meant for writing real programs) and Modula-2, and later Ada, took far too long to compile, and their code generators did not write optimum code when they compiled themselves. It took hand optimization to make programs written in those languages run fast.

That's all changed now. Our hardware is far better than our software, and one reason is that we are still using assembly languages. The theory of a structured language is that the compiler catches errors: it's harder to get the program to compile, but once it does, it generally does what you expect it to. It doesn't take longer to debug the program than it did to write it.

It's time we went back to structured languages, with mandatory declarations of variables, strong type checking, and range checking. I confess I am no longer current in languages; my last experiment with such was Modula-2 with a brief excursion into Oberon (another Niklaus Wirth language similar to Modula-2). I wrote a couple of small programs in an early version of ADA, but I have not followed it since.

But it is time to rethink the situation. The hardware will now support strongly typed structured languages, and we are no longer stuck with assembly-like languages unless we want to be.

Managing Editor Brian Bilbrey observes:

"There was a time when we needed assembly languages and C which is a variety of assembly language, because our hardware wasn't up to anything better."

I'd take exception to that. Assembly language is tied to a specific processor target. While C has primitives that can seem close to assembly language because of the amount of control both given to and required from the programmer, it is NOT assembly language. C compilers are requireed to make executable code for each processor target. While SOME inline assembly (which isn't C, it's a compiler-extension) can be found in written C code for specific optimizations, modern compilers are usually better optimizers than the programmers these days. C without inlined assembler is usually portable between processor targets. That makes it NOT assembly.

True enough. I should have said that C has many of the characteristics of an assembly language. Among them is the ability to compile almost anything, including type changes that are nowhere near obvious. In some ways C is worse than an assembly language.

It is fast, though, and that was its major advantage back when our hardware was the limiting factor.


AMD vs Intel

Jerry,

While you are right about Intel over AMD at the moment, this may not be the case for desktop machines by the end of the year. AMD has had significant problems moving downward in trace width and the currently available Phenoms pay the price in slower than desirable clock speeds. If AMD is making progress with their fabs, and I have no reason to believe that they are not, we may see major clock speed increases in the near future.

AMD still has an edge with on processor memory controller. Intel is working on new designs that incorporate this.

I have always wondered how the Intel Core 2 Duo would compare with AMD X2 chips on an OS other than windows. The fact that Windows XP and most likely Vista do not keep track of threads by processor and can schedule the next execution of a thread on a processor different from the previous execution means that processors that do not share the L2 cache, i.e. AMD X2, versus those that do, i.e. Intel Core 2 Duo, will have to reload L2 cache and have a significant decrease in performance.

When we ask ourselves why Microsoft is so lax in important OS design areas, the only answer would seem to be that if Microsoft didn't invent it or buy it. it can't be any good. Of course, eliminating the accumulated knowledge of more than 50 years of OS development does not yield a robust OS with good performance!

Bob Holmes

I would be extremely pleased to see AMD become a serous competitor to Intel. Competition is always good for the rest of us...

And on that note

Intel's new atom processor

Hi, Jerry - so you just bought your new Q6600 processor, and you haven't even put the machine together yet. And already, that chip is obsolete.

Over the last few weeks, Intel has announced its new Atom (Nehalem) processor; and more details are now emerging.

Originally suggested as a power efficient processor for ultramobile PCs, it is now clear that Intel is intending to implement the new processor design over its entire line, from UMPC applications right on up to servers. There are a number of features that make the chip interesting. Here they are, in no particular order:

45 nm core design. In the past, smaller fabrication processes have implied speedier, more complex cores, and that seems to be a really safe bet for Atom as well. Intel recently announced an Atom 4 core server part with 781 million transistors. Golly.

Up to 8 cores per chip. Current designs are based on original Pentium blueprints, which was originally intended as a single core processor which could be scaled to higher and higher clock speeds. The new Atom processor is designed to take advantage of increasing levels of parallel processing, as opposed to clock speed. As a result, a number of design tweaks and changes have been made to facilitate massive parallelism. (It remains my fond but fading hope that, as time progresses, Microsoft will find ways to take better advantage of parallel processing. They are doing a few things - Office 2007 takes better advantage of multiple cores than do earlier versions of Office - but I want to see improved OS performance. Of course, before I want to see improved OS performance I want to see improved OS stability, but that's another matter entirely.)

A massive boost in bandwidth, resulting from a serious redesign of the frontside bus. Current Intel processors try to mitigate the lack of bandwidth with very large predictive caches. It sorta works - the quad core Intel processors are spectacular performers - but ramping up the bandwidth to give the processors real time access to memory is a far better solution.

As good a design as the Atom appears to be, in the machines that I'm building I'm beginning to recognize that we already have more CPU cycles than can adequately be used by the rest of the machine. Even current CPUs are more powerful than we need; what we really need, are improvements in memory speed, and northbridge and southbridge chip speed. Those bottlenecks squeeze us even more tightly with every passing day. Accordingly, I'm finding that the path to a fast machine consists of piling gobs of money into hugely fast ram (and a motherboard that can support that ram), and less money into the CPU. You wind up with a better cost/performance ratio.

Still, Intel continues to innovate, and they continue to produce some truly excellent designs. I'm really excited by what I've read about Atom. It will be interesting to see how this design strategy is implemented; if it is as spectacular as it appears to be on paper, it could well sound the death knell for AMD. That company just laid off 10% of its workforce, and its share price hovers around the $6 range. Not that long ago, it was up in the vicinity of $20. AMD desperately needs to become competitive; and they need to do it now.

Regards, Charlie Worton

Fast RAM rather than faster CPU. Good advice. On the other hand, while my 6600 Quad may already be obsolete, for the moment it remains Good Enough; and good enough works for me.


Outlook to Entourage

Jerry, I remember you mentioning the other day the thought of migrating your Outlook to the Mac, possibly to Entourage. That might be a worthwhile experiment, but it would definitely be stretching Entourage to its very limits and perhaps beyond. Knowing what you've written about your inbox...well, let's just say that I think your inbox is way outside the size and complexity limitations imagined by the Office Mac team.

Entourage has an overlapping, but not identical, target market and feature set with Outlook. Entourage does not descend from any Outlook code base.

That said, the MVPs for Mac Office are a resourceful group, and here is a page with a number of links to different tools and methods for migrating from Outlook to Entourage.

http://www.entourage.mvps.org/cross_platform/win_mac.html

Good luck!

Steve Setzer

I haven't tried that yet. Frankly, I find Outlook 2003 and Outlook XP to be Good Enough, and my experiments with Outlook 2007 have not been encouraging; I suspect that Outlook 2003 on a good Core 2 Quad system running XP may be about as good as anything I can have for my needs.

In my case what I need is a great deal of spam filtering coupled with the ability to process hundreds to thousands of emails a day; and enough speed to deal with a hundred or so rules that sort the incoming mail that gets past spam filters. Whatever I use needs to have the ability to display previews in Plaintext, but to expand to html on demand.

I also need multiple address books, the ability to mail to each of those address books, and the ability to enter new contacts and edit details of the old once without danger of losing anything.

Outlook 2003 with a Core 2 Duo system is almost good enough for all of that.

By almost I mean that it does all those tasks, but periodically it stops responding. Left to itself it will recover, but sometimes that recovery can take a full minute, and that's frustrating.

As to Outlook 2007, it does odd things sometimes, and I'm still trying to understand it.

If Entourage won't handle all my mail, I need to keep looking for a Mac OS application that will.

On that score:

Windows 7 not upwardly compatible with previous versions

Yay!! (Of course, the folks at MS have promised it before, but never quite made the leap.)

I don't understand why they don't just bundle in a vm (using an approach like parallels coherence mode or VMware fusions unity mode) that would run legacy apps.

They have their own (sort-of) bare metal vm now with Hyper-V that could be used.

If they did that, they could free themselves from the (inordinately) large mound of baggage that they are carrying around in the name of upward compatibility.

Regards,

John Harlow

John Harlow, President BravePoint jrh@bravepoint.com

Fortunately, Office 2007 and Office 2008 (for Mac OS) still can save in both .doc and .rtf formats; and or course can open those document formats. Having tried both Office 2007 and Office 2008, I am writing this on Imogene the iMac in Office 2008.


Re: April Column Part 2

Hi Jerry.

Another very useful/fun thing from iPhoto is the ability to create a photo album and get a very nicely bound printed book containing these pictures.

I did it for my son's Eagle Scout ceremony.

If you took pictures of your grandchild's birth, that might be a nice present to give.

Thanks for the column as always!

Gene Wright

I have been experimenting with iPhoto, and I have already posted an album based around my granddaughter's birth; alas, at request of the parents, access is restricted to family, but I intend to make a more publicly available album when I get some time.

I find iPhoto a lot of fun, and a great time sink!


OneNote Functionality in Word 2008 for Mac

Recently you wondered about a One Note equivalent for the Mac. Maybe you already have it! From this link:

Hector Gomez, a GBM reader, has come up with cool way to use Word 2008 for Mac like OneNote.

1. Open a new Word document and go to view and choose Notebook Layout and you get a notebook style on the view; you could also choose if you want the rings to show or not by clicking on appearance on the Icon above the notebook.

2. The layout gives you a view of a college-ruled page, complete with colored page tabs, that also you can rename etc... You can enter text notes, images, digital handwriting from stylus, and audio from your Mac's microphone.

3. As you enter notes, you can also rearrange them by clicking and dragging to place at different areas of your notebook, or move them between tabbed sheets. When you reach the bottom of a notebook page, Word instantly lengthens the page for you.

4. When you click on the Audio icon on the toolbar to record a voice note, you then simply hit record to start recording.

5. One thing though is that the audio note doesn't show unless you move your mouse on the left side of the notebook and you'll see a speaker icon pop out. This tells you that there is a voice note there and you choose to play the audio.

6. If you look very carefully you'll notice that there will be a break in the line of the notebook which is where the audio note is at. You could also add a quick text label to identify your recording on the page, and you could also move it around the page. So far I don't see a highlighter to mark notes or text but I'm looking to see if there is one.

Great work, Hector!


Marty Winston observes:

Regarding the April Column Part 2

Interestingly, I've recently discovered the same kind of escalation of productive usability on my BlackBerry that you're reporting on your iPhone. Since I've been reviewing alternatives as I explore new categories, I have (unnecessary for most people) multiple competitors in some categories. Among other things, I now have:

* Several mapping applications
* Several speaking turn-by-turn navigation applications
* A viewer for e-mail attachments in PDF or Office formats
* A Sudoku game
* TV Guide (configured to watch for new episodes of my favorites)
* 2 "push" weather applications
* Speech response Web lookup (Yahoo! oneSearch) > AskMeNow
* A Bluetooth printer driver (for a battery printer that's not quite out yet)

I'm also bullying several industry sources into providing (with luck, later this year):

* Push warnings for storms, Amber Alerts, etc.
* Improved interactivity (mostly via sync) between on-Web and in-phone resources (starting with navigation and mapping).
* More cooperation (mostly sharing hooks) so it's easier to extend embedded speech command applications, for example, to control more third-party applications

These pursuits also led to some discoveries:

* While my battery used to last days, cranking hard on data services & Bluetooth can drop that to hours; I carry multiple rechargers when I travel, plug into the dashboard when I drive but I'm also about to investigate beefier batteries.

* Third-party developers are complaining about the processing horsepower in current models; I suspect that means we'll see some bigger handset horses by late next year - with luck, maybe enough to do some limited multitasking

* I think we'll see batteries going to lithium polymer for energy density but at a higher cost

* More in-phone cameras are heading for the 2-3 Megapixel zone & by later this year, more of the new phones will have externally accessible (thereby swappable) Micro SD card slots - current runs of which top out around 8GB but I think we'll see 16GB this year and 32GB Next year. I have some classic black & white half-hour TV episodes that I rip from DVD to watch on my phone that take about 100K each, so that's some serious storage even if nobody ever gets around to grooming it.

* I just saw a demo that lets any cell phone with a still camera & a data plan send broadcast-compatible live video with 2-way audio to a TV station or network - consumer, business and social networking applications are also in the works for it

* The latest cell industry buzz term is femtocell, which places a small piece of hardware at a site where phone coverage is needed & ties it back to the carrier over an IP connection - the hardware becomes a tiny new tower on the carrier's network for as long as it's running

And, if I may speculate:

* I think we'll see clever new applications that tie things together, like using the camera to capture business cards into the address book or OCR a URL within a photo to allow an immediate link to it, or using the camera with fairly primitive facial recognition as a biometric unlocking facility

* I think we'll be able to get a lot more done while driving by being able to command the phone by voice & have it respond to us by voice.

* I think the phone will get to be more and more like Della Street - change your plans and it will change your flight and hotel bookings, forget a document and it will fetch it from your desktop, dictate to it and it will draft a document

Of course that's BlackBerry stuff - you can't get that to happen on an iPhone without touching the screen - and you never saw Perry Mason touching Della.

Marty Winston

For all that, I find my iPhone a lot of fun. I do suspect that my model will be obsolete by next fall. Alas.


Linux v Windows

Dear Dr. Pournelle,

I wanted to share my experience with Ubuntu Linux 8.4 beta in contrast to Windows.

Originally it was installed on an ASUS K8N-E motherboard with an AMD Athlon 64 processor and ATI Radeon 9200 video card.

I had to move the system to a different system that uses an ECS K7VTA3 motherboard (also with a ATI 9200 video card) but a AMD Sempron 32 bit processor.

All I did was move the hard drive from one system to the other. And it worked right from the first boot up.

Now there is no way a Windows system would let me do that, at least without making a phone call to Redmond. And even then would it allow me to load the new drivers that Windows would require ?

I'm impressed with Linux !

Hoping you health continues to improve,

Tom Slater


Regarding Braxton Cook's Linux rant in April 1 Mail Bag

First off, I wish you well with your health issues. I'm close to your age, and have a few of those myself. What doesn't kill us makes us stronger. (I don't buy it either, but if it's true just once ... I hope it is true for you.)

Mr. Cook made a fundamental mistake in his approach to solving his problems with his Linux install.

1. When his network did not work, he assumed that the driver needed to be updated. This is Windows-think at work. If he is installing a current distribution with a recent kernel, and is using a network card that was recognized by the install routine, the correct module is installed in the kernel, and he is looking at a configuration problem. Further, attempting to grab later code, compile it and patch it into the kernel probably won't fix the problem, especially if it is a configuration problem. In fact, there is a high probability that the new code he is fetching is identical to the code he is replacing. Unless he knows what he is doing, there is a good chance he will make the situation worse, by an order of magnitude.

Generally, you want to grab updates from the distribution's repositories, and install them with the utilities provided. It's been 4 years since I have had to compile a kernel.

2. Pick a distribution the meets the needs of your application, and stick with it until you learn it. SuSe is different from Ubuntu which is different from Red Hat, and so on ... At the core, they are all the same, but they use different installation routines, different management utilities and different package management from each other. They tend to stick files in different locations. I prefer the Debian derivatives, like Ubuntu, but that's just me.

3. The video problems with his laptop stem from the proprietary nature of the video adapter used in that particular HP model. Since he was trying to configure it with SAX2, he was using SuSE which does not configure ATI proprietary drivers automatically. I haven't used SuSE for a while, so I can't comment on how they handle this, but both ATI and Nvidia provide binary blobs with instructions for installation, when all else fails. They really do need to open up.

The main thing to remember is that Linux is not Windows, and to use the community support resources. The Ubuntu and CentOS community forums are awesome.

Rick Spencer


VMWare Fusion - Unity mode

VMWare Fusion has a mode called "Unity" which is similar to the Coherence mode in Parallels, mentioned a couple of weeks ago by "Tim of Angle" in Chaos Manor Reviews.

"Unity" essentially hides the Windows desktop and interleaves Windows applications with Mac ones more or less seamlessly.

I personally find it distracting--I like to see a Windows desktop behind my Windows applications, and a Mac desktop behind my Mac apps. But many people like the Unity mode.

Steve Setzer

Thanks for the warning.

MacBook/Parallels/VMware Fusion

Hi, Jerry -

To recap, I have been using Windows XP Pro under Parallels to HotSync my Palm Tungsten T3 using its USB cradle. I have a lot of data in a suite of Palm applications called SplashWallet, published by SplashData. Their Mac desktop software is a version behind the Windows version, so I need Windows for now.

At first, I thought the current build 5584 of Parallels would do the job, but it has been lacking in speed and stability. The HotSync process often quit and had to be started again. But recently, more than once, Parallels itself quit during a HotSync. I didn't lose any data from the Splash applications, but I did lose a bit of data from Palm's own Note and Memo applications.

So, I've switched to VMware Fusion. After three weeks of daily use, Fusion has proven to be trouble-free. In Fusion, Windows is much faster over all, and the combination has been rock solid.

My MacBook is 15 months old and maxes out at 2 GB of RAM. I'm running Leopard and Windows XP Pro, both fully updated. When Parallels is running, the swap file grows to several hundred MB and it is used a lot. Fusion seems to be much more efficient in its use of RAM, and has been content with a swap file of a bit over 3 MB, or none at all if no other Mac programs but Activity Monitor are running. Also, Parallels tech support told me they knew their USB access had some issues but that it would be fixed in build 5584. Maybe they didn't fix it as well as they thought.

Perhaps Parallels shines on systems with more RAM, but for me, VMware Fusion is the hands-down choice.

Bill

I continue to experiment with VMware fusion. I'll have a lot more to say on that after some more experience.


We close with a pair of complaints:

"Friendly" or dangerous operating system features

Dear Dr. Pournelle:

On April 13th, you wrote (in View, www.jerrypournelle.com) about Macs: "Why do they think they are smarter than I am and know better what I want?" That really hit home. It was exactly why I whacked that one button mouse through a wall one day in 1998, and swore off Macs forever.

Unfortunately, since my finding the joys of Windows NT, Microsoft has tried more and more to imitate the Mac's style of operation. I think that here lies the heart of all my problems with VISTA. Like the Mac, it knows better than me how to run my life. Like the Mac, VISTA sets defaults, and hides methods for changing them, to suit the way it believes I should want to work. Unlike the Mac, and just like earlier versions of Windows, it commonly gets thing wrong, and usually fails to find a courteous way of telling me what it wants me to do. On the other hand, from your comments about OS/X, it seems that the Mac is speeding toward convergence, with similarly incomplete or uncommunicative messages.

All of this brings me to a much more serious concern: virtual data organization. VISTA has added virtual folders to its list of tricks to hide reality from the user. Beyond music playlists and libraries, and virtual picture albums, It is now possible for a user to assemble entire virtual structures of folders and sub-folders full of files of any type, seen and used on the desktop just like any other, "real" files and folders. These files become conveniently available in one place to a desktop user without necessarily being actually in place anywhere on their computer.

Why should this worry me? It does so because increasingly ignorant users are using an increasing of number of removable media types: CD-R , DVD-R, external USB drives, flash drives and so on, not to mention networked and online storage, and file sharing sites. At the same time, Windows and the OS/X are working to harder to hide the real, physical nature of the disk operating system's actions from the user. I really wonder how many real world users really know where (on which drive, at which location) the data they are accessing actually resides. Perhaps this most often applies to non-technical users and non-critical data right now, but for how long? How soon before important, valuable data is lost or stolen because the user simply didn't understand how his computer, its operating system, network, and the internet work?

Machines and machine-makers that think they are smarter than you are, and know better what you want are not just insulting to the knowledgeable. They are putting power into the hands of the ignorant, and so into those who seek to control them and misuse that power!

Alun Whittaker


And Bo Leuf adds

PC-MAC network madness

Although I would in principle tend to agree with Robert's view that the problem you had with broken networking was caused by WINS and master browser, it is not so simple as he says that this is a purely Windows issue. You are running Samba on the unix side to interface with the Windows-based LAN, and my experience is that the interactions between Samba and Windows networks can get very messy for obscure reasons. And problems go away again for equally obscure ones. Different timing tolerances of signals is one known problem, as is confusion among participating systems about which is the master browser.

Regardless, a full restart of all machines seems under the circumstances to have been a sensible thing, in order to let samba and master browsers to fight it out all over again for precedence. My guess is that the Mac's Samba and at least one of the Windows systems both are configured to be the master, perhaps mismatched in other parameters, and you will get periodic confusion and disconnect as a result.

I run a mixed Windows and Linux LAZN with on the order of ten machines at any given time, plus a couple of VMwared virtual systems as well. It frequently happens that my son who is mostly in Windows has problems accessing or even seeing the other systems on the network. Unreasonable delays in finding and connecting are also common. NFS and other Linux protocols (such as Lisa) by contrast just work all the time, at the same time, though of course then not including the Windows systems.

/ Bo

My next step is likely to simplify my network getting rid of as much as I can. It has been suggested to me that I get a Mac Mini and use that as the network server. I am considering that option but it hardly seems a simplification...


And that should do for this week....