Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor:
The Mailbag

Mailbag for March 5, 2007
Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2007 Jerry E. Pournelle, Ph.D.

March 5, 2007

First a reader inquiry:

Dear Jerry:

Would you do me a favor and ask your Linux gurus if any of the current distros would run well on an elderly IBM ThinkPad with only 128MB system RAM? No big tasks for the machine, just wondering if it could be made into a "poke around" Linux-learning box for me. I'd really appreciate it.

Thanks!

Tim

PS: Been using Vista on a brand new HP compact desktop system for the past two weeks (so I might actually know whereof I speak) and my review: it's like a lady "of a certain age" going for cosmetic surgery for nips, tucks, implants and augmentations, and general lifts and buffs: she definitely looks better when she leaves, and has a little bounce in her step because of it, but underneath it's the same tired old broad. At $29 Vista would have been a good value; allow for MS's massive greed and $49 would even have been okay for such a face lift. At current pricing for the Premium Home edition you're almost half way to a Mac Mini which will not only look better but work much better as well. In my never humble opinion...

I put the question to my crackerjack advisors. Managing Editor Brian Bilbrey says

http://www.xubuntu.org/ would be my first choice in old-hardware distros.

Minimum system requirements

To run the Desktop CD (LiveCD + Install CD), you need 128 MB RAM to run or 192 MB RAM to install. The Alternate Install CD only required you to have 64 MB RAM.

To install Xubuntu, you need 1.5 GB of free space on your hard disk.

Once installed, Xubuntu can run with 64 MB RAM, but it is strongly recommended to use at least 128 MB RAM.

No promises, of course. And it shouldn't be hard to upgrade the RAM in an old thinkpad, buying a big boost in performance (by reducing swap to disk) for a small investment.

.brian

A brief exchange between Rick Hellewell and Brian Bilbrey:

I seem to recall that an Ubuntu version can be run directly from a CD (assuming your system can boot from a CD). ... and doesn't it allow access to your hard disk for storing files? A way to try out Linux without partitioning, dual boot, etc.

...Rick

* * * *

Yes, but installing from the Xubuntu LiveCD calls for 192M of RAM. Installing from the "alternate" CD only wants 64M.

.brian

Chaos Manor Reviews isn't a tech support site — we just don't have the resources — but we will try to answer questions of general interest to the readers.


Continuing to watch Vista:

Subject: Vista upgrade -

Dr. Pournelle,

I recently upgraded Windows XP Pro to Vista (upgraded not a clean install). You asked your readers to respond to you with their experience.

I did have problems — The install initially just would not work — I tried it several times. After thinking about it some, I tried an install with everything unplugged except the monitor and Ethernet (ESPECIALLY all the USB peripherals) and it went smooth as silk. The key here is apparently to unplug ALL peripherals.

Hope this helps your readers,

John F. Gothard, Ph.D.


Subject: Clean Installation of Upgrade Vista

Jerry

Here's what I did:

Purchased a OEM copy of Vista home premium, and a new hard drive.

Installed the hard drive, and formatted it in NTFS.

Installed Vista on new hard drive dual booting with XP.

Both can see the other's hard drive. It works fine.

Mike Boyle

I keep hearing horror stories, but none were sent to me, and apparently for this with reasonable computer savvy, for the most part it all works pretty well. Alex has a report on his experiences in the regular column.

Let me emphasize again that I do not really recommend upgrading to Vista; if your XP system works well, there isn't all that much advantage to upgrading. It won't hurt you to wait a while.

Here's one more story, commenting on my reports on implementing wireless networking in Vista:

Jerry,

I got Vista to connect to our wireless network in about ten seconds. I suspect that IBM program on the Thinkpads is taking over the wireless connect function so well that Vista cannot do it anymore. I have learned since XP that you should either use the built in wireless functions of windows (which I recommend) or a third party app but not both. Vista is probably less tolerant of it than XP was. I bet if that IBM program was uninstalled Vista would connect fine.

Dean Peters

I agree entirely: in XP, you are far better off forgetting that IBM/Lenovo laptops have wireless connect and management software. Use the built-in XP Windows wireless functions. They have always worked for me.

With Vista it was just the opposite: I found the Vista wireless connection software deficient, while the Lenovo wireless connection software worked just fine first time.


Subject: Unices and frustration

Jerry,

I finally have Sun Solaris 10 up and running under Parallels Desktop on my MacBook (I wanted to use Sun's Fortran compiler (I can feel you shudder at the thought)) and it has made me realize the sheer amount of effort that Apple has put in to making Unix useable.

Configuring Solaris is arcane in the extreme -- even finding how to add a non- root user involved some considerable searching on the Internet and required the use of a command-line interface; adding another user in OS X is trivial by comparison (System Preferences-->Accounts). Don't even ask me about how I managed to persuade it that my monitor was not some monstrous size compared to reality (the install process picked a nice 1024 x 768, but for some reason the installed version of Solaris decided that my display was much bigger than it actually was, and the GUI method of changing the resolution simply didn't work). This whole experience has left me in no doubt about why the Sun and Apple share prices are the way they are.

And as an aside, installing and configuring Windows 2000 in Parallels is child's play in comparison (and I would say a lot faster, except for the time taken to install all the updates). Of course, I also have a nice Unix-like command-line environment in Windows using Cygwin (so that I can produce Windows executables of my programs for co-workers).

Best regards,

Alun

Dr Alun J. Carr
School of Electrical, Electronic, and Mechanical Engineering
University College, Dublin

Thanks. I recall in the 1980's there were computer users including BYTE readers who just couldn't wait to get "Real UNIX" on their desktops. AT&T even sent me a neat system that was part PC and part UNIX with some minimum communications between its two personalities. It was fast and powerful, but I found that learning UNIX required more time and attention than I had to devote to it. I concluded that UNIX was the full employment act for UNIX Gurus and Wizards, and it would always remain that way.

Over the years people kept sending me UNIX shells that would make it usable, and they did help. UNIX has some interesting features, particularly the whole concept of pipes in which the output of one application is fed as input to a new application; but DOS and WINDOWS programs weren't written to take advantage of that.

Now that Apple has built a UNIX shell — MAC OS X — that really works, I suspect that many applications will be written to take advantage of the underlying UNIX; with luck they will keep UNIX itself invisible.

Regarding FORTRAN, I wrote several complex models (both Monte Carlo and expected value) of thermonuclear war in FORTRAN during the 1960's. FORTRAN plus the RATFOR (Rational Fortran) pre-compiler that catches a number of common syntax and type change errors, allows reasonably structured programs, and they're not impossible to understand. I rather liked FORTRAN plus RATFOR.

I haven't looked into this in some time, but last time I checked with supercomputer users, they were writing programs of half a million lines of FORTRAN...


The discussion on Net Neutrality is nowhere near over. I make no doubt that it will be featured in several columns before the year is over. The following probably won't make sense without referring to the discussion last month, in which I said

"What this agreement apparently says is that AT&T must provide you, with your home based web server, bandwidth of the same quality as it provides a company that wants to sell movies on demand. That may or may not make economic sense. I am sure the cable companies will cheer."

Associate Editor Eric Pobirs replied

There is no way AT&T agreed to that. ISPs have a longstanding division between the service levels given to businesses vs. consumers. Consumer lines are 'best effort' but lacking in any guarantees compared to business-class lines that carry more assurance for a greater cost.

I really wish people would stop trotting out small, densely populated countries as the pinnacle of broadband aspirations. Just bringing LA County up to the same spec as Indiana-sized South Korea is non-trivial in the extreme but the fiber rollouts are progressing.

-- Eric

Eric is of course right about the longstanding divisions in service levels, but reading the "net neutrality" agreement that AT&T (temporarily) must abide by as a condition for acquiring the last of the Baby Bells certainly reads as if they had agreed to give those up for now.

My real point is that most people who are for Net Neutrality generally have no idea of how to write a meaningful law or regulation that would implement it.


One continuing discussion is whether the well known security problems with Microsoft are largely the result of the larger popularity of Windows — hackers go where the ducks are — or inherent vulnerabilities that Microsoft will never fix; and whether other operating systems such as Linux are inherently safer. I've expressed my views often enough, but I have to plead lack of experience with details.

Chaos Manor Associate Dan Spisak says

There are tons of exploits against Linux! In my time when I was working and running an ISP I had the unfortunate displeasure of finding hackers who had broken into Linux-based systems thanks to a buffer overflow in some piece of software that had been running on that server.

Are there exploits for the Linux kernel itself? Sure but if I scan through my Bugtraq folder it becomes abundantly clear that Linux's Achilles heel comes in the form of all of the FSF and OSS programs that tend to make up your average Linux distribution. There are so many possible attack vectors on a Linux box that it is nearly as vulnerable to attack as a Windows box. The only perceived advantage for Linux right now is that I am not aware of any malware that runs on Linux or can get installed/infected by doing unsafe browsing practices, because the money is in hitting the Windows users. However, various nefarious groups do like to target servers online to create a hidden base of operations for transferring files of various illegal natures.

This is a big reason why I run OpenBSD on my personal server. People may not like Theo De Raadt and think he is socially insensitive, but the facts speak for themselves. OpenBSD has not shipped with a remote exploit in over 10 years now. They backport all kinds of security patches into various parts of the OSS/FSF software they let run on OpenBSD while also putting in cutting edge security features into the OS kernel itself to better protect against attacks.

Seriously, if you don't subscribe to Buqtraq you should. It would truly open your eyes to just how many security damaging bugs are out there. Yes there are lots for Windows, but there are almost as many if not more for Linux and its OSS software. While Linux tends to patch their bugs faster then Windows usually, the act of keeping up with all of the patches can be harder on Linux then on Windows depending on what distribution one is running.

Dan S.

Alex Pournelle asked:

Dan:

Stupid question: Are all these add-ons items which people need to run a Linux server, or just for desktop use? This would be an important distinction to make, just as a fair comparison of Windows server means you discuss the necessary add-ins to make it useful. MS does have a security analyzer (licensed from Shavlik) to look at what you might want to be running, and (particularly in Win2K3 server) add-ins get installed OFF rather than ON by default.

In the past, the biggest server vulnerability was IIS. (IIS = Internet Information Server, Microsoft's web server app) IIS was the biggest source of server-side risks for a long time running, From my not very informed perch, IIS has gotten a lot better, at least in security, though last I heard Apache still holds the largest share of the server market (and has, does it not, pretty good Windows implementations?) across the spectrum.

Thanks,

Alex

And got this reply:

If it ships with a distribution in a default installation, then it's a likely attack target. The fact of the matter is, on Linux and any UNIX OS installing a program really comes down to this:

You compile various code libraries for feature/functionality support for an application

You compile the application

You install the application and its needed libraries.

Some libraries are fundamental to the use of many or all OSS/FSF programs. GNU C Library (aka glibc) for example is a widely used library. It tends to get updated on a somewhat regular basis for one reason or another. If I scan through Bugtraq right now I see these headlines:

[ GLSA 200702-06 ] BIND: Denial of Service Re: Solaris telnet vulnberability - how many on your network?
[ GLSA 200702-07 ] Sun JDK/JRE: Execution of arbitrary code
[ GLSA 200702-08 ] AMD64 x86 emulation Sun's J2SE Development Kit: Multiple vulnerabilities
mAlbum v0.3 admin by default user/pass
Firefox: about:blank is phisher's best friend
Re: Apache Multiple Injection Vulnerabilities
[SECURITY] [DSA 1261-1] New PostgreSQL packages fix several vulnerabilities
[USN-422-1] ImageMagick vulnerabilities

Now looking at the above list of email headers from that group is educational for a number of reasons. First it's worth noting that we have security announcements from different Linux distributions (USN is Ubuntu Security Notice, GLSA is Gentoo Linux Security Announcement) sometimes for the same vulnerability in a particular piece of code. Even worse is sometimes one distribution is not vulnerable while others are because that distribution has modified the stock source code for a library for some reason or other. We can also see that there are still exploits being found for PostgreSQL and ImageMagick which are packages that could be commonly used by a server. For example on my server my photo gallery software Gallery 2 uses ImageMagick to modify JPEGs and other image formats and it stores various information in a MySQL database.

Obviously, your average webserver doing anything interesting is doing some form of LAMP - Linux, Apache, MySQL, and PHP. Those four things have huge surfaces of potential attack vectors because of either direct application bugs/insecurity or due to underlying support library bugs. It is possible to secure these systems but its not something that is the easiest thing to do and depending on what kind of website someone is running they might have to think very deep about how to secure all of their potential incoming vectors from attack.

-Dan S.

All of which tells me that while some operating systems are more vulnerable than others, this is due to more than one cause. The fact is that we are none of us inherently safe no matter what we use. I do remind everyone that most worms and viruses are installed by users themselves through opening mail attachments, or saying "Yes" and allowing malware installations to run. Social engineering infects far more computers than any other exploit.


On Tax accounting

Hi Jerry,

Most of the larger accounting firms utilize tax preparation software by RIA or CCH. In both cases the software and data is stored at the vendor's site, with redundant offsite storage.

Actually, this is really a return to the way that professional computerized tax preparation started out. Years ago (the early 60's I believe), forward looking accountants who prepared tax returns by computer would fill out input sheets that would then be sent to a service center for input. Several days later a tax return would be delivered to the accountant's office. While the accountant retained a paper copy of his client's return in his files, the computerized data from the input sheets was stored offsite and was used to enter recurring information on the following year's input sheets..

Later, we had the ability to rent dumb terminals that were hooked up to high speed (read 4800 baud) lines so that corrections could be made directly without the resubmission of input sheets. Again, the data was stored offsite in the vendor's computers.

Thanks to the PC revolution, the software itself could be licensed from the vendor and installed on our own network. Returns could be printed in house. All data was retained in-house for the first time since the 1960's. This lasted about 5 years until web based software took over, with the data again being stored on the vendor's computers. A main advantage of web based software is the ability of the vendor to fix glitches immediately and uniformly for all customers. With the CD licensing model the accountant would have to wait for the next release to fix bugs (there were usually about 9 or 10 releases between January 1 and April 15).

My firm is in the process of going "paperless" so that even our copy of our client's return is stored on our hard drives, as is the underlying financial data that is the basis of the entries on the returns. Our clients receive their copies of their returns on CD's and they love it. They thank us since the many years of paper returns were taking up needed space.

I do not know if any of this make you any more comfortable with having your data offsite, but I can assure you that my (and my partners) livelihood would be injured if confidential data was lost or revealed. In the many (30+) years that my firm has used offsite tax preparation software not one file has ever been lost nor has any client data ever been misappropriated.

Best, Mark

I believe you. Nevertheless, I continue to use TurboTax and to keep my tax records at home.


And a few cleanup items that don't need comment:

Subject: today's mailbag

Hi Jerry,

I saw Ron Morse's comments on my note of last week on VMs. What he says is very true, can't argue it, IF you want to save that precise VM. The glory of VMs for me is that I don't have to bother. Keep the data elsewhere and just blow away the broken copy and reload. A matter of a few minutes at most. (Caveat emptor, I am not building giant Windows edifices, but rather just need to run MS Office, Fog Creek Copilot for remote admin of family and friends' machines and the like.) Avast updates itself and MS Update takes care of the rest, should I even care.

Richard


Subject: help on my website

Jerry,

Thanks for the help on my website. As one of your readers suggested, I got a copy of the CSS missing manual.

Thanks for reminding me of topic sentences! As you say, we mostly need reminding.

Phil


On Partitions:

Jerry,

I find myself in complete agreement with your response to Armando Molina's query on partitions. (19th Feb 2007)

Of course, in the old days, partitions were necessary to overcome size limitations imposed by the operating system, and there was also some benefit in keeping partitions small enough to minimize cluster size. Not any more.

Mostly, I find that partitioned hard drives scream: "Amateur at Work". I rarely see partitioning used to good advantage.

For as long as can remember, most of my computers have had 2 physical hard drives, either for RAID or manual data backup, and I have ignored the various generations of removable media until DVD burners became available at a reasonable price.

Incidentally, Windows Vista works well for backing up an image to an internal HD.

Regards,

Rob Megarrity