Dr. Jerry Pournelle

Email Me


Why not subscribe now?

Chaos Manor Subscribe Now



Useful Link(s)...

JerryPournelle.com


Hosting by
  Bluehost


Powered by Apache

Computing At Chaos Manor:
The Mailbag

Mailbag for May 14, 2007
Jerry Pournelle jerryp@jerrypournelle.com
www.jerrypournelle.com
Copyright 2007 Jerry E. Pournelle, Ph.D.

May 14, 2007

As usual, there is a great deal of mail, including a lively discussion on computer languages.

We begin with a warning Chaos Manor Advisor and security expert Rick Hellewell that came in just too late for last week's mailbag:

Dr. Pournelle:

Today's (May 4, 2007) Wall Street Journal has a front-page article on the data theft (over 45 million credit cards; perhaps as many as 200 million cards) at TJMaxx.

The recipe includes using a 'cantenna' (a wi-fi antenna made out of a Pringles can) to snoop on a WEP-encrypted wireless network at a store in St Paul MN (US) that carried cash register data. Sniff the wi-fi data, break the WEP password (easily done), then watch unencrypted logins by employees. Use those logins to get user names and passwords, and use that to get into the store's (and eventually the corporate) computer network. Install backdoors into the systems to allow remote access, then install programs that copy credit card data, putting them in encrypted files, and grabbing them at your leisure.

The result: losses of millions of credit card numbers over more than two years, and a total cost estimated at US$1 Billion over five years to fix the problem.

More details info at today's entry at my place . And some good identity theft protection information at the California Dept of Consumer Affairs site here.

Businesses need to follow the "Cardholder Information Security Program" (CISP) guidelines to protect your customer's credit card numbers. Start here for info: VISA link.

Regards, Rick Hellewell

Peter Glaskowsky comments

That's awful. Anyone using WEP to protect financial data since 2001 (when WEP was shown to be insecure) should be held criminally liable under some "duty to protect" law. (Is there such a law?)

. png

I don't know of any such law, but I agree there ought to be. But it gets worse:

Not only using WEP, but unencrypted data (logins, credit cards, etc) on their network.

You'd be surprised (perhaps not) on how many businesses are not encrypting credit card data, even though the credit card companies have been trying (perhaps not very hard) to enforce their Cardholder Information Security Program (CISP).

Just about all the data thefts you hear about (and probably all those you don't hear about) are because of storing credit card info unencrypted. Some even store the '2nd stripe', which has full info needed to clone the card.

...Rick Hellewell....

Peter Glaskowsky recalls the bad old days, which may or may not be ended:

I remember being surprised back in the 1980s when I bought a damaged credit-card processing telephone at a ham radio swapmeet. The card reader and its RS-232 output were still intact, so I was able to swipe cards through and see what came out. Lo and behold, my ATM card had my PIN on it! It wasn't even obscured-- it was just sitting there in plaintext. I used to demonstrate the technique to friends, who would often deny it was possible-- sometimes adamantly-- until I read off their PINs.

This problem has been fixed in the meantime, of course, but I was just flabbergasted to think that anyone who stole my card would get the PIN automatically.

Since then, I have been fairly difficult to surprise with this sort of thing.

. png

The point being that one should never be surprised by such laxities. But using WEP in these times is nearly criminally negligent. If your router doesn't support something better than WEP for your home or small office wireless network, go buy a more modern one. They aren't expensive.

But the discussion wasn't ended. Rich Heimlich volunteered:

Hey, this got me thinking about wireless protection in general, at your home. For example, my father-in-law just set up his broadband and was completely fine and then I made the mistake of asking him what he did about his wireless setup. Oops. I was mentioning wireless because he's thinking of adding a NAS device for music serving. He said, "You mean my wireless network is on by default?" He was pretty shocked when I didn't seem worried about his having an open wireless setup. I tried to use common sense here and say, "Look, how many people do you think drive around neighborhoods looking for access points?"

I'm pretty much someone who doesn't buy into the fear and consumption paradigm that most of the US buys into. My parents have an elaborate alarm system that they paid a fortune for and they got robbed. I've never had one and I've never been robbed. If I do get robbed, we have insurance. I also am totally sold on the idea that 99% of people will never need anti-virus programs and that Symantec and MacAfee spend a fortune convincing everyone they do need it. In all the years I've been helping people with their setups and they swore they had a virus, turns nearly none of them actually did. They just believed they had one given the fear put out there. Furthermore, in most of these households I've found products from these two to be the prime reason that the PC's their own are running poorly. I uninstall all of that and put them on something like NOD32 and they couldn't be happier.

Anyway, on my setup, I use Mac Filtering as my only line of defense. I've kept an eye on my logs here and there and I've yet to see anyone try to access my wireless network from outside my home. WEP and the others just seemed like silly overkill for home protection. Mac filtering seemed much better. Yes, they can be spoofed but good luck guessing which ones I've allowed.

Thoughts?

Rich Heimlich

I was about to reply, but Peter Glaskowsky did that first:

I lock my house, and I leave lights on, and I have security cameras so I can see who's coming to the door. The house doesn't even look like much from the outside. A good professional burglar would have no trouble getting past all that, but I don't have anything a good professional burglar would want. The amateurs are going to see the lights and cameras and go somewhere else.

My MacBook Pro uses FileVault (home directory encryption) and strong passwords for both accounts, and requires a password to wake up or exit the screen saver, and even uses secure virtual memory. My home wireless network uses WPA2 and MAC filtering, and I don't leave any computers on when I'm away.

So I guess I do buy into that stuff.

After all, if the network isn't encrypted, nobody has to guess; they just have to sniff the packets and record the authorized MAC addresses.

Would anyone do that? Well, probably not. But how long would it take to set up WPA encryption? Why not do that? It's another layer of security for almost no effort.

. png

I grew up in an era when no one locked their doors: I don't think we even had a house key. But that was a long time ago. Now we have dead bolts and motion detectors, and I still make sure my house is occupied when I go on trips. And Dan Spisak, who generally attends DEFCON and other such events, adds:

I pretty much only use WPA-Personal or WPA2-Personal because:

a. It lets me input a *real password*, not some passkey, or 13/26 digit long alphanumeric. Or was to wonder if I have to input HEX values or ASCII characters. You get the idea here.

b. WPA/WPA2 are significantly better written implementations of encrypted networks. Cracking a WPA password is to my knowledge technically feasible if you happen to live a decent percentage of the time it will take for the heat death of the universe to transpire. WPA2 is even better since it can utilize AES for packet encryption.

c. While I would love to have an open AP for neighbors and other passerbys to use to get that warm fuzzy feeling I would only do it if I had a good, easy, cheap way to packet shape that unfiltered traffic so it never uses more then a certain percentage of the available bandwidth of my Internet connection. Since I don't have a good solution for that right now I just run the WPA network to make sure no one is making my network slow with their unknown network activity.

d. Hypothetical scenario of Doom(tm): You run WEP (or Open with no firewalling). I attach to your network without you knowing (after all, you claim to check the access records, but by that point it's really too late). I then use an open source tool call ettercap-ng and fool the switch in your broadband router into thinking my laptop is now the gateway switch forcing all traffic between your PC and the Internet through my laptop. I then run Wireshark and capture your login to your email account. Or maybe even your bank! Or I use the passwords in the clear I see you sending for other services (IM, email, web forums, etc) and start trying them at your bank. Or I hack into your desktop and look for financial records or other low hanging fruit.

Will this happen to you? Statistically speaking, no. But if it does, it really can suck for you.

This is why I use WPA.

-Dan S.

Which is probably enough on the subject. Statistically your house won't burn down - insurance companies bet that nothing will happen to you, and they get rich that way - but that doesn't mean you shouldn't have fire insurance. Proper wireless security is pretty cheap insurance.


I try to do silly things so you don't have to, but I confess I have been more involved in fiction recently. Fortunately we have David Em to continue working with high end graphics systems and software. He reports:

Subject: 64-bit Vista sucks

I installed 64-bit Vista on the HP xw9300 (dual Opteron 252s, two Nvidia Quadro 3400 cards) workstation about three weeks ago.

Bluescreens regularly, sometimes from some sort of conflict with a Bluetooth keyboard, other times due to the graphics drivers. Nvidia, which is usually pretty good on support, hasn't met the bar on this one. The latest driver creates terrible lag when moving windows and can't play back video faster than a frame every few seconds. There are some minimal controls, nothing like what they provide under XP. Rolled back to the previous drivers, which has not controls at all, and things speeded up. Until the next bluescreen. I've given up until they release a service pack. Impossible to get any work done. I suppose 32-bit Vista works better or there would be riots. Memory hog, too. It uses up over a gigabyte of RAM simply to get up and running without any software turned on. I could go on with numerous fine points, but bottom line it's not ready for prime time.

David

Which confirms my impressions of Vista both 32 and 64 bit. I am using Vista on this system (Core 2 Duo with lots of memory) and I like it, but I haven't put Vista on my other main machine, and I don't have 64-bit Vista at all.

Robert Bruce Thompson has long ago gone over to Linux for just about everything he does. His comment on David's report was not astonishing:

> I suppose 32-bit Vista works better or there would be riots.

Not from what my readers tell me. Zero of the corporate users who have emailed me about it are running Vista on anything but test-bed systems, and I've heard from many readers who bought new systems with Vista on them and ended up reverting to XP. The fact that Dell reversed itself and now offers XP on new systems, including consumer models, is very telling.

Microsoft struggled mightily for more than five years and ended up dropping a gigantic blivet. I'm not convinced that SP1 (or indeed SP2 or SP3) will fix Vista.

-- Robert Bruce Thompson thompson@ttgnet.com

Aesop tells the story of the mountain in labor ... (link ). Dan Spisak expands on both 32 and 64-bit Vista:

Vista 32-bit is working fine here on both a Core 2 Duo desktop with 2GB of ram and on my Macbook Pro with 2GB of RAM. Vista 64-bit still relies heavily on driver support from vendors which have been taking their sweet time. As for the RAM hog, it is possible to make it use less on boot, depending on which version of Vista you installed, by turning off some services you're probably not using. But in general, yes, Vista uses more RAM. So in the grand scheme of things here there is responsibility on Microsoft's part and on the vendors part.

If vendors can't write drivers for a new framework within 5!!! years then there are serious issues there. I would like to note that ATI's graphics drivers for Vista have been far better performers in stability and regular updates when compared to NVIDIA in this instance. The drivers for my Intel made motherboard and its BIOS have also been excellent.

To me it sounds like the real culprits here are NVIDIA and Logitech (I'm assuming thats who made the bluetooth keyboard) for not writing good drivers.

This will eventually smooth out as these companies remember how to write solid drivers, or get their act together with the Vista driver model. I would be surprised if this takes longer then to the end of this year. However 64-bit drivers may still take longer since those environments are not completely mainstream still.

-Dan S.

This prompted another observation from David Em, who follows graphics hardware and software very closely indeed:

My dismay at NVIDIA knows no bounds. Who's going to run a 64-bit OS at present except people who need reliable high performance workstations?

If it really takes them another half year or more to provide a driver that does more than put up a picture for a while till it crashes for this community that they've marketed so heavily to, that's pathetic.

-- D

Vista had the largest and longest beta test of any operating system, and one supposes that Microsoft provided driver development support. It's astonishing that the applications publishers didn't have drivers ready. Is it Moore's Law? Riding an exponential is very difficult.

And on the subject of drivers:

Chaos Manor Mailbag: Driver double-talk (aka Hot Potato Drivers) and MS Technical Support. With regards to Michael L. Votaw's comment in today's Mailbag:

My big HP commercial InkJet reboots anytime you print with the Vista driver for it and HP doesn't support the driver since it's included in Vista now.

I've had the same kind of problem of a "hot-potato driver" with XP as well. I had problems with a driver related to the ASUS motherboard's SATA interface; when a new driver became available on Windows Update last spring, I downloaded and installed it. Bad choice- it rendered the box unbootable, and the System Restore points wouldn't restore. A call to Microsoft resulted in 1) being told (by MS support), that's not our software, it was written by the hardware vendor (even though it was available only from Windows Update, as far as I could tell), and 2) being told by Asus support, we can't help you, that's a driver from Silicon Image, and finally 3) oops, nothing, couldn't get any response from SI.

Working with MS Support eventually ended up with the machine in a state where I couldn't even get into the System Restore, and I ended up doing a flatten-'n-rebuild. Thank Goodness for separate data drives.

I've got similar problems with multiple video cards/monitors, which also eventually resulted in an F'nR (otherwise known as nuking it from orbit; JEP), and losing one of my three monitors ever since (I gave up after two weeks of flashing the video, trying to download a new driver, convince it to install, and so forth.)

Windows XP stable? Somewhat. I'm not switching to Vista any time soon, but I'm also not particularly pleased with XP. Over the past couple of years, I've tried using my "free" tech sup calls with Microsoft perhaps three times. In each case, I ended up finding the final solution of the problem- in one case, dealing back-and-forth with the requests for information and proposed "fixes", I ended up having enough time to research and find the actual cause of the problem myself, while in the other two, the "fixes" proposed by the tech-sup ended up making the situation worse, eventually resulting in the F'nR.

--
Brian Pickering

The important point here is to have external backup of all data and creative work. I can replace computers, operating systems, and programs. I can't easily replace the work I have done on a novel net yet completed. Then there are accounting records and tax returns...


The announcement that Microsoft will no longer support Visual Basic 6 produced a great deal of mail on both this announcement and on languages in general. I have selected a representative sample of opinions. The discussion was both informative and interesting.

Subject: ON VB 6

>developers are coping with the demise of key programming languages. - ... he opted to move his projects to C#.

So he gets hammered by Microsoft's end-of-life of his preferred toolset and then picks another Microsoft tool to use. Doesn't that just set him (or his successors) up for the same problem again?

Personally, I've decided my business is too important to me to build it on the uncertainty of proprietary tools. He'd have been better off in the long run to invest his time in something like Python I would have thought.

Scott Kitterman

I replied,

Why Python? Why not a truly compiled structured language?

Jerry Pournelle Chaos Manor

And Kitterman replied

In my case Python is broadly useful for many different types of tasks. The code is coherent and I can understand my code when I come back to it months later.

I don't code every day and so picking one language that I can focus on that is sensible and easy to retain works for me. The byte compiled Python code is generally fast enough for me.

I recently did some benchmarking of different Sender Policy Framework (SPF) libraries and the Python took roughly twice as long as custom C code to complete the tasks. With today's processors that's a price I quite willing to pay (I know C isn't strongly typed, but that's the only compiled language that was in my test set).

Your correspondent may well find some other toolset more suitable (likely will).

My major point is about proprietary tools. Unless there is not a reasonable alternative, I think they should be avoided. In the open source world things don't get hit with an arbitrary end of life date. they tend to go on as long as there is significant interest. Generally I see business dependence on proprietary tools as a business risk.

Scott Kitterman

I agree about the usefulness of Python, and the importance of learning to use Python in solving many computer problems. I would probably not write large applications in Python, but then I don't write large applications to begin with.


Subject: VB6 Retired

Regardless of all the folks out there moaning about this, my take is that it is a good move and not before time.

I've earned my crust developing code in VB since the heady days of version 2 some 15 years back (when I used to read your column in Byte). In that time every single version, up to and including VB6, was badly thought through in places and treated as a second class language in the Microsoft scheme of things. Multi-threading? Nup, you can't do that (without hacks that aren't portable between versions). Inheritance? You can have something that looks a bit like that, but you'll need to re-implement the code in each object. System level calls? We'll make a half hearted attempt to let you call the simpler Windows APIs, but won't provide any of the headers or documentation to do anything else.

Finally with VB.NET Microsoft upgraded the language to the premier league. Pretty well anything that could be done with C#, C++, J# etc could be done with VB.NET. The full .NET framework was open to it and it could directly use class libraries written in and intended for other .NET languages. At the same time many of the guns and knives of the Windows environment were removed (or placed in a glass case with a card saying "break in case of emergency, penalty for improper use") and the patchwork quilt of it's heritage hidden behind a consistent object model.

Having done all this work, having cleaned up the pre-existing mess that was VB, Microsoft gave developers fair warning that VB6 would be phased out and they should look to upgrading. Five years later folks are kicking, screaming and looking for sympathy when free support has ended. Time to move on people. If you wanted a career where you didn't need to keep learning new stuff then writing software was about the worst choice possible.

Regards,

Steve Todd


Subject: Re: Retired! From Visual FoxPro to VB6 (and Visual J#), developers are coping with the demise of key programming languages.

COBOL much?

Programming languages go obsolete, it's a fact of life.

I started programming professionally ten years ago in C and Fortran. Within a year I was developing in C++, then back to C again for an embedded project. Back to C++ with some Delphi. Some Java. A little bit of VB. Now I do the C# thing for a living. Next year, who knows?

Don't even get me started on the databases, the text editors, and the operating systems.

Then you have software methodology fads. Everything from Waterfall to Extreme Programming.

In spite of all that, the fundamental principles of programming remain fairly constant: you are still getting a bunch of transistors to do what the guys in suits want them to do.

Learning a new language really isn't all that tough, especially since each new language is intended to be easier to learn than the previous. Re-architecting an existing project for a new language is indeed a pain in the arse, but it is a cost of doing business. Eventually new hardware will force you to do that if nothing else.

Meanwhile the COBOL guys I am currently working alongside pull down more cash making their morning coffee than I do over the entire day.

When life hands you lemons, make lemonade. :)

Mike Fisk


Dr. Pournelle,

I am a long time developer of Visual Basic 6 using it to develop a CAD/CAM application to run my company's metal cutting machine. (Company link ). I used nearly all the version of Visual Basic from 1.0 to 6.0.

The big tragedy of Microsoft retirement of Visual Basic 6 is that was unnecessary. Originally when .NET was introduced in 2002, the VB6 Developers were told that due to constraints of the .NET runtime the language had to be altered, backwards compatibility was dropped and important features of the VB6 IDE had to be dropped. Among these was Edit and Continue. The VB.NET language had many improvements. The .NET framework was extensive and useful. VB.NET was a first class language in the .NET world and could do 99% of the things that C# could and had a few things that C# didn't (the handles clause was the most useful).

However despite the utility of the new feature, VB.NET thoroughly broke backwards compatibility. The included wizard could only convert the most trivial of applications and only worked well on applications that were essentially front ends for databases using Crystal Reports . (This was the biggest use of VB)

I bought this, like many other VB6 developers. Then a year passed and Microsoft put out a language called F# (link ). Microsoft Research wanted to show what could be done with the .NET runtime. It was a very different language than either VB.NET or C#. So it got me thinking how they did it. I dug into how a .NET complier is built. I used Reflector to tear into EXEs created from other .NET compliers to see how the IL (Intermediate Language) worked. (link ) The realization dawned on me that the IL opcodes worked pretty much as assembly opcodes on CPUs. Like compliers can be built nearly any language for 80x86 assembly the same is true for IL opcodes. The implication that the changes Microsoft said were needed for VB.NET to work was not needed.

From talking with other in VB community and my own communication with members of the VB.NET development team. It appears what happened that the VB.NET was spawned off the larger C# language development team. The VB.NET team did not have the appreciation or respect of the history and important of the history of the Microsoft BASIC language. In my conservations with the team they did not understand why many of the constructs we used were important and are only now (2007) starting to get that maybe backwards compatibility is somewhat important.

This is in contrast to C++. C++.NET is not an all or nothing approach. While it took them two tries to get it right, their approach has been to extend C++ with a special series of keywords to create an application that uses the .NET runtime and framework. Older C++ program will run using the new complier and produce traditional exes and dlls.

Some of the specific problems with VB.NET

Change in the meaning of INTEGER, LONG and other fundamental data types which has remained the same since the days of QuickBASIC. This has a severe impact on binary operations. The loss of VB6 compatible graphics which now require a total rewrite in VB.NET. The loss of VB6 printing capability with now require a total rewrite in VB.NET. Both printing and graphics both have an entirely different model so it not even a matter of substituting a new setup routine and new function but a completely different algorithm. The loss of GOSUB which was using in VB for short procedures within procedures. The form engine has been totally revamped and behavior of fundamental controls have not been preserved for the same actions

This are among the few problems I encountered. The others can be handled through substitution of the new functions but the above are show stoppers. They require that I rewrite my application to use .NET. Note that the transition from VB3 to VB6 did not have this problem. A full list is found here (link )

I don't think VB6 is the pinnacle of technology. VB.NET, the .NET framework may have issues but it is a clear improvement over what came before. But the total lack of compatibility means that my company has to re-write, re-test our core application in order to use this new technology. This will cost us with no visible improvement to our customers. Microsoft is costing my company money due to their lack of foresight, imagination and/or arrogance.

I can't get into their heads and tell you why this happened. I can say that Microsoft choose to destroy the most popular programming language on the planet. Everything single VB6 developers have to rewrite their application to work in VB.NET if they want to continue to use the latest features. My thought is that why should I trust Microsoft anymore. Who is to say that in 2010 they will just axe .NET as a bad idea and come out with something else? Already they have WPF which is alternative to the Winforms library for making windows form. WPF has many cool feature in relations to Winforms but it is not compatible. The only reason I will consider to use .NET today is the fact that projects like MONO (http://www.go-mono.com ) exist which provides an alternative to Microsoft.

The VB6 community is divided today between those who have existing project to maintain and those are have been able to work on new projects. If you have a new projects the current version (VB.NET 2005) is little different than working with VB 6 and has a lot of nice features in the language and framework. But if you have an older project then your life is still hell. Since the release of VS 2005 Microsoft as been doing some support of the VB 6 conversion in releasing some compatibility objects found here (link )

There is a petition website for Microsoft to deal with this issue.

Thank you

Robert Conley

Emphasis in the above was added by me. I have no answer to your questions. I am particularly appalled at changing the very definitions of the strong types. As you say, Visual Basic was at one time the most popular language on the planet, and a number of very useful programs were written in VB.

What will Microsoft do in future? Which leads to this letter:

Subject: Proprietary programming languages

The key, of course, is to not use them. Yes, this sucks for the people who have large projects built on them but it's the lesson that 30+ years of Unix history teaches us. Ken Thompson and Dennis Ritchie had it right and in spite of the damage some of the Linux crowd is trying to do to this long proud tradition the idea that your most basic tools should be open and not under the control of any one person or company is sane and good.

There is a reason why you had the computers in Starswarm running Unix. Open standards can always evolve and grow. With closed proprietary systems you are at the mercy of a lot of folks who simply don't care about engineering. It's sad that so many of the loudest people going on about this are shrill and irrational. I long for a return to the days when it was "shut up and code" and there was little to no politics in the process.

But getting past all that. The only way to make sure that you're in control of your fate is to make sure you have control of your tools. And the only way to make sure of this is to use open standards. And in this day of fast cheap hardware giving up a bit of effiency in favour of portability makes a lot of sense to me. But then again I'm just a curmudgeonly netadmin who is grateful for the fine tools created by folks a lot smarter than he is. But by sticking to open Unix standards I still have basically, some slight changes in config files for shell changes and version upgrades of a few apps but I'm talking about 10s of lines, the same home dir that I had when I started in this business over 15 years ago. And that's a pretty typical story. I think it should serve as a powerful cautionary tale for anybody looking at tying themselves to something that is controlled by any one entity. Be that Microsoft, Apple, Sun, or even some of the more radical departures from the Unix Way by the Linux crowd. Or even my beloved OpenBSD.

I can't think of any possible advantage that is worth giving up that control over your own fate for.

Ray Percival


Subject: In response to "Retired!"

Jerry,

Tuesday's mail included an article snippet from Michael Desmond regarding the "demise of key programming languages". I noted that you commented that it deserves a longer reply than you had time for. Fortunately for me, I do have time. It will be interesting to see if our viewpoints converge on this issue.

First and foremost, I would hardly characterize Visual Foxpro and VB as "key" programming languages. These are scripting languages provided by a single vendor (Microsoft) that have reached the end of their support life cycle. Yes, there are quite a few applications out there built on these languages (I've done more than my fair share), but their disappearance will have nowhere near the impact of the removal of a non-vendor specific language such as C, C++ or even COBOL.

Many moons ago I was part of a team faced with a decision to continue development in Visual Foxpro, or to migrate to a more "modern" programming language. One of the key motivating factors for this decision was the fact that the Foxpro language was only supported by a single vendor, so our fate would be tied to theirs. We eventually settled on Java as our next programming language.

Interestingly enough, we completely missed the most significant benefit to migrating to Java as our new language. When we made the choice to move to Java we had no idea that we would be introduced to the whole "object oriented" conceptual framework. Learning the new language was relatively trivial. Learning the new approach to conceptually organizing software completely changed our level of productivity for the better.

In that same article David Lambert expresses concern that he is going to eventually have to rebuild his project, presumably in a new language. Again, I fail to see this as a real problem. Yes, there is a cost associated with re-developing an existing application, but when was the last time you saw a software application that wasn't under continuing development, either with new features, or ongoing maintenance?

I know what you're going to say. You're going to say that yes, there is ongoing support costs, but they are nowhere near what it would cost to re-develop the application from scratch. Good point - let's see what the numbers say.

Assuming you have an application that will take 5 developers 6 months to create from scratch you end up with 2.5 developer years invested into the project (feel free to insert your own pay scales per developer). Let's also assume that we are planning on using this application for at least 5 years. Now we all know that when an application is delivered, it isn't really done, so we will have to keep at least a couple of these developers on the project for that full term - fixing bugs, developing new features and so forth. 2 developers on the project over 5 years equals a "support" cost of 10 developer years - four time the original development cost.

But here's the dirty little secret. Over those 5 years of life, the project has grown quite a few new features and had many bugs fixed. Do you think that all of the development work that went into those new features or bug fixes followed the design of the application or best practices? We have a term for projects in this state - they are considered to be "fragile". Fixing one bug breaks two other things, adding a new feature takes 5 times as long as you planned...

The longer a software application has been around, the more likely it is "fragile". There are exceptions to this rule, but my experience is that the vast majority of projects enter this state within 2-3 years of original production release.

Bottom line, the disappearance of VB and Foxpro are significant to those that still depend on those languages, but I doubt their demise will cause much wailing and gnashing of teeth to the software development community as a whole. If anything, perhaps this will serve as a wake-up call for those that base their software development efforts on the platform of a single vendor, or those that are spending a lot of time and money keeping those old applications limping along.

Regards,

Jonathan House
Technology Development Director, Amirsys, Inc.


I am preparing an essay on computer languages. I was early on converted by Edsger Dijkstra and Niklaus Wirth to the notion that programming languages ought to be strongly typed, structured, and readable. The theory is that if the programming language is properly written, some programming errors are simply impossible, while others are not easily made by mistake. In both Pascal and Modula-2, both by Wirth, getting the program to compile is harder than with C; but once the program compiles it generally does what you expect it to, and debugging is a great deal easier.

ADA was supposed to be such a language, but it soon exploded with features and exception handling, and does not seem to have done what it was intended to do.

Regarding the VB 6 issue, I repeat a statement from one of the letters above:

There is a petition website for Microsoft to deal with this issue.