Saturday, March 31, 2007

 

It happens outside the US first

Techrepublic mentioned a Yankee Group "survey of IT executives shows that 23% respondents intend to migrate off of [MSFT] Exchange to Linux-based mail servers in the next 12-18 months. Of the respondents, 65% of them currently run Exchange." The writer was skeptical.

If the survey was biased towards "Exchange shops" in the USA, I don't buy it either. Some of the respondents are just using the survey to register their displeasure with MSFT. (Was the survey conducted in English only? Was it only offered to readers of an English-only Web site or magazine?) If the survey was worldwide and conducted in many languages, it's plausible. Over that time period, MSFT customers are being told to replace their investment in the "32-bit Exchange" with a new "64-bit Exchange" that runs on a new operating system distribution that's meeting a lot of market resistance, especially outside the US.

More than half the people with Internet access live outside the US. That's where the Internet is growing the fastest, too. Across Latin America and Asia and in the European Union countries, alternatives to MSFT are being adopted much faster than they are in the US. It's part of our technological decline relative to the rest of the world.

As software technology goes, MSFT is kind of a backwater. That's a consequence of its corporate philosophy of never inventing anything. Invention is risky. The market rejects some inventions. Truly new ideas don't do anything to reinforce the monopoly. MSFT is the technological incumbent. It makes more sense for the incumbent to wait for others to prove new ideas in the marketplace. Once they're tested, MSFT can imitate them or buy them.



Friday, March 16, 2007

 

Vista, hell of a gamble, 100% MSFT's fault

CNet asked, "are the problems people are having with Vista Microsoft's fault?" As if someone else might be to blame for Vista's outrageous hardware requirements.

Anyone who designs PC hardware or software for MS-Windows has two bosses. The company who signs her pay check, and MSFT. When I was designing Ethernet cards at 3Com, the boss came around every year and apologized, and dropped a copy of the Microsoft Hardware Design Guide on my desk. Every year it got thicker and more constraining.

If you broke any of the rules, MSFT would blackball your product. They do that by prohibiting you from using the "Designed for Windows" logo. That locks you out of the distribution channel. They can also "make a mistake" and drop your driver from their release "by accident." It hurts a big company so badly it might never recover. It's instant bankruptcy for a smaller one. To a much greater extent than you would know from the trade press, MSFT directs and controls the PC hardware business. It's tighter than Apple ever was over its "third party" hardware makers.

With that kind of control, MSFT is 100% at fault for all of the problems with Vista.

MSFT took a huge gamble with Vista, gambling that its network effects-based monopoly is so strong that its customers would tolerate being told they have to discard a generation of hardware that runs competing software platforms just fine. (If you don't know what "network effects" are, look in a good economics textbook.) That gamble reflects MSFT's confidence in the strength of the monopoly. They want to retire the working hardware because it is relatively open, and replace it with stuff that will enforce Digital Restrictions Management. The jackpot MSFT is going after is control over music and motion picture distribution in the next decade. Heck of a gamble.

 

User friendly!

Some of my PC repair customers switch ("migrate" would be a better word) from MS-Windows to GNU+X+Linux.

The ones who have the hardest time are the MS-Windows "Power Users." People who never used a computer before, or who only used "AOL" or some minicomputer twenty years ago have a much easier time.

That's because the "Power Users" know computing from the visual cues presented by Microsoft's user interface. When those visual cues change even a little, they're lost and afraid.

That's one reason you should try several GNU+X+Linux distributions. You should at least see the most popular desktops: KDE, GNOME, XFCE, and maybe Fluxbox. You should try changing the mouse policy: you might, like me, find that focus-follows-mouse with no auto-raise is a lot easier to use and makes better use of your screen area than the policy (click-to-focus with autoraise) that MS-Windows and the original Mac OS forced on you. You might really like using a different virtual desktop for each task. You might like Konqueror or Nautilus better than Firefox or Internet Explorer. You might like KOffice better than Openoffice.org. It's too bad there's no GNOME Office.

Seeing the same tasks presented several different ways will give you an intuitive understanding that the user interface isn't the computer.

The MS-power users are confused and bewildered by all those choices. But you'll be learning to use the computer, not just the user interface, so you won't have that problem.

Labels: , , ,


Thursday, March 08, 2007

 

comparing expectations

Mildly interesting exchange this weekend on a CNet forum. Thread began with "which was MSFT's worst OS?" Everyone agreed it was MS-Windows Millenium Edition. That was their last distribution with the Windoze 95 kernel before they scrapped it.

Someone opined that no OS is reliable, they all need to be reinstalled routinely, they all get "registry corruption," etc etc. I've only known MS-Windows users to say things like that. They use words like "touchy" and "iffy." People who use other computing environments expect them to work right, in the absence of malicious/careless users or hardware failure. The only other OSes people use on PCs or Macs are unix, so this was taken by two MSFT fanboys as yet another silly "macs are better" thread. As far as I know VAX/VMS, AS/400, and the mainframe OSes are just as good, but that didn't come up. We're talking computers consumers use.

But what stood out was the MSFT Power Users' absolute conviction that no computer operating system is reliable. I mentioned the only time I'd seen a Linux file system corrupted so badly it couldn't be recovered was when the drive had failed. I have heard intruders can destroy a file system with a buggy rootkit also. Power failures and accidental resets, recoverable. Running for years under load, no corruption, no lost files. Both of them said that must be a lie.

It seems to me that's another adverse impact of the monopoly. People don't expect computers to be reliable any more. That would be a good thing if it prompted people to do backups and make recovery plans, or be suspicious of critical systems. (Computers in hospitals, banks, power plants, airliners...) But I don't see that coming out of it. It's like spam: people just accept it as inevitable instead of asking why.

 

MSFT stumbles over daylight savings time change

Of all the things. Big story this week about how MSFT's "patches" to various products don't apply correctly. It seems the US Congress moved the start of Daylight Savings Time (DST) forward a few weeks, with less than two year's notice. Computer software needs to know the time of day, so it cares about DST.

Software should be as simple as possible. It's less buggy that way. One way to make things simple is to have well-tested shared subroutine libraries for functions like getting the time of day and expressing it in local time, that lots of programs need to do. Unix (including GNU+Linux) handles this in the standard C library (libc) that almost all user programs include. They include it at run-time. That way you can update the function one place, when the suits do something silly like changing how the wall clock works.

Debian handled this huge complicated problem by reissuing the software package with the time zone data files that the function in libc consults, tzdata. You go apt-get update && apt-get upgrade, you're done. If you're maintaining a thousand Debian machines, you "push" that out through your update routines. Or maybe you're more cautious and go apt-get update && apt-get install tzdata so nothing else updates just then.

But the MSFT monopoly holds a unique position in the software world. It's got an interest in making its products as complicated as possible. That way IT people have to dedicate their careers to MSFT and don't have time to master the alternatives. Apparently there's time zone code in Outlook Express and Exchange Server and a lot of other stuff. Those packages got reissued, and the upgrades aren't working real well. Not only that, but MSFT is charging $4000 for upgrades to older versions like Exchange 2000. Why are companies still running the seven year old version of that buggy thing? Because MSFT's upgrades don't work too well...

Tuesday, March 06, 2007

 

Debian molts

The most comprehensive and carefully maintained software distribution I know about is Debian GNU/Linux. Started by Debbie and Ian Murdock in August 1993. It's an open project by an association of tens of thousands of developers and testers worldwide. It's published at four levels of maturity:

  1. Old Stable: The previous Stable. You get security-related support for a year or two after your Stable system becomes obsolete.
  2. Stable: The software in this distribution is feature-frozen. Updates come out to fix bugs, especially security bugs. But the features and functions don't change. You can keep up to date without fear anything in your system will break.
  3. Testing: The software in Debian Testing changes along with the "upstream" software it's derived from. We expect it to work, but it might change and break some feature or function you were depending on.
  4. Unstable: This software changes all the time (that's why it's called "unstable") as the developers find bugs. On any particular date it probably works as well as any other software distribution, but there's no guarantee. Use at your own risk.

You choose which works best for you. If you don't want to participate in the development, use Stable. If you want the latest features and don't mind reporting a bug now and then, try Testing. It you want to help, join Unstable.

Every couple of years, Testing "freezes." We keep fixing bugs but don't add new features. (That's the only way to approach a bug-free system, and MSFT doesn't do it. MSFT's Service Packs and Windows Update "patches" introduce complex new features with new bugs.) After the freeze, it takes a few months to reach zero show-stopping "release-critical" (RC) bugs. When the last RC bug is squashed (or the package it's in removed), there is a NEW DEBIAN RELEASE. Stable becomes Old Stable. Testing becomes Stable. Work begins on a new Testing. Unstable just keeps changing as always.

It's about to happen. The current Stable is Debian GNU/Linux 3.1. It is about to become Old Stable. The current Testing is Debian 4.0. It will be Stable any day now. It feature-froze in November '06 and it's had security support since then. If you install Debian today, you should install Testing.

Debian GNU/Linux 4.0 (nicknamed "Etch") will be available on three "Official" DVDs or twenty-five "Official" CDs. That's the "i386 binary" you can install on an IBM-PC-Compatible computer. (The source code is available for the whole system. That's another 25 CDs. New users don't need it. Developers usually use the online archive, not CDs.) You can also use the disk set to upgrade your Debian 3.1 system. There are dozens of socially responsible vendors who will sell you a disk set by mail. I once considered getting into that business but it is just too competitive. But I'll sell you an Etch disk set when it's released, at an outrageous markup, as a fundraiser for Green Internet Society. Drop me a line to reserve a set, cls@greens.org.

Nobody needs the whole thing. Etch has about 25,000 packages. A typical home workstation uses less than a thousand. A typical Internet server uses less than five hundred. If you have fast Internet access (DSL, cable TV, or fast wireless) you can install only the packages you need, as you need them, over the Internet. That's the recommended method. The most popular packages are on the first CD. You can use the first Etch CD to install a usable home workstation. When you run the installer, it will try to find the online archives automatically, and you can install (and maintain) the rest of what you want from there. There is also a "network install" CD. It fits on a "business card" mini-CD. It's only useful if for some reason you can't deal with a whole CD.

This page is powered by Blogger. Isn't yours?