Open Source Software vs. Commercial Software:
Migration from Windows to Linux
An IT Professional's Testimonial

 

Table of Contents
 

 

"I'm a PC, and Linux was my idea"

 

Preface

Over the years, I have been exposed to all sides of computers, the good and the evil. But one thing I have discovered after 11 years of experience in the IT world, is the simpler the system is, the better. Think of it as your car; the more bells and whistles, the more that can go wrong. Within my 11 years of experience, I have used many different types of architectures of machines, different operating systems, and have concluded one thing: Linux is far superior hands down over everything I have used. Why do I say this? Mainly because it has many many years of thousands of peoples' hard work behind it. It is derived from Unix [1], which was first developed in the 1960s (Linux [2] itself came about in the early 1990s by its creator Linus Torvalds). Many aspects of early Unix are still apparent in Linux. Then, add the years and work of a stable operating system to the most available and least expensive hardware (IBM compatible architecture), and you get Linux. And from that, you get an affordable computer that has been refined to the point that it is extremely stable and is pretty much free of bugs and other defects. In the end, you have an operating system that is 100% free (no cost to you), extremely stable, can run on older hardware that would otherwise be useless, and can do just about anything you can imagine.

I will go into detailed discussion of why I feel that people are locked into the monopoly of Microsoft Windows, and why this is a losing relationship. I will also point out many disadvantages of using Windows when compared to Linux. A basic understanding of computers is all that is really needed to install a working Linux computer, contrary to rumors about Linux being difficult to install and use. Linux has come a long way from its state a few years ago, to where it is just as user friendly (and sometimes more) than Windows. I will also go into deep detail of the mechanics of Linux and Windows, and why Linux has a superior advantage over Windows because of its modular (and very simple) design. Windows focuses on features and bells/whistles to sell the software, and as a result has become over bloated which makes it run slower and also causes far more malfunctions.

My goal is to put as much information on the table as I can, so that people can judge for themselves, and hopefully realize that there is a better alternative than Windows. I am an everyday user of Windows and Linux, and I'm not being paid to write this article. I'm writing it out of pure frustration of using Microsoft products that have a plethora of problems, not to mention high costs. After using both Windows and Linux, I myself can see the clear winner, which in my opinion is Linux. I have seen Linux servers run for years without a reboot; it's this stable in a server environment, so in my opinion it is more than capable for a desktop or home environment. I try to be as fair as I can from all angles, as some views may seem biased in favor of Linux. However, I can testify that my views are not biased, and are strictly obtained from personal factual experiences. I try not to repeat what is published all over the Internet in favor of Windows either. With this I hope to bring to light some of the mysteries and clear up misconceptions of Linux to the average user as well. If you are here for instructions on migrating from Windows to Linux, click Here to jump ahead and read about making the actual migration. Otherwise, please read on.

 

People Are Habitual

X11 Linux Desktop
Linux Desktop (X11), using Gnome window manager

Let's face it, many of us don't like change. We like to stick with the things that we are familiar with. We learn a regular routine over time, and we stick with that until we are forced to change our ways. The same thing applies to using your computer. You get accustomed to the programs that you use most often, and since they work and do what you want, why change? Microsoft came to the scene when Apple had the reigns of the windows-based type operating systems, and ran away with a more stable and advanced operating system, Windows. However over the years, Microsoft has grown into a huge empire that is squeezing everything they can out of the consumers. This is the reason I decided to sit down and write this. There are other options, and better ones available. Some people may not realize this, or don't want to change their computers and have to relearn their software. But, for those that open their mind to other possibilities, there are many great surprises inside of a Linux machine.

Many rumors have circled around the Internet for years about Linux being a complex operating system and is hard to install, etc. Yes, this was true back in the early days of Linux, however during the past several years, it has been refined into a very user-friendly system. Now, Linux has a completely GUI (graphical user interface) [3] based installation. This means even a novice Linux user can boot to a CD or DVD, follow the step by step installation process, and have a running Linux system in a short while. Linux can also be installed on a PC in conjunction with Windows (on a separate partition on the hard disk), to allow a slow migration process to learn Linux, while having the ability to boot back into Windows on the same PC to go back to the familiar things. Another thing that people don't realize is that Linux can run Windows applications. Yes, it's true, from basic applications to 3D games with full sound. I will go into more detail on this near the end of this and provide some real examples.

Linux Proving Itself - My Real World Experiences

I admit that even in 1997 when I started using Linux, that at first I was a little hesitant since it was not very user friendly. I had been a Windows user for several years and up to that point had become very accustomed to it. At the time I was in college, I started working for a small research department on campus at MSU (Michigan State). When I started there, two servers had been set up by my predecessors, one with Windows NT 4.0, and the other with Red Hat 4.2. The Windows NT 4.0 server was a dual processor Intel Pentium II 266 MHz machine, with 196 MB of RAM, and served as our primary authentication server and domain controller for Windows 95 and NT 4.0 workstations. The Linux server was a single processor Pentium Pro 133 MHz machine, with 128 MB of RAM, and served as our web and email server. Eventually, the Windows server started blue screening (the infamous blue screen of death) on its own. After two late nights at the office reinstalling Windows and looking for answers, I started to recognize the fact I was spending my spare time fixing the Windows server while the Linux server ran by itself, untouched. Not to mention it was late at night through the dinner hour, when I would have rather been out doing something a little more fun. I decided it was time to get to know Linux a little better. I ended up working at that department as a student for 2 full years until 1999, when I graduated college. During my time working at MSU, we continued to use the Windows NT 4.0 server, and I built a completely new Linux server with a dual processor Pentium II 300 MHz machine, with a RAID5 disk array to house our crictical web files and email spool files. I soon realized that once I got to know Linux, that it wasn't that scary afterall, and in fact was easier and more flexible than Windows in some ways. And most important, I realized that Linux was extremely stable when it was sitting side by side to a Windows server with continuous problems. I also realized that Linux wasn't much more difficult or time consuming to set up, and required FAR less maintenance overall within the big picture.

In 2000 I moved out of state, and my next job was for a medium-sized web hosting company. When I started, we had two Windows servers providing web hosting services, one Windows server providing email hosting services, and two Linux servers hosting all of the DNS records. Soon after I started, I realized that the Windows servers were in bad shape, and I found myself spending late nights yet again (and one all-night incident at the office) fixing problems with services crashing and servers needing continuous rebooting. The all-night incident by the way was because of a Windows security patch installed for one of the Windows web servers, and the server's NTFS filesystem became corrupted causing the server to never boot back up. I was able to pinpoint the issue to a corrupt filesystem as I was able to re-install Windows on the same hard disk and everything worked fine again. I was soon on call 24 hours by myself after the other primary technician left the company. Now the infrastructure was in my hands, and soon after I realized that things needed to change. Customers were noticing the downtime of their hosted websites and email, and the company was beginning to get a bad image. Not to mention we were having to deal with calls from angry customers. I ended up working for the company for 2 years total. During the time I was there, I managed a complete migration of all 600 email accounts from the Windows server running the Post.Office mail server software, to a Red Hat Linux 6.0 server running UW POP3 and Sendmail. The reason for the migration was because of the Post.Office software allowing open relaying and people using the server to send spam, and costing $1200 for an upgrade of that software to solve the open relaying problem. This was a challenging migration, but doable and we accomplished it. Soon after, customers saw more uptime and we were able to help restore the company's image. We actually received complimentary calls of happy customers from time to time which was very encouraging. However, the web servers were still in bad shape. Unfortunately, the websites that were hosted were encoded with ColdFusion technology, which was an extremely expensive package to buy for Linux and migrate websites over from Windows. So, we split the websites among several Windows servers so that if one server went down, the least number of customers were affected at any one time as possible. Unfortunately, this is the best solution that we could afford at the time. And what about the two Linux DNS servers? They continued to run and provided DNS services as expected, without a hitch. Near the end of my term with the company, we upgraded them to newer hardware to minimize our chances of any failing hardware, even though the systems continued to run without problems.

In 2002 I moved back to my home state and began working for a software development company that used Windows primarily for Active Directory. Fortunately, this company realized the true value of Linux and had already installed a Linux mail server and were in the works of purchasing a large Linux server for housing a high end project for scanning old newspapers to digital format. We ended up purchasing a dual processor Pentium III server with a 1 terabyte RAID5 enclosure, and installed Red Hat 7.0. We also purchased another identical Pentium III server without a storage array, that was used for running OCR and processing software that was written for Windows 2000. I do recall 2 times that I had to completely reinstall the Windows 2000 operating system on this second server because of unexplained booting issues where Windows 2000 would boot and would completely lock up, leaving us with a dead server. During my employment at this company, we used the Linux server very heavily for file storage, and wrote gigabytes and gigabytes of data every day to and from DLT tapes with the Unix tar utility. Meanwhile our other Windows 2000 server that ran Active Directory did not have hardly any problems. We did have one other server that was practically unused, but did have a time that the video card caused complete lockups of this server which was running Windows NT 4.0. I stayed with the company until 2004 when they filed for bankruptcy. During my time there we ended up using a total of 3 Linux servers and 3 Windows servers. My single and only problem with the Linux servers was with one of them, it overheated and caused one of the hard disks to fail.

To present, I work as part of a team at a large corporation that is a primary Windows installment, both on the servers and workstations. However, day after day I continue to run across issues in Windows that can never be explained, even with the latest versions of Windows 2003 and 2008 Server, and Windows XP and Vista. I even encounter Microsoft products that just plain don't work right out of the box, which I will explain in more detail later. Not to mention watching the hills of money that is being shoveled out the door to keep things running with software upgrades and purchases. With this, and encountering countless problems on my home computers running Windows XP, I have eventually written this document out of pure frustration with Windows, knowing in the back of my mind that Linux does not have these problems and never has over the years since I started using it. Even at home the costs can quickly rise by the time you figure the cost of Windows, Microsoft Office, and other programs needed for everyday tasks. With this, I decided to make the migration from Windows to Linux on my home computers, to not only alleviate all of the frustration and problems and save myself some serious cash, but also free up my spare time that I waste in maintaining and troubleshooting Windows. This is what sparked the birth of this document.

 

A Little History

Richard Stallman Linus Torvalds
Richard Stallman (left), founder of the GNU project, and Linus Torvalds (right), creator of the Linux kernel

Linux is based on Unix, it is basically a port of Unix that will run on IBM compatible hardware. Unix itself was first developed in 1969 by Bell Labs (AT&T). Many other operating systems at the time drew upon the ideals of Unix because of its high success. Even today, there are still signs of Unix apparent in Linux, even the structure of the filesystem reflect early Unix systems and tools such as "vi" and "ed" are still included in Linux distributions.

So what started it all? In 1989, Richard Stallman [5] formed the GNU General Public License after realizing the dramatic limitations of commercial software, which is the heart of open source software. Stallman created the main operating system components and tools which started it all, and called it the GNU Operating System. In 1991, Linus Torvalds [4] began to create the Linux kernel which was based on the founding priciples of Unix from the 1970s and 1980s, mainly based on an operating system called "Minix", and released it under the GNU General Public License. Torvalds and his team were able to utilize tools from Stallman to help with the kernel development. Later on, this kernel would be applied to the operating system and tools already in place by Stallman, and thereafter GNU/Linux was born. The proper name for the operating system including the contributions of Stallman and Torvalds is "GNU/Linux", however for simplicity I will just call it "Linux". From there, others jumped on board and eventually the open source community began to grow and contribute to further development of the Linux kernel and open source applications under the GNU General Public License. You can get the entire Linux story at Wikipedia's article on Linux. X Windows (the windows system used in Unix/Linux) was first developed in 1984 at MIT. Shortly thereafter,

X11 Window System
An early-1990s style Unix desktop running the X Window System

X11 (X Windows version 11) was developed in 1987, and is the same windows system still used today in the latest versions of Linux and Unix, and Apple OS X. These are just a few examples of the time proven solutions of Unix/Linux that have become the definition of Linux today. With decades of refinement and development by thousands of individuals that make up the open source community, Linux has matured into a very stable, compatible, and reliable operating system.

In 2000/2001, Apple Computer completely rewrote their MacOS operating system and created their new MacOS X operating system, which without much surprise is based on Unix. Even Apple realized that Unix has the years behind it as a proven operating system, and decided to jump onboard with this concept, rather than reinventing the wheel which seems to be the Microsoft way.

But let's face it, Microsoft has their hands on the PCs we use today, and they have a good grasp of millions of consumers and businesses alike. Why is this? They got into the desktop PC market early and came up with some pretty good programs and software for PCs that were cheap in their day. Some can say they stole Apple's idea of a window-based operating system, but you do have to give Microsoft credit for taking the window based operating system to a whole new level. I think with Windows 95, they proved to the world that they could write an operating system that is user friendly, and that is very powerful and flexible at the same time, yet available for the cheapest hardware architecture (IBM compatible). Once Microsoft gained more shares of the desktop PC market, they started ramping up their server operating system as well. By carefully integrating their desktop and server operating systems, they esentially moved people along to locking themselves into Microsoft products. Even more so today, Microsoft has engineered its products to work very well amongst their own, but has turned its head away from standards and other 3rd party solutions, in order to keep its grasp on the users that are already using Microsoft products. Clever some would say. It would probably cost a company a good chunk of time with the learning curve to migrate away from Microsoft products today, but after making the plunge, would make up for those immediate costs and surpass them in time. In the home computer environment, this is a very similar scenario, most of the cost is basically measured in time of making the migration and from the learning curve, which can be challenging.

Further Reading

The history of GNU/Linux and how it evolved is a very interesting story. Two books that I would highly recommend for this topic are:

Free as in Freedom: Richard Stallman's Crusade for Free Software by Sam Williams
A semi-biography of the person who started the free software movement, Richard Stallman. It goes into depth about Richard's early years as a programmer at MIT and what problems he faced with commercial and closed source software, that initiated his movement to open up source code to all.

Free Software, Free Society: Selected Essays of Richard M. Stallman by Richard M. Stallman, Lawrence Lessig, Joshua Gay, and Laurence Lessig
A collection of writings and speeches of Richard Stallman which give you a better idea of his ideas, morals, and reasoning of his open source software movement. Very interesting reading which goes in depth of his policies and thoughts of commercial software vs. open source software, with regards to legal laws and other "flaws" of modern software copyright.

If you want to dig even further back to the roots of GNU/Linux, there are also a lot of online resources for this as well. Here are some various bits of information.

 

What's Wrong With My Computer?

First Aid 97 software Windows Uninstaller software
Products such as these have flooded the market to fix "Windows problems".

But, even with the shiny front end of Windows, there are ugly things in the background that users do not see. And many defects and things that can go wrong with all of the bells and whistles. I recall having to reboot my Windows 95 PC a few times a day when things just plain stopped working. Since then, Microsoft has refined their Windows OS with a few releases, (Windows 98, Windows 98 SE, Windows NT, Windows ME, Windows 2000, Windows XP, and Windows Vista). As of Windows NT 4.0 and later into Windows 2000 and XP, this brought forth their "NT Technology", that was at first more geared at business use, but later with Windows XP came into the home arena as well. Why was this? I think Microsoft finally realized that their DOS-based operating systems (Windows 95, 98, and ME) were failing miserably in the performance and reliability scene. Take for example the huge numbers of products that were released during the days of Windows 95 especially, that were designed to "fix" all of the "Windows Problems". The market was flooded with products like this. So, why were all of these products needed? Well, one could say that a lot of companies were trying to capitalize on the Windows operating system. This could be true, but why would there be a demand for products like this? If Windows was completely stable and there weren't a plethora of problems with it, products like this would not be needed and therefore not be in demand. So, in all honesty my personal feeling is that these products were needed because Windows had a lot of faults and problems in itself. Yes, Windows has come a long way since these days, but it is still riddled with many problems of a different nature. Now, there are many websites that do the same thing, basically to scan your system and clean out the junk. But, not only do we face the issues of Windows being clogged up, but this cleaning software today is riddled with malware. And even some forms of malware itself says that it will clean your Windows installation but in fact it is doing the opposite and infecting your PC, gathering your personal information and sending it to some unknown service, deleting your data, etc. Many examples I will get into later on. Even though we have newer versions of Windows, there are still many problems and issues apparent even in the latest and greatest versions. Even Linux has its problems, but somebody will usually find the time to put forth their effort and fix the issues very efficiently and quickly. There are problems with some Microsoft products that have been there for years that remain unfixed. I will get into these details in a little bit.

Windows blue screen of death (BSOD)
Windows XP Blue Screen of Death

I ended up on Windows XP in 2001 and have been using it since on one of my computers. I do admit that it has been the most stable operating system from Microsoft to date (other than Vista, which still has to prove itself over time). But, over the past 7 years of using XP I have seen many ugly things happen, too. Complete corruption of operating system files where the system is rendered useless and will no longer boot. A lot of us have seen this happen on our own computers, which is sad to say. Problems like this and others are known as the infamous "blue screen of death", which is the result of a crash in Windows. To tell you that you will never see a real software fault like the "blue screen of death" in Linux is completely true; the only time that you will see it is if you use the famous X Windows screensaver called "BSOD" [6] that shows a series of Windows blue screens and crashing screens. Linux just doesn't crash, and this is why the BSOD screensaver was created for Linux, to poke fun at Windows of course!

But, can this really be true that we have one operating system in one hand (Windows), that costs $100 or more, and an operating system in the other hand (Linux) that is completely free, and the free one works better? You bet, and there are many reasons for this, as I will get into. This is putting aside any hardware issues that would cause a malfunction of the operating system itself. Time and time again I have seen issues where things just malfunction in Windows that cannot be explained. Everyday I witness this type of behavior and what else is there to do other than shrug my shoulders and either close the error message or reboot the computer. Everybody that uses Windows knows about this, and most of us have become accustomed to this behavior and aren't caught by surprise when it happens.

Earlier I mentioned that I have used Linux in specific uses since I first plunged into the IT world. And I realized that I had not seen such strange issues as I had in Windows. Could it just be coincidence? At first you would think so, but after 11 years of running Windows on one hand, an Linux on the other, Linux wins each and every time because it does not have such faulty quirks and bugs. But is Linux that much more superior? I would vote yes every time. And how could this be? My overall theory is this: Microsoft has a finite team of developers, and yes they do identify as many problems with their software as they can. They use feedback from users, as well as their own internal testing. But, in the Linux world, you have more than a finite number of developers. Linux is open source, meaning that anybody that wants to contribute to any of the code, can do so in any aspect of Linux! So, with that type of a design, you open the world up to an unlimited number of developers! And I do mean the world. Thousands of talented people from countries all over the world contribute to the Linux operating system and Linux applications, and the end product that you get is a well refined one. Yes, there are bugs as nothing is perfect, but in just about every case a bug is identified, it is fixed almost immediately, or somebody will post a workaround for it while a fix is worked on. The same also applies to security glitches. In Windows you have a very large number of users of the product, so in theory you have more people looking for loopholes and security issues. When one is found, Microsoft will then apply their finite number of developers to the issue and get it resolved. However, this process can take days or even weeks. On the Linux side, the issue is usually fixed in a matter of hours because of the vast number of contributing developers.

Next Section : A Little Politics

 

Table of Contents
 

 

Click Here to Continue reading on making the actual migration.

 

References

1. Wikipedia : Unix

2. Wikipedia : Linux

3. Wikipedia : Graphical User Interface (GUI)

4. Wikipedia : Linus Torvalds

5. Wikipedia : Richard Stallman

6. Linux BSOD (Blue Screen of Death) Screensaver, home page