Open Source Software vs. Commercial Software:
Migration from Windows to Linux
An IT Professional's Testimonial

 

Maintenance Headache of Windows

One of the more overlooked topics of Windows and Linux is the high overhead of maintenance in Windows. Weekly security updates, strange anomalies that need fixing, managing of 3rd party drivers and applications, frequent rebooting, lack of built-in troubleshooting utilities, and much more. Not only are these things very annoying, but they can cost valuable time and money by running in circles. These are things that are not always apparent by looking at the covers of Windows and Linux, but rather by using these operating systems for everyday use.

With anything, the simpler its design the better. This simple rule can be applied to just about anything, and should also be applied to your computer. The more complex the computer gets, the higher the chance of something going wrong with it. Take for example your login profile in Windows vs. Linux. In Windows, your login profile holds all of your application settings, operating system settings, and some application data. Usually the profile folders exist on the C Drive, in the "Documents and Settings" folder for Windows XP and in the "Users" folder for Windows Vista. In Linux, profiles are stored in the "/home" folder. The main concept between both Windows and Linux is similar, all of the user's data and settings are located in these folders. However, when you dig into the two folders in more detail, you will see that there are some vast differences. In Windows, many of the files are regularly locked and in use, unless you use Microsoft's Volume Shadow Copy service to get access to the files. In Linux, files are not regularly locked so that they can easily and completely be backed up without using fancy mechanisms. Another large difference is that Windows has internal pointers like the UID (user ID) that is embedded into the profile folder and settings. So, when restoring a profile folder from a backup (if the computer completely blows up and a spare or replacement PC is used), sometimes the profile folder can be rejected by Windows which results in having to manually restore individual files and folders in order to get all settings and files back up and in order. On the other hand, restoring a Linux user profile is about as simple as it sounds. You simply place the entire folder in the /home folder, and as long as the profile folder name matches the account name that is logging in, Linux will pick up the folder and use it. The one catch is that filesystem permissions must be correct, where the User and Group attribute of the user's profile folder and all subfolders inside it must be set to the User and Group of the user. This is the same permissions concept for Windows user profile folders as well. But with Linux, there are no issues with mismatched internal pointers.

So, in the end you can back up a Linux user's profile, create a new separate computer and install the same software, copy the profile over to it and all of the software will automatically find all of its settings. I have done this many many times with Linux computers, even across multiple versions of Linux and installed software, with high success. With Windows, a lot of unique identifiers are used, making this same process very painful and in most cases just not work at all.

Plus, don't forget Windows also stores user settings in the registry, which also needs to be backed up for complete disaster recovery. In Linux, there is no registry. All user settings are stored in plain text files which are all stored in the profile folder. So, by backing up the Linux profile folder, you can guarantee that you are backing up everything for that user.

This is a clear example of Windows being overly complex than it needs to be, while Linux provides similar functionality with a much simpler mechanism in mind.

Over the years, Windows has become over bloated with each release. Microsoft is continuously throwing in new features, trying to make it appealing to increase sales. This is great up front for appeal, but bad in the long run once it starts to bog down the computer. Think about it... why do you need to upgrade your computer? Because it's running slow, right? That is the most common reason to replace a computer. If a computer was still running happily why would you replace it (other than a major hardware failure)? Stop and think how well the computer ran when you first got it or when you saw a demo unit in the store several years back. It was probably running really fast, right? Part of this software getting bloated over the years cannot be controlled, aside from all of the extra bloated code that Microsoft adds. There is a natural progression of basic features of the operating system that is simply getting more complex over time. That is a fact, and is a good thing as it makes computers easier to use and gives them ability to do more. However, Microsoft goes over the top of this and exceeds demands and adds a lot of extra "fluff" for the appeal to the market, and as a result you obtain "bloatware", or software that exceeds the limits of the computer. And, Windows also has hardware limits as well, that are because of flaws in its code. Windows cannot for instance take advantage of the 32-bit Intel architecture and use memory over 4 GB. Linux can. I will go into detail on this a little later.

As I mentioned earlier, Linux is very modular in design. This means you can get Linux running with the bare essentials of what you need, and add to it over time to get additional functionality. It's almost like an a la carte list of components, but for your operating system. Part of the reason why Linux is modular goes back to its Unix roots, where everything is set up in a modular and well structured fashion. Linux builds upon these roots and adds extra funcionality for us to work in today's environment, which is ever changing. With Windows, the structure is set in place, but things don't always seem to be logically organized. Often times I find myself asking questions like "why is this set up like this?". I find that I don't ask myself this question with Linux nearly as much.

Also on countless occasions, I have found that the Linux versions of software just works better than the Windows versions. This is typically true for open source software. Things like VNC Server, where in Windows you have to deal with video drivers and a whole bunch of settings and hope that they all work together. On Linux, VNC Server is able to tie in directly to X11 (X Window System), and does everything with built-in functionality, without added drivers like in Windows. Other things like getting my old Microsoft sidewinder gamepads in Windows was a huge headache, especially in Windows XP because the gamepads were designed for Windows 98. On countless occasions, the gamepads would completely stop working until I uninstalled and re-installed them in Windows XP. When I tried the sidewinder gamepads in Linux for the first time, they just worked, and have ever since.

People are used to Windows, and so when they look at Linux it all looks foreign at first, kind of like fear of the unknown. However after being a system administrator of both Windows and Linux for many years, I can testify that once you dig in and see how things work, the Linux model is one that is quite simplistic, well designed, and easy to grasp. In my opinion, this also makes Linux much easier to adapt for various applications as well as troubleshoot if and when things go wrong. It is different from Windows, so an open mind must be in place before jumping right in for newcomers.

 

Windows is like a nice basic toolbox. It is shiny on the outside, it looks appealing. It has a few wrenches, some sockets, a couple set of pliers. But it might be missing tools you need to get the job done, in which you might need a trip or two to the store to get the missing pieces.
   

Linux is like a full toolchest with a bunch of tools to do everything you need and probably more. It might not look as nice and shiny on the outside, but it includes a vast set of sockets, pliers, hex wrenches, and a wide variety of tools of all sorts. Everything is included to get the job done.

 

Going Back to the Roots: The Command Line

Over time, operating systems have obviously evolved and are more complex today than ever. As mentioned above in the Windows example, sometimes they are pushed too far and become over bloated. But, do they need to be this complex? That is a question I often ask myself.

The first Unix terminals were no doubtedly run with a command line. The command prompt is at the root of any operating system in reality, and is at the layer closest to the operating system's kernel for direct access without fancy graphical interfaces. It is still used today heavily in Linux/Unix, and was at the core of DOS/Windows9x. Windows NT migrated away from it and now it's back in Vista/7/Server 2008 as "Windows PowerShell". Wait... "shell"? The word "shell" was actually taken from Unix, which is just another term for the command line. Many aren't comfortable with using a command line or shell, but it is a skill that should be known by anybody using a computer. Look at the days of using DOS where it was required to use the computer, and people learned it and used it. Windows has tried to cover up the command prompt for so many years that many have not learned the skill of using it.

In Linux, it is VERY difficult to get along without the command prompt or shell. Yes, one can argue that Linux can be set up and run pretty well without ever using a command prompt thanks to developers making nice graphical front-ends. However, keep in mind that most graphical applications are just that, front-ends that doing all of the work at the equivalent level of a command prompt. The command line opens up unlimited possibilities. For one, Linux uses the command line as a central point of entry to remotely administrate systems. A service called "ssh" (secure shell) is one of the most powerful tools available. With it, users can remotely administrate and check a system securely, using very little bandwidth since the data transferred is only text (not images), and without ever interrupting the user that is currently logged in to the system. Windows does not have such a mechanism built-in that is secure.

Linux shell showing the running processes of a remote Linux system.
Linux shell editing a text file on a remote Linux system.
Linux shell viewing files and permissions on a remote Linux system.

 

A One Way Street

For some reason Microsoft has been known to provide certain functionality that seems to work opposite as one would expect, almost backwards and counterproductive. When we find something like this, many of us have come to grips that this is just the way it is in Windows, and have trained ourselves to just deal with it. Even something as simple as the "Remote Assistance" or "Remote Desktop" feature of Windows XP, where a remote user can connect to a Windows XP machine and help another person with a problem or issue. But, for those who have used this feature, reality shows that it is completely worthless for one person remotely helping another: The remote user cannot see the desktop of the user that is currently logged in to the computer! What, are you serious? Unfortunately I am. Even to this day I don't see how this could ever be helpful for remote assistance, or why Microsoft would have implemented a feature this unuseful. Luckily, there is an open source program called VNC, that shares the screen of the person currently logged in to allow a remote user to see the same desktop and work with the user directly. Again another example of an open source program coming to the rescue. In Linux, VNC comes standard with most installations, and actually has two different ways for a remote user to connect. One to interact with the currently logged in user, and another to connect in as a separate and independent session (just like Remote Desktop in Windows). Also, in Linux the X Window System (or X11) has the ability for a user to connect remotely with an X client and actually have a full desktop on a remote system, similar to a thin client or teminal services. This is more like the model of Microsoft's Remote Desktop and Remote Assistance, which is normally used in client/server scenarios rather than remote assistance type of scenarios. Rather than using a proven technology such as VNC, Microsoft decided to try and implement its own mechanism and again was outdone by an open source product that provided additional functionality.

Another example of Microsoft straying off into their own land takes us back to one of the main features of any operating system: a command prompt. Microsoft started out with a command-line based operating system (otherwise known as DOS), migrated away to Windows and tried to get rid of the command line, then reunited with the command line once again with Windows Server 2008. It's almost as if Microsoft forgot that the command line should be a part of the operating system. But, this concept of moving away from something and then bringing it back is nothing new. However, as I have mentioned already, I have been a user of Linux since 1997, at which time the king of the Linux world was Red Hat 4.2. And, I can honestly look back at Red Hat 4.2 and easily compare it to Fedora Linux 10 (the latest and greatest), and say that there are a LOT of similarities. One of which is the same exact command line interface. Microsoft has finally discovered the usefulness of the command line again and has most recently developed "Powershell", which is a Unix-like command line with powerful commands. Those of us in the Unix/Linux world have been using a powerful command line shell like this for years. After all, the first days of Unix were completely housed within a command line environment. Not until the mid 1980s when the X Windows System was first invented did we have windows in Unix. But, even today Unix and Linux are tightly integrated to the command line, which has proven itself over and over.

I have already touched on the web browser war between Internet Explorer and Mozilla Firefox, however there is another side of Internet Explorer which involves the "one way street" politics of Microsoft that I should mention. It is most evident in the newest release of Internet Explorer, version 8. At first glance, Internet Explorer 8 looks very similar to Firefox 3. However, behind the scenes there are some serious problems at hand. Internet Explorer 8 is not very good at maintaining backwards compatibility. Haven't we gone through this before with Microsoft products? Yes, and quite a few times. As soon as Internet Explorer 8 started to reach the users, many users started complaining to webmasters and web designers that their websites no longer worked. Naturally, each would point the finger at the other, the webmaster blaming the user for using Internet Explorer 8, and the user blaming the webmaster for not designing a compatible website with Internet Explorer 8.

At first glance, it seems that Microsoft is releasing the new browser in hopes that maintainers of websites will recode them to be compatible. But, should it really be this way? In my opinion it should not. To me this is putting the cart before the horse. The browser should be backwards compatible with the websites. This follows basic principles of software migration: create a new version but maintain backwards compatiblity for the older versions. But, what's even more of a shocker is that Microsoft knew this would be a problem BEFORE it released Internet Explorer 8! That is why we have the backwards compatible "Internet Explorer 7" mode in Internet Explorer 8. So, one could say that Microsoft covered its bases because the newest browser actually has emulation for the older version. This sounds great up front, but this can get messy. This requires a user to visit each site and revert to a backwards compatible mode for each and every website encountered. So, whose fault is this? I would have to point the finger at Microsoft, for releasing a web browser that cannot be backwards compatible by default. Users of Mozilla Firefox should be quick to back up this accusation. How often have websites not worked in Firefox, that worked in previous versions? Hardly any. So, essentially you will find that Firefox is VERY backwards compatible with previous versions. Websites that render just fine in Firefox 2.0 still render just fine in Firefox 3.0. Part of my belief of this goes back to the fact that Firefox is NOT commercial, so it has not introduced tons of new ground breaking features and interface changes. Instead, each version has brought improvements over the previous one, and useful features have been added to improve user operability while maintaining compatibility. As I already mentioned, Firefox is gaining market share over Internet Explorer. This is a true fact, and the facts speak for themselves.

On top of straying off into its own land, Microsoft leaves remnants behind for 3rd parties to pick up the slack for. For instance, Microsoft developed its .NET Framework, and ONLY developed it to work with Windows. Other companies and individuals came along and picked up the slack, and successfully ported .NET Framework for other platforms. The "Mono Project" was started by volunteers to get .Net Framework over to other platforms, like Mac OS X, Linux, and Unix. Even most recently, the Mono Project also tackled getting the .NET Framework on the Apple iPhone [1], which is interesting as this is Microsoft's main competitor to its own Windows Mobile phones. While you could say that this is because Microsoft knows Windows and it can't make it for every platform, it could develop the .NET Framework for the big three platforms, like Windows, Mac OS X, and Linux. These are the main players in the operating system market. Even the Java technology is developed for those main platforms, because Sun knows that having it as compatible as possible is a good thing. But, it's obvious that Microsoft does not want cross platform compatibility because they want its users to use Windows exclusively. This is where the politics of Microsoft hurt the consumers. Microsoft picks and chooses what products it decides to develop and for what platforms. For instance, its Silverlight technology is developed for Windows and Mac OS X. So, we definitely have some inconsistencies of what platform(s) Microsoft develops for. My personal thinking on why Microsoft does NOT develop the .NET Framework for the other platforms is to try and control the market and pursuade them to keep Windows. If applications are originally created on Windows and use the .NET Framework, it would be a huge pain if not impossible for somebody to migrate to Mac OS X or Linux and try to keep the applications. This is also part of the vendor lock-in game of Microsoft.

In my opinion, by using Microsoft products, I feel that Microsoft itself has been experimenting with each version of Windows and carrying its customers along for the ride. Why not stick to what is known to work, and stick to the standards? Sticking to standards isn't the Microsoft way, and therefore provides one more reason to migrate away and stay with industry standards. Linux is consistent, and from that makes upgrading from version to version a small jump, without huge hurdles and learning curves, retraining, etc.

Upgrade Paths

Both Windows and Linux have greatly different reasons for coming out with new versions of software. Microsoft likes to add lots of new bells and whistles to Windows and make it look prettier, and put it on the market to try and sell to its customers. The whole purpose is to make it appealing to the customers, so that they will buy it. Functionality is also added as a bonus to upgrading. You must keep in mind the key fundamental reason for releasing new versions of Windows: to make money for Microsoft. If nobody buys the software, Microsoft won't make money, so all effort is to make customers want to buy it.

Now, let's look at Linux. In this case, nobody is making a profit when new versions of most Linux distributions are released. Let's take for example Fedora Linux, which I use, and is the free distribution of Red Hat Linux. Fedora is released every 6 months. While this may seem quite frequent, let's examine why new versions are released. All of the applications included with a Linux distribution are compiled with the libraries included with the release. From time to time, the libraries are upgraded, in which case the applications that use them must be recompiled. Usually when the libraries are upgraded to this extent, a new distribution is born and all of the applications are compiled to work with that particular version of the distribution. This may sound bad, but keep in mind any application can be compiled for any distribution, because the source code is always available to whoever wants it. So, in the end, you have a much smoother transition between versions of Linux. The fact that the source code is available also gives the users the freedom to continue using an older distribution, and compile their applications for it. I will get into this topic in further detail.

It is inevitable that software will eventually have to be upgraded. This is just a fact. You can drag your feet as long as you can, but eventually you will need to upgrade to newer versions of your operating system and software. Sometimes it is just not practical to recompile every Linux application you want to install, because your distribution is old. With Windows, Microsoft will eventually drop all support and new patches for older versions, so you will be forced to upgrade if you want to stay secure. The path to upgrade from the old version to the new is not always a cake walk. In fact, often times it's just the opposite. Having the ugprade process as streamlined as possible is a challenge for software vendors. Both Windows and Linux can have bumpy rides when upgrading the operating systems. However, Linux versions are released quite often, making the upgrade hurdles smaller if you upgrade to the small releases as you go. Otherwise, you can find yourself in the same boat as Windows, where you may encounter huge hurdles, sometimes making the upgrade process a complete failure. In this case, a "migration" is needed where a fresh copy of the operating system is installed, and data is copied over. In this case, all additional software on top of the operating system must be re-installed. This can lead to hours and hours of headaches. Microsoft's new operating system, Windows 7, requires a migration if you are upgrading from Windows XP or older. Since most of the market at the time of this writing is using Windows XP, this means that a majority of Microsoft's customers are going to have to migrate, rather than do an in place upgrade. What does this mean? Basically, hours of work to get from Windows XP to Windows 7. The two operating systems are completely different, so much that they are independent from one another. Whereas Linux does not go through these complete rewrites, and upgrades from one version to another are not as drastic.

A lot of Microsoft Windows users have expressed their extreme disappointment with the Windows XP to Windows 7 migration path, since it is not a direct in place upgrade. They have also voiced their thoughts and opinions to Microsoft, but Microsoft has ignored them up to this point, as there is no in place upgrade path. The theories as to why Microsoft did not provide a direct path is questionable, but many speculate is that they did not want to get involved in the nightmares that were apparent with the XP to Vista upgrade path, which was riddled with problems. Instead, by forcing the customers to install a fresh copy of Windows 7 and migrate the data as a separate process, you get a cleaner system to start with, even though it is more work. This process avoids the possibility of bringing in problems from the older version of Windows XP to Windows 7. This is mainly because the versions of Windows are vastly different from version to version. Even veteran Windows bloggers such as Paul Thurrott (who runs the SuperSite for Windows blog), has posted comments and workarounds for upgrading.

Paul has also voiced his opinions on other licensing issues with the Windows XP to Windows 7 upgrade [2]. Microsoft originally advertised special pricing for Windows 7 upgrades for those that wished to pre-order it, mentioning the upgrade path that was supposedly possible at the time. I think this was almost like putting the cart in front of the horse, as the product was being rushed to market, without providing further information. However, in this case, Paul had posted some workarounds on his blog that demonstrated how to upgrade and get around issues of using old Windows media and license keys to get a copy of Windows 7 running, and stated:

"I'm just trying to support the millions of people that Microsoft fooled into pre-ordering Windows 7 by offering steep discounts, only to discover later that the Upgrade version they purchased unknowingly might not actually install properly.... And for the nth time, you could (and should) have clearly documented how this works months ago. Or allowed myself and others to do so. You chose to ignore this need. So this is a problem of your own making. It's that simple."

Microsoft's Eric Ligman responded to Thurott's posts that were helping users in upgrade debacles, with:

"When these posts and write-ups state that you can install clean from an Upgrade piece of software and they fail to mention that you need to own a qualifying software license to be legal to use the Upgrade software for the installation, they give the impression that because it is technically possible, it is legal to do."

In other words, Ligman was concerned that the upgrade instructions provided by Thurott may not be legal, even though they were helping Microsoft customers that already owned computers with Windows to install Windows 7. It is clearly evident that from this perspective, that Microsoft only cares about its bottom line, and not about its customers' experiences. Couldn't Microsoft forget about its bottom line for once, and try helping out its customers, even after the sale? Again, this is a typical sales story. The salesman makes a sale, and is he around after the sale to support the customer? Sometimes not. Once the salesman has the customer's money, he slips away and is never to be seen again. If the salesman wants repeat business, he should stick around. Microsoft wants return business that is for sure, but they have other ways to ensure this will happen, like vendor lock-in.

In regards to this subject of upgrading, I won't touch too much on it since Windows and Linux both have complicated paths to accomplish this. There are also too many factors that affect the success rate of upgrading, mainly the installed software on the computer. I would like to briefly mention the two idealogies of upgrading though with both Windows and Linux. Let's start with Windows. Upgrading is sometimes possible from one version to the next, except for device drivers (they cannot be kept in place in most cases). Also, often times newer versions of the additional software on the computer must also be obtained. Sometimes, this requires the user to purchase a new version. The way Linux upgrades from one distribution is a little different. Linux distributions are released fairly often, so therefore the jump from one to the next is not too drastic. Basically, the installer looks at all of the software packages installed, and upgrades them one at at time automatically for you. This is quite different from Windows where the user is required to try and go back and obtain/buy updated versions. The reason Linux can do this is because all of the software is included in the Linux distribution. As far as drivers go, all drivers are part of the Linux kernel itself. This means that when upgrading, a newer kernel will be installed and its own drivers will be used. In most cases, this means that there is nothing for the user to do. The newer kernel will run and use its drivers to run all of your devices, just like the older one. Only in the rare cases of 3rd party drivers is where a hitch can come up. However, this is very rare. In fact, most Linux distributions do NOT include 3rd party drivers, they are included in the distribution. One example of 3rd party drivers is nVidia video card support. nVidia drivers are still released by nVidia, and not part of the kernel. So, when upgrading a Linux system with an nVidia card, to get the full nVidia driver, the user must install the nVidia driver meant for the new kernel that is installed.

Design Flaws

So, how much better is open source software and closed source software when comparing apples to apples? Well, this can vary greatly. From my experience, commercial software is almost always readily available for any task you can imagine. With open source, there has to be a need in order for somebody to sit down and write an application. Normally, an open source application is created and is based off of an already existing commercial application. But, the key point with this is that often times the open source application will take the ideas and functionality of the closed source version, and run away with further enhancements and functionality, as well as logical design, etc. This is from the vast source of the developers with open source, versus the limited team for the closed source application. You will essentially get more feedback and more input on open source software, which broadens the possibilities immensely.

A common flaw with Windows actually goes back to the main heart of the operation system, the kernel itself. Windows is very dependent on the hardware it is installed on, which limits the amount of flexibility between computer systems. For instance, installing Windows on one computer will set Windows to use the exact hardware in the system, but if the hardware changes significantly, the system may no longer boot and provide the famous "blue screen of death" or BSOD. This is It took Microsoft 15 years (since its release of its early version of Windows in the early 1990s) to realize this central flaw of its kernel. Finally, Windows Vista released in 2006 has a central kernel that is compatible with mutiple hardware types, or HALs (hardware abstration layer). But, what most do not know is that most stock Linux kernels released by vendors have been centrally compatible with multiple machine types right out of the box. In most cases, you can remove a disk from on a Linux PC or server and stick it into another, and it will boot. Yes, some devices such as network cards have to be reconfigured as the operation system will not know how to assume settings. But, a booting system is better than no booting system. The reason Linux can do this is that the kernel has everything it needs, including drivers and software to run devices, already built in, so moving a hard disk between systems isn't usually an issue. The problems with this only arise when moving between machines of different architectures (i.e. moving from the Intel processor platform to a machine with the AMD processor platform).

outlook 2007 corrupted pst error
A critical Microsoft Outlook 2007 error from a corrupted PST file.
outlook 2007 corrupted pst
A single corrupted PST file with folders and emails all mixed up, in Outlook 2007.

The same goes for inserting or removing more minor hardware to or from the computer or server running Linux. For instance, you can remove a network card and replace it with a completely different model of network card and chances are it will just work. No need to search around for drivers, as again the drivers are already built in to the kernel.

I have included various examples of commercial software that is faulty or limited in functionality. But there are even more examples of software that has faulty design as well. Take for example Microsoft Outlook. One of the strange design issues that I have never determined the answer for is how Outlook handles the storage of mail on the local hard disk. Outlook stores this mail in one single file that can grow, but in order to shrink it, it has to be compacted manually. so for instance, if you had a total of 1.5 GB of mail stored in your PST file, and one day you decided to remove some old mail and delete it, the PST file would stay at 1.5 GB unless manually compressed.

There are many disadvantages to storing all mail in one single file as Outlook does. Corruption for one, if this file should become corrupted, it could potentially cause the contents to be all lost in one sweep. Microsoft makes a repair tool called "scanpst.exe" to fix PST files. Having all of the mail stored in one file also causes a lot of issues when trying to store mail over a network drive or connection to a file server. Outlook needs to open the entire file just to get to one piece of the file. So, opening the file over a slower network connection causes Outlook to become unresponsive or freeze / lock up. Also, as the PST file grows, it makes it inefficient for backup since the entire file would need to be backed up to capture any changes done. Every other program that I have encountered will store the mail in individual files (usually one file per folder), which avoids the entire contents becoming corrupted, and allows for more efficient backups. So for instance the Inbox is stored in its own file, Sent Mail in its own file, etc. This design seems more logical for management and speed (the operating system will handle lots of small files better than opposed to one large file). To me, the storage for Outlook mail seems illogical and often times I wonder why it was done so. Microsoft has a tendency to do things its own way, as I have mentioned already.

 

Next Section : Maintenance Headache of Windows:Dependencies,Compatibility

Previous Section: A Little Politics:Windows Failures, Windows Jokes, Windows Problems, Behind Closed Doors, Windows Mutation to Linux

 

Table of Contents
 

 

Click Here to Continue reading on making the actual migration.

 

References

1. Mono Project opens .Net Framework on iPhone

2. SuperSite Blog