5 Things Linux does better than Windows

Status
Not open for further replies.
ESX is a VMWare distro of Linux. OMG it is the BEST thing I have ever seen in terms of server management.

It's a VM Host OS and to call it "potent" is an understatement.

I just installed an XP VM in 15 minutes while 2 other VMs were running AND the whole thing was being backed up!

Plus the fact that you can Snapshot your VM and revert back to that snapshot in about 30 seconds if something catastrophic happens is Ãœber-awesome times 43.

Just yesterday I was working on a mission-critical server and I completely muffed it up the first go, so I just flipped the little switch and in 2 minutes flat it was completely reverted to it's status of 10 minutes prior.

The sheer awesomeness of this thing makes me want to **** in my pants like 20 times. You have no idea.
 
zmatt, I've enjoyed the logical points that everybody is making to the pros and cons of both Linux and Windows platforms. But taking the stance that "Windows only releases things that are ready" whereas saying Linux releases sloppy distro's every 6 months is so far from accurate it's unreal.

So, in response, I'm going to respond to you with 1 word that'll completely break down the point you just made.


"Vista"


Yes yes yes I know it's not so bad now, and I even run it now with decent success, but it's also 3 years old. When it was released the majority of the Microsoft market wouldn't touch it with a 10 foot pole - and for good reason too. Also, keep in mind... 9.04 was released a few days ago... but Ubuntu developers are allocated in a way so there are developers looking far into the future (beyond 9.10) and others who have been concentrating on 9.10 for the last year. They work a lot farther ahead than you think. If 9.04 comes out, they take a week off, then go back and try to slap together 9.10 in 6 months time, I could see that point. But that's not how things are. Things are organized at a much higher degree than you think.

Speaking from experience with having spare computers and testing out operating systems and software, I can say from what I've experienced, Linux beta releases are often as stable as finalized products by Microsoft. Every time I install a beta release by Ubuntu on my spare rig and test it out, I'm always wondering how it's even considered beta. Take it as you want, argue it as you want, but it's what I've experienced so... it is what it is.

I'm not really seeing how saying certain Ubuntu releases have been more like service packs is a bad thing. It doesn't cost anything. It's a simple download. The same way an actual service pack for XP/Vista is. If the distro costed 280 bucks like Windows did, and it was pumped out every 6 months, all right... I'd tack a -1 for Ubuntu then. But that's not how things are.

At the end of the day, use what works. The reality is, all 3 main platforms work. I own XP Pro/Vista Ultimate/Mac OSX/Ubuntu 8.10 + 9.04 at home. I use them all for different reasons. It's how you use them and utilize them to benefit your needs that makes an OS suddenly turn a golden color in your eyes. Some people's slice of cake is that of Microsoft, others Mac, etc etc etc.


Dude i have three words for you

Nvidia and creative

many companies, especially those two failed to make decent Vista drivers at launch. To be fair you can attribute a lot to poor planning and management on M$'s account. But a great deal of the blame goes to the industry not willing to prepare for Vista. And this is often over looked.

But I don't think you can judge all of M$'s performance on one OS. The perfect counter point to the Vista argument is Windows 7. Puddle Jumper has been sitting here across the table from me and says that the hard drive on his laptop hasn't been accessed in 3 minutes. Aside from the massive speed increase in Windows 7, it also intelligently installs the needed 3rd party drivers. Almost everything worked out of the box. Linux can't do that.

All of M$ OS releases aside form Vista to an extent and ME have been polished and worked fairly well. Ubuntu on the other hand always has some beta feature in a major release. That is unacceptable. When you hit a major release everything should work. If its not ready then it need to wait till the next release. Didn't Uubntu have an issue where it killed laptop hard drives a few years back due to its pre-fetching?


ESX is a VMWare distro of Linux. OMG it is the BEST thing I have ever seen in terms of server management.

Actually its not a distro. It starts with a Linux kernel as the first VM, but it runs with a proprietary micro-kernel. If it ran the Linux kernel it would have to be free, which it isn't, because gpl is a viral License. They could use BSD, its more permissive, but it is impossible for a Linux based OS to be proprietary.

Also I think its a bit misleading to say that an OS designed to run VMs is good at running VMs, Of course it would be good at it. if it wasn't it would be a total failure. And the VMware products are pretty well made. This OS could never be a desktop OS as it is too niche.
 
Getting back to package management, I am a big fan of the windows .exe system. In windows dependency **** is almost a non-issue. The only dependency problems I have encountered are with the M$ C++ redistributable and .Net framework, and that's because a lot of programs use them. other than that its a non issue. I can't tell you how many times I have had dependency troubles in Linux. program X needs Lib Z v2.01 and Program Y needs Lib Z v2.02 and you can't have both versions at the same time. WTF!?! In this regard I think the Linux community is like Apple, whenever someone brings up this problem they all stick their fingers in their ears and and scream LALALALALALALA.

I'd like to know what you were doing that you ran into dependency issues. Were you trying to install a KDE application in Gnome or something?

When you install a package, whether via terminal using apt-get or Synaptic Package Manager, everything is already set up accordingly to download all of the dependencies needed. So when you run sudo apt-get install amarok, you're pulling all of the necessary libraries and dependencies accordingly. To go back to the original post, to say Windows has any sense of package management is nothing short of a joke.

I'm not saying that the dependency issues are non existent, there are often times when a distro is released that a dependency isn't pulled properly, but each time I ran into that it was fixed in a matter of a day or two.

And no, the Linux community is not like Apple. I'm not debating that - I'm simply telling you that. Have you even been to the UbuntuForums? My gosh, there are actual developers there that spend a good portion of their day dedicating their time to help out other people who are having issues, just starting on Ubuntu, etc.

Think of it like this... Take Mak. The Godfather of Microsoft software who helps out countless people on TF all day long, myself included when I've ran into issues. Except replace that Windows side with Ubuntu... and right there is an example of a lot of the people on the UbuntuForums.

If it wasn't for the dedicated posters on the forum who've helped me out when I was trying to adjust from my Microsoft know-how to get to know Ubuntu, I wouldn't have even bothered learning it.
 
When you install a package, whether via terminal using apt-get or Synaptic Package Manager, everything is already set up accordingly to download all of the dependencies needed. So when you run sudo apt-get install amarok, you're pulling all of the necessary libraries and dependencies accordingly. To go back to the original post, to say Windows has any sense of package management is nothing short of a joke.

Like I said that's unfair comparison. Windows doesn't have package management support because it doesn't need it. The system of self contained programs makes it immune to dependency ****. Your argument is like blaming the Ferrari F50 for poor off-road performance. it's a race car!

Package management is antithetical to an .exe based system. In fact I would argue that it could hamper commercial applications. if you have to rely on everyone else's libraries it would be a nightmare for companies. They couldn't host other companies libraries online in a repository. It would never work. Fully contained programs works best on a commercial OS and I would argue it works best overall. It's simpler, more intuitive, doesn't have the issues of package systems, and makes manual program installation and initialization much easier. If, for some reason launching a program from the menu in windows doesn't work, I can track down the .exe in the file system and manually start it. You can't do that in Linux, because the .exe doesn't exist.

And no, the Linux community is not like Apple. I'm not debating that - I'm simply telling you that. Have you even been to the UbuntuForums? My gosh, there are actual developers there that spend a good portion of their day dedicating their time to help out other people who are having issues, just starting on Ubuntu, etc.

No they aren't like Apple, I'm not saying they are. What I am saying is, they don't take outside observations well, and like Apple are so Xenophobic and elitist that they ignore real issues. And yes I have been to the Ubuntu forums, i don't know how many times I have been told, "You are on your own, may God be with you." If you can get Flask support in the PowerPC release of Ubuntu working I'm all ears. Nobody in the Ubuntu forums can and Saxon can't either. All Gnash does it give me ads, Youtube is a no no.
 
I just did a little benchmark on my XP VM versus my real XP Desktop
VM Real Units
13540.5 - 3820.25 - DacriMarks (CPU)
4231 - - - 3270 - - - MB/s (RAM)
34.53 - - - 53.15 - - MB/s (HDD)

The VM CPU is through the roof - even sharing with other machines and also the server grade memory makes a difference, but the HDD sharing while replicating is the only disadvantage for the VM.
 
Whats the specs of the server versus the desktop? I'm assuming that ESX has good multi-cpu support, so any half decent server should clobber a desktop with XP. ESX will intelligently handle the multithreading fro XP, whereas real XPs multi core support is practically nonexistant.
 
Whats the specs of the server versus the desktop? I'm assuming that ESX has good multi-cpu support, so any half decent server should clobber a desktop with XP. ESX will intelligently handle the multithreading fro XP, whereas real XPs multi core support is practically nonexistant.

I don't have the full specs of the server, I just know that it's a brand new HP G5 with a Quad Core and 16 GB of RAM.

My point was that ESX was able to manage those resource so extraordinarily well that an XP VM outperforms a regular VM - even when the ESX host is doing other things.

My desktop is a dual core optiplex 745 with 1 GB of RAM.

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I wonder if I could run F@H on my XP VM..... LOL. That might get me in trouble.
 
Dude i have three words for you

Nvidia and creative

Why would you mention Nvidia? Nvidia has had better support than ATI from the start. ATI was the one lacking Linux drivers for quite a while. However, Nvidia and ATI both have come a long way in getting video drivers in and working the way they should. In fact, my hat goes off to them. They've both done a tremendous amount of work. I've used Nvidia since the beginning with Ubuntu with flying success. Only 1 time I used ATI with Ubuntu and even then I had great success too.

But hey, while we're at it, let's throw a curve ball your way.

I had a Creative sound card that didn't work in Vista either. So right there in that particular example, the door swung both ways.

many companies, especially those two failed to make decent Vista drivers at launch. To be fair you can attribute a lot to poor planning and management on M$'s account. But a great deal of the blame goes to the industry not willing to prepare for Vista. And this is often over looked.

Oh definitely. I'm not dogging Vista for not supporting hardware. But when people say "It works in Vista but not Ubuntu. Go Vista!" it makes me sick because it's so contradictory.

But I don't think you can judge all of M$'s performance on one OS. The perfect counter point to the Vista argument is Windows 7. Puddle Jumper has been sitting here across the table from me and says that the hard drive on his laptop hasn't been accessed in 3 minutes. Aside from the massive speed increase in Windows 7, it also intelligently installs the needed 3rd party drivers. Almost everything worked out of the box. Linux can't do that.

Actually, yes... yes it can. That's the point behind Linux with the way the kernel is constructed. Things are built as modules within the core kernel itself, which enables you to boot up and things just work. I just installed Ubuntu 9.04 on 3 computers last night, two laptops of different models and a desktop. All of them worked. The only thing I had to do was on the one laptop I had to activate the wireless card. But hey, what's 1 button to click? :p

All of M$ OS releases aside form Vista to an extent and ME have been polished and worked fairly well. Ubuntu on the other hand always has some beta feature in a major release. That is unacceptable. When you hit a major release everything should work. If its not ready then it need to wait till the next release. Didn't Uubntu have an issue where it killed laptop hard drives a few years back due to its pre-fetching?

Well, you're one of the few. I'm not going to bother trying to relay my good experience to your unfortunate experiences with Ubuntu. I have had great success with Ubuntu, and keep in mind... I have several rigs here at work that run Ubuntu, both as servers and desktops. So far I haven't had any major problems worth noting.

Like I've said many times, you have to use what works for you. For me - it screams Linux. I can't tell you how much easier Linux operating systems, utilities, and open source software has made my job.
 
Package management is antithetical to an .exe based system. In fact I would argue that it could hamper commercial applications. if you have to rely on everyone else's libraries it would be a nightmare for companies. They couldn't host other companies libraries online in a repository. It would never work. Fully contained programs works best on a commercial OS and I would argue it works best overall. It's simpler, more intuitive, doesn't have the issues of package systems, and makes manual program installation and initialization much easier. If, for some reason launching a program from the menu in windows doesn't work, I can track down the .exe in the file system and manually start it. You can't do that in Linux, because the .exe doesn't exist.

I don't mean to sound like a jerk, but have you used Ubuntu extensively at all?? You certainly can launch programs in Linux without the actual program file. Any program that is installed can be executed by terminal. In fact, this is what you do when you are trying to troubleshoot a program that's acting funny. When you launch a program by terminal, the terminal will display a constant progress report as to what the program itself is doing that you are running. If there are any errors, it can be seen here. When the program crashes (assuming that was the problem in the first place), you are able to see that in what's displayed in the terminal. As a result, you can troubleshoot these programs accordingly.

Only one time I had to use that, and that's when I downloaded Frostwire and Frostwire would randomly shut off. In the terminal I was able to see that I did not have the proper version of Java for Frostwire to run properly. I downloaded the latest Java via terminal, started Frostwire... bam. It worked. Just like that.

You can argue that having a package management system is a bad thing, but I fail to see how it'd be a problem at all. Synaptic Package Manager allows you to see every little thing that's installed. In Windows, okay fine you run the .exe and install it accordingly. But have you ever tried to uninstall a program manually by removing the registry inputs it leaves behind? The same can be argued against Windows.
 
Why would you mention Nvidia? Nvidia has had better support than ATI from the start. ATI was the one lacking Linux drivers for quite a while. However, Nvidia and ATI both have come a long way in getting video drivers in and working the way they should. In fact, my hat goes off to them. They've both done a tremendous amount of work. I've used Nvidia since the beginning with Ubuntu with flying success. Only 1 time I used ATI with Ubuntu and even then I had great success too.

I was talking about vista having poor support because of Creative and Nvidia dragging their feet.
And I have had great experience with ATI cards in Linux rigs. However my powerbook has weird screen corruptions that I can't fix and it has an Nvidia card. Linux users don't need great 3d support anyways, gaming in Linux is a joke.


But hey, while we're at it, let's throw a curve ball your way.

I had a Creative sound card that didn't work in Vista either. So right there in that particular example, the door swung both ways.

So your proving my point, that Creative's poor drivers contributed to Vista's poor support. Thank you for misreading my post and inadvertently proving my point.


Oh definitely. I'm not dogging Vista for not supporting hardware. But when people say "It works in Vista but not Ubuntu. Go Vista!" it makes me sick because it's so contradictory.
In 2007 it was contradictory, but today it is very much true, Linux's biggest nemesis is the broadcom based wifi card. Window uses it fine.



Actually, yes... yes it can. That's the point behind Linux with the way the kernel is constructed. Things are built as modules within the core kernel itself, which enables you to boot up and things just work. I just installed Ubuntu 9.04 on 3 computers last night, two laptops of different models and a desktop. All of them worked. The only thing I had to do was on the one laptop I had to activate the wireless card. But hey, what's 1 button to click? :p

There is a major debate in the CS community over what kind of Kernel works best. I don't think you are prepared to get into that. I suggest you stop it there. There are a million reasons to use a different kernel model.



Well, you're one of the few. I'm not going to bother trying to relay my good experience to your unfortunate experiences with Ubuntu. I have had great success with Ubuntu, and keep in mind... I have several rigs here at work that run Ubuntu, both as servers and desktops. So far I haven't had any major problems worth noting.

Linux has good server hardware support. Thats an assumption, but the pc support, especially in the desktop realm is pretty bad. Windows just works with the hardware (see what I did there?), and Linux doesn't. Blame it on what you want, but it still remains. If Linux isn't easier to use than windows, doesn't have better support than windows, and lacks compelling killer consumer level functionality then there is no reason for the masses to switch. And that my friend is why Linux is marginalized in the desktop market and will continue to be so. Notice how most netbooks launched with Linux? Windows has a dominating majority of the netbook market now and most returned netbooks are the Linux models. That was the test there to see if they were acceptable to the general public. And it isn't. Instead of spending the past 15 years on things that matter, like usability and drivers, they spent time on version control systems and improving systems that already worked fine. The Linux community had the opportunity to gain a real foot hold in the OS market and they failed the test of popular consent. i don't know how else to say it. If Linux was that much better than 90% of netbooks would run Linux instead of windows today.
 
Status
Not open for further replies.
Back
Top Bottom