YOu are spamer .For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities. http://www.idresses.co.uk
Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
Quick Startup: OK, Windows is fast - at first, well let's say that is if you install it yourself without all the bloatware that come standard on Windows store-bought PS's (we bought a Toshiba laptop for work with Vista that took 12 minutes after boot-up for it to respond to a click on the start menu - even on the third time booting.)
Windows startup is often burdened by auto-updates from Microsoft, anti-virus, Sun-Java, Acrobat Reader, etc. etc. that slow down the computer on boot-up to where your original idea of "hey I just want to start my computer and check my email for a minute before work" can take at least 5. I can do this reliably on Linux in 1. Yes, if you know a lot about Windows, you can stop all the auto-updates and maintain them yourself but 99% of Windows users don't have time/or know how to do this.
Trouble-free: E.G. I installed Linux on a computer for my wife's parents (Mepis Linux) 7 years ago for email, pictures, games, web, letter use and haven't had to touch it since then. This is typical.
For Windows, often I have done fresh installs on trojan/virus infected computers - installed working antivirus and all Windows Updates (not to mention this process takes about 2-4 hours of updates upon updates + downloads of the proper drives from the manufacturers websites vs about 1 hour for an Ubuntu install with all updates done including any extra work for codecs and graphic drivers) - only to have to come back a couple months later to a slow again computer from users installing adware, infected games, etc.
Free: Nearly every Windows reinstall I've had to do starts with a computer loaded with MS Office, games, etc. but upon reinstall nobody has the disks for these. There is a lot of "sharing" of computer programs in the Windows world that is not very honest
With Linux, you can have the operating system plus pretty much anything else you would need, without having to cheat.
Adaptable Performance: You can get a well-performing Linux installation (LXDE Desktop) on a PIII computer with 256MB of ram. The only thing that will seem slow to an average mom/pop user would be surfing on flash loaded web pages, but with adblock on Firefox, it's not too bad. With Vista loaded on this computer, it would be annoyingly slow. You can often continue to use/re-use computer hardware with Linux for years after it would be unsuitable for Windows.
I think these features are of high value to the average user -- maybe not the average Anandtech computer user -- but the average surf/email/do homework/look at photos/play solitaire/balance my checkbook user.
Using SMB for network performance is extremely biased. It's a proprietary Microsoft protocol, of course Microsoft is going to win that one. Use NFS, HTTP, FTP, SSH or some other open protocol for network performance benchmarking. Alot of NASes do support these, as they are Linux-based.
Furthermore, using a Windows server with SMB with the argument that most consumer NAS use SMB is pretty ridiculous, these NASes are most likely going to use Samba, not native SMB, the Samba which is implemented in GNU/Linux distributions and Mac OS X, not to mention that most of the NASes that I've seen offer at least one of these protocols as an alternative.
The ISO thing is pretty ridiculous, creating a simple GUI in both GTK and Qt and integrating them into Gnome and KDE should be pretty damn easy, though I suppose integration with the respective virtual file systems would be in order, in which case it might get slightly more complex for those (like me) not familiar with the code. There's even a FUSE (userspace filesystem) module now, so you wouldn't even need to be root to mount it.
About the command-line support, IMO that's a good thing. It's a lot easier both for the person helping and the guy needing help to write/copy-paste a few commands than it is to tell the person to click that button, then that one then another one, etc. It's also alot easier for the guy needing help to simply paste the result if it didn't help, and it makes it much easier to diagnose the problem than if the user would attempt to describe the output. And you usually get much more useful information from the command-line utilities than you do from GUIs, the GUI simplifies the information so anyone can understand it, but at the price of making debugging a hell of a lot more difficult.
It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.
If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.
In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.
If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.
In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely hate it.
1. User friendly. No, CLI is not user friendly no matter what you say; no matter what excuse you use; no matter how blind you are. NOT ONE COMPANY dare to provide their mainstream products to be CLI only, from things as simply as ATM, ipod, to things as complicate as cellphone, cars, airplane. That ought to tell you something--- CLI is not intuitive, not intuitive=sucks, so CLI = sucks. You command line fan boys are more than welcome to program punched cards, expect no one use punched cards and machine language anymore because they are counter-intuitive. Having to do CLI is a pain for average user, and having to do CLI every time to install a new program/driver is a nightmare. GUI is a big selling point, and a gapless computer-human user experience is what every software company looking to achieve.
2. There is NOTHING a Linux can do that windows cannot. On the contrary, there are a lot of things windows can do that Linux cannot. I’d like to challenge any Linux user to find engineering software alternatives on Linux, like matlab, simulink, xilinx, orcad, labview, CAD… you cannot. For people who actually user their computer for productive means (not saying typing documents are not productive, but you can type document using type writer with no CPU required whatsoever), there is nothing, again, I repeat, NOTHING that Linux can offer me.
3. Security issues. I disagree with the security issues that windows has. I can set up a vista machines, turn it on, luck it into a cage, and it will be as security as any Linux machine out there. Hell. If I bought a piece of rock, pretend it was a computer and stare it all day, it would be the most secure system known to the man-kind. Linux’s security is largely due to one of the two reasons: 1. Not popular, not enough software to support and to play with. 2. Not popular, un user-friendly. Either of them is not a good title to have. It is like you are free from the increase of the tax not because you have your business set up to write off all your expense, but because you don’t make any money thus you don’t have to pay tax.
4. There is nothing revolutionary about Linux for an average user, other than it is free. If free is your biggest selling point, you are in serious trouble. Most people, if not all, would pay for quality product than a free stuff, unless it is just as good. Obviously Ubuntu is never going to be as good as windows because they don’t have the money that MS has. So what does Ubuntu have that really makes me want to switch and take few weeks of class to understand those commands?
Be honest, people. If you only have ONE O/S to use, most of you guys will chose windows.
I hope you realize that your hated is showing so strongly that absolutely no one cares what you say.
That said, I don't know how to use a cli and have been successfully using Linux for 3 years. I found the article to be a fairly fair one even though the author is so unfamiliar with Linux/Ubuntu. As he does not use the default app's in windows, linux users don't use the defaults only in linux. K3B is far superior to Brassaro and so on. In addition, I don't think he let on very well as to the extent of the software available in the repositories (with addition repositories easy to add). Several hundred app's, 20,000 programs, even security app's and programs ranging from easy as pie to complicated. (for those of us how have a computer that is more than a rock) I personally do audio mixing, video transcoding, advanced printing....all with graphic interfaces.
BTW, I learned how to turn on a computer for the 1st time 3 1/2 years ago, I stopped using windows a little over 3 years ago and have no reason to go back. I find it too hard, limiting and frustrating to use. Plus, I can't live w/o multiple desktops, the author didn't get it yet, but once you get used to them you can't go back.
Well, I've said enough for now, can't wait for your next article.
Your ignorance and stupidity is showing here. No engineering software for Linux? Hello? Matlab is available, Simulink is available, Labview the same. Xilinx and Altera have supported Linux for a long time and so do the smaller FPGA houses like Lattice and Actel. Mentor Graphics too. Orcad is the only one you mentioned that isn't available on Linux, but Cadence does support Linux with their Allegro product and so does Mentor Graphics with PADS and Board Station and Expedition.
I have to disagree. You are NOT talking abut average Joe/Jane. I think that even the article author is kind of biased towards enthusiast user. Ubuntu actualy completes all needs of average Joe/Jane user, you can browse www, you can do email/scheduling, you can play games (easy non enthusiast games), you can DL pictures from your camera and edit them, you can even playback mp3/CD and video, do basic office work, all out of the box. The gnom learning curve for PC beginners is much shorter then with windows. Most of the average Joes/Janes dont install aps or peripherals by themselfs, belive me I had to install it for them many many times on Win systems (the best is "installing" digital camera: plug one wire end in camera, other in PC). Yes I agree that installing Ubuntu so that ALL is runing right may be pain in the ass, but average Joe/Jane naever install their system (not Win, nor MacOS), but when they get the PC with preinstalled Ubuntu you are done. With windows you have to worry that they will "bother" you every few months with non working system. Yes it might be nice source of income for PC technician, but not always welcome as reliability advertising (for customer to come).
I did some instalation of Ubuntu to my customers mostly as a "safe" web/mail PC, they all where used to windows platform already, after one week of using Ubuntu even the hardest critisizer where comfy to use Ubuntu (some even asked me to install it on their home PCs), The most "problem" was: that no one can read our "excel" files. So I showed them that it has to be saved with .xls extension and voila, no more problems. I was NEVER asked for any CAD system, nor MATHLAB, not even Graphics apps, all what they used in offie was already there! Then there are home users, only complaint was that thay had windows at work, but after few houres all was fine, only kids had problems that they cannot play enthusiast games on it. My wife is running Ubuntu for three years now, with no problem. When my 62 year old mother asked me for a computer I brought her a notebook with Ubuntu, had no time to explain it comming next mornig. My mom never used a computer before (ok shooting ships on my ATARI doesnt count), next mornig I came there, she was already browsing. I asked her how did she do that and she said its easy, tap the aplications then internet and one of the apps was "internet". She even installed the snake game, Isaid how did you do that, she said in aplications section is install new aplication, then she clicked on games and then she piscked what she tought would be the game for her and then install, whas that wrong? she asked, I said NO, its right.
BTW no one knows that they can use CLI or that there is some terminal window in Ubuntu. They are average Joes/Janes.
Not everyone is an enthusisat with PC full of stuff that, and be honest, you dont use on dayli base.
The truth is that Ubuntu will not be a succesfull system for enthusiast or high level profesionals until big software houses (Adobe, hallo!) and game producers will not start to port software for Linux. But that is not fault of Ubuntu or linux and again we are not talking here about majority of users (I mean Joes/Janes).
All folks who think Linux is hard. Have you tried PCLinuxOS? this is easier to install, use than Windows XP, 2003 and Vista period.
there is no Windows hatred here, but you have to try that before you complain.
I have access to all Windows OS at work including the latest Win 7 RC but i find PCLinuxOS easy to setup and use. Needs no special admin skills every config is GUI driven.
Linux has come a long way from where it was 5 years ago!
There are two things I'd like to comment on that bothered me about this article. Firstly, most regular users do not use LTS, the software is just too old and the latest releases of Ubuntu are quite stable. LTS is mostly guaranteed stability for corporate environments.
Second, this package manager hatred is based on this flawed idea that no packages exist outside of the official repositories. A simple google search for deb packages leads to GetDeb.net, a website dedicated to providing up to date packages of all kinds of software specifically for Ubuntu. Google search too hard you say? But its even less difficult to find packages because many project sites (such as wine, featured in this article) include multiple packages for various distributions and even PACKAGE TYPES.
Overall not a bad article. The author definitely knows technology and I'm grateful for that, but he did not seem to do much research on the actual community itself or the Linux Way of doing things. These are minor issues which will resolve themselves with time and I'm looking forward to seeing more linux articles on this site in the future.
I was concerned as well with the constant releases...until I upgraded the first time. I had set aside the better part of an evening because I was *sure* there were going to be plenty of headaches. I've done three such version upgrades now and am happy (not to mention shocked) to report that it's literally a one click upgrade. Simply amazing. I'm sure something will get mucked up in the future with one of the version upgrades for me...but for now all has gone amazingly smooth.
That being the case, I have to disagree with you on the "they release too often" point. I understand it's a pain to sift through all the search results on the forums, but I also have found some older threads (sometime 3 versions back) that the same fixes work for my issue. I agree they need to tag posts with version info...that would make it far easier. Also, there's far more useful information in the (versionally-diluted) forums than I've found for any other piece of software or OS I've used. I almost don't cringe when I have a problem or issue now because I'm quite confident I can find the information without too much digging.
I'd encourage you to upgrade versions from your current install (don't wipe) and comment on how the process goes. Maybe I've just had an extremely easy (and lucky) go of things with no problems...it'll be interesting to read your experiences. Honestly with how easy my upgrades have been, I look forward to new releases (but still give them a few weeks before upgrading...just to see the comments from other users).
Very good read as usual,personally I like to see Kubuntu reviewed at some point(I hear Kubuntu 9.10 is due in Oct) ,as you know its the KDE version,also Gnome and KDE compared would be interesting.
I think the main problem for new Linux users is which one to go with,sure they are all free but it can be confusing and time consuming to try them all,some are more noob friendly then others like Ubuntu/Mint.
In this sentence:
"It’s undoubtedly a smart choice, because if Ubuntu wiped out Windows like Windows does Ubuntu, it would be neigh impossible to get anyone to try it out since “try out” and “make it so you can’t boot Windows” are mutually incompatible"
The more common phrase is 'nigh on impossible' (as in close to impossible) or you could say it's nigh-impossible. Definitely not neigh. Sorry to point out grammar issues, but this is a pet peeve, right along with pique being spelt peak or peek (as in pique my interest).
I've been a 100% Linux desktop (Ubuntu 9.04) user at home ever since I bought my last i7 920. Gaming, multimedia, web -- everything a typical desktop user does under Windows. The inconvenience of migrating an existing Windows install & re-activation outweighed the convenience using Linux which simply booted and worked on the new hardware.
Yes, there are times where you must fire up Google and search for solutions, some of which are commands to be pasted into a terminal window. Yes, sometimes you need to upgrade software packages (Wine is horribly out of date for instance).
On the other hand, with Windows you get apprximately 1,337 updaters which run on startup, virus checkers, malware checkers, browser parasite checkers, firewalls, DRM and misc layers of barnacles which accumulate the longer you use the system. Thankfully the gathering of cruft is not a bane on the typical Linux system yet.
Try 9.04 and see if it is more to your liking. LTS means nothing when most open source problems are "supported" by simply upgrading to the latest software.
I use both XP64 and Hardy (Ubuntu 8.04).
I am also a power user.
Both these operating systems have pros and cons.
Cons for XP64:
1. It does not recognize my hardware properly.
2. Finding 64 bit drivers was/is a mission.
Cons for Hardy:
1. It does not plug and play with my hardware (i have to compile the drivers).
2. Not as user friendly as windows.
Pros for XP64:
1. Windowing system is super fast.
2. User friendly.
Pros for Hardy:
1. Recognizes my hardware.
2. Command line tools are awesome.
Conclusion:
I think that the article was good.
I am one of those people who has always had problems installing windows straight out of the box and thus find that paying a large amount of money for their buggy OS is unacceptable.
I can get a lot of stuff done with Hardy and it is free and if I find a problem with it I can potentially fix that problem.
I also find it unacceptable that manufacturers do not write software (drivers or application software for their devices) for Linux.
For me, it is difficult to live without both XP64 and Hardy.
How can people use something like Aero and its Linux or OSX equivalents (that pre-dates it if I am not mistaken) ? The noise is just hiding the information. Transparency is one issue, another are those icons that are more like pictures : one looses the instant recognition. With Aero knowing which is the active window is not something obvious, you have to look at small details. The title of the window is surrounded by mist making it more difficult to read. Even with XP the colour gradient in a title bar is just noise : there is no information conveyed by it.
The OS GUIs are more and more looking like those weird media players, with an image of rotary button that is to be manipulated like a slide button.
The evolution of all applications to a web interface reminds me of the prehistory of personal computers : each program has its own interface.
The MS Office Ribbon UI is just in the same vein: more than 20 icons on each tab. The icon interface is based on instant recognition and comprehension, when you have so many it turns into a mnemonics exercise. And of course with MS one does not have a choice : you just have to adapt to the program. An end user is only there to be of service to the programs ;-)
If i want to look at a beautiful image I will do it, but the when I want to write an letter or update a database all those ultra kitsch visual effects are just annoying.
As a summary the noise is killing the information and thus the usability.
The thing with windows has been seen before, back in the win 3.1-OS/2 days it was found that while one instance of excel didn't run any faster under OS/2, two in separate VMs (ok, not technically the same thing) ran in about the same time as one on windows.
I like the package management, and hate when I have to install something that doesn't support it, it means I have to worry about updates all my self. If they have one, they I get updates every time I check for Ubuntu updates, very handy. Nice to get the nightly Google Chrome builds, for instance (still alpha/very beta).
Frankly, supporting binary kernel drivers would be insane. Now they are stuck supporting code they cannot look at and cannot fix, they cannot fix their mistakes (or are stuck emulating them forever). If they supported them, there would be even more of them, and when they wanted to fix something broken or that was a bad idea, they would have to wait a reasonable amount of time before doing it, so it would be supported. Frankly, I don't see why people don't have automated frameworks for doing this and automated deb/rpm repository generation. I add their repository, when I get a kernel update, perhaps it is held up a day for their system to automatically build a new version, but then it all installs, instead, I am stuck with having to run a very old kernel, or not having 3D on my laptop, for instance.
I found this article very interesting, because is oriented to windows user and is helpful to them because you just didn't die trying it.
But you can't blame ubuntu (or any distro), about the pain in the ass a video card's drive could be to install, blame ati and nvidia for been lazy, and if using wine for playing games is not as good as playing in windows blame games company for don't release a GNU/linux version.
Also, the thing about why GNU/linux overpass windows in file management is because ntfs is a BAD file system, maybe if windows somehow could run under ext3 would be even better than it is.
And why your negligence to use a console (stop saying cli please), you are not opening your mind trying to use GNU/linux as a windows just because it is not windows is a completely different os. Look from this point of view... something that you can do in windows with 5 clicks maybe you can do it in GNU/linux in just one line of bash code. So, sometimes you will use GUI and others you will use console and you will find that having this options is very comfortable. So start using the console and do the same article a year later.
I hope some day have a paid version of GNU/linux (still open source), that could pay salaries to programers to fix specific issues in the OS.
In the other hand, when you do the IT benchmark is very disappointing that you don't use linux with those beautiful Xeons. Servers environment is were GNU/linux get stronger. And Xeons with windows are just toys compared with unix on sparcs or power architectures.
PS: try to get 450 days of uptime in a windows 2003.
Many people considering linux are still on dial-up. These are often folks with lesser expertise who just want to get connected and use their computer in basic ways. But getting connected with dial-up is something of an adventure with many distros and/or versions. Ubuntu 9.04 has moved away from easy dial-up, but Mint7KDE includes KPPP for simple dial-up connection. Mint7KDE has other nice features as well.
I am asking you to expand your current picture of the landscape to include people who want to use linux with a dial-up connection. This of course would have to include a brief discussion of 1) appropriate modems and 2) distro differences. Thanks,
r kerns
Really glad to find this in depth article after all this time. Thank you Ryan. I too have run Ubuntu as my main OS even though most of my experience is in Windows and have had similar experiences. Because this was a very long article it got into detail about things like the Package Manager or the multiple desktops that I have not seen discussed elsewhere from a user perspective. As someone else pointed out it is moot what people would like or complain about if they were moving from Linux to Windows or OSX, but imagine for a moment if they were used to getting the OS and all their apps updated in one hit and were asked to do it one app at a time and expected to pay for the privilege!
If you go on with the Linux series I'd like to see discussion of the upcoming Ubuntu and other distros - I've been impressed with SUSE. I'd also like to see projects on how to build a Linux server and HTPC - including choice of distro and the kind of hardware needed. I'm less sure of where benchmarking is really useful - the tradition of detailed benchmarking at AT arose from the interest in overclocking and gaming which I think is a much lesser consideration in Linux. More relevant might be comparisons of netbook specific distros or how to work out if that old P4 will do as a home server. There is a lot of buzz in the tech world about things like Symbion, Chrome OS, Moblin, Maemo on portable devices that could possibly draw new readers to the Linux tab at AT. A great start in any case.
I just wanted to say thank you to the author for a very thorough article. After reading it, I decided to use Ubuntu for a PC I'm building out of spare parts for a retired friend who's on fixed income. My friend just uses web, e-mail, and some word processing, so this will be perfect.
The article gave me a good idea of what to expect -- a good honest appraisal with all the good and bad. After installing Ubuntu 9.0.4, I am very impressed. The install was very quick, and easier than XP. Everything is quite snappy, even though it's running on a AMD 3800+ single core processor and an old hard drive.
I only skimmed the article (I saw the part on gaming being poor), I'd like to see a comparison of several games using the same hardware on windows and linux (results given in fps). If this has been mentioned sorry and good day.
The article is a good job overall. Below are a few of my own observations concerning the article and linux in general.
###Package manager
I never used Ubuntu, so cannot comment on apt. I tried SuSE, mandriva and eventually ended up with Gentoo. I think package managers are the best part of linux. Gentoo's portage and 'emerge --sync' allows you to be always up-to-date in terms of software, no need to ever reinstall the system. It is CLI-based and sometimes you do have to play a bit with USE-flags and compatibility, but in general 99.99% of my software needs can be satisfied by portage.
###Command Line Interface
For my home needs I switched from windows to linux about 4 years ago. At work I still use windows, but that is due to corporate policies rather than preference. Originally CLI was something terrifying. It took me some time to learn and adapt, but now I do most of system tasks in CLI.
###Video drivers
I does take some effort when doing something non-standard (1080p projector connected to HDMI via av-receiver). But in general in gentoo it is usually as easy as typing 'emerge [nvidia]/[ati]-drivers'. Even double-monitor setups, which I have two. Dont forget the open-source xorg drivers, which are usually fine for simple desktop and come pre-bundled.
###Gaming...
...is much harder under linux. In my case not relevant, as I spent less than 1% of my time to that over the past 2 years. Others may find this a real obstacle for migration.
###Syncing...
...your devices is also a headache. I even once succeeded in syncing my windows mobile 5 device with evolution. But amount of efforts taken for that was far too great. I never even tried syncing my nokia phones after that.
###Usage scenarios
I think you judge Ubuntu (and linux in general) in very windows-centric usage patterns. Ubuntu is unlikely to out-windows the original, although OSX seems to have done just that. Even the approach is windows-centric - you take a windows app and compare linux against it. The point is that linux in general allows user to open and develop completely different usage scenarios, which are beyond windows. Allow me to elaborate on the basis of my own experience.
###In search of killer app...
You note in the article that there is no linux killer app. I disagree.
1. I run linux homeserver. It has proxy (squid) and attached antivirus filter (clamav+squidclamav). It has array of software raid (mdadm). I also run web-server (apache) with gallery of photos (gallery2). There is a mail server (postfix) for a few accounts. Filesharing is done via nfs and samba. Finally I run mythtv backend server with 4 tuners. I never tried to replicate this software stack in windows, it is likely possible but require some pretty expensive licenses. Some of the thing like mythtv server are impossible under windows to the best of my knowledge.
2. dvdrip + transcode allows me to rip dvds and transcode them simultaneously on 4 client machines with total 12 processing cores. Transcoding is usually done under 15 mins.
3. I mentioned mythtv. I have centralised server and a number of clients. TV at the house is done via LAN, i.e. small x86 boxes with output to TV screen. Integrated mythtv client interface allows watching movies or listening to music from central storage, light browsing and so on at every TV.
4. The small x86 boxes are network-booted from server (in.tftpd + nfs). This allows easy management of software, i.e. single image for all clients. Never tried that under windows, likely possible but costly.
5. One of the clients is an HP thin-client with only 1G of local storage. I ended up network-booting it anyway, but initially compiled a full gentoo system (kernel, X, fully-fledged window manager (XFCE-4), mythtv client, browser (firefox), mail client (claws-mail), media player (mplayer + GUI)) under 1G. If I spent a bit more time, I think I could even fit office in that space. Not possible under windows.
6. There are other things that i have not even explored. Like asterisk for ip telephony. Or projects like opengoo, which allow you to run your own server-based set of office apps.
###The bottomline
I think a fair comparison should not focus on things that windows is known to do best. I think getting familiar with linux will enable one to find his own killer app, which cant be replicated in windows at reasonable cost. But this would require a reasonable time and efforts, which are beyond the scope of the article.
a couple of more things which i forgot to mention in my previous post:
###mounting network shares
your troubles seemed quite strange to me. Maybe this is because of Ubuntu implementation. This is usually done very well by linux. I do it in /etc/fstab. I agree with some previous posters, this is largely due to lack of linux experience, so should not be used in final assessment.
###benchmarks
are quite useless. I dont think you should dedicate much time in the article to that. For majority of desktop applications, plus or minus 10-15% does not make much difference.
###other interesting projects
another example of ltsp. Again, there are alternatives in windows, but licensed and pricey.
If you have a Windows CD lying around I highly recommend using virtualbox and installing Windows on it. For any low-performance application it'll work 100%, but it's not made for gaming. But except for gaming it should reduce your dual boot time to near zero. WINE is great for hackers and idealists but unless the application got a platinum/gold rating new users should not use it.
As for support time, let me put it this way... do you keep XP for 10 years to run 10 year old installations? As the distro is for the most time supplying the applications, it's like being stuck ten years in the past. This is more like a free upgrade from Office 97 -> XP (2000) -> 2003 -> 2007 -> 2010 every few years.
Is it pefect? No. But a lot of it is that Canonical can't tell you the easy ways of fixing things. Most of the things on this page should be done first time you boot a fresh Ubuntu install:
It's great to see Linux on the front page of Anandtech once again!
Linux use on the Desktop is getting more attention, and its use is growing (though still only a very small fraction of the total market). I'm a huge fan of Linux (as well as Solaris and BSD), and I found Ryan's review very fair and informative. As with any OS, it is important to accept weaknesses as well as strengths, so that it can grow and improve. The strength of Linux has mostly been in the server world, but especially in the last couple of years, it has become much more user friendly on the desktop. These days, I have been able to install it on machines for people with no prior familiarity of Linux, and have found, in most cases no significant issues. There are definitely still things that can be improved though, (audio issues are the biggest one), but overall it is good news. Some people have commented 'Why bother? Windows does everything I need!', but I think competition is a good thing for desktop computing (and just about anywhere), as the saga of Internet Explorer's decline in the demise of Netscape proved. Mac OS X of course is there, especially at the higher end, but Linux is also beginning to make an impact on generic hardware (Netbooks being a good example), which has meant that Microsoft has had to lift its game and drop prices.
To answer Ryan's question, in terms of what I would like to see for Linux on Anandtech: in addition to benchmarks for consumer hardware running Linux, guides such as building a home server or HTPC, where using Linux is an appropriate option, alongside Mac or Windows; for some things, such as gaming there's much less point of course. IT Anandtech used to feature Linux server and virtualisation benchmarks, which made a lot of sense since Linux's greatest strength is in servers, although these benchmarks and reviews have been mysteriously absent lately. And of course, news from the world of Linux, such as new products with Linux on them, new kernel features, popular distro releases such as Ubuntu, and general headlines (e.g., Dell's recent comment that netbook return rates were no higher for Linux than for Windows).
Great to read the comments. Many thanks, Anandtech!
This explains why I'm not using and won't use Linux in the near future for my regular PC. As long as I need windows anyway (and pay) there is for me even as "above average" user no need for Linux. I think I can do anything I need on windows, so why should I get Linux additionally to Windows? Makes no sense for me and not for most other users that are not developers, geeks or idealists.
That will only change if you do not need windows at all, eg. you can game on Linux with same performance than on windows.
The only thing I can imagine using Linux is on a HTPC. Take one of the new ION nettops without OS and put on LinuxMCE or Linux with XBMC. But I have no idea how easy that is and how good it works. But there I would really benefit from the fact that Linux is free. (else the whole price will go up by like by 50% ;)).
I might actually try.
As someone who has used Ubuntu primarily for at least 4 years now I can say there are things that only come with time. Namely what applications to use for what. I primarily use Ubuntu because out of the box it installs most of the stuff I need right out of the box. However, that doesn't mean it installs EVERYTHING you need. Particularly when it comes to audio management. That package manager is your GOD in more ways than one once you realize that aside from running the latest drivers (which aside from us there aren't many other people that do and you aren't gaming on Linux unless you are using Cedega or Crossover) it provides you really with just about any program you need.
So...
Audio Management - Remember you can mix and match KDE/Gnome. So for this I would use Amarok 1.4. It can sync with iPods and offers the ability to have your music collection residing in a SQL database. That alone had me smiling for days. I have about 70GB of music which it can catalog in about 5 min. Try that in iTunes or WMP. With it being in MYSQL any program I want to have access to my
collection can. In 9.04 you'll need to add the repository as it installs 2.0 of Amarok by default (which isn't as feature rich).
As an aside, Rhythmbox can now rip from inside the program. It might just be because you are using 8.04 but I know it does it now. However, Sound Juicer gives you more options with MP3 Tags as well as file types.
Video Editing / Mastering
If you want a program like Nero Vision, check out Devede, ManDVD is pretty good for DVD mastering. Both of those should be in the Package Manager. If they aren't their web sites offer debs. Just download and double-click.
Burning
If you don't like Brasero head to the package manager and get K3B. It's about as close to Nero as one can get.
SMB
The problems you've experienced are due to version. Browsing the root of a server has been fixed. It takes a little longer for the shares to appear than I would like it to, but it does work now.
Mapping "drives" as it were depends on the route you take. "Connect to Server" I believe creates a mount point under your media folder. I really haven't had much of a problem with this as most programs do recognize bookmarks, and for all of my shares one of the first things I do is set it in my FSTAB which essentially hard links my shares which avoids the problem entirely as every program can see a mount point. Basically there's about 5 different ways to go about this issue just choose the one that works best.
ISOs
One thing with ISO is that File Roller reads them natively. So unzip it and install from there. It's not like hard drive space is scare now a days.
Customization
My closing thought is that remember Ubuntu is not Gnome. Meaning that Ubuntu just creates the packages for it's distribution. Everything in linux is customizable. Don't like the bar at the bottom? Delete it. Want a tray like OSX? There's tons to choose from. I've done some OS modding and I can make Gnome look like Windows XP or Mac OSx (and mimic it's functionality). You aren't limited here.
Thats' the thing with linux the more time you spend (and really I'm talking about days here not months) the more programs you'll come across which will do what you want. Don't like Transmission? Use Ktorrent. Hell even Utorrent works very well with Wine.
Overall I liked your review.... even though I've had to wait for it...
Great article, congrats. Looking forward for part 2.
My only criticism is that for both intermediate (topic-specific) verdicts and final (first?) thoughts, the non-techie average-joe end user point of view is given way too much weight. Not only do these people not read AT, but they also don't use objective judgement to decide their OSs. These are the people who will go with the flow, which will be dictated by other people. In short, the article focuses on helping joe6pack, but joe6pack doesn't care.
I believe it would be more productive, when passing comparative judgement, to narrow the focus on aspects more strongly connected to wide adoption - deployment (installation & out-of-the-box functionality), large-scale maintenance ease, support, productivity (office) software, and why not, bang for buck. Some of them are already being considered, and maybe they just need to be given higher relevance.
I have used Windows since the 3.0 days and am currently running RC7 on my home desktop. I think that it is significant that the hardware resources required by Ubuntu and Linux in general are much lower than Windows and can give new life to an old laptop or desktop. Ubuntu recognized and set up all the hardware without a hitch on my Dell Inspiron E1505 and my newer office Q9550 quad. A reasonably older system running on Linux can perform standard tasks at speeds close or equal to newer hardware running any of the Windows operating systems. The boot time for Ubuntu is also significantly shorter.
It's no coincidence that the ASUS PC1000 netbook, with its solid state hard drive ships with Linux. Mine came with a version of Mandriva linux that is obviously designed for newbies, and doesn't provide nearly as large a set of software repositories as Ubuntu. I replaced it with Eee specific version of Ubuntu and have been very pleased with results. It quickly performs daily tasks such as office work via Openoffice 2.5, browsing (I use Opera) and email (Thunderbird). The difference in observable speed between my Eee and the i7 system I recently built is negligible. Of course I'm not talking about CPU intensive tasks, including Google Earth, where the i7 is many magnitudes faster. But for everyday mobile tasks I use the Eee. (The Inspiron is way too hot).
I have to disagree about the Synaptic Package Manager. You can Google 3rd party Linux repositeries, such as Medibuntu,that contain multimedia codecs and other interesting Linux software. (Be careful & research 3rd party repositories to avoid unpleasant surprises.) You can add these 3rd party repositories to your permanent list of software sources so that when updates to that software are created Ubuntu notifies you and allows you to accept the updates you want. Using a USB powered external DVD drive I can watch movies on my Eee or rip & compress them on the i7 system and put them on a flash drive or card. Once you get the comression formats tuned you have a good, light weight, computer/multimedia system with excellent battery life; good for two movies. All this on a two year old Intel Atom powered netbook.
I've played around with Linux for some time but I found it too alien & command line driven using a command set that has very little in common with MS or DR DOS. But over the last several years Linux has been improving at an accelerating rate. All that's needed now is a high quality Wine (Windows emulator) program that will allow you to run all your Windows specific software on Linux without a performance loss. It's a difficult undertaking but Wine has been steadily improving and there are now commercial Linux distros that guarantee compatability with specific Windows programs.
I recommend putting it on an old hard drive & give it a whirl. And try several versions; there are a lot of free ones out there. I recommend Ubuntu, OpenSUSE and Mandriva for starts. Why pay for the cow when you can get the milk for free?
Unfortunately, most of the same problems mentioned in the article are keeping me from switching to Ubuntu as my primary OS. As far as gaming is concerned, I'll be going with Win7 (I'm still using XP atm).
If I ever get a netbook or non-gaming laptop, I'll certainly consider Ubuntu. However, the likelihood of moving to Ubuntu when the laptop comes with Win7 will probably be very small, unless I can somehow get a refund for not using Win7.
Also, I guess some people do like the new GUI in MS Office. I found it annoying as I'm still using to the old GUI. But I suppose in time, I may learn to accept it like how I eventually accepted the XP look.
Excellent article, thank you. I will definitely be passing it on.
I completely agree with superfrie2 about the CLI. Why resist it?
Versions: I, like you, originally plumped for Hardy Heron because it is an LTS version. I recently changed my mind, and now run the latest stable Ubuntu. As a single user, at home, the benefits of a long-term unchanging OS are pretty small, and in the end it was more important to me to have more recent versions of software. Now if I were administering a network for an office, it would be a different matter...
Package management: Yes, this is absolutely the most amazing part of free software! How cool is it to get all your software, no matter who wrote it, from one source, which spends all its time diligently tracking its dependencies, checking it for compatibility, monitoring its security flaws, filtering out malware, imposing sensible standards, and resisting all attempts by big corporations to shove stuff down your throat that you don't want, all completely for free? And you can upgrade *everything* to the latest versions, at your own convenience, in a single command. I still don't quite believe it.
Unpackaged software: Yes, I agree, unpackaged software is not nearly as good as packaged software. It's non-uniform, may not have a good uninstaller, might require me to install something else first, might not work, and might conceal malware of some sort. That's no different from any other OS. However, it's not as bad as you make out. There *is* a slightly more old-fashioned way of installing software: tarballs. They're primitive, but they are standard across all versions of Unix (certainly all Linux distributions), they work, and pretty much all Linux software is available in this form. It never gets worse than that.
Games: A fair cop. Linux is bad for games.
GPUs: Another fair cop. I lived with manually installing binary nVidia drivers for five years, but life's too short for that kind of nonsense. These days I buy Intel graphics only.
40 second boot: More like 20 for me on my desktop machine, and about 12 on my netbook (which boots off SSD). After I installed, I spent a couple of minutes removing software I didn't use (e.g. nautilus, gdm, and most of the task bar applets), and it pays off every time I boot.
Separate menu bar and task bar: I, like you, prefer a Windows-ish layout with everything at the bottom, so after I installed I spent a minute or two dragging-and-dropping it all down there.
I use GNU/Linux for 100% of my needs, but then I have for years and my hardware and software reflect this. For example I have a Creative Zen 32gb SSD music player and only buy DRM free MP3s. In Linux I plug it in and fire up Amarok and it automatically appears in the menus and I can move tracks back and forth. I knew this when I bought it, I would never buy an iPod as I know it would make life difficult.
The lesson here is that if you live in a Linux world you make your choices and purchases accordingly. A few minutes with Google can save you a lot of hassle when it comes to buying hardware.
There are three web sites any Ubuntu neophyte needs to learn.
1 www.medibuntu.org where multimedia hassles evaporate.
2 http://ubuntuguide.org/wiki/Ubuntu:Jaunty">http://ubuntuguide.org/wiki/Ubuntu:Jaunty the missing manual where you will find the solution to just about any issue.
3 http://www.getdeb.net/">http://www.getdeb.net/ where new versions of packages are published outside of the normal repositories. You need to learn how to use gdebi installer, but essentially you download a deb and double click on it.
Then there are PPA repositories for the true bleeding edge. This is the realm of the advanced user.
For a home user it is always best to keep up to date. The software is updated daily, what did not work yesterday works today. Hardware drivers appear all the time, by sticking with LTS releases you are frozen in time. Six months is a long time, a year is ancient history. An example is USB TV sticks, buy one and plug it into 8.10 and nothing happens, plug it in 9.04 and it just works or still does not work, but will in 9.10
Yes it is a wild ride, but never boring. For some it is an adventure, for others it is too anarchic.
I use Debian Sid which is a rolling release. That means that there are no new versions, every day is an update that goes on forever. Ubuntu is good for beginners and the experienced, the more you learn the deeper you can go into a world of software that exceeds 30,000 programs that are all free in both senses.
I look forward to part 2 of this article, but remember that the author is a Linux beginner, clearly technically adept but still a Linux beginner.
> I use Debian Sid which is a rolling release.
> That means that there are no new versions, every day is an update that goes on forever.
This is actually one of the best things about Ubuntu and Debian - you NEVER have to reinstall your OS.
With Windows you may live with one OS for years (few manage to do that without reinstalling, but it is definitely possible) - but you HAVE to wipe everything clean and install a new OS eventually. With Debian and Ubuntu you can simply constantly upgrade and be happy. At the same time noone forces you to upgrade ALL the time, or upgrade EVERYTHING - if you arehappy with, say, firefox v2 and dont want to go to v3 because your fav skin is not there yet - just dont upgrade one app (and decide for yourself if uyou need the security fixes).
Some time ago I turned on a Debian box which was offline/turned off for 2+ years and managed to update it (to a new release) with just two reboots (one for the new kernel to take effect). That was it, it worked right after that. To be fair, I did have to update a few config files manually after that to make it flawless, but even without manual updates the OS at least booted "into" the new release. Natuarally, all my user data stayed intact, as did most of the OS settings. Most (99%) programs worked as expected as well - the problematic 1% falling on some GUI programs not dealing well with new X/window manager. And had no garbage files or whatever after the update (unlike what you get if you try to upgrade a winXP to say WinVista)
Having said all that, I 100% agree that linux has its problems as a desktop OS (I use windows more than linux day-to-day), but I totally disagree that using one OS for a long time is a weak point of Ubuntu.
P.S. one thing i never tried is upgrading a 32 bit distro to 64 bit - i wonder if this is possible on a live OS using a package manager.
A good article but I have a few pointers.
1) More linux distros need to be reviewed. Your "out of the box" review was informational but seemed to more in-tuned with commercial products aimed for making a profit (ie, is this a good buy for your money?). I, for one, used to check AnandTech.com before making a big computer item purchase. However linux is free to the public thus the tradeoff for the user would now be how much time should I invest in learning and customizing this particular distro. Multi-distro comparisons along with a few customized snapshots would help the average user on deciding what to spend with his valuable time and effort.
2) Include Linux compatibility on hardware reviews. Like I said earlier, I once used AnandTech.com as my guide for all PC related purchases and I have to say about 80% of the time it was correct. But, try to imagine my horror about 1.5 years ago when my brand spanking new HD4850 video card refused to do anything related to 3-D on Ubuntu. I spent weeks trying to get it to work but ended up selling it and going with Nvidia. Of course it was a driver issue but no where did AnandTech.com mentioned this other than saying it was a best buy.
Thanks for listening, I feel better now. I'm looking forward to reading your Ubuntu 9.04 review and please keep adding more linux related articles.
When I first saw that there was going to be a "first time with Linux" article on Anandtech, I have to admit I was a bit worried. While the hardware reviews here are excellent, it's already something you guys are familiar with - it's not new grounds, you know what to look for. I sadly expected Ryan would enter with the wrong mindset, trip over something small and end up with an unfair review like almost all "first time with Linux" reviews end up being.
Boy, was I wrong.
With only one major issue (about APT, which I explained in another post) and only a handful of little things (which I expect will be largely remedied in Part 2), this article was excellent. Pretty much every major thing that needed to be touched on was hit, most of Ubuntu's major pluses and minuses fairly reviewed. It's evident you really did your homework, Ryan. Very well done. I should have known better then to doubt anyone from anandtech, you guys are brilliant :D
One last thing I forgot to say....
Good job on the article. I (and many others) would have liked to see 9.04 instead (I don't know of anyone who uses the LTS releases, those seemed to be aimed at system integrators, such as Dell's netbooks with ubuntu), but the article itself was quality.
I'd like to make one last addition in similar spirit. I appreciate this article as a generally unbiased review that covers many important aspects of a general-purpose OS.
And just to be sure: I'm not a Linux fanatic, in fact, for some reason, I'm writing up this post on Vista x64 ;)
You're right that there are historical reasons that dictate that one Linux binary might be in /usr/bin, another in /sbin or /usr/sbin, yet another one in /usr/local/bin, etc.
However, you really couldn't care less as long as the binary is in your path. which foo will then tell you the location. Furthermore, there's hardly any need to manually configure something in the installation directory. Virtually anything that can be user-configured (and there's a lot more that can than on Windows) can be configured in a file below ~ (your home). The name of the config file is usually intuitive.
But yeah, for things that you configure as admin (think X11 in /etc/X11/xorg.conf or Postgres usually somewhere under /usr/local/pgsql) you might need to know the directory. However, the admin installs the app, so he should know. Furthermore, GUIs exist to configure most admin-ish things (I don't know what it's in Ubuntu for X but it's sax2 in SUSE; and it's pgadmin for Postgresql in both Ubuntu and SUSE)
Even though it is technically possible to reorder the directory structure, Ubuntu isn't going to do it for a variety of reasons:
First and foremost, one must remember Ubuntu is essentially just a snapshot of Debian's in-development branch (unstable aka Sid) with some polish aiming towards user-friendliness and (paid) support. Other then the user-friendly tweaks and support, Ubuntu is whatever Debian is at the time of the snapshot. And while Debian has a lot of great qualities, user-friendliness isn't one of them (hence the need for Ubuntu). Debian focuses on F/OSS principles (DFSG), stability, security, and portability - Debian isn't going to reorder everything in the name of user-friendless.
Second, it'd break compatibility with every other Linux program out there. Despite the fact that Ryan seemed to think it's a pain to install things that aren't from Ubuntu's servers, it's quite common. If Ubuntu rearranges things, it'd break everything else from everyone else.
Third, it would be a tremendous amount of work. I don't have a number off-hand, but Ubuntu has a huge number of programs available in it's repos that would have to be changed. Theoretically it could be done with a script, but it's risking breaking quite a lot for no real gain. And this would have to be done every six months from the latest Debian freeze.
I disagree with the evaluation of the package manager.
First, there's a repo for almost anything. I quickly got used to adding a repo containing newer builds of a desired app and then installing via apt-get.
Second, with a few exceptions, you can just download source code and then install via "./configure; make; sudo make install." This usually works very well if, before running those commands, you have a quick look at the README and install required dependencies via apt-get (the versions of the dependencies in the package manager almost always are fine).
Third, and most importantly, you can simply update your whole Ubuntu distribution via dist-upgrade. True, you might occasionally get issues from doing that (ATI/NVIDIA driver comes to mind) but think of the convenience. You get a coffee while "sudo apt-get dist-upgrade" runs and when you get back, virtually EVERYTHING is upgraded to a recent version. Compare that with Windows, where you might waste hours to upgrade all apps (think of coming back to your parent's PC after 10 months, discovering all apps are outdated).
I concur - while most of the article is quite good, Ryan really seemed to have missed quite a bit here. His analysis of it seemed rather limited if not misleading.
Not everything *has* to be a package - I have various scripts strewn around, along with Firefox 3.6a1 and a bunch of other things without having them organized properly as .deb's with APT. The packaging system is convenient if you want to use it, but it is not required.
Additionally, Ryan made it seem as though everything must be installed through Synaptic or Add/Remove and that there where no stand-alone installers along the lines of Windows' .msi files. It's quite easy on Ubuntu to download a .deb file and double-click on it. In fact, it's much simpler then Windows' .msi files - there's no questions or hitting next. You just give it your password and it takes care of everything else.
The one area I agree with Ryan is that there needs to be an standardized, easy, GUI fashion to add a repository (both the address and key) to APT. I have no problems with doing things like >>/etc/apt/sources.list, but I could see where others may. I suspect this could be done through a .deb, but I've never seen it done that way.
Something I've been fishing for here and have not yet seen much of are requests for benchmarks. Part 2 is going to be 9.04 (no multi-Linux comparisons at this point, maybe later) and I'd like to know what you guys would like to see with respect to performance.
We'll have a new i7 rig for 9.04, so I'll be taking a look at a few system level things (e.g. startup time) along side a look at what's new between 8.04 and 9.04. I'll also be taking a quick look at some compiler stuff and GPU-related items.
Beyond that the board is open. Are there specific performance areas or applications that you guys would like to see(no laptops, please)? We're open to suggestions, so here's your chance to help us build a testing suite for future Linux articles.
I'd like to see differences between PPD in World Community Grid between various Windows and Linux distros.
I never really see AT talk about WCG or other distributed computing, but I figure if I'm gonna OC the crap out of my cpu, I might as well put it to good use.
Cross platform testing is pretty difficult, considering there are a multitude of different apps to accomplish the same task, some faster, some slower. And then there's the compiler optimizations for the same cross platform app as you mentioned in the article. However, I understand that from an end user's perspective, it's all about doing a "task". So just to throw a few ideas out there involving cross platform apps so that it's a bit more comparable...
- Image or video conversion using GIMP or vlc.
- Spreadsheet calculations using the Open Office Calc app.
- Performance tests through VMware.
- How about something java related? Java compiling, a java pi calculator app, or some other java single/multi threaded test app.
- Perl or python script tests.
- FTP transfer tests.
- 802.11 b/g/whatever wireless transfer tests.
- Hard drive tests, AHCI. (I read bad things about AMD's AHCI drivers, and that Windows AHCI drivers were OK. What about in Ubuntu?)
- Linux software RAID vs "motherboard RAID", which is usually only available to Windows.
- Linux fat32/NTFS format time/read/write tests vs Windows
- Wasn't there some thread scheduling issues with AMD Cool and Quiet and Windows that dropped AMD's performance? What about in Linux?
While I'm brainstorming, here's a few tests that's more about functionality and features than performance:
- bluetooth connectivity, ip over bluetooth, etc
- printing, detecting local/network printers
- connected accessories, such as ipods, flash drives, or cameras through usb or firewire
- detecting computers on the local network (Places -> Network)
- multi channel audio, multi monitor video
Just for fun:
- Find a Windows virus/trojan/whatever that deletes files, unleash it in Ubuntu through Wine, see how much damage it does.
I know you've said in previous comments that using Phoronix Test Suite for benchmarking different OSes (e.g. Ubuntu vs Vista) won't work because PTS doesn't install in Windows, but you could probably use a list of the available tests/suites in PTS as a place to get ideas for commonly available programs in Windows/OSX/Linux.
I'm pretty sure that Unigine's Tropics/Sanctuary demos/benchmarks are available in Windows, so those could bench OpenGL/graphics.
Maybe either UT2004 or some version of Quake or Doom 3 would work as gaming benchmarks. It's all going to be OpenGL stuff, but it's better than nothing. You could also do WoW in Wine, or Eve under Wine to test some game compatibility/performance.
Once you get VDPAU working, I'd love to see CPU usage comparisons between windows/linux for media playback of H.264 videos. And also, I guess, a test without VDPAU/VAAPI working. Too bad for ATI that XvBA isn't supported yet... might be worth mentioning that in the article.
You also might want to search around for any available OpenCL demos which exist. Nvidia's newest Linux driver supports OpenCL, so that might give you a common platform/API for programs to test.
I've often felt that DVD Shrink runs faster in Wine than in Windows, so the time to run a DVD rip would be nice, but might have legal implications.
Some sort of multitasking benchmark would be nice, but I'm not sure how you'd do it. Yeah, I can see a way of writing a shell script to automatically launch multiple benchmarks simultaneously (and time them all), but the windows side is a little tougher to me (some sort of batch script might work). Web Browsing + File Copy + Transcoding a video (or similar).
Ooh... Encryption performance benchmarks might be nice. Either a test of how many PGP/GPG signs per second, or copying data between a normal disk partition, and a TrueCrypt partition. The TrueCrypt file copy test would be interesting to me, and would cover both encryption performance and some disk I/O.
One last suggestion: Folding@Home benchmarks. F@H is available at least in CPU-driven form in Windows/Linux, and there's F@H benchmark methodologies already developed by other sites (e.g. techreport.com's CPU articles).
Well, that's enough for now. Take or leave the suggestions as you see fit.
you are out of luck here ... linux does not compare to windows because they are both different architectures. you already did what you could in the article.
especialy in a binary distribution like Ubuntu, compilation speed tests are meaningless (but Gentoo folks would kiss your feet for that).
boot up times are also not usefull. the init scripts and even init mechanisms are different from distro to distro.
compression/filesystem benchmarks are half way usable. on windows you only have NTFS these days. on linux there are like 20 different filesystems that you can use (ext3/4, reiser, jfs and xfs are the most used. also quite many distros offer lvm/evms backends or software raid.
I do not think there is much benchmarking you can do that will help in linux vs windows, even ubuntu vs windows because the same benchmars will differ between ubuntu versions.
the only usable types are wine+game vs windows+game, native linux game vs same windows game (mostly limited to unreal and quake angines), some povray/blender tests and application comparisons (like you did with the firefox javascript speed).
Not really a benchmark per se, but I'd be curious to know how the stereotypes of Windows being bloated and Ubuntu being slim and efficient translate to power consumption. Load and idle would be nice, but if at all possible I’d be much more curious to see a comparison of task energy draw, i.e. not so much how long it takes them to finish various tasks, but how much energy they need to finish them.
In know that’d be a very difficult test to perform for what will probably be a boring and indeterminate result, but you asked. :)
is there some kind of cross platform test that can be done to test memory usage? Maybe Firefox on both platforms? not sure.
By "no laptops", I presume you mean, no battery tests (therefore power and as a consequence, heat). That would have been nice though. Maybe for those looking for a 'quiet' setup.
but yes, definitely GPU (including video acceleration) and the GCC vs Visual Studio vs Intel compiler arena (along with some technical explanation why there are such huge differences)
If you can, find games that are reported to work well under WINE and benchmark those against those running natively in Windows. It'd be interesting to see how the various differences between the two systems, and WINE itself, could effect benchmarks.
Number 1 use of Ubuntu is probably going to be for netbooks/low end desktops for people who just wanna browse the net.
In that case, the browsing experience (including flash) should be investigated.
Boot up time is important.
Performance with differing memory amounts would be nice to see (say 256MB, 512MB, 1GB, and 2GB or higher). Scaling across cpus would be nice.
Ubuntu as a programming environment versus windows would be good to see, including available IDEs and compiler and runtime performance.
Ubuntu as a media server/HTPC would be good to see. Personally, I have my windows box using DAAP shares since Ubuntu interfaces with it much better than Samba. And as an HTPC, XBMC and Boxee are nice, cross-platform apps.
Finally, how Ubuntu compares for more specific applications. For instance, scientific computing, audio editing, video editing, and image manipulation. Can it (with the addition of apps found through it's add/remove programs app) function well-enough in a variety of areas to be an all-around replacement for OSX or Windows?
Speedwise, how do GIMP and Photoshop compare doing similar tasks? Is there anything even on par with Windows Movie Maker?
What's Linux like game wise? Do flash games take a noticeable performance hit? Are the smattering of id software games and quake/super mario bros/tetris/etc clones any good? How does it handle some of the more popular WINE games?
1.) I've been completely ignorant of software development on Windows over the last few years. Comparison of MS Visual Studio vs Eclipse or vs Netbeans/Sun Studio? How fast are CLI C++ apps on Windows vs. Linux? Perhaps using both GNU and Intel C++ Compiler toolchains on Linux. And possibly MS Visual C++ and Intel Visual C++ on Windows.
Perhaps less esoteric, 2.) instead of benching SMB/CIFS on Windows vs Samba on *nix, bench something *nix native such as scp/sftp or nfs. Netperf.
3.) Number-crunching stuff. I guess this is sort of similar to running at least a few synthetic benches. LINPACK or some other test that uses BLAS or LAPACK, tests that use FFTW. Maybe even SPEC (I wouldn't expect any exciting results here, though, or are there?)
I have a question about your benchmarks that involve files, such as copying and zipping. When you run your benchmarks, do you run them multiple times and then get an average? I ask that because I have learned that in Linux, files get cached into memory, so subsequent runs will appear faster. I suspect the same thing happens in Windows. Do you take that into account by clearing cached memory before each run?
The LTS is really for the same types of people that avoid grabbing the latest MS service pack. IE, anyone who's still running Windows XP SP2 with IE6. Do that comparison and see how they compare.
Ubuntu is little more than a tight integration of many well-tested packages, there's no reason to go with ubuntu's LTS when everything else already goes through it's own extensive testing. Given how quickly open source software advances, I'd say the LTS is probably less stable than the most up to date versions, and certainly far behind on usability.
You want the equivalent of Ubuntu's LTS in Windows? It most closely matches the progression that the Windows server versions follow.
To put things in perspective, 8.04 was released shortly after Vista SP1 and XP SP3 were. So Hardy vs. XP SP2 (a 4 year old SP) is a pretty poor comparison.
You'll see an up to date comparison in part 2 when we look at 9.04.
I'm glad you did this article. It really has been something I think about. I'm ready to read your Part II. As others have mentioned, I have a couple of other articles that would be great.
1) The comparison of the various versions as mentioned. SuSe, Ubuntu 9.04, BSD, etc...
2) Someone mentioned VirtualBox. I'd love to hear more about this including a detailed setup for the normal user. I'd love to be able to surf while in Linux, but able to play games in Windows and keep them separate for added security.
Thanks for the article! Hope to see one or both of the ideas mentioned above covered.
"Not that it would necessarily be of much use, the last time I saw any statistics for instant messaging network usage, the vast majority of North American users were on AOL’s AIM network."
IM use is highly regionalized. As such, AIM is clearly the dominant IM in the USA. However, Canada is dominated by MSN Messenger, and has been for many years (most of us migrated from ICQ to MSN around the release of Windows XP, I believe, due to the bundling of then Windows Messenger).
So, if Canada is dominated by MSN, while I can't speak for Mexico, it's misleading to claim that "the vast majority of North American users". As a Canadian, I can't think of anybody I know in person that uses AIM. They all use MSN or Google Talk without exception.
For myself, the thing that most bugs me when I have to go back to Windows is all the missing features from the window manager. I've come to rely on having multiple workspaces on my desktop, but I can adjust to having just one fairly easily when I'm not working on a lot of different stuff at once. What really bugs me, though, is how much more effort it takes to move or resize windows in Windows. On Linux I can press ALT and then click anywhere on the window, but with Windows I have to carefully click the title bar or the very edge of the window and that takes a noticeably longer time once you're used to doing things differently.
Oh, and I find that the Linux scheduler seems to be noticeably better than the Windows one in preserving responsiveness when the system is under load.
How about some benchmarks of "minimal" distros (like Puppy, Tine Core, ...)??
I like the idea of "ressurect" an old PC, but I would like to see benchmarks in Quad Cores, i7 too!
Anandtech is great, Bench(beta) is awesome!!
(sorry by bad english)
A few months ago most major trackers unbanned Transmission, but it still doesn't seem to be universally accepted on private trackers.
I remember offhand (I could be wrong) that the main gripe was due to the fact it made excessive queries and thus flooded trackers with requests, or had the ability to.
I think you really need to mention the big picture here.
I myself just tried Ubuntu for the first time 2 months ago and although I will admit that I have spent up to 8 hours trying to figure out how to install a specific program (before I found out there was a way to get the package manager to find the install), and I wanted to smash my computer at times. Now that I have learned quite a bit more, I realized that the few things I have installed worked great and flawlessly.
Anyhow back to the big picture. I can understand some of your concerns with how the OS will work with specific programs but what I have found is that most people I know use their computers for 2 things email and web browsing. Most of these people are constantly having problems with the system running too slow and cant seem to get rid of hidden viruses/malware. So I think that those people could easily be much happier with a simple OS like Ubuntu just for email and web browsing (And I would get a heck of alot of less calls from my dad asking my why his computer is running too slow). Lets also not forget that everything is moving to be browser compatible (like you mentioned).
Also, for people like myself, I use my Ubuntu system for a file server as well as a media center (XBMC is Awesome).
So, yes, for burning DVDs/CDs/Playing Games/Microsoft Office, I see no reason why you wouldn't use windows, but I think 95% of the users would be perfectly fine with ubuntu which is something that Mr Bill would not be very happy about when the public realizes this.
I think you have missed one small but important part.
I am Ubuntu user since 8.04. I came to Linux because of the constant treat of viruses.
Last month I have installed 7 and it is very user friendly and I think it is very user frinedly but after Avira Antivir got crashed by virus I installed Kaspersky INternet security 2010. then it took almost twice as long to boot. Then I gladly returned to Ubuntu 9.04. Because MIcrosoft can not exist without Antivirus I think you should do some real benchmarking and test windows WITH Antivirus.
On Ubuntu I have ClamWin just in case i get some files from Windows users:)
Thanks
just wanted to point out that you can install software under the LiveCD. Of course it does not install on the hard drive. It remains on a ram-drive, so when you reboot, it's gone. It's still useful, if you wish to test out some package or perform some task with a tool not installed by default on the LiveCD
Even more useful (and not mentioned) is that Ubuntu can easily run off a flash drive, and more recent versions even include a GUI tool for installing it to one. Then all installs and other changes are saved from session to session, and everything runs much more quickly than the LiveCD.
It would be great if you could do more articles on compiler and especially driver performance differences. That was the most interesting part of this article.
This is what Part 2 will look at. I can compile some stuff by hand to see if it closes the Windows/Ubuntu gap, and I have plenty of video cards on hand to test what I can when it comes to graphics.
Maybe I'm missung something but this appears to be a new article.
Why are you reviewing a year-old version of Ubuntu? there's been nearly 3 releases since that (Ubuntu is on 9.04 now with 9.10 coming very soon).
Its important to review the most recent version as Ubuntu is totally unlike the Microsoft world in tnat new releases are frequent (Every 6 months) and have real practical improvements.
Great article. I look forward to reading the follow up.
One comment on security that I would like to make. The commercial Linux vendors (IBM, Novell, Redhat, etc.) are all VERY dedicated to ensuring Linux security, as many/all of their server products use Linux, and changes they make will filter back down to the Linux desktop community. This is something that OSX does not have to nearly the same degree.
My experience with running Linux on the desktop sounds pretty much the same as yours.
-Games killed it in general. I don't usually have a top of the line system. So, I'm usually pushing my computer its limits to run newer games under Windows. Also, I hate dual booting, and most of the FOSS I use is available as a compiled binary for Windows.
-Drivers killed it in one specific instance with an older laptop, as I never got NdisWrapper (required for my wifi cards Windows drivers) to run better than intermittently. I spent way to much time messing with it.
[quote]and for the price you’re only giving up official support.[/quote]
Ubuntu doesn't have free official support, but neither does Microsoft. Apple does give 90 days free phone support, to their credit, but after that you have to pay.
You can always hire an expert (from ms, or apple, or a third party) to help you, but that's also true with ubuntu, though I expect there are fewer such experts to be found.
MS, Apple, and Ubuntu all offer free web-based help, both community maintained and "officially" maintained.
So I think it's misleading to imply that going from Windows or Mac to Ubuntu means you're downgrading your support options. People overestimate just how "supported" their operating systems are. Also, Linux / Ubuntu releases fixes and updates much more quickly than Apple or MS, so your chances of hitting a bug is lower in the first place. (MS maintains a huge knowledgebase of bugs they haven't bothered to fix yet and might have a workaround for - but I hardly see that as a positive).
While I have used computers for 20 years or more, I am not a techie. I am much more interested an experience that "just works".
When Vista came out I decided to explore the Linux desktop world. I have been using it as my primary system (still keep the dual boot option for XP) for just under 2 years.
I agree that "free" and security are big considerations for moving to a Linux desktop environment. However, there are some other items (and you might class them under security) that I like - because of the file structure, you don't have to periodically defrag your system. Both systems have a lot of updates, but so far I have not gotten the feeling that my Ubuntu system is gradually slowing down and clogging up with a lot of useless files (you don't see a lot of adds for such utilities as Registry Cleaners:). I no longer experience the MS ripple effect - when MS sneezes, other Windows apps may get a cold.
That is not to say that there cannot be issues. My pet peeve has been that my sound has disappeared on a couple of occassions after downloading updates. Using Google, and the Ubuntu documentation, I have been able to get it back up - but wish that wouldn't happen. But Windows updates can on occassion cause some issues.
I think you made a very valid point about the issue of tech support. Google has made a big difference in problem solving.
I've tried using Linux (usually Ubuntu) as a full replacement desktop on and off for the last few years. I've gone back to Windows every time after a while. Some key points:
1. For my desktop usage, there honestly isn't anything that Linux does better, in terms of functionality, than Windows
2. Windows is cheap enough that I do not mind spending the money on it. For the $100 that I spent for Vista 64 Home Premium OEM, it is quite worthwhile even if I only use it for 3 years. Yes, there are more apps out of the box for Linux, but it's usually easy to find freeware for Windows with the same functionality. Even Office is now pretty affordable with the Home & Office version.
3. Games - Wine just doesn't cut it. When I want to play a new game, I want buy it and play it immediately! I do not want to have to do research to see whether some game would work on Wine even before I buy it. I do not want to spend hours troubleshooting on the internet if something doesn't work right.
4. There's always something that you want to change in Linux that you can't figure out. Yes, usually the solution is on the internet. And I used to even enjoy spending time and looking for the solution. But, it eventually grew old. Now I just want things to work and keep working.
Note that I do love Linux and actually have a server that doubles as a mythtv HTPC setup. It's a beautiful thing. I am comfortable with shell commands and frequently use SSH to perform multiple functions remotely. My opinions above is purely based on desktop usage.
Great article, Ryan! Putting out some well written Linux articles really adds depth to your site. I have been reading this site daily for years and this article is prompting my first post.
For future articles it would be great to see some Linux benchmarks in most of the hardware reviews. There are some excellent tools out there (check out http://www.phoronix-test-suite.com/)">http://www.phoronix-test-suite.com/). This would also give some closer apples-to-apples comparisons for Mac vs. Linux performance. I for one would LOVE to see SSD articles report some Linux (and Opensolaris/ZFS) benchmarks along with all the Windows tests.
Users often don't realize how much they benefit daily from open source software. I don't think most Mac users realize all the OSX pieces that are used in the background for which Apple leverages open source code (Samba for SMB access and sharing, Webkit for Safari, etc.). Home NAS and enterprise storage which serve files in Windows environments are often *nix based.
It is also a myth that open source means that developers aren't paid. Most enterprises recognize that implementing even commercial apps can require considerable internal development manpower. If enterprise developers can utilize open source code internally and contribute back to the code base, the companies save considerable money and benefit from a healthy software development ecosystem. There are thousands if not millions of developers employed to work on open source code.
Please keep up the good work. I am looking for your next article.
Unfortunately the Phoronix Test Suite doesn't work under Windows, so it's of limited utility. It's something we may be able to work in to hardware reviews, but it's not really applicable to OS reviews.
what i'd like to see on the next ubuntu version is more softer and smoother graphic and font rendering. i hate the way gnome renders the graphic and font. they look old operating system. using the ms core font some how helps but not much.
i know there's compiz and friends, but i just wish it comes by default, so no need to hassle with compiz and its setting. i wish it could be rendered softer and smoother such as in windows and mac osx.
the look and feel should be tweaked more often! :D
Nobody's going to agree with the entire article. I'm just glad to see Anandtech paying some attention, and would welcome any articles, tests, reviews, etc.
It's embarrassing to visit the "Linux" tab and see the latest article was posted in July of 2005...
This is based on Ubuntu and I installed it this past weekend. I am having certain issues with it. Yes, it is free. Overall I like it very much and am pleasantly surprised. But, this has shown that Windows 7 will be a comparative bargain to me. I do not have the time to sit in front of the computer and play with Linux; trying to find out why certain videos don't play and why I am having eye strain and clicking on an audio link that doesn't play and a few more. When I go to the Mint forums I am confronted with a Tower of Babel what with all of the acronyms, and told to go to the terminal and type $surun%(8#**#. Ok, now turn your head and cough.
I'll keep Linux on this machine to boot up and play with now and then. It beats solitaire for the time being.
You hit on a good point. People I've setup with dual booting linux distros and windows begin to appreciate what they are paying for with windows. Typical response is "This is cool (Ubuntu) and I can see why some people like it. But I'm going to stick with windows, it's worth the money to me."
They appreciate that Linux could work, but see the "value" in paying form something familiar.
I run Vista on my main PC. Vista on all the spare LAN gaming PCs. I have an Ubuntu 9.04 VM and Ubuntu Netbook Edition on my old tablet PC (small and netbook like).
Just out of curiosity what user mode were you having guests run in? Even in vista I don't provide anything greater than standard user. With that guests need my password (which they don't have) to mess my machine up. Going back as far as Windows 2000, as long as you pair Windows with good spyware (spybot, or for XP defender if you choose) and antivirus (I like Avast and AVG both free and have nil footprints) you basically don't have to worry about system security as long as the person is running a standard user account.
My my parents system, we went from having to wipe and reinstall windows every time I came home from college, to a rock solid system that absoultly never failed when I performed these steps. I still like the XP/2000 behaviour of simply denying access better than the current UAC implementation. But Vista 64 + UAC (active) seems to be secure enough, particularly when paired with the aformentioned anti-virus software.
For what it's worth, it's an admin account. I know, I know, I could do Limited User. But that tends to just elicit complaints. XP's Limited User mode is embarrassing compared to how well Vista/Win7 does it.
Since it's basically just a web browsing laptop anyhow, it's basically a perfect fit for Ubuntu since I wouldn't need to be concerned with Windows malware period.
i have to agree even XP in its standered/limited user account mode quite hard for stuff to install but not imposable (Vista and win7 with UAC on and an standered account with the admin account passworded should prevent the system from been messed up)
* for me, the best possible way to install applications on any OS, but specially in one that is free (libre) is as follows: you search on the internet for the best program to meet your needs, you find it, you copy some code that identifies it, and paste that in your package manager, which then connects to some database, checks that the program is not malware, looks for the latest version, and proceeds to download and install it, not caring whether it's open source or not; this would beat windows/OSX by a wide margin, and also the current ubuntu system, whose "we don't like this software, on philosophical grounds, so it's going to be a pain in the ass for you to install it" attitude is a bit too problematic
* it would be nice if the "auto" option in the installer told you what it's going to do with your hard disk before going on to do it; I never use it, out of fear it might try to do something I don't like
* I missed some comment on that section on how Photoshop CS3 costs a lot of $$$, while GIMP is free
* along these lines, the comparison of total costs in time and money of installing windows/OSX/ubuntu, with all their companion programs, is striking
* I didn't check this ltely, but aren't there still problems with VBA compatibility? if I can open my xls/xlsm files but I can't run my macros, it's no good; I have a ton of stuff written in VBA, and I'm definitely not doing all that work again
* the ribbon UI in office 2007 is a royal pain: it's only good for the "It looks like you're writing a letter" users, and you can't get rid of it; there's a lot of people doing real work on excel, and none I talked to likes that ribbon thing, they'd all rather stay with excel 2003
Ryan, nowadays you don't need to dual boot. You can just set up a virtual machine. If you are a gamer use Windows as host and setup a Linux distro as guest. If you have enough memory, 4GB is very good, you can have both perfectly usable at the same time. I'm using Virtual Box and it works great.
I havent been able to read the whole cos I'm currently at work, but so far it seems good. Some people have been saying you should be testing 9.04, and I can see their point, but on the other hand, I agree that since 8.04 is the latest LTS release, it should be pretty stable still.
Nonetheless, perhaps you could compare a later non LTS release to a service pack for Windows? I mean, there is some new functionality and some fixes. Granted, new versions of Ubuntu contain a lot more few features than Windows service packs.
I agree that the 6 month release cycle is too fast. I dont develop for Ubuntu myself, but I imagine a lot of time will be wasted on preparing for release twice a year. I mean, theres a lot of testing, bugfixing and documentation to be done, and I would think if you would only did that once a year, you would have more time for development. Although, I guess the more changes you do in a release the more you should test, so maybe thats invalid.
I've also never really liked the Linux filesystem and package manager idea. Granted, package managers especially have improved a lot lately, and personally I think we have Ubuntu to thank for that, with its huge focus on usability, which historically Linux hasnt cared at all about.
I also dont like over reliance on the terminal/CLI. I dont like that there are certain things that can only be done with it. Its easier and faster for me to do things with a GUI, because we are visual creatures and a GUI is a much better way of displaying information than just plain text. I think until a lot of the Linux developers get over the idea that the CLI is "the only way to go", the GUI will be underdeveloped. As I said, its only recently that some Linux developers have actually bothered to try to get the various desktop managers up to scratch.
The other thing I find interesting about Ubuntu, is the nerd rage that some Debian developers exhibit towards Ubuntu.
Anyway... when 9.10 comes out, I would love to see your impressions of the difference.
I was one of those waiting for this article. I do remember getting excited when it was promised back in ... (can't recall the year, sorry, it's been too long :) ). Anyway, the wait seems to have been worth it. Excellent article.
A suggestion for part 2: install LinuxMint 7 (apart from Ubuntu 9.04) and see which of the problems you found in part 1 with Ubuntu 8.04 are solved in LinuxMint "out of the box"
I totally agree! To hell with Ubuntu, Mint7 is the best linux distro by far. Before I settled on Mint I tried Ubuntu, Kubuntu, PCLinuxOS (my previous fave), Mepis, Scientific, openSUSE, Fedora, Slackware, CentOS, Mandriva, and RedHat. None could come close to the complete awesomeness, beauty, out-of-the-box completeness, and ease of use as Mint7.
I'm a scientist and I'm using it for sequence and image analysis, so far.
so I got to page before installation and I have so many comments I cannot read further :-)
I am using linux on and off as my main desktop system since redhat 6.0 (that's kernel 2.2 iirc) so some 10 years. my job is a unix admin. so I am obviously biased :-)
1. virtual desktops - while this heavily depends on your workflow, it helps organise non-conflicting windows to not occupy the same space. I used to have one for IM/email, one with just web browser, one with my IDE and work stuff and one for GIMP and Blender. while this is my preference, it helps to kill the notification hell that is Windows. I hate how Windows steals focus from whatever I am working on just because some unimportant IM event just occured.
2. package manager and filesystem. given my background, the linux FHS is my 2nd nature. however you failed to grasp the importance of the package manager here. it effectively hides the FHS from you so you do not need to clean up manualy after uninstall. all directories you should ever go into manualy are /etc, your home dir, the system mount directory and whatever the log directory is. If you need to acccess other directories manualy, then you are either a system developer, a programmer or too curious :-)
also you can usualy one-click install .deb packages and they appear in the package manager as usual. just you have to manage dependencies manualy in that case. repositories are nice as you need to set them up ONCE and then all your updates/future versions are taken care of.
3. missing executable icons - this has a lot more background to it but it is a mistake to use nautilus in the default icon mode. you basicaly cannot live withour ownership/permissions displayed on a unix system. trying to hide this in any way in a GUI is a capital mistake. that's why a windows explorer like file manager is not usable under linux. good old MC :-) anyway an executable file can be anything from a shell script to a binary file. you just have to have the correct launcher registered in the system and you can open anything. basicaly same as windows just not that much gui friendly.
4. NVIDIA/ATI drivers - this is a story in itself. use NVIDIA if you want easy of use. use ATI if you want to learn about kernel and X :-) dig through phoronix.com for more info.
ok I will post more comments as I read further :-)
so I read the whole article. I would have some more comments :-)
1. installation - for me this was never a problem on any linux distro I was using. my partition scheme does not change much and it is usualy the trickiest part of the whole installation process. try out the full gentoo 3 stage installation if you want some fun (ok it is not avaiable via normal means anymore).
2. fonts - as you mentioned with codecs, there are software restrictions and licensing policies governing linux distributions. ms fonts are licensed under different terms than GPL software. yes even FOTNS have licenses. so they are generaly not included in linux distributions by default.
What I missed from the article is the amount of customisation you can do with a typical linux distro. just ubuntu has 3 main variants and you can mix and match them at will. you can even have all 3 installed and switch between the window managers by user preference.
Since you did not like the package manager anyway, you missed on the main Linux strength - application variability.
From a common user perspective however, the article is quite correct. I would expect more from a seasoned windows user and AT editor.
Ubuntu 8.04 is 14 months old creature.
2 versions released after it and the third one should arrive in October.
In terms of Windows it's short time, but for Linux it's a lot of time.
I suggest your next review should be done on Ubuntu 9.10 instead of 9.04 (which IMHO is better than 8.04 but still lacks some polish).
As mentioned before, the advantage of CLI instructions is that it will work on any Desktop Environment (Gnome, KDE, XFCE etc.) if it's not related to the DE itself. Moreover it will work on different versions (older/newer).
For example in Vista/7 i couldn't find Network Connections in GUI.
But who can stop me to type "Network Connections" in Explorer's address bar ? Sometimes GUI changed and even if only a little, most people will fail to follow screen shots. not to mention that most desktops are so customized (on real geek's computers) that it looks too different. I'm not talking about icons or desktop background. I'm talking about panels (if any at all), docks, menus, context menus etc. in Linux almost everything can be changed. And old-school geeks that had their Linux installations for years do this things so each DE is probably unique. (I have Gnome and apps settings/tweaks for over 7 years. Some of them probably never changed). The trick is that even when you reinstall the system, your personal setting may stay with you. (I jumped form Debian to Ubuntu to Gentto back to Ubuntu to Ubuntu x86_64 and finally to Gentoo x86_64). After all this, i have not lost any user customization/setting. On the system level it's harder since Debian and Gentoo are very different. All this gives you motivation to change and to tweak to make it better. Windows users are not really can customize and when they do, it's only valid until they have to reinstall/upgrade their OS. Since most of the Windows users I know reinstall at least once a year, after few cycles they will stay with defaults for both OS and applications.
Switch to Linux is not the easiest thing. It's usually not "love from first sight" story. But if somehow you stayed around and get to know it, you can't be separated after :)
Even on Windows 7 i feel handicapped in terms of usability and effectiveness/productivity. (I spend more time in front of Windows then Linux computers)
for your time spent on writing this article. I've made the jump to from Windows to Ubuntu (and Xubuntu for my older computers) back around 7.10 and 8.04 and I went through some of the headaches in adjusting to Ubuntu, but I eventually solved all of them and I'm quite settled in now.
One comment about finding help in the form of command line instructions, rather than GUI instructions. GUI instructions for Ubuntu would not be useful for Kubuntu or Xubuntu, since they use different window managers. The command line solutions usually work for all three.
Also, boot times were noticeably improved in 9.04. Perhaps you can run a quick retest on it.
And you CAN install stuff when using the live CD. I've installed a couple of temperature monitoring utilities when I was stress testing my motherboard.
Finally, thanks again for writing such a thorough look into your Ubuntu experiences. It was a great read in seeing how far Ubuntu has come and what it still lacks.
Yeah, you can set the APT sources to use a CD. There is an option for it 'system' > 'administor' > 'software source', or you can edit the /etc/apt/sources.list file
[quote]since SMB is the predominant protocol for consumer file server gear, it’s a fair test of such use.[/quote]
While this comment is not false, it presents a lazy approach to comparison; it's a one-sided contest, and Linux, pitted against Windows on home turf, doesn't stand much of a chance.
You as much as acknowledge this in the article, so why not provide some counterpoint? For example, consumer file server gear, even if it supports SMB almost ubiquitously, is usually *nix-based. So instead of just showing Windows and Linux clients interacting with Windows servers, show them interacting with *nix servers as well. Do some NFS transfers as well; NFS is well supported in consumer NAS these days.
You also really missed the boat on the video drivers. 8.04 was not the first Ubuntu release to include the Restricted Drivers Manager (known simply as "Hardware Drivers" in later releases). This handy app will identify hardware, such as AMD and NVIDIA GPUs, that can take advantage of proprietary drivers, and will offer to to install the same via synaptic (APT) with just a click of the mouse. No CLI, no headaches.
Still, a thorough review, and generally well-researched. I'm looking forward to the 9.04 follow-up.
Since you mentioned hardware HD decoding, I recommend taking a look at smplayer from the testing ppa (https://launchpad.net/~rvm/+archive/testing)">https://launchpad.net/~rvm/+archive/testing). Unfortunately vdpau doesn't work with the nvidia blobs in the default Ubuntu repos, but I believe there's a PPA providing vdpau-compatible blobs for anybody not wanting to do CLI installs.
[quote]While this comment is not false, it presents a lazy approach to comparison; it's a one-sided contest, and Linux, pitted against Windows on home turf, doesn't stand much of a chance. [/quote]
This isn't Linux pitted against Windows on home turf, it's Linux pitted against Windows in the real world.
Well, no doubt SMB is the dominant method of sharing files for consumers in general. Obviously comparing Linux to Windows makes sense in a world where Windows is the incumbent, but it's not the whole story.
I hope Part 2 will address some of the objective benefits of Ubuntu, and not fall into the trap of "worse because it's not the same as Windows".
I agree in principle, but there has to be a distinction between "Worse because it's not compatible with Windows," "Worse because it's not as easy as Windows," and "Worse because it's not the same as Windows." Die-hard *nix advocates tend to dismiss the first two as if they were the latter, and this tends to undermine their argument.
Also, in some cases "Worse because it's not the same as Windows" can be a valid point, because the public has been trained to the point that the Windows way is the "intuitive" way. Of course, this isn't truly intuitive, as people who learned Linux first would find Linux methodologies more intuitive - but that's largely a moot point, as that's not the reality we live in today. You could say the same thing about the color red - in the western world, when we see red we can intuitively guess that it means Stop, or Warning, or Error, etc. The fact that this is not an understanding we're born with but rather a socially acquired intuition does not mean it would be any easier to suddenly change the color of traffic lights and expect people to adjust without problems.
All of the NAS gear I can get my hands on is either SMB only, or is a Time Capsule which is SMB + AFP. I don't have anything that does NFS, which isn't so much a comment on testing (I could always set up another box) as it is usefulness. NFS just isn't common on consumer gear; SMB is a more important metric if you're looking at file transfer performance, because that's what most people are going to be working with. This doesn't preclude doing NFS at a later time though.
And the Restricted Drivers Manager is limited to the drivers in the Hardy repository, which means they're a year+ out of date.
Interestingly, if one checks the SmallNetBuilder NAS charts, it looks like out of 87 NAS devices, 49 have NFS. 56% in other words. And you say NFS isn't common? Really now? Seems a little biased to me.
While a lot of your issues have complicated solutions or lengthy technical backstories I can solve your complaint of smb shares mounted in nautilus not being useful in non-gtk applications in one simple command (or as you seem to hate commands the gui can do it too).
theory: make a symlink to the directory nautilus mounts to so it can be easily accessed. Symlinks to directories or files are transparently (to users and applications) identical to the location they refer to. Windows doesn't have symlinks (only useless shortcuts) so it isn't surprising you were not aware to do it.
howto: gvfs uses the directory /home/$USER/.gvfs as a mount point so link to it:
ln -s ~/.gvfs ~/linkname
howto gui: in nautilus go to your home folder then choose view -> show hidden files. Right click on .gvfs and choose make link. Then you can rename the link to whatever you want and hide hidden files again.
hint: symlinks are your best friend. My home dir is littered with links to places on the filesystem I visit a lot to avoid a lot of clicking/typing
The symlink is a very elegant solution, I'm embarrassed I didn't think of that myself. It's a bit of a lousy solution in that there even needs to be a solution, but as far as things go that's a very insightful suggestion.
ntfs has symlinks but the windows shell can't create or manipulate them. Pretty pointless. MS can (and does) use them in vista/7 but you can't make your own
gnome bookmarks are very handy I just find symlinks to be more flexible since they work regardless of gnome vs kde, gtk vs qt and gui vs cli. Even wine can take advantage of them
one more thing you should have covered is battery life on laptops. Linux in general is pretty awful at managing battery life. Just web browsing 4 hrs on Vista on my vostro 1310(not using 7) but with Ubuntu 2 1/2. It's a huge difference, but oh well.
Laptops are out of our domain, that would be Jarred. If this two-part series is successful, I'll see what I can do about talking him in to putting some Ubuntu (or any Linux for that matter) battery benchmarks in. But I'm told a complete workup takes a while.
On my Thinkpad T43, battery life is essentially equal between XP and Ubuntu. Ubuntu may even be slightly better, though I have never bothered with a formal test to put real numbers on both. Have you looked at whether the processor is throttling down properly or not while in Ubuntu?
"Now we have yet to touch on hardware accelerated playback, which is something we’re going to hold off on until we take a look at Ubuntu 9.04. Linux does not have a common media framework like Windows and Mac OS X have DirectShow/DXVA and QuickTime respectively. Rather the desktop environment that Ubuntu is based off of (GNOME) includes a lesser framework called GStreamer, which is closer to a basic collection of codecs and an interface to them. As such hardware accelerated playback is not as easy to do under Ubuntu as it is under Windows and Mac OS X. We’ll take look at the APIs and the software for this in our look at Ubuntu 9.04."
Well, not exactly. There is this: http://en.wikipedia.org/wiki/VaAPI">http://en.wikipedia.org/wiki/VaAPI, which is not exactly widespread yet. nVidia's VDPAU, which provides hardware acceleration for h.264 and if you have the latest version of PureVideo in your card, it does VC-1 as well. It can work with this or alone as well.
Also, while it is wacky that bin or binaries are in one spot, and lib or libraries are in another, and anything you install is in a /usr/lib/local, it does keep all related files in one spot. Keeping all your libraries registered as packages also makes it easy to repair.
Also, one click installs are possible on openSuSE as well, though it does involve a small gui process and adding a repository as well. But doesn't any program require this?
Also, I believe your problem with SMB shares is something I run into as well, but only on GNOME. On KDE, browsing shares is quite normal. Since I never bother mounting the share, I can't directly access it without KDE caching the file locally.
Isn't /home/$Your Name$ intuitive as to where your stuff would be? That is very nice, as I can keep my stuff separate from the OS, thus I can reformat the OS partition without having to touch my data. Imagine reinstalling Windows and finding all your apps working exactly as before with no work. Can OSX do that (not rhetorical)?
I'm sure the comment section will quickly be swamped with quibbles, so I just wanted to say that I found this article to be very informative, accurate (WRT my own Ubuntu experiences), and thorough. Kudos - now it's time to ask for a raise. :)
I see you shared a lot of the same problems I had with Ubuntu when I first got it. Yeah, it's harder, I won't lie, and it's a pain in the ass when it doesn't work. But when it works, you love it, and you feel like more of a man. I use it for my web server, runs very nicely.
Ubuntu sometimes makes you want to shoot it with a m249, but at other times you feel superior to other users. But that's because you are using the terminal all the time and are actually smart, Mac users just need to be shot in the face for their ignorance.
I think a lot of your problems would have gone away by using the newer versions, though, specifically with the package manager. There's much less need for finding things outside of it when you're using the new versions. Even video drivers can usually be put off for 6 months or so if you're not too cutting edge. Leaving the package manager behind is a pain, though, as you found out. You tried to explain that the LTS version was more comparable to Windows/OSX, but in truth very very few desktop users continue to use it. In fact, I'm not aware of any. It's really only used by companies for work machines who don't want to make large changes every 6 months like home users can.
MSTT fonts. Good luck trying to get those by default, they're owned by microsoft who is in no mood to simply give them away to their competitors. Installing them is like installing the patent encumbered video codecs - at your own risk, which is minimal as long as you aren't trying to make money off of it.
It should be mentioned that Red Hat put down some money to buy some nice new fonts a while ago, called Liberation, that are much nicer than the default serif ones this old Ubuntu version was using. Still different than the MS ones, though, which is going to cause some people problems. Also, the font anti-aliasing differences are again due to patents owned by other companies, but there's good news there. They're supposed to expire later this year so better font rendering in Linux should be coming soon! You can already get it working manually, but the distros make it hard to setup.
You mentioned you chose Ubuntu because it was supposed to be user-friendly, which I regard as one of the more puzzling wide-spread myths that go around. Sure, it's a lot simpler than Debian, or some other choices, but it is definitely NOT the distro to choose if you're looking to avoid the CLI, as you found out.
On that note, I would HIGHLY encourage you to eventually go back and do another review (part 3?) that uses a KDE based distro. Maybe try out OpenSUSE next fall, for example. Although KDE is going through a bit of a transition now, it's definitely where all the more interesting stuff is going on. As you said, Gnome is a lot like a boring Windows XP environment, which is both a positive and a negative. KDE is quite different, for better or worse, and is worth a look I think. For one thing, that smb://COMPUTERNAME address will work out of the box in KDE apps. If you do try KDE, I highly recommend another distro besides (K)Ubuntu, though, because they simply don't put any resources into their KDE implementation and it shows.
Ubuntu KDE has more options to play with that are missing in gnome (but gnome top is far better then KDE top, long time i used linux its task monitor, Linux verson of windows XP task manager but only the process page but very detailed)
Ubuntu should be easy to use but it lacks the easy install for drivers and Still does not offer Fail save VGA mode if X windows fails to start your stuck with an command line, it should try an second time but in save mode vga but it does not
You mentioned linux on Netbooks, and thought I would mention that I found Moblin(www.moblin.org) from Intel very impressive. It's still in beta and a little rough around the edges, but it boots faster than xp resumes from hibernate, around 15sec from bios screen and the UI is designed around small screens. After using it for a few hours and then installing Windows 7, I immediately missed how well Moblin was optimized for the lowres small screen. I had to install W7 because the ath9k kernel module drivers are unstable in Moblin, if not for this I would probably keep it as the primary OS on my netbook.
I ve been using Ubuntu 9.0 for a year with my Dell notebook and i love it, I dont see limitations in my work, the only problem is my company doesn't allow it in the network but is my OS in the house
I'm still reading it, but on my xubuntu 8.04, my firefox is located in /usr/bin/firefox. Most apps are under /usr/bin.
Also, the directory structure is definitely VERY different from Windows. One main difference is that everything that belongs to the user is supposed to be under /home. Everything that belongs to the "system" is everywhere else. I think the theory is that the user stuff is "sandboxed" in /home, so he doesn't mess things up in the system for everyone else.
You have the same in Windows under %SystemDrive%\Documents and Settings\user Although many settings are stored in the register (which can be said to be the equivalent of /etc). It's however there programs like Firefox saves it settings and where you have your My Documents and tempfiles.
* %SystemDrive% is a variable and substitute for your systems drive letter on which Windows is installed which can be something other then C:.
the question is, who cares where firefox or any other application's binary is installed? It's not as if you'll go searching for it to run it. They are on your execution 'PATH', which means you can just press ctrl+F2 and type their name, or a terminal, or access them from the application menu.
My favourite way is to use something like gnome-go (or krunner in Kubuntu)
PS: yes, all package manager provided application have their binaries in /usr/bin and most user build ones go in /usr/local/bin by default, which is also in your $PATH.
For incoming connections I don't quite grasp what good a firewall will do on a system with no internet-facing services. With no open ports you stand little to gain from adding a firewall, and any internet-facing service you might add, well, you don't want to firewall that anyway.
I can see two theoretically plausible arguments for a host-based firewall, but even these don't really stand up in real-world use: 1) a machine that has open ports out of the box (I'm looking at you, Windows), and 2) for the folks who want to police outgoing connections.
In the case of the former, why would we open ports and then block them with a firewall, right out of the box? This makes as much sense to me as MS marketing their own antivirus. Third-party firewalls were rightfully introduced to remedy the silly situation of computers listening on networks where they shouldn't be, but the idea of MS producing a host-based firewall instead of just cleaning up their services profile defies common sense.
In the case of outbound firewalling, I've yet to meet a home user that understood his/her outbound firewall and managed it half-way effectively. Good in theory, usually worse than useless in practice.
Just because a port/service is open, doesn't mean you want it open to the whole world.
Examples:
SMB
NFS
VNC
RDP
SSH
Web (intranet sites, for example)
And the list could go on... and on and on and on, really.
Also, it's erroneous to assume that only 1st party software will want to open ports.
And that is to say nothing of the possibility of ports being unintentionally opened by rogue software, poorly documented software, naughty admins, or clumsy admins.
Host-based firewalls help with all of these situations.
Windows firewall doesn't filter by source. In other words, if you want SMB or any other service open to some peers and not others, Windows firewall can't help you; you'll need a more sophisticated product or a hardware firewall for that.
I'm not saying there's no case for host-based firewalls, I'm just saying it's pointless for most users out of the box, where Ubuntu doesn't need it and Windows should be looking at fixing the problem of unneeded services running, rather than just bolting on another fix.
"I can see two theoretically plausible arguments for a host-based firewall, but even these don't really stand up in real-world use"
That sounds to me like a claim that there is little or no case for a host-based firewall; at least, that's how I interpreted it.
"Windows firewall doesn't filter by source. In other words, if you want SMB or any other service open to some peers and not others, Windows firewall can't help you"
That is incorrect, and you should check your facts before making such statements. The Windows Firewall can filter by source. Any firewall exception that is created can be made to apply to all sources, to the local subnet only, or to a custom list of IPs and subnets.
The firewall in Vista and Windows 7 goes a step further, as it is location aware. Different ports and services are opened depending on the network you're plugged into, as exemplified by the default behavior of treating all new networks as "Public" (unknown and untrusted) until instructed otherwise.
"The Windows Firewall can filter by source. Any firewall exception that is created can be made to apply to all sources, to the local subnet only, or to a custom list of IPs and subnets. "
In that case I retract my assertion that an out-of-the-box firewall makes no sense in the case of Windows.
As for Ubuntu, or any other desktop OS having no open ports by default, I still see including an enabled firewall by default as superfluous. Meanwhile, firewall GUIs exist for those wishing to add them.
I'm not quite sure I agree with your criticism of .iso mounting in linux. The mount -o loop command is very easy to use after you've done a couple of times. In fact, I think it is far better than using D tools in windows because you don't have to worry about unclicking all the gay-ware it tries to get you to install.
Also, I'm not sure I agree with your pseudo dislike for some forms CLI. CLI is far more powerful than what its GUI based copies tries to accomplish. As a matter of fact, the more I learn about linux's CLI, the less I use the GUI. I find myself only using the GUI for web browsing on a regular basis.
However, when looking at the linux GUI, compiz fusion is simply amazing. When I have a shitload of stuff open, compiz allows me to organize all of my windows and access them very efficiently. In fact, when I use windows for games, I feel handicapped.
The most interesting part your testing was that windows applications running under wine outperformed linux native applications. I look forward to hearing more about that aspect like you mentioned.
jordan air max oakland raiders $34--39;
Ed Hardy AF JUICY POLO Bikini $25;
Christan Audigier BIKINI JACKET $25;
gstar coogi evisu true jeans $35;
coach chanel gucci LV handbags $36;
coogi DG edhardy gucci t-shirts $18;
CA edhardy vests.paul smith shoes $32;
jordan dunk af1 max gucci shoes $37;
EDhardy gucci ny New Era cap $16;
coach okely Adidas CHANEL DG Sunglass $18;
Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.
So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
Let me say this, I am a Senior Software QA Engineer, I have been testing windows, windows apps, DB's and web sites for over 10 year now. I am what you could consider an windows guru of sorts.
I have off an on always gone and tried linux from red hat 5, 6, ubuntu, suse, fedora etc... Linux is not and has not been ready for mainstream users. Sure simple email, word docs web browsing it is ok.
But in order to do many things I want to do and many advanced windows users the author and many commentors are right. Linux people need to get out of their little shell and wake up.
Linux has such great potential to be a true contenderto windows and OSX. But it lacks simple usability. Out of the box it can come nowhere close to MS or Apple offerings. The out of the box experience is truly horrible.
Hardware drivers? good luck I run RAID cards that have no support. Forget the newest graphics and sound cards. Connecting to shares just as the author mentioned a hassle of a work around.
Again as stated elsewhere Linux needs someone who programs and or scripts to get things done right. I have neitherthe time or patience for such. I use command line when needed. I would rather have 2 or 3 clicks and I am done then have to remember every CLI for every thing I need to do.
Time is money, time is not a commodity. Linus wastes too much time.
It is geting better with each distro true. But It has been 11 years from red hat 5?? and Linux is not a whole lot better than it was then.
What is needed if Linux really wants to make a stand in the desktop space, is a unified pull togeher ofall distro's. Sit down and truly plan out the desktop. Put together a solid platform that out of the box can really put the hurt on MS or Apple.
Look what Apple did with OSX! And how many developers are wrking on it? How many developers are working on Linux all distro's? OSX is a jewel in 7 years it has matured much farther than any *nix distro. And has a following that cannot yet be challenged by any distro available.
Why is it that when win2k came out Linux was claiming to be superior, and yet after 10 years of development it is hardly comparable to XP let alonevista/win 7 or OSX?
You guys really need to wake up and smell the coffee!
Of course it's not ready for consumer desktops, there are no serious distributions for that.
It means no DVD player OOB, no proprietary codecs, no video editing software, no proprietary drivers which works magically. Of course not is SLED and RHEL Desktop ready for normal users it's targeted for Linux admins to set up the environment. Community distributions won't have as easy time to be set up by those. Community distros will also always lack the above mentioned stuff. It's simply not legal for them to offer it OOB. OS X is actually older then Linux and ran on x86 before Apple bought Jobs NeXT company. It's also supported by an OEM. (OEM = Themselves). Which no Linux dist is. It also uses many GNU technologies like GCC, X11 (optional but included on disc), bash shell and so on, and of course SAMBA for SMB/CIFS, on the server edition they use a modified openldap server, dovecot and postfix for mail, Apache, PHP, Perl, MySQL etc. Stuff thats developed on Linux and has matured thanks to it.
There's a lot of problems with having just community supported stuff, but that doesn't mean it's useless or sucks. Sure the kernel aren't really helping getting drivers in there, by locking out closed source stuff but they end up useless if they are proprietary and not updated any way. For the servers just buy RHEL or SLES certified stuff and you get all the hardware support-needed. But on the other hand you wouldn't be much successful in running 7 year old video drivers in Windows either. Community distros definitively don't need to cease existing for the creation of a commercial one. But there will never be one linux and that's really the beauty of it not the course. It wasn't meant to be something rivaling windows and the kernel developers has no desire to create a distro. That's why we can see Linux in stuff like Android and Maemo. And from home routers to mainframes and supercomputers. For a commercial entity targeting that many devices wouldn't be possible. Not with the same basic code and libraries. There are definitively some top notch products and solutions based on Linux and GNU. But Linux doesn't want anything as it's not an entity. And it's really up to GNOME and KDE to create the desktop environment. It's not the distros that shape them and write all the libraries that software developers use to create their software. As there are no major consumer desktop distro maker there is also no one that can really steer them by sponsoring work and holding discussions either. Not towards a unified desktop environment for normal non-tech users anyway. Also GNOME and KDE has no desire to create a exclusive platform around their software. OS X is a innovative 20 year old OS (since commercial release) and is actually based on work before then (BSD code). OS X UI is really 20 years into it's making and builds heavily on the next/openstep framework. On other Unixes there hasn't been any such heritage to build on, X was an total mess on commercial Unixes and I would actually say it's a lot better and more streamline now. There's just Xorg now, sure there are a lot of window managers but only two major environments now so it's still better then when all the vendors had it's own and couldn't make up it's mind on which direction to go and standardize on. In the middle of the 90's there where like at least 4 major Unix vendors that all had their own workstations.
Most end users are not comfortable with the command line. Linux, even Ubuntu, is still not ready for the masses. This shouldn't be confused with the quality of the OS. It's mostly GUI issue. I've also had some issues with installers failing. Some were solved from an xterm and others just didn't work.
It wasn't a big deal in most cases, because there's generally another program that can get the job done, but for the typical home user, it's a deal killer. Nevertheless, I must give credit where credit is due, and Ubuntu has made huge strides in the right direction. The UI isn't close to Windows 7 and I suspect it's not close to OS X either, but Canonical is moving in the right direction.
See this is the problem with some of linux users, you guys are some what always closed in a nutshell. What you may think is easy does not mean the rest of the world will agree with you. In this day and age, people what to get things done quickly and use the least amount of time as possible. For Mac OS X and Windows getting a simple task done takes like 3 simple clicks, for Ubuntu performing the same tasks requires the user to do extensive amount of research just to complete it.
I'm glad this article was written by a author who has not head into linux terriroty before and it shows the true side of linux from the perspective of a new user.
If you like to do ramen coding and so forth does not mean the others will. If linux want's to become mainstream, then they really need to stand in the shoes of Joe or Jane.
I use mac, windows and linux and I must disagree with your assessment of "this is the problem with some linux users"
This article, and this site for that matter, comes from the perspective of a windows (and some mac) user looking at linux. More specifically Ubuntu. From this point of view of course Linux is difficult. A person who is linux focused thinks windows is difficult at first too and is likely to criticize. If you take the time to learn something instead of just criticizing something because it is different you may be a lot happier.
Again - those are done by Linux people. His points are right on. . .someone a while ago did a "Mom" test, which is closer to what is needed, not people who know computers doing studies on usability.
So i tried to find some links about this relating to gnome, but only got some pretty old ones. There are other methods they are using as well, like the 100 paper cuts idea. honestly have a look around and you'll see how much of a focus it is, particularly with ubuntu
Face it, Linux is still back in Windows 2000 days. Try getting SLI working, 1080P working right, games working. IT IS Way to much trouble and damn near impossible for regular users. In Windows or Mac its next to no work and very little issue. Wake up guys, Linux has Potential but thats it. BECAUSE those who advocate it spend so much energy defending what is "easy" to them when they ought to use that energy making it ACTUALLY easy and USER USER USER USER (DO YOU UNDERSTAND THIS WORD?) FRIENDLY... NOT PROGRAMMER FRIENDLY...
All of the things you mention are probably not that easy for grandma to do either. People thrive on saying it's so hard to do things in Linux, but I think it's generally not intuitive to use most computer systems. Imagine if you had no exposure to computers how difficult any system would be. A few years ago a friend of mine wanted me to install some software on her Mac. I had no idea how to do it. I've been using computers since I was 5 years old, but had to google for information on installing software.
I actually think that Yum/Apt repos actually make it significantly easier to install software. The other day I wanted an application to take a photo with my webcam. I simply did a search "yum search webcam" and looked at the descriptions of included software and found Cheese which did exactly what I wanted.
When you know exactly what you want, and it's not available in the repos you use, I agree it is more difficult to get it installed. Still with both Red Hat/Fedora and Debian/Ubuntu, you can do an install by downloading a package file. This doesn't get you the benefit of automatic updates, but it's just as easy to install as an MSI file.
Well maybe they would want '1080p' but I'm not sure how that could be a problem unless you have some strange hardware that requires a specific driver... like another OS sometime needs you to go to a manufacturers website ;)
Installing nVidia drivers and XBMC or mplayer isn't that hard.
But keep in mind there is only homebrew codecs on Linux which OEMs like Dell can never ship with there computers and has limited support of proprietary formats such as BD. It's the same codecs as ffdshow, or as in XBMC or VLC on Windows. What's lacking is a PowerDVD with BD support. w32codecs is also available for gstreamer, giving alternative support for WMV and such. Installing ubuntu-restricted-extras is essentially the only thing you need for it to work in Totem if you don't need/have VDPAU support. XBMC is definitively a decent platform to playback warez. You need to rip blurays to be able to play them back at all though. But an Ion is definitively powerful enough for 1080p h264 under linux. But because all that software contains unlicensed patented codecs Canonical don't officially support any of it. So it won't work on ubuntu OOB. Codecs aren't free.
Great review. Thank you for reviewing 8.04 LTS Please review 10.04 when it comes out. I am interested to see if they software center has changed the authors opinions.
Your ignorance and stupidity is showing here. No engineering software for Linux? Hello? Matlab is available, Simulink is available, Labview the same. Xilinx and Altera have supported Linux for a long time and so do the smaller FPGA houses like Lattice and Actel. Mentor Graphics too. Orcad is the only one you mentioned that isn't available on Linux, but Cadence does support Linux with their Allegro product and so does Mentor Graphics with PADS and Board Station and Expedition.
i find a fashion shop, they supply handbags, wedding dresses, party dresses, evening dresses. and the price lower to others, more info to http://www.amiracollections.co.uk
best blog ever I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely.
If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.It is really nice.http://www.shortweddingdressus.com">http:/...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
195 Comments
Back to Article
viciki123 - Monday, February 22, 2010 - link
I have a new website!wedding dresses uk:http://www.weddingdressonlineshop.co.uk">http://www.weddingdressonlineshop.co.ukJillian - Thursday, December 29, 2011 - link
YOu are spamer .For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities. http://www.idresses.co.ukzerobug - Monday, February 1, 2010 - link
Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
MarcusAsleep - Thursday, December 17, 2009 - link
Quick Startup: OK, Windows is fast - at first, well let's say that is if you install it yourself without all the bloatware that come standard on Windows store-bought PS's (we bought a Toshiba laptop for work with Vista that took 12 minutes after boot-up for it to respond to a click on the start menu - even on the third time booting.)Windows startup is often burdened by auto-updates from Microsoft, anti-virus, Sun-Java, Acrobat Reader, etc. etc. that slow down the computer on boot-up to where your original idea of "hey I just want to start my computer and check my email for a minute before work" can take at least 5. I can do this reliably on Linux in 1. Yes, if you know a lot about Windows, you can stop all the auto-updates and maintain them yourself but 99% of Windows users don't have time/or know how to do this.
Trouble-free: E.G. I installed Linux on a computer for my wife's parents (Mepis Linux) 7 years ago for email, pictures, games, web, letter use and haven't had to touch it since then. This is typical.
For Windows, often I have done fresh installs on trojan/virus infected computers - installed working antivirus and all Windows Updates (not to mention this process takes about 2-4 hours of updates upon updates + downloads of the proper drives from the manufacturers websites vs about 1 hour for an Ubuntu install with all updates done including any extra work for codecs and graphic drivers) - only to have to come back a couple months later to a slow again computer from users installing adware, infected games, etc.
Free: Nearly every Windows reinstall I've had to do starts with a computer loaded with MS Office, games, etc. but upon reinstall nobody has the disks for these. There is a lot of "sharing" of computer programs in the Windows world that is not very honest
With Linux, you can have the operating system plus pretty much anything else you would need, without having to cheat.
Adaptable Performance: You can get a well-performing Linux installation (LXDE Desktop) on a PIII computer with 256MB of ram. The only thing that will seem slow to an average mom/pop user would be surfing on flash loaded web pages, but with adblock on Firefox, it's not too bad. With Vista loaded on this computer, it would be annoyingly slow. You can often continue to use/re-use computer hardware with Linux for years after it would be unsuitable for Windows.
I think these features are of high value to the average user -- maybe not the average Anandtech computer user -- but the average surf/email/do homework/look at photos/play solitaire/balance my checkbook user.
Cheers!
Mark.
SwedishPenguin - Wednesday, October 28, 2009 - link
Using SMB for network performance is extremely biased. It's a proprietary Microsoft protocol, of course Microsoft is going to win that one. Use NFS, HTTP, FTP, SSH or some other open protocol for network performance benchmarking. Alot of NASes do support these, as they are Linux-based.Furthermore, using a Windows server with SMB with the argument that most consumer NAS use SMB is pretty ridiculous, these NASes are most likely going to use Samba, not native SMB, the Samba which is implemented in GNU/Linux distributions and Mac OS X, not to mention that most of the NASes that I've seen offer at least one of these protocols as an alternative.
SwedishPenguin - Wednesday, October 28, 2009 - link
The ISO thing is pretty ridiculous, creating a simple GUI in both GTK and Qt and integrating them into Gnome and KDE should be pretty damn easy, though I suppose integration with the respective virtual file systems would be in order, in which case it might get slightly more complex for those (like me) not familiar with the code. There's even a FUSE (userspace filesystem) module now, so you wouldn't even need to be root to mount it.About the command-line support, IMO that's a good thing. It's a lot easier both for the person helping and the guy needing help to write/copy-paste a few commands than it is to tell the person to click that button, then that one then another one, etc. It's also alot easier for the guy needing help to simply paste the result if it didn't help, and it makes it much easier to diagnose the problem than if the user would attempt to describe the output. And you usually get much more useful information from the command-line utilities than you do from GUIs, the GUI simplifies the information so anyone can understand it, but at the price of making debugging a hell of a lot more difficult.
nillbug - Wednesday, September 30, 2009 - link
It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.
In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
nillbug - Wednesday, September 30, 2009 - link
It must be said that Ubuntu and the major Linux distributors all have 64bit O/S versions since a long time. The reason behind is to allow users to benefit from memory (+4MB) and 64bit CPUs (almost all today) gaining a better computing experience.If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.
In face of today's reality (and not the author reality) why did he never mentioned the 64bit Ubuntu systems? I guess he's final thoughts then would've been much more in favor of Linux.
seanlee - Tuesday, September 15, 2009 - link
I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely hate it.1. User friendly. No, CLI is not user friendly no matter what you say; no matter what excuse you use; no matter how blind you are. NOT ONE COMPANY dare to provide their mainstream products to be CLI only, from things as simply as ATM, ipod, to things as complicate as cellphone, cars, airplane. That ought to tell you something--- CLI is not intuitive, not intuitive=sucks, so CLI = sucks. You command line fan boys are more than welcome to program punched cards, expect no one use punched cards and machine language anymore because they are counter-intuitive. Having to do CLI is a pain for average user, and having to do CLI every time to install a new program/driver is a nightmare. GUI is a big selling point, and a gapless computer-human user experience is what every software company looking to achieve.
2. There is NOTHING a Linux can do that windows cannot. On the contrary, there are a lot of things windows can do that Linux cannot. I’d like to challenge any Linux user to find engineering software alternatives on Linux, like matlab, simulink, xilinx, orcad, labview, CAD… you cannot. For people who actually user their computer for productive means (not saying typing documents are not productive, but you can type document using type writer with no CPU required whatsoever), there is nothing, again, I repeat, NOTHING that Linux can offer me.
3. Security issues. I disagree with the security issues that windows has. I can set up a vista machines, turn it on, luck it into a cage, and it will be as security as any Linux machine out there. Hell. If I bought a piece of rock, pretend it was a computer and stare it all day, it would be the most secure system known to the man-kind. Linux’s security is largely due to one of the two reasons: 1. Not popular, not enough software to support and to play with. 2. Not popular, un user-friendly. Either of them is not a good title to have. It is like you are free from the increase of the tax not because you have your business set up to write off all your expense, but because you don’t make any money thus you don’t have to pay tax.
4. There is nothing revolutionary about Linux for an average user, other than it is free. If free is your biggest selling point, you are in serious trouble. Most people, if not all, would pay for quality product than a free stuff, unless it is just as good. Obviously Ubuntu is never going to be as good as windows because they don’t have the money that MS has. So what does Ubuntu have that really makes me want to switch and take few weeks of class to understand those commands?
Be honest, people. If you only have ONE O/S to use, most of you guys will chose windows.
kensolar - Monday, October 26, 2009 - link
I hope you realize that your hated is showing so strongly that absolutely no one cares what you say.
That said, I don't know how to use a cli and have been successfully using Linux for 3 years. I found the article to be a fairly fair one even though the author is so unfamiliar with Linux/Ubuntu. As he does not use the default app's in windows, linux users don't use the defaults only in linux. K3B is far superior to Brassaro and so on. In addition, I don't think he let on very well as to the extent of the software available in the repositories (with addition repositories easy to add). Several hundred app's, 20,000 programs, even security app's and programs ranging from easy as pie to complicated. (for those of us how have a computer that is more than a rock) I personally do audio mixing, video transcoding, advanced printing....all with graphic interfaces.
BTW, I learned how to turn on a computer for the 1st time 3 1/2 years ago, I stopped using windows a little over 3 years ago and have no reason to go back. I find it too hard, limiting and frustrating to use. Plus, I can't live w/o multiple desktops, the author didn't get it yet, but once you get used to them you can't go back.
Well, I've said enough for now, can't wait for your next article.
amrs - Saturday, September 26, 2009 - link
Your ignorance and stupidity is showing here. No engineering software for Linux? Hello? Matlab is available, Simulink is available, Labview the same. Xilinx and Altera have supported Linux for a long time and so do the smaller FPGA houses like Lattice and Actel. Mentor Graphics too. Orcad is the only one you mentioned that isn't available on Linux, but Cadence does support Linux with their Allegro product and so does Mentor Graphics with PADS and Board Station and Expedition.MadIgor - Thursday, September 24, 2009 - link
I have to disagree. You are NOT talking abut average Joe/Jane. I think that even the article author is kind of biased towards enthusiast user. Ubuntu actualy completes all needs of average Joe/Jane user, you can browse www, you can do email/scheduling, you can play games (easy non enthusiast games), you can DL pictures from your camera and edit them, you can even playback mp3/CD and video, do basic office work, all out of the box. The gnom learning curve for PC beginners is much shorter then with windows. Most of the average Joes/Janes dont install aps or peripherals by themselfs, belive me I had to install it for them many many times on Win systems (the best is "installing" digital camera: plug one wire end in camera, other in PC). Yes I agree that installing Ubuntu so that ALL is runing right may be pain in the ass, but average Joe/Jane naever install their system (not Win, nor MacOS), but when they get the PC with preinstalled Ubuntu you are done. With windows you have to worry that they will "bother" you every few months with non working system. Yes it might be nice source of income for PC technician, but not always welcome as reliability advertising (for customer to come).I did some instalation of Ubuntu to my customers mostly as a "safe" web/mail PC, they all where used to windows platform already, after one week of using Ubuntu even the hardest critisizer where comfy to use Ubuntu (some even asked me to install it on their home PCs), The most "problem" was: that no one can read our "excel" files. So I showed them that it has to be saved with .xls extension and voila, no more problems. I was NEVER asked for any CAD system, nor MATHLAB, not even Graphics apps, all what they used in offie was already there! Then there are home users, only complaint was that thay had windows at work, but after few houres all was fine, only kids had problems that they cannot play enthusiast games on it. My wife is running Ubuntu for three years now, with no problem. When my 62 year old mother asked me for a computer I brought her a notebook with Ubuntu, had no time to explain it comming next mornig. My mom never used a computer before (ok shooting ships on my ATARI doesnt count), next mornig I came there, she was already browsing. I asked her how did she do that and she said its easy, tap the aplications then internet and one of the apps was "internet". She even installed the snake game, Isaid how did you do that, she said in aplications section is install new aplication, then she clicked on games and then she piscked what she tought would be the game for her and then install, whas that wrong? she asked, I said NO, its right.
BTW no one knows that they can use CLI or that there is some terminal window in Ubuntu. They are average Joes/Janes.
Not everyone is an enthusisat with PC full of stuff that, and be honest, you dont use on dayli base.
The truth is that Ubuntu will not be a succesfull system for enthusiast or high level profesionals until big software houses (Adobe, hallo!) and game producers will not start to port software for Linux. But that is not fault of Ubuntu or linux and again we are not talking here about majority of users (I mean Joes/Janes).
fazer150 - Friday, September 4, 2009 - link
All folks who think Linux is hard. Have you tried PCLinuxOS? this is easier to install, use than Windows XP, 2003 and Vista period.there is no Windows hatred here, but you have to try that before you complain.
I have access to all Windows OS at work including the latest Win 7 RC but i find PCLinuxOS easy to setup and use. Needs no special admin skills every config is GUI driven.
Linux has come a long way from where it was 5 years ago!
Cynicist - Sunday, September 6, 2009 - link
There are two things I'd like to comment on that bothered me about this article. Firstly, most regular users do not use LTS, the software is just too old and the latest releases of Ubuntu are quite stable. LTS is mostly guaranteed stability for corporate environments.Second, this package manager hatred is based on this flawed idea that no packages exist outside of the official repositories. A simple google search for deb packages leads to GetDeb.net, a website dedicated to providing up to date packages of all kinds of software specifically for Ubuntu. Google search too hard you say? But its even less difficult to find packages because many project sites (such as wine, featured in this article) include multiple packages for various distributions and even PACKAGE TYPES.
Overall not a bad article. The author definitely knows technology and I'm grateful for that, but he did not seem to do much research on the actual community itself or the Linux Way of doing things. These are minor issues which will resolve themselves with time and I'm looking forward to seeing more linux articles on this site in the future.
cliffa3 - Thursday, September 3, 2009 - link
I was concerned as well with the constant releases...until I upgraded the first time. I had set aside the better part of an evening because I was *sure* there were going to be plenty of headaches. I've done three such version upgrades now and am happy (not to mention shocked) to report that it's literally a one click upgrade. Simply amazing. I'm sure something will get mucked up in the future with one of the version upgrades for me...but for now all has gone amazingly smooth.That being the case, I have to disagree with you on the "they release too often" point. I understand it's a pain to sift through all the search results on the forums, but I also have found some older threads (sometime 3 versions back) that the same fixes work for my issue. I agree they need to tag posts with version info...that would make it far easier. Also, there's far more useful information in the (versionally-diluted) forums than I've found for any other piece of software or OS I've used. I almost don't cringe when I have a problem or issue now because I'm quite confident I can find the information without too much digging.
I'd encourage you to upgrade versions from your current install (don't wipe) and comment on how the process goes. Maybe I've just had an extremely easy (and lucky) go of things with no problems...it'll be interesting to read your experiences. Honestly with how easy my upgrades have been, I look forward to new releases (but still give them a few weeks before upgrading...just to see the comments from other users).
Mem - Wednesday, September 2, 2009 - link
Very good read as usual,personally I like to see Kubuntu reviewed at some point(I hear Kubuntu 9.10 is due in Oct) ,as you know its the KDE version,also Gnome and KDE compared would be interesting.I think the main problem for new Linux users is which one to go with,sure they are all free but it can be confusing and time consuming to try them all,some are more noob friendly then others like Ubuntu/Mint.
lishi - Wednesday, September 2, 2009 - link
Since you spend so many time dealing with the windows its worth pointing that compiz is actually much more powerful then what you wrote.Install the package ccsm-simple for more option.(like different application selector, different windows animations etc).
Or install ccsm for the complete configuration tools. Given most of them are eye-candy there some who can improve the desktop experience.
sethk - Tuesday, September 1, 2009 - link
In this sentence:"It’s undoubtedly a smart choice, because if Ubuntu wiped out Windows like Windows does Ubuntu, it would be neigh impossible to get anyone to try it out since “try out” and “make it so you can’t boot Windows” are mutually incompatible"
The more common phrase is 'nigh on impossible' (as in close to impossible) or you could say it's nigh-impossible. Definitely not neigh. Sorry to point out grammar issues, but this is a pet peeve, right along with pique being spelt peak or peek (as in pique my interest).
v8envy - Tuesday, September 1, 2009 - link
I've been a 100% Linux desktop (Ubuntu 9.04) user at home ever since I bought my last i7 920. Gaming, multimedia, web -- everything a typical desktop user does under Windows. The inconvenience of migrating an existing Windows install & re-activation outweighed the convenience using Linux which simply booted and worked on the new hardware.Yes, there are times where you must fire up Google and search for solutions, some of which are commands to be pasted into a terminal window. Yes, sometimes you need to upgrade software packages (Wine is horribly out of date for instance).
On the other hand, with Windows you get apprximately 1,337 updaters which run on startup, virus checkers, malware checkers, browser parasite checkers, firewalls, DRM and misc layers of barnacles which accumulate the longer you use the system. Thankfully the gathering of cruft is not a bane on the typical Linux system yet.
Try 9.04 and see if it is more to your liking. LTS means nothing when most open source problems are "supported" by simply upgrading to the latest software.
trexpesto - Monday, August 31, 2009 - link
"linux" is "niche" spelled inside out and backwards..in rot13.
brennans - Sunday, August 30, 2009 - link
I use both XP64 and Hardy (Ubuntu 8.04).I am also a power user.
Both these operating systems have pros and cons.
Cons for XP64:
1. It does not recognize my hardware properly.
2. Finding 64 bit drivers was/is a mission.
Cons for Hardy:
1. It does not plug and play with my hardware (i have to compile the drivers).
2. Not as user friendly as windows.
Pros for XP64:
1. Windowing system is super fast.
2. User friendly.
Pros for Hardy:
1. Recognizes my hardware.
2. Command line tools are awesome.
Conclusion:
I think that the article was good.
I am one of those people who has always had problems installing windows straight out of the box and thus find that paying a large amount of money for their buggy OS is unacceptable.
I can get a lot of stuff done with Hardy and it is free and if I find a problem with it I can potentially fix that problem.
I also find it unacceptable that manufacturers do not write software (drivers or application software for their devices) for Linux.
For me, it is difficult to live without both XP64 and Hardy.
ciukacz - Sunday, August 30, 2009 - link
http://www.iozone.org/">http://www.iozone.org/JJWV - Sunday, August 30, 2009 - link
How can people use something like Aero and its Linux or OSX equivalents (that pre-dates it if I am not mistaken) ? The noise is just hiding the information. Transparency is one issue, another are those icons that are more like pictures : one looses the instant recognition. With Aero knowing which is the active window is not something obvious, you have to look at small details. The title of the window is surrounded by mist making it more difficult to read. Even with XP the colour gradient in a title bar is just noise : there is no information conveyed by it.The OS GUIs are more and more looking like those weird media players, with an image of rotary button that is to be manipulated like a slide button.
The evolution of all applications to a web interface reminds me of the prehistory of personal computers : each program has its own interface.
The MS Office Ribbon UI is just in the same vein: more than 20 icons on each tab. The icon interface is based on instant recognition and comprehension, when you have so many it turns into a mnemonics exercise. And of course with MS one does not have a choice : you just have to adapt to the program. An end user is only there to be of service to the programs ;-)
If i want to look at a beautiful image I will do it, but the when I want to write an letter or update a database all those ultra kitsch visual effects are just annoying.
As a summary the noise is killing the information and thus the usability.
Ronald Pottol - Saturday, August 29, 2009 - link
The thing with windows has been seen before, back in the win 3.1-OS/2 days it was found that while one instance of excel didn't run any faster under OS/2, two in separate VMs (ok, not technically the same thing) ran in about the same time as one on windows.I like the package management, and hate when I have to install something that doesn't support it, it means I have to worry about updates all my self. If they have one, they I get updates every time I check for Ubuntu updates, very handy. Nice to get the nightly Google Chrome builds, for instance (still alpha/very beta).
Frankly, supporting binary kernel drivers would be insane. Now they are stuck supporting code they cannot look at and cannot fix, they cannot fix their mistakes (or are stuck emulating them forever). If they supported them, there would be even more of them, and when they wanted to fix something broken or that was a bad idea, they would have to wait a reasonable amount of time before doing it, so it would be supported. Frankly, I don't see why people don't have automated frameworks for doing this and automated deb/rpm repository generation. I add their repository, when I get a kernel update, perhaps it is held up a day for their system to automatically build a new version, but then it all installs, instead, I am stuck with having to run a very old kernel, or not having 3D on my laptop, for instance.
cesarc - Sunday, August 30, 2009 - link
I found this article very interesting, because is oriented to windows user and is helpful to them because you just didn't die trying it.But you can't blame ubuntu (or any distro), about the pain in the ass a video card's drive could be to install, blame ati and nvidia for been lazy, and if using wine for playing games is not as good as playing in windows blame games company for don't release a GNU/linux version.
Also, the thing about why GNU/linux overpass windows in file management is because ntfs is a BAD file system, maybe if windows somehow could run under ext3 would be even better than it is.
And why your negligence to use a console (stop saying cli please), you are not opening your mind trying to use GNU/linux as a windows just because it is not windows is a completely different os. Look from this point of view... something that you can do in windows with 5 clicks maybe you can do it in GNU/linux in just one line of bash code. So, sometimes you will use GUI and others you will use console and you will find that having this options is very comfortable. So start using the console and do the same article a year later.
I hope some day have a paid version of GNU/linux (still open source), that could pay salaries to programers to fix specific issues in the OS.
In the other hand, when you do the IT benchmark is very disappointing that you don't use linux with those beautiful Xeons. Servers environment is were GNU/linux get stronger. And Xeons with windows are just toys compared with unix on sparcs or power architectures.
PS: try to get 450 days of uptime in a windows 2003.
rkerns - Saturday, August 29, 2009 - link
Ryan,Thanks for your good work.
Many people considering linux are still on dial-up. These are often folks with lesser expertise who just want to get connected and use their computer in basic ways. But getting connected with dial-up is something of an adventure with many distros and/or versions. Ubuntu 9.04 has moved away from easy dial-up, but Mint7KDE includes KPPP for simple dial-up connection. Mint7KDE has other nice features as well.
I am asking you to expand your current picture of the landscape to include people who want to use linux with a dial-up connection. This of course would have to include a brief discussion of 1) appropriate modems and 2) distro differences. Thanks,
r kerns
William Gaatjes - Saturday, August 29, 2009 - link
GRATIShijg hijg hijg hijg
hijg hijg hijg hijg
lgude - Saturday, August 29, 2009 - link
Really glad to find this in depth article after all this time. Thank you Ryan. I too have run Ubuntu as my main OS even though most of my experience is in Windows and have had similar experiences. Because this was a very long article it got into detail about things like the Package Manager or the multiple desktops that I have not seen discussed elsewhere from a user perspective. As someone else pointed out it is moot what people would like or complain about if they were moving from Linux to Windows or OSX, but imagine for a moment if they were used to getting the OS and all their apps updated in one hit and were asked to do it one app at a time and expected to pay for the privilege!If you go on with the Linux series I'd like to see discussion of the upcoming Ubuntu and other distros - I've been impressed with SUSE. I'd also like to see projects on how to build a Linux server and HTPC - including choice of distro and the kind of hardware needed. I'm less sure of where benchmarking is really useful - the tradition of detailed benchmarking at AT arose from the interest in overclocking and gaming which I think is a much lesser consideration in Linux. More relevant might be comparisons of netbook specific distros or how to work out if that old P4 will do as a home server. There is a lot of buzz in the tech world about things like Symbion, Chrome OS, Moblin, Maemo on portable devices that could possibly draw new readers to the Linux tab at AT. A great start in any case.
jmvaughn - Saturday, August 29, 2009 - link
I just wanted to say thank you to the author for a very thorough article. After reading it, I decided to use Ubuntu for a PC I'm building out of spare parts for a retired friend who's on fixed income. My friend just uses web, e-mail, and some word processing, so this will be perfect.The article gave me a good idea of what to expect -- a good honest appraisal with all the good and bad. After installing Ubuntu 9.0.4, I am very impressed. The install was very quick, and easier than XP. Everything is quite snappy, even though it's running on a AMD 3800+ single core processor and an old hard drive.
xchrissypoox - Saturday, August 29, 2009 - link
I only skimmed the article (I saw the part on gaming being poor), I'd like to see a comparison of several games using the same hardware on windows and linux (results given in fps). If this has been mentioned sorry and good day.tabuvudu - Saturday, August 29, 2009 - link
The article is a good job overall. Below are a few of my own observations concerning the article and linux in general.###Package manager
I never used Ubuntu, so cannot comment on apt. I tried SuSE, mandriva and eventually ended up with Gentoo. I think package managers are the best part of linux. Gentoo's portage and 'emerge --sync' allows you to be always up-to-date in terms of software, no need to ever reinstall the system. It is CLI-based and sometimes you do have to play a bit with USE-flags and compatibility, but in general 99.99% of my software needs can be satisfied by portage.
###Command Line Interface
For my home needs I switched from windows to linux about 4 years ago. At work I still use windows, but that is due to corporate policies rather than preference. Originally CLI was something terrifying. It took me some time to learn and adapt, but now I do most of system tasks in CLI.
###Video drivers
I does take some effort when doing something non-standard (1080p projector connected to HDMI via av-receiver). But in general in gentoo it is usually as easy as typing 'emerge [nvidia]/[ati]-drivers'. Even double-monitor setups, which I have two. Dont forget the open-source xorg drivers, which are usually fine for simple desktop and come pre-bundled.
###Gaming...
...is much harder under linux. In my case not relevant, as I spent less than 1% of my time to that over the past 2 years. Others may find this a real obstacle for migration.
###Syncing...
...your devices is also a headache. I even once succeeded in syncing my windows mobile 5 device with evolution. But amount of efforts taken for that was far too great. I never even tried syncing my nokia phones after that.
###Usage scenarios
I think you judge Ubuntu (and linux in general) in very windows-centric usage patterns. Ubuntu is unlikely to out-windows the original, although OSX seems to have done just that. Even the approach is windows-centric - you take a windows app and compare linux against it. The point is that linux in general allows user to open and develop completely different usage scenarios, which are beyond windows. Allow me to elaborate on the basis of my own experience.
###In search of killer app...
You note in the article that there is no linux killer app. I disagree.
1. I run linux homeserver. It has proxy (squid) and attached antivirus filter (clamav+squidclamav). It has array of software raid (mdadm). I also run web-server (apache) with gallery of photos (gallery2). There is a mail server (postfix) for a few accounts. Filesharing is done via nfs and samba. Finally I run mythtv backend server with 4 tuners. I never tried to replicate this software stack in windows, it is likely possible but require some pretty expensive licenses. Some of the thing like mythtv server are impossible under windows to the best of my knowledge.
2. dvdrip + transcode allows me to rip dvds and transcode them simultaneously on 4 client machines with total 12 processing cores. Transcoding is usually done under 15 mins.
3. I mentioned mythtv. I have centralised server and a number of clients. TV at the house is done via LAN, i.e. small x86 boxes with output to TV screen. Integrated mythtv client interface allows watching movies or listening to music from central storage, light browsing and so on at every TV.
4. The small x86 boxes are network-booted from server (in.tftpd + nfs). This allows easy management of software, i.e. single image for all clients. Never tried that under windows, likely possible but costly.
5. One of the clients is an HP thin-client with only 1G of local storage. I ended up network-booting it anyway, but initially compiled a full gentoo system (kernel, X, fully-fledged window manager (XFCE-4), mythtv client, browser (firefox), mail client (claws-mail), media player (mplayer + GUI)) under 1G. If I spent a bit more time, I think I could even fit office in that space. Not possible under windows.
6. There are other things that i have not even explored. Like asterisk for ip telephony. Or projects like opengoo, which allow you to run your own server-based set of office apps.
###The bottomline
I think a fair comparison should not focus on things that windows is known to do best. I think getting familiar with linux will enable one to find his own killer app, which cant be replicated in windows at reasonable cost. But this would require a reasonable time and efforts, which are beyond the scope of the article.
tabuvudu - Saturday, August 29, 2009 - link
a couple of more things which i forgot to mention in my previous post:###mounting network shares
your troubles seemed quite strange to me. Maybe this is because of Ubuntu implementation. This is usually done very well by linux. I do it in /etc/fstab. I agree with some previous posters, this is largely due to lack of linux experience, so should not be used in final assessment.
###benchmarks
are quite useless. I dont think you should dedicate much time in the article to that. For majority of desktop applications, plus or minus 10-15% does not make much difference.
###other interesting projects
another example of ltsp. Again, there are alternatives in windows, but licensed and pricey.
Kjella - Friday, August 28, 2009 - link
If you have a Windows CD lying around I highly recommend using virtualbox and installing Windows on it. For any low-performance application it'll work 100%, but it's not made for gaming. But except for gaming it should reduce your dual boot time to near zero. WINE is great for hackers and idealists but unless the application got a platinum/gold rating new users should not use it.As for support time, let me put it this way... do you keep XP for 10 years to run 10 year old installations? As the distro is for the most time supplying the applications, it's like being stuck ten years in the past. This is more like a free upgrade from Office 97 -> XP (2000) -> 2003 -> 2007 -> 2010 every few years.
Is it pefect? No. But a lot of it is that Canonical can't tell you the easy ways of fixing things. Most of the things on this page should be done first time you boot a fresh Ubuntu install:
https://help.ubuntu.com/community/RestrictedFormat...">https://help.ubuntu.com/community/RestrictedFormat...
Good luck to everyone that feel like trying :)
yasbane - Friday, August 28, 2009 - link
It's great to see Linux on the front page of Anandtech once again!Linux use on the Desktop is getting more attention, and its use is growing (though still only a very small fraction of the total market). I'm a huge fan of Linux (as well as Solaris and BSD), and I found Ryan's review very fair and informative. As with any OS, it is important to accept weaknesses as well as strengths, so that it can grow and improve. The strength of Linux has mostly been in the server world, but especially in the last couple of years, it has become much more user friendly on the desktop. These days, I have been able to install it on machines for people with no prior familiarity of Linux, and have found, in most cases no significant issues. There are definitely still things that can be improved though, (audio issues are the biggest one), but overall it is good news. Some people have commented 'Why bother? Windows does everything I need!', but I think competition is a good thing for desktop computing (and just about anywhere), as the saga of Internet Explorer's decline in the demise of Netscape proved. Mac OS X of course is there, especially at the higher end, but Linux is also beginning to make an impact on generic hardware (Netbooks being a good example), which has meant that Microsoft has had to lift its game and drop prices.
To answer Ryan's question, in terms of what I would like to see for Linux on Anandtech: in addition to benchmarks for consumer hardware running Linux, guides such as building a home server or HTPC, where using Linux is an appropriate option, alongside Mac or Windows; for some things, such as gaming there's much less point of course. IT Anandtech used to feature Linux server and virtualisation benchmarks, which made a lot of sense since Linux's greatest strength is in servers, although these benchmarks and reviews have been mysteriously absent lately. And of course, news from the world of Linux, such as new products with Linux on them, new kernel features, popular distro releases such as Ubuntu, and general headlines (e.g., Dell's recent comment that netbook return rates were no higher for Linux than for Windows).
Great to read the comments. Many thanks, Anandtech!
beginner99 - Friday, August 28, 2009 - link
This explains why I'm not using and won't use Linux in the near future for my regular PC. As long as I need windows anyway (and pay) there is for me even as "above average" user no need for Linux. I think I can do anything I need on windows, so why should I get Linux additionally to Windows? Makes no sense for me and not for most other users that are not developers, geeks or idealists.That will only change if you do not need windows at all, eg. you can game on Linux with same performance than on windows.
The only thing I can imagine using Linux is on a HTPC. Take one of the new ION nettops without OS and put on LinuxMCE or Linux with XBMC. But I have no idea how easy that is and how good it works. But there I would really benefit from the fact that Linux is free. (else the whole price will go up by like by 50% ;)).
I might actually try.
kc77 - Friday, August 28, 2009 - link
As someone who has used Ubuntu primarily for at least 4 years now I can say there are things that only come with time. Namely what applications to use for what. I primarily use Ubuntu because out of the box it installs most of the stuff I need right out of the box. However, that doesn't mean it installs EVERYTHING you need. Particularly when it comes to audio management. That package manager is your GOD in more ways than one once you realize that aside from running the latest drivers (which aside from us there aren't many other people that do and you aren't gaming on Linux unless you are using Cedega or Crossover) it provides you really with just about any program you need.So...
Audio Management - Remember you can mix and match KDE/Gnome. So for this I would use Amarok 1.4. It can sync with iPods and offers the ability to have your music collection residing in a SQL database. That alone had me smiling for days. I have about 70GB of music which it can catalog in about 5 min. Try that in iTunes or WMP. With it being in MYSQL any program I want to have access to my
collection can. In 9.04 you'll need to add the repository as it installs 2.0 of Amarok by default (which isn't as feature rich).
As an aside, Rhythmbox can now rip from inside the program. It might just be because you are using 8.04 but I know it does it now. However, Sound Juicer gives you more options with MP3 Tags as well as file types.
Video Editing / Mastering
If you want a program like Nero Vision, check out Devede, ManDVD is pretty good for DVD mastering. Both of those should be in the Package Manager. If they aren't their web sites offer debs. Just download and double-click.
Burning
If you don't like Brasero head to the package manager and get K3B. It's about as close to Nero as one can get.
SMB
The problems you've experienced are due to version. Browsing the root of a server has been fixed. It takes a little longer for the shares to appear than I would like it to, but it does work now.
Mapping "drives" as it were depends on the route you take. "Connect to Server" I believe creates a mount point under your media folder. I really haven't had much of a problem with this as most programs do recognize bookmarks, and for all of my shares one of the first things I do is set it in my FSTAB which essentially hard links my shares which avoids the problem entirely as every program can see a mount point. Basically there's about 5 different ways to go about this issue just choose the one that works best.
ISOs
One thing with ISO is that File Roller reads them natively. So unzip it and install from there. It's not like hard drive space is scare now a days.
Customization
My closing thought is that remember Ubuntu is not Gnome. Meaning that Ubuntu just creates the packages for it's distribution. Everything in linux is customizable. Don't like the bar at the bottom? Delete it. Want a tray like OSX? There's tons to choose from. I've done some OS modding and I can make Gnome look like Windows XP or Mac OSx (and mimic it's functionality). You aren't limited here.
Thats' the thing with linux the more time you spend (and really I'm talking about days here not months) the more programs you'll come across which will do what you want. Don't like Transmission? Use Ktorrent. Hell even Utorrent works very well with Wine.
Overall I liked your review.... even though I've had to wait for it...
dfonseca - Thursday, August 27, 2009 - link
Great article, congrats. Looking forward for part 2.My only criticism is that for both intermediate (topic-specific) verdicts and final (first?) thoughts, the non-techie average-joe end user point of view is given way too much weight. Not only do these people not read AT, but they also don't use objective judgement to decide their OSs. These are the people who will go with the flow, which will be dictated by other people. In short, the article focuses on helping joe6pack, but joe6pack doesn't care.
I believe it would be more productive, when passing comparative judgement, to narrow the focus on aspects more strongly connected to wide adoption - deployment (installation & out-of-the-box functionality), large-scale maintenance ease, support, productivity (office) software, and why not, bang for buck. Some of them are already being considered, and maybe they just need to be given higher relevance.
Cheers
dontruman - Thursday, August 27, 2009 - link
I have used Windows since the 3.0 days and am currently running RC7 on my home desktop. I think that it is significant that the hardware resources required by Ubuntu and Linux in general are much lower than Windows and can give new life to an old laptop or desktop. Ubuntu recognized and set up all the hardware without a hitch on my Dell Inspiron E1505 and my newer office Q9550 quad. A reasonably older system running on Linux can perform standard tasks at speeds close or equal to newer hardware running any of the Windows operating systems. The boot time for Ubuntu is also significantly shorter.It's no coincidence that the ASUS PC1000 netbook, with its solid state hard drive ships with Linux. Mine came with a version of Mandriva linux that is obviously designed for newbies, and doesn't provide nearly as large a set of software repositories as Ubuntu. I replaced it with Eee specific version of Ubuntu and have been very pleased with results. It quickly performs daily tasks such as office work via Openoffice 2.5, browsing (I use Opera) and email (Thunderbird). The difference in observable speed between my Eee and the i7 system I recently built is negligible. Of course I'm not talking about CPU intensive tasks, including Google Earth, where the i7 is many magnitudes faster. But for everyday mobile tasks I use the Eee. (The Inspiron is way too hot).
I have to disagree about the Synaptic Package Manager. You can Google 3rd party Linux repositeries, such as Medibuntu,that contain multimedia codecs and other interesting Linux software. (Be careful & research 3rd party repositories to avoid unpleasant surprises.) You can add these 3rd party repositories to your permanent list of software sources so that when updates to that software are created Ubuntu notifies you and allows you to accept the updates you want. Using a USB powered external DVD drive I can watch movies on my Eee or rip & compress them on the i7 system and put them on a flash drive or card. Once you get the comression formats tuned you have a good, light weight, computer/multimedia system with excellent battery life; good for two movies. All this on a two year old Intel Atom powered netbook.
I've played around with Linux for some time but I found it too alien & command line driven using a command set that has very little in common with MS or DR DOS. But over the last several years Linux has been improving at an accelerating rate. All that's needed now is a high quality Wine (Windows emulator) program that will allow you to run all your Windows specific software on Linux without a performance loss. It's a difficult undertaking but Wine has been steadily improving and there are now commercial Linux distros that guarantee compatability with specific Windows programs.
I recommend putting it on an old hard drive & give it a whirl. And try several versions; there are a lot of free ones out there. I recommend Ubuntu, OpenSUSE and Mandriva for starts. Why pay for the cow when you can get the milk for free?
fffblackmage - Thursday, August 27, 2009 - link
Unfortunately, most of the same problems mentioned in the article are keeping me from switching to Ubuntu as my primary OS. As far as gaming is concerned, I'll be going with Win7 (I'm still using XP atm).If I ever get a netbook or non-gaming laptop, I'll certainly consider Ubuntu. However, the likelihood of moving to Ubuntu when the laptop comes with Win7 will probably be very small, unless I can somehow get a refund for not using Win7.
Also, I guess some people do like the new GUI in MS Office. I found it annoying as I'm still using to the old GUI. But I suppose in time, I may learn to accept it like how I eventually accepted the XP look.
sanjeev - Thursday, August 27, 2009 - link
Yesterday, the manufacturer was Canon, today its "Canonical". Is Ubuntu manufactured by different(or types) vendors ?... just asking :).
I'm yet to finish 25 (+ 11) more pages .
apt1002 - Thursday, August 27, 2009 - link
Excellent article, thank you. I will definitely be passing it on.I completely agree with superfrie2 about the CLI. Why resist it?
Versions: I, like you, originally plumped for Hardy Heron because it is an LTS version. I recently changed my mind, and now run the latest stable Ubuntu. As a single user, at home, the benefits of a long-term unchanging OS are pretty small, and in the end it was more important to me to have more recent versions of software. Now if I were administering a network for an office, it would be a different matter...
Package management: Yes, this is absolutely the most amazing part of free software! How cool is it to get all your software, no matter who wrote it, from one source, which spends all its time diligently tracking its dependencies, checking it for compatibility, monitoring its security flaws, filtering out malware, imposing sensible standards, and resisting all attempts by big corporations to shove stuff down your throat that you don't want, all completely for free? And you can upgrade *everything* to the latest versions, at your own convenience, in a single command. I still don't quite believe it.
Unpackaged software: Yes, I agree, unpackaged software is not nearly as good as packaged software. It's non-uniform, may not have a good uninstaller, might require me to install something else first, might not work, and might conceal malware of some sort. That's no different from any other OS. However, it's not as bad as you make out. There *is* a slightly more old-fashioned way of installing software: tarballs. They're primitive, but they are standard across all versions of Unix (certainly all Linux distributions), they work, and pretty much all Linux software is available in this form. It never gets worse than that.
Games: A fair cop. Linux is bad for games.
GPUs: Another fair cop. I lived with manually installing binary nVidia drivers for five years, but life's too short for that kind of nonsense. These days I buy Intel graphics only.
40 second boot: More like 20 for me on my desktop machine, and about 12 on my netbook (which boots off SSD). After I installed, I spent a couple of minutes removing software I didn't use (e.g. nautilus, gdm, and most of the task bar applets), and it pays off every time I boot.
Separate menu bar and task bar: I, like you, prefer a Windows-ish layout with everything at the bottom, so after I installed I spent a minute or two dragging-and-dropping it all down there.
GregE - Wednesday, August 26, 2009 - link
I use GNU/Linux for 100% of my needs, but then I have for years and my hardware and software reflect this. For example I have a Creative Zen 32gb SSD music player and only buy DRM free MP3s. In Linux I plug it in and fire up Amarok and it automatically appears in the menus and I can move tracks back and forth. I knew this when I bought it, I would never buy an iPod as I know it would make life difficult.The lesson here is that if you live in a Linux world you make your choices and purchases accordingly. A few minutes with Google can save you a lot of hassle when it comes to buying hardware.
There are three web sites any Ubuntu neophyte needs to learn.
1 www.medibuntu.org where multimedia hassles evaporate.
2 http://ubuntuguide.org/wiki/Ubuntu:Jaunty">http://ubuntuguide.org/wiki/Ubuntu:Jaunty the missing manual where you will find the solution to just about any issue.
3 http://www.getdeb.net/">http://www.getdeb.net/ where new versions of packages are published outside of the normal repositories. You need to learn how to use gdebi installer, but essentially you download a deb and double click on it.
Then there are PPA repositories for the true bleeding edge. This is the realm of the advanced user.
For a home user it is always best to keep up to date. The software is updated daily, what did not work yesterday works today. Hardware drivers appear all the time, by sticking with LTS releases you are frozen in time. Six months is a long time, a year is ancient history. An example is USB TV sticks, buy one and plug it into 8.10 and nothing happens, plug it in 9.04 and it just works or still does not work, but will in 9.10
Yes it is a wild ride, but never boring. For some it is an adventure, for others it is too anarchic.
I use Debian Sid which is a rolling release. That means that there are no new versions, every day is an update that goes on forever. Ubuntu is good for beginners and the experienced, the more you learn the deeper you can go into a world of software that exceeds 30,000 programs that are all free in both senses.
I look forward to part 2 of this article, but remember that the author is a Linux beginner, clearly technically adept but still a Linux beginner.
It all comes down to choice.
allasm - Thursday, August 27, 2009 - link
> I use Debian Sid which is a rolling release.> That means that there are no new versions, every day is an update that goes on forever.
This is actually one of the best things about Ubuntu and Debian - you NEVER have to reinstall your OS.
With Windows you may live with one OS for years (few manage to do that without reinstalling, but it is definitely possible) - but you HAVE to wipe everything clean and install a new OS eventually. With Debian and Ubuntu you can simply constantly upgrade and be happy. At the same time noone forces you to upgrade ALL the time, or upgrade EVERYTHING - if you arehappy with, say, firefox v2 and dont want to go to v3 because your fav skin is not there yet - just dont upgrade one app (and decide for yourself if uyou need the security fixes).
Some time ago I turned on a Debian box which was offline/turned off for 2+ years and managed to update it (to a new release) with just two reboots (one for the new kernel to take effect). That was it, it worked right after that. To be fair, I did have to update a few config files manually after that to make it flawless, but even without manual updates the OS at least booted "into" the new release. Natuarally, all my user data stayed intact, as did most of the OS settings. Most (99%) programs worked as expected as well - the problematic 1% falling on some GUI programs not dealing well with new X/window manager. And had no garbage files or whatever after the update (unlike what you get if you try to upgrade a winXP to say WinVista)
Having said all that, I 100% agree that linux has its problems as a desktop OS (I use windows more than linux day-to-day), but I totally disagree that using one OS for a long time is a weak point of Ubuntu.
P.S. one thing i never tried is upgrading a 32 bit distro to 64 bit - i wonder if this is possible on a live OS using a package manager.
wolfdale - Wednesday, August 26, 2009 - link
A good article but I have a few pointers.1) More linux distros need to be reviewed. Your "out of the box" review was informational but seemed to more in-tuned with commercial products aimed for making a profit (ie, is this a good buy for your money?). I, for one, used to check AnandTech.com before making a big computer item purchase. However linux is free to the public thus the tradeoff for the user would now be how much time should I invest in learning and customizing this particular distro. Multi-distro comparisons along with a few customized snapshots would help the average user on deciding what to spend with his valuable time and effort.
2) Include Linux compatibility on hardware reviews. Like I said earlier, I once used AnandTech.com as my guide for all PC related purchases and I have to say about 80% of the time it was correct. But, try to imagine my horror about 1.5 years ago when my brand spanking new HD4850 video card refused to do anything related to 3-D on Ubuntu. I spent weeks trying to get it to work but ended up selling it and going with Nvidia. Of course it was a driver issue but no where did AnandTech.com mentioned this other than saying it was a best buy.
Thanks for listening, I feel better now. I'm looking forward to reading your Ubuntu 9.04 review and please keep adding more linux related articles.
ParadigmComplex - Wednesday, August 26, 2009 - link
When I first saw that there was going to be a "first time with Linux" article on Anandtech, I have to admit I was a bit worried. While the hardware reviews here are excellent, it's already something you guys are familiar with - it's not new grounds, you know what to look for. I sadly expected Ryan would enter with the wrong mindset, trip over something small and end up with an unfair review like almost all "first time with Linux" reviews end up being.Boy, was I wrong.
With only one major issue (about APT, which I explained in another post) and only a handful of little things (which I expect will be largely remedied in Part 2), this article was excellent. Pretty much every major thing that needed to be touched on was hit, most of Ubuntu's major pluses and minuses fairly reviewed. It's evident you really did your homework, Ryan. Very well done. I should have known better then to doubt anyone from anandtech, you guys are brilliant :D
Fox5 - Wednesday, August 26, 2009 - link
One last thing I forgot to say....Good job on the article. I (and many others) would have liked to see 9.04 instead (I don't know of anyone who uses the LTS releases, those seemed to be aimed at system integrators, such as Dell's netbooks with ubuntu), but the article itself was quality.
jasperjones - Wednesday, August 26, 2009 - link
I'd like to make one last addition in similar spirit. I appreciate this article as a generally unbiased review that covers many important aspects of a general-purpose OS.And just to be sure: I'm not a Linux fanatic, in fact, for some reason, I'm writing up this post on Vista x64 ;)
jasperjones - Wednesday, August 26, 2009 - link
You're right that there are historical reasons that dictate that one Linux binary might be in /usr/bin, another in /sbin or /usr/sbin, yet another one in /usr/local/bin, etc.However, you really couldn't care less as long as the binary is in your path. which foo will then tell you the location. Furthermore, there's hardly any need to manually configure something in the installation directory. Virtually anything that can be user-configured (and there's a lot more that can than on Windows) can be configured in a file below ~ (your home). The name of the config file is usually intuitive.
But yeah, for things that you configure as admin (think X11 in /etc/X11/xorg.conf or Postgres usually somewhere under /usr/local/pgsql) you might need to know the directory. However, the admin installs the app, so he should know. Furthermore, GUIs exist to configure most admin-ish things (I don't know what it's in Ubuntu for X but it's sax2 in SUSE; and it's pgadmin for Postgresql in both Ubuntu and SUSE)
ParadigmComplex - Wednesday, August 26, 2009 - link
Again, if I may extend from what you've said:Even though it is technically possible to reorder the directory structure, Ubuntu isn't going to do it for a variety of reasons:
First and foremost, one must remember Ubuntu is essentially just a snapshot of Debian's in-development branch (unstable aka Sid) with some polish aiming towards user-friendliness and (paid) support. Other then the user-friendly tweaks and support, Ubuntu is whatever Debian is at the time of the snapshot. And while Debian has a lot of great qualities, user-friendliness isn't one of them (hence the need for Ubuntu). Debian focuses on F/OSS principles (DFSG), stability, security, and portability - Debian isn't going to reorder everything in the name of user-friendless.
Second, it'd break compatibility with every other Linux program out there. Despite the fact that Ryan seemed to think it's a pain to install things that aren't from Ubuntu's servers, it's quite common. If Ubuntu rearranges things, it'd break everything else from everyone else.
Third, it would be a tremendous amount of work. I don't have a number off-hand, but Ubuntu has a huge number of programs available in it's repos that would have to be changed. Theoretically it could be done with a script, but it's risking breaking quite a lot for no real gain. And this would have to be done every six months from the latest Debian freeze.
jasperjones - Wednesday, August 26, 2009 - link
I disagree with the evaluation of the package manager.First, there's a repo for almost anything. I quickly got used to adding a repo containing newer builds of a desired app and then installing via apt-get.
Second, with a few exceptions, you can just download source code and then install via "./configure; make; sudo make install." This usually works very well if, before running those commands, you have a quick look at the README and install required dependencies via apt-get (the versions of the dependencies in the package manager almost always are fine).
Third, and most importantly, you can simply update your whole Ubuntu distribution via dist-upgrade. True, you might occasionally get issues from doing that (ATI/NVIDIA driver comes to mind) but think of the convenience. You get a coffee while "sudo apt-get dist-upgrade" runs and when you get back, virtually EVERYTHING is upgraded to a recent version. Compare that with Windows, where you might waste hours to upgrade all apps (think of coming back to your parent's PC after 10 months, discovering all apps are outdated).
ParadigmComplex - Wednesday, August 26, 2009 - link
I concur - while most of the article is quite good, Ryan really seemed to have missed quite a bit here. His analysis of it seemed rather limited if not misleading.Not everything *has* to be a package - I have various scripts strewn around, along with Firefox 3.6a1 and a bunch of other things without having them organized properly as .deb's with APT. The packaging system is convenient if you want to use it, but it is not required.
Additionally, Ryan made it seem as though everything must be installed through Synaptic or Add/Remove and that there where no stand-alone installers along the lines of Windows' .msi files. It's quite easy on Ubuntu to download a .deb file and double-click on it. In fact, it's much simpler then Windows' .msi files - there's no questions or hitting next. You just give it your password and it takes care of everything else.
The one area I agree with Ryan is that there needs to be an standardized, easy, GUI fashion to add a repository (both the address and key) to APT. I have no problems with doing things like >>/etc/apt/sources.list, but I could see where others may. I suspect this could be done through a .deb, but I've never seen it done that way.
Ryan Smith - Wednesday, August 26, 2009 - link
Something I've been fishing for here and have not yet seen much of are requests for benchmarks. Part 2 is going to be 9.04 (no multi-Linux comparisons at this point, maybe later) and I'd like to know what you guys would like to see with respect to performance.We'll have a new i7 rig for 9.04, so I'll be taking a look at a few system level things (e.g. startup time) along side a look at what's new between 8.04 and 9.04. I'll also be taking a quick look at some compiler stuff and GPU-related items.
Beyond that the board is open. Are there specific performance areas or applications that you guys would like to see(no laptops, please)? We're open to suggestions, so here's your chance to help us build a testing suite for future Linux articles.
cyriene - Monday, August 31, 2009 - link
I'd like to see differences between PPD in World Community Grid between various Windows and Linux distros.I never really see AT talk about WCG or other distributed computing, but I figure if I'm gonna OC the crap out of my cpu, I might as well put it to good use.
Eeqmcsq - Thursday, August 27, 2009 - link
Cross platform testing is pretty difficult, considering there are a multitude of different apps to accomplish the same task, some faster, some slower. And then there's the compiler optimizations for the same cross platform app as you mentioned in the article. However, I understand that from an end user's perspective, it's all about doing a "task". So just to throw a few ideas out there involving cross platform apps so that it's a bit more comparable...- Image or video conversion using GIMP or vlc.
- Spreadsheet calculations using the Open Office Calc app.
- Performance tests through VMware.
- How about something java related? Java compiling, a java pi calculator app, or some other java single/multi threaded test app.
- Perl or python script tests.
- FTP transfer tests.
- 802.11 b/g/whatever wireless transfer tests.
- Hard drive tests, AHCI. (I read bad things about AMD's AHCI drivers, and that Windows AHCI drivers were OK. What about in Ubuntu?)
- Linux software RAID vs "motherboard RAID", which is usually only available to Windows.
- Linux fat32/NTFS format time/read/write tests vs Windows
- Wasn't there some thread scheduling issues with AMD Cool and Quiet and Windows that dropped AMD's performance? What about in Linux?
While I'm brainstorming, here's a few tests that's more about functionality and features than performance:
- bluetooth connectivity, ip over bluetooth, etc
- printing, detecting local/network printers
- connected accessories, such as ipods, flash drives, or cameras through usb or firewire
- detecting computers on the local network (Places -> Network)
- multi channel audio, multi monitor video
Just for fun:
- Find a Windows virus/trojan/whatever that deletes files, unleash it in Ubuntu through Wine, see how much damage it does.
Veerappan - Thursday, August 27, 2009 - link
I know you've said in previous comments that using Phoronix Test Suite for benchmarking different OSes (e.g. Ubuntu vs Vista) won't work because PTS doesn't install in Windows, but you could probably use a list of the available tests/suites in PTS as a place to get ideas for commonly available programs in Windows/OSX/Linux.I'm pretty sure that Unigine's Tropics/Sanctuary demos/benchmarks are available in Windows, so those could bench OpenGL/graphics.
Maybe either UT2004 or some version of Quake or Doom 3 would work as gaming benchmarks. It's all going to be OpenGL stuff, but it's better than nothing. You could also do WoW in Wine, or Eve under Wine to test some game compatibility/performance.
Once you get VDPAU working, I'd love to see CPU usage comparisons between windows/linux for media playback of H.264 videos. And also, I guess, a test without VDPAU/VAAPI working. Too bad for ATI that XvBA isn't supported yet... might be worth mentioning that in the article.
You also might want to search around for any available OpenCL demos which exist. Nvidia's newest Linux driver supports OpenCL, so that might give you a common platform/API for programs to test.
I've often felt that DVD Shrink runs faster in Wine than in Windows, so the time to run a DVD rip would be nice, but might have legal implications.
Some sort of multitasking benchmark would be nice, but I'm not sure how you'd do it. Yeah, I can see a way of writing a shell script to automatically launch multiple benchmarks simultaneously (and time them all), but the windows side is a little tougher to me (some sort of batch script might work). Web Browsing + File Copy + Transcoding a video (or similar).
Ooh... Encryption performance benchmarks might be nice. Either a test of how many PGP/GPG signs per second, or copying data between a normal disk partition, and a TrueCrypt partition. The TrueCrypt file copy test would be interesting to me, and would cover both encryption performance and some disk I/O.
One last suggestion: Folding@Home benchmarks. F@H is available at least in CPU-driven form in Windows/Linux, and there's F@H benchmark methodologies already developed by other sites (e.g. techreport.com's CPU articles).
Well, that's enough for now. Take or leave the suggestions as you see fit.
haplo602 - Thursday, August 27, 2009 - link
you are out of luck here ... linux does not compare to windows because they are both different architectures. you already did what you could in the article.especialy in a binary distribution like Ubuntu, compilation speed tests are meaningless (but Gentoo folks would kiss your feet for that).
boot up times are also not usefull. the init scripts and even init mechanisms are different from distro to distro.
compression/filesystem benchmarks are half way usable. on windows you only have NTFS these days. on linux there are like 20 different filesystems that you can use (ext3/4, reiser, jfs and xfs are the most used. also quite many distros offer lvm/evms backends or software raid.
I do not think there is much benchmarking you can do that will help in linux vs windows, even ubuntu vs windows because the same benchmars will differ between ubuntu versions.
the only usable types are wine+game vs windows+game, native linux game vs same windows game (mostly limited to unreal and quake angines), some povray/blender tests and application comparisons (like you did with the firefox javascript speed).
GeorgeH - Wednesday, August 26, 2009 - link
Not really a benchmark per se, but I'd be curious to know how the stereotypes of Windows being bloated and Ubuntu being slim and efficient translate to power consumption. Load and idle would be nice, but if at all possible I’d be much more curious to see a comparison of task energy draw, i.e. not so much how long it takes them to finish various tasks, but how much energy they need to finish them.In know that’d be a very difficult test to perform for what will probably be a boring and indeterminate result, but you asked. :)
ioannis - Wednesday, August 26, 2009 - link
is there some kind of cross platform test that can be done to test memory usage? Maybe Firefox on both platforms? not sure.By "no laptops", I presume you mean, no battery tests (therefore power and as a consequence, heat). That would have been nice though. Maybe for those looking for a 'quiet' setup.
but yes, definitely GPU (including video acceleration) and the GCC vs Visual Studio vs Intel compiler arena (along with some technical explanation why there are such huge differences)
ParadigmComplex - Wednesday, August 26, 2009 - link
If you can, find games that are reported to work well under WINE and benchmark those against those running natively in Windows. It'd be interesting to see how the various differences between the two systems, and WINE itself, could effect benchmarks.Fox5 - Wednesday, August 26, 2009 - link
Number 1 use of Ubuntu is probably going to be for netbooks/low end desktops for people who just wanna browse the net.In that case, the browsing experience (including flash) should be investigated.
Boot up time is important.
Performance with differing memory amounts would be nice to see (say 256MB, 512MB, 1GB, and 2GB or higher). Scaling across cpus would be nice.
Ubuntu as a programming environment versus windows would be good to see, including available IDEs and compiler and runtime performance.
Ubuntu as a media server/HTPC would be good to see. Personally, I have my windows box using DAAP shares since Ubuntu interfaces with it much better than Samba. And as an HTPC, XBMC and Boxee are nice, cross-platform apps.
Finally, how Ubuntu compares for more specific applications. For instance, scientific computing, audio editing, video editing, and image manipulation. Can it (with the addition of apps found through it's add/remove programs app) function well-enough in a variety of areas to be an all-around replacement for OSX or Windows?
Speedwise, how do GIMP and Photoshop compare doing similar tasks? Is there anything even on par with Windows Movie Maker?
What's Linux like game wise? Do flash games take a noticeable performance hit? Are the smattering of id software games and quake/super mario bros/tetris/etc clones any good? How does it handle some of the more popular WINE games?
jasperjones - Wednesday, August 26, 2009 - link
I second most of Fox5's suggestion.1.) I've been completely ignorant of software development on Windows over the last few years. Comparison of MS Visual Studio vs Eclipse or vs Netbeans/Sun Studio? How fast are CLI C++ apps on Windows vs. Linux? Perhaps using both GNU and Intel C++ Compiler toolchains on Linux. And possibly MS Visual C++ and Intel Visual C++ on Windows.
Perhaps less esoteric, 2.) instead of benching SMB/CIFS on Windows vs Samba on *nix, bench something *nix native such as scp/sftp or nfs. Netperf.
3.) Number-crunching stuff. I guess this is sort of similar to running at least a few synthetic benches. LINPACK or some other test that uses BLAS or LAPACK, tests that use FFTW. Maybe even SPEC (I wouldn't expect any exciting results here, though, or are there?)
Eeqmcsq - Wednesday, August 26, 2009 - link
Are you looking for benchmarks in Windows vs Ubuntu with the same hardware? Or benchmarks in different CPUs/motherboards/etc with the same Ubuntu?Ryan Smith - Wednesday, August 26, 2009 - link
Cross-platform. There's no problem coming up with Linux-only benchmarks for hardware.Eeqmcsq - Wednesday, August 26, 2009 - link
I have a question about your benchmarks that involve files, such as copying and zipping. When you run your benchmarks, do you run them multiple times and then get an average? I ask that because I have learned that in Linux, files get cached into memory, so subsequent runs will appear faster. I suspect the same thing happens in Windows. Do you take that into account by clearing cached memory before each run?Ryan Smith - Wednesday, August 26, 2009 - link
We reboot between runs to avoid cache issues (and in the case of Windows, wait for it to finish filling the SuperFetch cache).fri2219 - Wednesday, August 26, 2009 - link
I heard Sony is coming out with this thing they call a Walkman.You should review that next!
StuckMojo - Wednesday, August 26, 2009 - link
ROFL!Fox5 - Wednesday, August 26, 2009 - link
The LTS is really for the same types of people that avoid grabbing the latest MS service pack. IE, anyone who's still running Windows XP SP2 with IE6. Do that comparison and see how they compare.Ubuntu is little more than a tight integration of many well-tested packages, there's no reason to go with ubuntu's LTS when everything else already goes through it's own extensive testing. Given how quickly open source software advances, I'd say the LTS is probably less stable than the most up to date versions, and certainly far behind on usability.
You want the equivalent of Ubuntu's LTS in Windows? It most closely matches the progression that the Windows server versions follow.
Ryan Smith - Wednesday, August 26, 2009 - link
To put things in perspective, 8.04 was released shortly after Vista SP1 and XP SP3 were. So Hardy vs. XP SP2 (a 4 year old SP) is a pretty poor comparison.You'll see an up to date comparison in part 2 when we look at 9.04.
awaken688 - Wednesday, August 26, 2009 - link
I'm glad you did this article. It really has been something I think about. I'm ready to read your Part II. As others have mentioned, I have a couple of other articles that would be great.1) The comparison of the various versions as mentioned. SuSe, Ubuntu 9.04, BSD, etc...
2) Someone mentioned VirtualBox. I'd love to hear more about this including a detailed setup for the normal user. I'd love to be able to surf while in Linux, but able to play games in Windows and keep them separate for added security.
Thanks for the article! Hope to see one or both of the ideas mentioned above covered.
Guspaz - Wednesday, August 26, 2009 - link
"Not that it would necessarily be of much use, the last time I saw any statistics for instant messaging network usage, the vast majority of North American users were on AOL’s AIM network."IM use is highly regionalized. As such, AIM is clearly the dominant IM in the USA. However, Canada is dominated by MSN Messenger, and has been for many years (most of us migrated from ICQ to MSN around the release of Windows XP, I believe, due to the bundling of then Windows Messenger).
So, if Canada is dominated by MSN, while I can't speak for Mexico, it's misleading to claim that "the vast majority of North American users". As a Canadian, I can't think of anybody I know in person that uses AIM. They all use MSN or Google Talk without exception.
Aclough - Wednesday, August 26, 2009 - link
For myself, the thing that most bugs me when I have to go back to Windows is all the missing features from the window manager. I've come to rely on having multiple workspaces on my desktop, but I can adjust to having just one fairly easily when I'm not working on a lot of different stuff at once. What really bugs me, though, is how much more effort it takes to move or resize windows in Windows. On Linux I can press ALT and then click anywhere on the window, but with Windows I have to carefully click the title bar or the very edge of the window and that takes a noticeably longer time once you're used to doing things differently.Oh, and I find that the Linux scheduler seems to be noticeably better than the Windows one in preserving responsiveness when the system is under load.
fumacapena - Wednesday, August 26, 2009 - link
Great article!How about some benchmarks of "minimal" distros (like Puppy, Tine Core, ...)??
I like the idea of "ressurect" an old PC, but I would like to see benchmarks in Quad Cores, i7 too!
Anandtech is great, Bench(beta) is awesome!!
(sorry by bad english)
Thanks
InGraphite - Wednesday, August 26, 2009 - link
A few months ago most major trackers unbanned Transmission, but it still doesn't seem to be universally accepted on private trackers.I remember offhand (I could be wrong) that the main gripe was due to the fact it made excessive queries and thus flooded trackers with requests, or had the ability to.
chomlee - Wednesday, August 26, 2009 - link
I think you really need to mention the big picture here.I myself just tried Ubuntu for the first time 2 months ago and although I will admit that I have spent up to 8 hours trying to figure out how to install a specific program (before I found out there was a way to get the package manager to find the install), and I wanted to smash my computer at times. Now that I have learned quite a bit more, I realized that the few things I have installed worked great and flawlessly.
Anyhow back to the big picture. I can understand some of your concerns with how the OS will work with specific programs but what I have found is that most people I know use their computers for 2 things email and web browsing. Most of these people are constantly having problems with the system running too slow and cant seem to get rid of hidden viruses/malware. So I think that those people could easily be much happier with a simple OS like Ubuntu just for email and web browsing (And I would get a heck of alot of less calls from my dad asking my why his computer is running too slow). Lets also not forget that everything is moving to be browser compatible (like you mentioned).
Also, for people like myself, I use my Ubuntu system for a file server as well as a media center (XBMC is Awesome).
So, yes, for burning DVDs/CDs/Playing Games/Microsoft Office, I see no reason why you wouldn't use windows, but I think 95% of the users would be perfectly fine with ubuntu which is something that Mr Bill would not be very happy about when the public realizes this.
Keno - Thursday, August 27, 2009 - link
I think you have missed one small but important part.I am Ubuntu user since 8.04. I came to Linux because of the constant treat of viruses.
Last month I have installed 7 and it is very user friendly and I think it is very user frinedly but after Avira Antivir got crashed by virus I installed Kaspersky INternet security 2010. then it took almost twice as long to boot. Then I gladly returned to Ubuntu 9.04. Because MIcrosoft can not exist without Antivirus I think you should do some real benchmarking and test windows WITH Antivirus.
On Ubuntu I have ClamWin just in case i get some files from Windows users:)
Thanks
ioannis - Wednesday, August 26, 2009 - link
just wanted to point out that you can install software under the LiveCD. Of course it does not install on the hard drive. It remains on a ram-drive, so when you reboot, it's gone. It's still useful, if you wish to test out some package or perform some task with a tool not installed by default on the LiveCDstrikeback03 - Wednesday, August 26, 2009 - link
Even more useful (and not mentioned) is that Ubuntu can easily run off a flash drive, and more recent versions even include a GUI tool for installing it to one. Then all installs and other changes are saved from session to session, and everything runs much more quickly than the LiveCD.Mr Pearce - Wednesday, August 26, 2009 - link
It would be great if you could do more articles on compiler and especially driver performance differences. That was the most interesting part of this article.Ryan Smith - Wednesday, August 26, 2009 - link
This is what Part 2 will look at. I can compile some stuff by hand to see if it closes the Windows/Ubuntu gap, and I have plenty of video cards on hand to test what I can when it comes to graphics.justniz - Wednesday, August 26, 2009 - link
Maybe I'm missung something but this appears to be a new article.Why are you reviewing a year-old version of Ubuntu? there's been nearly 3 releases since that (Ubuntu is on 9.04 now with 9.10 coming very soon).
Its important to review the most recent version as Ubuntu is totally unlike the Microsoft world in tnat new releases are frequent (Every 6 months) and have real practical improvements.
ioannis - Wednesday, August 26, 2009 - link
I couldn't help myself, but...RTFA!!
:-D
PS: if you read the article, you will also get the joke ;)
nafhan - Wednesday, August 26, 2009 - link
Great article. I look forward to reading the follow up.One comment on security that I would like to make. The commercial Linux vendors (IBM, Novell, Redhat, etc.) are all VERY dedicated to ensuring Linux security, as many/all of their server products use Linux, and changes they make will filter back down to the Linux desktop community. This is something that OSX does not have to nearly the same degree.
My experience with running Linux on the desktop sounds pretty much the same as yours.
-Games killed it in general. I don't usually have a top of the line system. So, I'm usually pushing my computer its limits to run newer games under Windows. Also, I hate dual booting, and most of the FOSS I use is available as a compiled binary for Windows.
-Drivers killed it in one specific instance with an older laptop, as I never got NdisWrapper (required for my wifi cards Windows drivers) to run better than intermittently. I spent way to much time messing with it.
crimson117 - Wednesday, August 26, 2009 - link
[quote]and for the price you’re only giving up official support.[/quote]Ubuntu doesn't have free official support, but neither does Microsoft. Apple does give 90 days free phone support, to their credit, but after that you have to pay.
You can always hire an expert (from ms, or apple, or a third party) to help you, but that's also true with ubuntu, though I expect there are fewer such experts to be found.
MS, Apple, and Ubuntu all offer free web-based help, both community maintained and "officially" maintained.
So I think it's misleading to imply that going from Windows or Mac to Ubuntu means you're downgrading your support options. People overestimate just how "supported" their operating systems are. Also, Linux / Ubuntu releases fixes and updates much more quickly than Apple or MS, so your chances of hitting a bug is lower in the first place. (MS maintains a huge knowledgebase of bugs they haven't bothered to fix yet and might have a workaround for - but I hardly see that as a positive).
crimson117 - Wednesday, August 26, 2009 - link
I'm probably being too hard on Apple here. The genius bar offers free 15 minute appointments to diagnose problems and offer software tips / advice.I'd say apple has the best "official" support, followed by a fuzzy tie between ubuntu and microsoft.
gordonsmall - Wednesday, August 26, 2009 - link
While I have used computers for 20 years or more, I am not a techie. I am much more interested an experience that "just works".When Vista came out I decided to explore the Linux desktop world. I have been using it as my primary system (still keep the dual boot option for XP) for just under 2 years.
I agree that "free" and security are big considerations for moving to a Linux desktop environment. However, there are some other items (and you might class them under security) that I like - because of the file structure, you don't have to periodically defrag your system. Both systems have a lot of updates, but so far I have not gotten the feeling that my Ubuntu system is gradually slowing down and clogging up with a lot of useless files (you don't see a lot of adds for such utilities as Registry Cleaners:). I no longer experience the MS ripple effect - when MS sneezes, other Windows apps may get a cold.
That is not to say that there cannot be issues. My pet peeve has been that my sound has disappeared on a couple of occassions after downloading updates. Using Google, and the Ubuntu documentation, I have been able to get it back up - but wish that wouldn't happen. But Windows updates can on occassion cause some issues.
I think you made a very valid point about the issue of tech support. Google has made a big difference in problem solving.
Enjoyed your review.
Gordon Small
yuchai - Wednesday, August 26, 2009 - link
I've tried using Linux (usually Ubuntu) as a full replacement desktop on and off for the last few years. I've gone back to Windows every time after a while. Some key points:1. For my desktop usage, there honestly isn't anything that Linux does better, in terms of functionality, than Windows
2. Windows is cheap enough that I do not mind spending the money on it. For the $100 that I spent for Vista 64 Home Premium OEM, it is quite worthwhile even if I only use it for 3 years. Yes, there are more apps out of the box for Linux, but it's usually easy to find freeware for Windows with the same functionality. Even Office is now pretty affordable with the Home & Office version.
3. Games - Wine just doesn't cut it. When I want to play a new game, I want buy it and play it immediately! I do not want to have to do research to see whether some game would work on Wine even before I buy it. I do not want to spend hours troubleshooting on the internet if something doesn't work right.
4. There's always something that you want to change in Linux that you can't figure out. Yes, usually the solution is on the internet. And I used to even enjoy spending time and looking for the solution. But, it eventually grew old. Now I just want things to work and keep working.
Note that I do love Linux and actually have a server that doubles as a mythtv HTPC setup. It's a beautiful thing. I am comfortable with shell commands and frequently use SSH to perform multiple functions remotely. My opinions above is purely based on desktop usage.
cciemd - Wednesday, August 26, 2009 - link
Great article, Ryan! Putting out some well written Linux articles really adds depth to your site. I have been reading this site daily for years and this article is prompting my first post.For future articles it would be great to see some Linux benchmarks in most of the hardware reviews. There are some excellent tools out there (check out http://www.phoronix-test-suite.com/)">http://www.phoronix-test-suite.com/). This would also give some closer apples-to-apples comparisons for Mac vs. Linux performance. I for one would LOVE to see SSD articles report some Linux (and Opensolaris/ZFS) benchmarks along with all the Windows tests.
Users often don't realize how much they benefit daily from open source software. I don't think most Mac users realize all the OSX pieces that are used in the background for which Apple leverages open source code (Samba for SMB access and sharing, Webkit for Safari, etc.). Home NAS and enterprise storage which serve files in Windows environments are often *nix based.
It is also a myth that open source means that developers aren't paid. Most enterprises recognize that implementing even commercial apps can require considerable internal development manpower. If enterprise developers can utilize open source code internally and contribute back to the code base, the companies save considerable money and benefit from a healthy software development ecosystem. There are thousands if not millions of developers employed to work on open source code.
Please keep up the good work. I am looking for your next article.
Ryan Smith - Wednesday, August 26, 2009 - link
Unfortunately the Phoronix Test Suite doesn't work under Windows, so it's of limited utility. It's something we may be able to work in to hardware reviews, but it's not really applicable to OS reviews.chrone - Wednesday, August 26, 2009 - link
what i'd like to see on the next ubuntu version is more softer and smoother graphic and font rendering. i hate the way gnome renders the graphic and font. they look old operating system. using the ms core font some how helps but not much.i know there's compiz and friends, but i just wish it comes by default, so no need to hassle with compiz and its setting. i wish it could be rendered softer and smoother such as in windows and mac osx.
the look and feel should be tweaked more often! :D
Telkwa - Wednesday, August 26, 2009 - link
Nobody's going to agree with the entire article. I'm just glad to see Anandtech paying some attention, and would welcome any articles, tests, reviews, etc.It's embarrassing to visit the "Linux" tab and see the latest article was posted in July of 2005...
Geraldo8022 - Wednesday, August 26, 2009 - link
This is based on Ubuntu and I installed it this past weekend. I am having certain issues with it. Yes, it is free. Overall I like it very much and am pleasantly surprised. But, this has shown that Windows 7 will be a comparative bargain to me. I do not have the time to sit in front of the computer and play with Linux; trying to find out why certain videos don't play and why I am having eye strain and clicking on an audio link that doesn't play and a few more. When I go to the Mint forums I am confronted with a Tower of Babel what with all of the acronyms, and told to go to the terminal and type $surun%(8#**#. Ok, now turn your head and cough.I'll keep Linux on this machine to boot up and play with now and then. It beats solitaire for the time being.
VooDooAddict - Friday, August 28, 2009 - link
You hit on a good point. People I've setup with dual booting linux distros and windows begin to appreciate what they are paying for with windows. Typical response is "This is cool (Ubuntu) and I can see why some people like it. But I'm going to stick with windows, it's worth the money to me."They appreciate that Linux could work, but see the "value" in paying form something familiar.
VooDooAddict - Friday, August 28, 2009 - link
I run Vista on my main PC. Vista on all the spare LAN gaming PCs. I have an Ubuntu 9.04 VM and Ubuntu Netbook Edition on my old tablet PC (small and netbook like).Locutus465 - Wednesday, August 26, 2009 - link
Just out of curiosity what user mode were you having guests run in? Even in vista I don't provide anything greater than standard user. With that guests need my password (which they don't have) to mess my machine up. Going back as far as Windows 2000, as long as you pair Windows with good spyware (spybot, or for XP defender if you choose) and antivirus (I like Avast and AVG both free and have nil footprints) you basically don't have to worry about system security as long as the person is running a standard user account.My my parents system, we went from having to wipe and reinstall windows every time I came home from college, to a rock solid system that absoultly never failed when I performed these steps. I still like the XP/2000 behaviour of simply denying access better than the current UAC implementation. But Vista 64 + UAC (active) seems to be secure enough, particularly when paired with the aformentioned anti-virus software.
Ryan Smith - Wednesday, August 26, 2009 - link
For what it's worth, it's an admin account. I know, I know, I could do Limited User. But that tends to just elicit complaints. XP's Limited User mode is embarrassing compared to how well Vista/Win7 does it.Since it's basically just a web browsing laptop anyhow, it's basically a perfect fit for Ubuntu since I wouldn't need to be concerned with Windows malware period.
leexgx - Wednesday, August 26, 2009 - link
i have to agree even XP in its standered/limited user account mode quite hard for stuff to install but not imposable (Vista and win7 with UAC on and an standered account with the admin account passworded should prevent the system from been messed up)aguilpa1 - Wednesday, August 26, 2009 - link
It seems the OS does not like core 2 duos and nvidia 9800GTX graphics, something even OSX was able to handle.samspqr - Wednesday, August 26, 2009 - link
* for me, the best possible way to install applications on any OS, but specially in one that is free (libre) is as follows: you search on the internet for the best program to meet your needs, you find it, you copy some code that identifies it, and paste that in your package manager, which then connects to some database, checks that the program is not malware, looks for the latest version, and proceeds to download and install it, not caring whether it's open source or not; this would beat windows/OSX by a wide margin, and also the current ubuntu system, whose "we don't like this software, on philosophical grounds, so it's going to be a pain in the ass for you to install it" attitude is a bit too problematic* it would be nice if the "auto" option in the installer told you what it's going to do with your hard disk before going on to do it; I never use it, out of fear it might try to do something I don't like
* I missed some comment on that section on how Photoshop CS3 costs a lot of $$$, while GIMP is free
* along these lines, the comparison of total costs in time and money of installing windows/OSX/ubuntu, with all their companion programs, is striking
samspqr - Wednesday, August 26, 2009 - link
and about openoffice:* I didn't check this ltely, but aren't there still problems with VBA compatibility? if I can open my xls/xlsm files but I can't run my macros, it's no good; I have a ton of stuff written in VBA, and I'm definitely not doing all that work again
* the ribbon UI in office 2007 is a royal pain: it's only good for the "It looks like you're writing a letter" users, and you can't get rid of it; there's a lot of people doing real work on excel, and none I talked to likes that ribbon thing, they'd all rather stay with excel 2003
Kakao - Wednesday, August 26, 2009 - link
Ryan, nowadays you don't need to dual boot. You can just set up a virtual machine. If you are a gamer use Windows as host and setup a Linux distro as guest. If you have enough memory, 4GB is very good, you can have both perfectly usable at the same time. I'm using Virtual Box and it works great.VaultDweller - Wednesday, August 26, 2009 - link
"Manufacturer: Canon"I think you mean Canonical.
Ryan Smith - Wednesday, August 26, 2009 - link
It wasn't in our DB when I wrote the article, it was supposed to be added before it went live. Whoops.Thanks you.
Proteusza - Wednesday, August 26, 2009 - link
I havent been able to read the whole cos I'm currently at work, but so far it seems good. Some people have been saying you should be testing 9.04, and I can see their point, but on the other hand, I agree that since 8.04 is the latest LTS release, it should be pretty stable still.Nonetheless, perhaps you could compare a later non LTS release to a service pack for Windows? I mean, there is some new functionality and some fixes. Granted, new versions of Ubuntu contain a lot more few features than Windows service packs.
I agree that the 6 month release cycle is too fast. I dont develop for Ubuntu myself, but I imagine a lot of time will be wasted on preparing for release twice a year. I mean, theres a lot of testing, bugfixing and documentation to be done, and I would think if you would only did that once a year, you would have more time for development. Although, I guess the more changes you do in a release the more you should test, so maybe thats invalid.
I've also never really liked the Linux filesystem and package manager idea. Granted, package managers especially have improved a lot lately, and personally I think we have Ubuntu to thank for that, with its huge focus on usability, which historically Linux hasnt cared at all about.
I also dont like over reliance on the terminal/CLI. I dont like that there are certain things that can only be done with it. Its easier and faster for me to do things with a GUI, because we are visual creatures and a GUI is a much better way of displaying information than just plain text. I think until a lot of the Linux developers get over the idea that the CLI is "the only way to go", the GUI will be underdeveloped. As I said, its only recently that some Linux developers have actually bothered to try to get the various desktop managers up to scratch.
The other thing I find interesting about Ubuntu, is the nerd rage that some Debian developers exhibit towards Ubuntu.
Anyway... when 9.10 comes out, I would love to see your impressions of the difference.
R3MF - Wednesday, August 26, 2009 - link
i thoroughly approve of AT running linux articles..........however i didn't bother to read this one as anything from Q2 2008 is of zero interest to me now.
may i suggest a group-test to be published around Xmas of the following Q4 2009 distro releases:
Ubuntu 9.04
opensuse 11.2
fedora 12 (?)
Mandiva 2010
that would be awesome AND relevant to your readers.
CityZen - Wednesday, August 26, 2009 - link
I was one of those waiting for this article. I do remember getting excited when it was promised back in ... (can't recall the year, sorry, it's been too long :) ). Anyway, the wait seems to have been worth it. Excellent article.A suggestion for part 2: install LinuxMint 7 (apart from Ubuntu 9.04) and see which of the problems you found in part 1 with Ubuntu 8.04 are solved in LinuxMint "out of the box"
captainentropy - Tuesday, September 1, 2009 - link
I totally agree! To hell with Ubuntu, Mint7 is the best linux distro by far. Before I settled on Mint I tried Ubuntu, Kubuntu, PCLinuxOS (my previous fave), Mepis, Scientific, openSUSE, Fedora, Slackware, CentOS, Mandriva, and RedHat. None could come close to the complete awesomeness, beauty, out-of-the-box completeness, and ease of use as Mint7.I'm a scientist and I'm using it for sequence and image analysis, so far.
haplo602 - Wednesday, August 26, 2009 - link
so I got to page before installation and I have so many comments I cannot read further :-)I am using linux on and off as my main desktop system since redhat 6.0 (that's kernel 2.2 iirc) so some 10 years. my job is a unix admin. so I am obviously biased :-)
1. virtual desktops - while this heavily depends on your workflow, it helps organise non-conflicting windows to not occupy the same space. I used to have one for IM/email, one with just web browser, one with my IDE and work stuff and one for GIMP and Blender. while this is my preference, it helps to kill the notification hell that is Windows. I hate how Windows steals focus from whatever I am working on just because some unimportant IM event just occured.
2. package manager and filesystem. given my background, the linux FHS is my 2nd nature. however you failed to grasp the importance of the package manager here. it effectively hides the FHS from you so you do not need to clean up manualy after uninstall. all directories you should ever go into manualy are /etc, your home dir, the system mount directory and whatever the log directory is. If you need to acccess other directories manualy, then you are either a system developer, a programmer or too curious :-)
also you can usualy one-click install .deb packages and they appear in the package manager as usual. just you have to manage dependencies manualy in that case. repositories are nice as you need to set them up ONCE and then all your updates/future versions are taken care of.
3. missing executable icons - this has a lot more background to it but it is a mistake to use nautilus in the default icon mode. you basicaly cannot live withour ownership/permissions displayed on a unix system. trying to hide this in any way in a GUI is a capital mistake. that's why a windows explorer like file manager is not usable under linux. good old MC :-) anyway an executable file can be anything from a shell script to a binary file. you just have to have the correct launcher registered in the system and you can open anything. basicaly same as windows just not that much gui friendly.
4. NVIDIA/ATI drivers - this is a story in itself. use NVIDIA if you want easy of use. use ATI if you want to learn about kernel and X :-) dig through phoronix.com for more info.
ok I will post more comments as I read further :-)
haplo602 - Wednesday, August 26, 2009 - link
so I read the whole article. I would have some more comments :-)1. installation - for me this was never a problem on any linux distro I was using. my partition scheme does not change much and it is usualy the trickiest part of the whole installation process. try out the full gentoo 3 stage installation if you want some fun (ok it is not avaiable via normal means anymore).
2. fonts - as you mentioned with codecs, there are software restrictions and licensing policies governing linux distributions. ms fonts are licensed under different terms than GPL software. yes even FOTNS have licenses. so they are generaly not included in linux distributions by default.
What I missed from the article is the amount of customisation you can do with a typical linux distro. just ubuntu has 3 main variants and you can mix and match them at will. you can even have all 3 installed and switch between the window managers by user preference.
Since you did not like the package manager anyway, you missed on the main Linux strength - application variability.
From a common user perspective however, the article is quite correct. I would expect more from a seasoned windows user and AT editor.
n0nsense - Wednesday, August 26, 2009 - link
Ubuntu 8.04 is 14 months old creature.2 versions released after it and the third one should arrive in October.
In terms of Windows it's short time, but for Linux it's a lot of time.
I suggest your next review should be done on Ubuntu 9.10 instead of 9.04 (which IMHO is better than 8.04 but still lacks some polish).
As mentioned before, the advantage of CLI instructions is that it will work on any Desktop Environment (Gnome, KDE, XFCE etc.) if it's not related to the DE itself. Moreover it will work on different versions (older/newer).
For example in Vista/7 i couldn't find Network Connections in GUI.
But who can stop me to type "Network Connections" in Explorer's address bar ? Sometimes GUI changed and even if only a little, most people will fail to follow screen shots. not to mention that most desktops are so customized (on real geek's computers) that it looks too different. I'm not talking about icons or desktop background. I'm talking about panels (if any at all), docks, menus, context menus etc. in Linux almost everything can be changed. And old-school geeks that had their Linux installations for years do this things so each DE is probably unique. (I have Gnome and apps settings/tweaks for over 7 years. Some of them probably never changed). The trick is that even when you reinstall the system, your personal setting may stay with you. (I jumped form Debian to Ubuntu to Gentto back to Ubuntu to Ubuntu x86_64 and finally to Gentoo x86_64). After all this, i have not lost any user customization/setting. On the system level it's harder since Debian and Gentoo are very different. All this gives you motivation to change and to tweak to make it better. Windows users are not really can customize and when they do, it's only valid until they have to reinstall/upgrade their OS. Since most of the Windows users I know reinstall at least once a year, after few cycles they will stay with defaults for both OS and applications.
Switch to Linux is not the easiest thing. It's usually not "love from first sight" story. But if somehow you stayed around and get to know it, you can't be separated after :)
Even on Windows 7 i feel handicapped in terms of usability and effectiveness/productivity. (I spend more time in front of Windows then Linux computers)
Eeqmcsq - Wednesday, August 26, 2009 - link
for your time spent on writing this article. I've made the jump to from Windows to Ubuntu (and Xubuntu for my older computers) back around 7.10 and 8.04 and I went through some of the headaches in adjusting to Ubuntu, but I eventually solved all of them and I'm quite settled in now.One comment about finding help in the form of command line instructions, rather than GUI instructions. GUI instructions for Ubuntu would not be useful for Kubuntu or Xubuntu, since they use different window managers. The command line solutions usually work for all three.
Also, boot times were noticeably improved in 9.04. Perhaps you can run a quick retest on it.
And you CAN install stuff when using the live CD. I've installed a couple of temperature monitoring utilities when I was stress testing my motherboard.
Finally, thanks again for writing such a thorough look into your Ubuntu experiences. It was a great read in seeing how far Ubuntu has come and what it still lacks.
fepple - Thursday, August 27, 2009 - link
Yeah, you can set the APT sources to use a CD. There is an option for it 'system' > 'administor' > 'software source', or you can edit the /etc/apt/sources.list fileclarkn0va - Wednesday, August 26, 2009 - link
[quote]since SMB is the predominant protocol for consumer file server gear, it’s a fair test of such use.[/quote]While this comment is not false, it presents a lazy approach to comparison; it's a one-sided contest, and Linux, pitted against Windows on home turf, doesn't stand much of a chance.
You as much as acknowledge this in the article, so why not provide some counterpoint? For example, consumer file server gear, even if it supports SMB almost ubiquitously, is usually *nix-based. So instead of just showing Windows and Linux clients interacting with Windows servers, show them interacting with *nix servers as well. Do some NFS transfers as well; NFS is well supported in consumer NAS these days.
You also really missed the boat on the video drivers. 8.04 was not the first Ubuntu release to include the Restricted Drivers Manager (known simply as "Hardware Drivers" in later releases). This handy app will identify hardware, such as AMD and NVIDIA GPUs, that can take advantage of proprietary drivers, and will offer to to install the same via synaptic (APT) with just a click of the mouse. No CLI, no headaches.
Still, a thorough review, and generally well-researched. I'm looking forward to the 9.04 follow-up.
Since you mentioned hardware HD decoding, I recommend taking a look at smplayer from the testing ppa (https://launchpad.net/~rvm/+archive/testing)">https://launchpad.net/~rvm/+archive/testing). Unfortunately vdpau doesn't work with the nvidia blobs in the default Ubuntu repos, but I believe there's a PPA providing vdpau-compatible blobs for anybody not wanting to do CLI installs.
db
VaultDweller - Wednesday, August 26, 2009 - link
[quote]While this comment is not false, it presents a lazy approach to comparison; it's a one-sided contest, and Linux, pitted against Windows on home turf, doesn't stand much of a chance. [/quote]This isn't Linux pitted against Windows on home turf, it's Linux pitted against Windows in the real world.
clarkn0va - Wednesday, August 26, 2009 - link
Well, no doubt SMB is the dominant method of sharing files for consumers in general. Obviously comparing Linux to Windows makes sense in a world where Windows is the incumbent, but it's not the whole story.I hope Part 2 will address some of the objective benefits of Ubuntu, and not fall into the trap of "worse because it's not the same as Windows".
VaultDweller - Wednesday, August 26, 2009 - link
I agree in principle, but there has to be a distinction between "Worse because it's not compatible with Windows," "Worse because it's not as easy as Windows," and "Worse because it's not the same as Windows." Die-hard *nix advocates tend to dismiss the first two as if they were the latter, and this tends to undermine their argument.Also, in some cases "Worse because it's not the same as Windows" can be a valid point, because the public has been trained to the point that the Windows way is the "intuitive" way. Of course, this isn't truly intuitive, as people who learned Linux first would find Linux methodologies more intuitive - but that's largely a moot point, as that's not the reality we live in today. You could say the same thing about the color red - in the western world, when we see red we can intuitively guess that it means Stop, or Warning, or Error, etc. The fact that this is not an understanding we're born with but rather a socially acquired intuition does not mean it would be any easier to suddenly change the color of traffic lights and expect people to adjust without problems.
Ryan Smith - Wednesday, August 26, 2009 - link
All of the NAS gear I can get my hands on is either SMB only, or is a Time Capsule which is SMB + AFP. I don't have anything that does NFS, which isn't so much a comment on testing (I could always set up another box) as it is usefulness. NFS just isn't common on consumer gear; SMB is a more important metric if you're looking at file transfer performance, because that's what most people are going to be working with. This doesn't preclude doing NFS at a later time though.And the Restricted Drivers Manager is limited to the drivers in the Hardy repository, which means they're a year+ out of date.
amrs - Wednesday, September 30, 2009 - link
Interestingly, if one checks the SmallNetBuilder NAS charts, it looks like out of 87 NAS devices, 49 have NFS. 56% in other words. And you say NFS isn't common? Really now? Seems a little biased to me.ekul - Wednesday, August 26, 2009 - link
While a lot of your issues have complicated solutions or lengthy technical backstories I can solve your complaint of smb shares mounted in nautilus not being useful in non-gtk applications in one simple command (or as you seem to hate commands the gui can do it too).theory: make a symlink to the directory nautilus mounts to so it can be easily accessed. Symlinks to directories or files are transparently (to users and applications) identical to the location they refer to. Windows doesn't have symlinks (only useless shortcuts) so it isn't surprising you were not aware to do it.
howto: gvfs uses the directory /home/$USER/.gvfs as a mount point so link to it:
ln -s ~/.gvfs ~/linkname
howto gui: in nautilus go to your home folder then choose view -> show hidden files. Right click on .gvfs and choose make link. Then you can rename the link to whatever you want and hide hidden files again.
hint: symlinks are your best friend. My home dir is littered with links to places on the filesystem I visit a lot to avoid a lot of clicking/typing
Ryan Smith - Wednesday, August 26, 2009 - link
I suddenly feel very humiliated...The symlink is a very elegant solution, I'm embarrassed I didn't think of that myself. It's a bit of a lousy solution in that there even needs to be a solution, but as far as things go that's a very insightful suggestion.
LittleMic - Wednesday, August 26, 2009 - link
http://en.wikipedia.org/wiki/NTFS_symbolic_link">http://en.wikipedia.org/wiki/NTFS_symbolic_linkWell, Windows 2000 had symbolic links for a long time :-p (only for directory until Vista though)
ekul - Wednesday, August 26, 2009 - link
ntfs has symlinks but the windows shell can't create or manipulate them. Pretty pointless. MS can (and does) use them in vista/7 but you can't make your ownEeqmcsq - Wednesday, August 26, 2009 - link
"hint: symlinks are your best friend. My home dir is littered with links to places on the filesystem I visit a lot to avoid a lot of clicking/typing"I use Gnome's bookmarks for that. Those bookmarks even include SMB shares on my other computers.
ekul - Wednesday, August 26, 2009 - link
gnome bookmarks are very handy I just find symlinks to be more flexible since they work regardless of gnome vs kde, gtk vs qt and gui vs cli. Even wine can take advantage of themjigglywiggly - Wednesday, August 26, 2009 - link
one more thing you should have covered is battery life on laptops. Linux in general is pretty awful at managing battery life. Just web browsing 4 hrs on Vista on my vostro 1310(not using 7) but with Ubuntu 2 1/2. It's a huge difference, but oh well.Ryan Smith - Wednesday, August 26, 2009 - link
Laptops are out of our domain, that would be Jarred. If this two-part series is successful, I'll see what I can do about talking him in to putting some Ubuntu (or any Linux for that matter) battery benchmarks in. But I'm told a complete workup takes a while.strikeback03 - Wednesday, August 26, 2009 - link
On my Thinkpad T43, battery life is essentially equal between XP and Ubuntu. Ubuntu may even be slightly better, though I have never bothered with a formal test to put real numbers on both. Have you looked at whether the processor is throttling down properly or not while in Ubuntu?sprockkets - Wednesday, August 26, 2009 - link
"Now we have yet to touch on hardware accelerated playback, which is something we’re going to hold off on until we take a look at Ubuntu 9.04. Linux does not have a common media framework like Windows and Mac OS X have DirectShow/DXVA and QuickTime respectively. Rather the desktop environment that Ubuntu is based off of (GNOME) includes a lesser framework called GStreamer, which is closer to a basic collection of codecs and an interface to them. As such hardware accelerated playback is not as easy to do under Ubuntu as it is under Windows and Mac OS X. We’ll take look at the APIs and the software for this in our look at Ubuntu 9.04."Well, not exactly. There is this: http://en.wikipedia.org/wiki/VaAPI">http://en.wikipedia.org/wiki/VaAPI, which is not exactly widespread yet. nVidia's VDPAU, which provides hardware acceleration for h.264 and if you have the latest version of PureVideo in your card, it does VC-1 as well. It can work with this or alone as well.
Also, while it is wacky that bin or binaries are in one spot, and lib or libraries are in another, and anything you install is in a /usr/lib/local, it does keep all related files in one spot. Keeping all your libraries registered as packages also makes it easy to repair.
Also, one click installs are possible on openSuSE as well, though it does involve a small gui process and adding a repository as well. But doesn't any program require this?
Also, I believe your problem with SMB shares is something I run into as well, but only on GNOME. On KDE, browsing shares is quite normal. Since I never bother mounting the share, I can't directly access it without KDE caching the file locally.
Isn't /home/$Your Name$ intuitive as to where your stuff would be? That is very nice, as I can keep my stuff separate from the OS, thus I can reformat the OS partition without having to touch my data. Imagine reinstalling Windows and finding all your apps working exactly as before with no work. Can OSX do that (not rhetorical)?
Ryan Smith - Wednesday, August 26, 2009 - link
VDPAU is something we'll specifically be covering in the Part 2; in fact it's what I'm working on at this moment.GeorgeH - Wednesday, August 26, 2009 - link
I'm sure the comment section will quickly be swamped with quibbles, so I just wanted to say that I found this article to be very informative, accurate (WRT my own Ubuntu experiences), and thorough. Kudos - now it's time to ask for a raise. :)jigglywiggly - Wednesday, August 26, 2009 - link
I see you shared a lot of the same problems I had with Ubuntu when I first got it. Yeah, it's harder, I won't lie, and it's a pain in the ass when it doesn't work. But when it works, you love it, and you feel like more of a man. I use it for my web server, runs very nicely.Ubuntu sometimes makes you want to shoot it with a m249, but at other times you feel superior to other users. But that's because you are using the terminal all the time and are actually smart, Mac users just need to be shot in the face for their ignorance.
smitty3268 - Wednesday, August 26, 2009 - link
I agreed with a lot of what was in this review.I think a lot of your problems would have gone away by using the newer versions, though, specifically with the package manager. There's much less need for finding things outside of it when you're using the new versions. Even video drivers can usually be put off for 6 months or so if you're not too cutting edge. Leaving the package manager behind is a pain, though, as you found out. You tried to explain that the LTS version was more comparable to Windows/OSX, but in truth very very few desktop users continue to use it. In fact, I'm not aware of any. It's really only used by companies for work machines who don't want to make large changes every 6 months like home users can.
MSTT fonts. Good luck trying to get those by default, they're owned by microsoft who is in no mood to simply give them away to their competitors. Installing them is like installing the patent encumbered video codecs - at your own risk, which is minimal as long as you aren't trying to make money off of it.
It should be mentioned that Red Hat put down some money to buy some nice new fonts a while ago, called Liberation, that are much nicer than the default serif ones this old Ubuntu version was using. Still different than the MS ones, though, which is going to cause some people problems. Also, the font anti-aliasing differences are again due to patents owned by other companies, but there's good news there. They're supposed to expire later this year so better font rendering in Linux should be coming soon! You can already get it working manually, but the distros make it hard to setup.
You mentioned you chose Ubuntu because it was supposed to be user-friendly, which I regard as one of the more puzzling wide-spread myths that go around. Sure, it's a lot simpler than Debian, or some other choices, but it is definitely NOT the distro to choose if you're looking to avoid the CLI, as you found out.
On that note, I would HIGHLY encourage you to eventually go back and do another review (part 3?) that uses a KDE based distro. Maybe try out OpenSUSE next fall, for example. Although KDE is going through a bit of a transition now, it's definitely where all the more interesting stuff is going on. As you said, Gnome is a lot like a boring Windows XP environment, which is both a positive and a negative. KDE is quite different, for better or worse, and is worth a look I think. For one thing, that smb://COMPUTERNAME address will work out of the box in KDE apps. If you do try KDE, I highly recommend another distro besides (K)Ubuntu, though, because they simply don't put any resources into their KDE implementation and it shows.
leexgx - Wednesday, August 26, 2009 - link
Ubuntu KDE has more options to play with that are missing in gnome (but gnome top is far better then KDE top, long time i used linux its task monitor, Linux verson of windows XP task manager but only the process page but very detailed)Ubuntu should be easy to use but it lacks the easy install for drivers and Still does not offer Fail save VGA mode if X windows fails to start your stuck with an command line, it should try an second time but in save mode vga but it does not
Badkarma - Wednesday, August 26, 2009 - link
Thought I'd mention a linux specific site Phoronix has an "Open Letter to Tech Review sites" (http://www.phoronix.com/scan.php?page=article&...">http://www.phoronix.com/scan.php?page=article&....You mentioned linux on Netbooks, and thought I would mention that I found Moblin(www.moblin.org) from Intel very impressive. It's still in beta and a little rough around the edges, but it boots faster than xp resumes from hibernate, around 15sec from bios screen and the UI is designed around small screens. After using it for a few hours and then installing Windows 7, I immediately missed how well Moblin was optimized for the lowres small screen. I had to install W7 because the ath9k kernel module drivers are unstable in Moblin, if not for this I would probably keep it as the primary OS on my netbook.
colonel - Wednesday, August 26, 2009 - link
I ve been using Ubuntu 9.0 for a year with my Dell notebook and i love it, I dont see limitations in my work, the only problem is my company doesn't allow it in the network but is my OS in the houseEeqmcsq - Wednesday, August 26, 2009 - link
I'm still reading it, but on my xubuntu 8.04, my firefox is located in /usr/bin/firefox. Most apps are under /usr/bin.Also, the directory structure is definitely VERY different from Windows. One main difference is that everything that belongs to the user is supposed to be under /home. Everything that belongs to the "system" is everywhere else. I think the theory is that the user stuff is "sandboxed" in /home, so he doesn't mess things up in the system for everyone else.
Penti - Tuesday, September 1, 2009 - link
You have the same in Windows under %SystemDrive%\Documents and Settings\user Although many settings are stored in the register (which can be said to be the equivalent of /etc). It's however there programs like Firefox saves it settings and where you have your My Documents and tempfiles.* %SystemDrive% is a variable and substitute for your systems drive letter on which Windows is installed which can be something other then C:.
fepple - Wednesday, August 26, 2009 - link
On the normal Ubuntu install, the /usr/bin/firefox is actually a symlink that points to the firefox install in /usr/lib :)ioannis - Wednesday, August 26, 2009 - link
the question is, who cares where firefox or any other application's binary is installed? It's not as if you'll go searching for it to run it. They are on your execution 'PATH', which means you can just press ctrl+F2 and type their name, or a terminal, or access them from the application menu.My favourite way is to use something like gnome-go (or krunner in Kubuntu)
PS: yes, all package manager provided application have their binaries in /usr/bin and most user build ones go in /usr/local/bin by default, which is also in your $PATH.
fepple - Wednesday, August 26, 2009 - link
As a developer that has to deal with custom paths or managing symlinks in default paths, I can say I do care where binaries are located ;)ioannis - Wednesday, August 26, 2009 - link
...sorry, I think it's Alt+F2 by default. I'm talking about the 'Run Command' dialog.Eeqmcsq - Wednesday, August 26, 2009 - link
Oh, yes you're right. I stand corrected.sprockkets - Wednesday, August 26, 2009 - link
Ubuntu doesn't ship with the firewall on eh? Weird. SuSE's is on, and that has been the default for quite some time. GUI management of it is easy too.clarkn0va - Wednesday, August 26, 2009 - link
For incoming connections I don't quite grasp what good a firewall will do on a system with no internet-facing services. With no open ports you stand little to gain from adding a firewall, and any internet-facing service you might add, well, you don't want to firewall that anyway.I can see two theoretically plausible arguments for a host-based firewall, but even these don't really stand up in real-world use: 1) a machine that has open ports out of the box (I'm looking at you, Windows), and 2) for the folks who want to police outgoing connections.
In the case of the former, why would we open ports and then block them with a firewall, right out of the box? This makes as much sense to me as MS marketing their own antivirus. Third-party firewalls were rightfully introduced to remedy the silly situation of computers listening on networks where they shouldn't be, but the idea of MS producing a host-based firewall instead of just cleaning up their services profile defies common sense.
In the case of outbound firewalling, I've yet to meet a home user that understood his/her outbound firewall and managed it half-way effectively. Good in theory, usually worse than useless in practice.
db
VaultDweller - Wednesday, August 26, 2009 - link
Just because a port/service is open, doesn't mean you want it open to the whole world.Examples:
SMB
NFS
VNC
RDP
SSH
Web (intranet sites, for example)
And the list could go on... and on and on and on, really.
Also, it's erroneous to assume that only 1st party software will want to open ports.
And that is to say nothing of the possibility of ports being unintentionally opened by rogue software, poorly documented software, naughty admins, or clumsy admins.
Host-based firewalls help with all of these situations.
clarkn0va - Wednesday, August 26, 2009 - link
Windows firewall doesn't filter by source. In other words, if you want SMB or any other service open to some peers and not others, Windows firewall can't help you; you'll need a more sophisticated product or a hardware firewall for that.I'm not saying there's no case for host-based firewalls, I'm just saying it's pointless for most users out of the box, where Ubuntu doesn't need it and Windows should be looking at fixing the problem of unneeded services running, rather than just bolting on another fix.
VaultDweller - Wednesday, August 26, 2009 - link
"I can see two theoretically plausible arguments for a host-based firewall, but even these don't really stand up in real-world use"That sounds to me like a claim that there is little or no case for a host-based firewall; at least, that's how I interpreted it.
"Windows firewall doesn't filter by source. In other words, if you want SMB or any other service open to some peers and not others, Windows firewall can't help you"
That is incorrect, and you should check your facts before making such statements. The Windows Firewall can filter by source. Any firewall exception that is created can be made to apply to all sources, to the local subnet only, or to a custom list of IPs and subnets.
The firewall in Vista and Windows 7 goes a step further, as it is location aware. Different ports and services are opened depending on the network you're plugged into, as exemplified by the default behavior of treating all new networks as "Public" (unknown and untrusted) until instructed otherwise.
clarkn0va - Wednesday, August 26, 2009 - link
"The Windows Firewall can filter by source. Any firewall exception that is created can be made to apply to all sources, to the local subnet only, or to a custom list of IPs and subnets. "In that case I retract my assertion that an out-of-the-box firewall makes no sense in the case of Windows.
As for Ubuntu, or any other desktop OS having no open ports by default, I still see including an enabled firewall by default as superfluous. Meanwhile, firewall GUIs exist for those wishing to add them.
Paazel - Wednesday, August 26, 2009 - link
...not enough pictures. admittedly my interest additionally waned when i read the newest ubuntu isn't be reviewed.philosofool - Wednesday, August 26, 2009 - link
I'm not done with this article, which I'm loving. However, there's a grammatical/spelling quibble that's driving me nuts: "nevertheless" is one world.sheh - Thursday, August 27, 2009 - link
Also, it's "into", not "in to".Anyway, an interesting read. Thanks.
sheh - Thursday, August 27, 2009 - link
Also, it's "into", not "in to".Other than that, an interesting read. Thanks.
ssj4Gogeta - Thursday, August 27, 2009 - link
nevertheless is one "world"?:P
Ryan Smith - Wednesday, August 26, 2009 - link
Noted and fixed. Thank you.ClownPuncher - Wednesday, August 26, 2009 - link
Web browsing page - Ariel should read Arial when talking about fonts?pcfxer - Wednesday, August 26, 2009 - link
Ease of use of Ubuntu is superseded by PC-BSD and its PBI packages. PC-BSD also takes MUCH less time to install than Ubuntu.Souka - Wednesday, August 26, 2009 - link
I use PC-DOS 1.0aRuns very fast on my Core i7 setup, and I haven't even overclocked it yet.
ap90033 - Friday, August 28, 2009 - link
You probably can run more games in that than linux LOL...Penti - Tuesday, September 1, 2009 - link
You can run dosbox or dosemu in Linux just like in Windows...superfrie2 - Wednesday, August 26, 2009 - link
I'm not quite sure I agree with your criticism of .iso mounting in linux. The mount -o loop command is very easy to use after you've done a couple of times. In fact, I think it is far better than using D tools in windows because you don't have to worry about unclicking all the gay-ware it tries to get you to install.Also, I'm not sure I agree with your pseudo dislike for some forms CLI. CLI is far more powerful than what its GUI based copies tries to accomplish. As a matter of fact, the more I learn about linux's CLI, the less I use the GUI. I find myself only using the GUI for web browsing on a regular basis.
However, when looking at the linux GUI, compiz fusion is simply amazing. When I have a shitload of stuff open, compiz allows me to organize all of my windows and access them very efficiently. In fact, when I use windows for games, I feel handicapped.
The most interesting part your testing was that windows applications running under wine outperformed linux native applications. I look forward to hearing more about that aspect like you mentioned.
sadf23ssaaa - Monday, March 22, 2010 - link
Welcome to our website:jordan air max oakland raiders $34--39;
Ed Hardy AF JUICY POLO Bikini $25;
Christan Audigier BIKINI JACKET $25;
gstar coogi evisu true jeans $35;
coach chanel gucci LV handbags $36;
coogi DG edhardy gucci t-shirts $18;
CA edhardy vests.paul smith shoes $32;
jordan dunk af1 max gucci shoes $37;
EDhardy gucci ny New Era cap $16;
coach okely Adidas CHANEL DG Sunglass $18;
zerobug - Monday, February 1, 2010 - link
Regarding benchmarks and Linux-focused hardware roundups, one thing worth of consideration is that while Microsoft places strong resources on O/S development to create features that will require the end users the need to get the latest and greatest powerful hardware, Linux places their efforts in order that the end user will still be able to use their old hardware and get the best user experience while running the latest and greatest software.So,the benchmarks could compare the user experience when running popular software on Microsoft and Linux O/S's, with different powerful machines.
For this, you could pick up some popular open source and proprietary (or their free equivalents) application that can run both Linux and W7. and compare the price, time and power consumption for retrieving, saving, processing, compiling, encrypting,decrypting compacting, extracting, encoding, decoding, backup, restore, nº of frames,etc, with machines in a range of different CPU and memory capacities.
abnderby - Thursday, September 3, 2009 - link
Let me say this, I am a Senior Software QA Engineer, I have been testing windows, windows apps, DB's and web sites for over 10 year now. I am what you could consider an windows guru of sorts.I have off an on always gone and tried linux from red hat 5, 6, ubuntu, suse, fedora etc... Linux is not and has not been ready for mainstream users. Sure simple email, word docs web browsing it is ok.
But in order to do many things I want to do and many advanced windows users the author and many commentors are right. Linux people need to get out of their little shell and wake up.
Linux has such great potential to be a true contenderto windows and OSX. But it lacks simple usability. Out of the box it can come nowhere close to MS or Apple offerings. The out of the box experience is truly horrible.
Hardware drivers? good luck I run RAID cards that have no support. Forget the newest graphics and sound cards. Connecting to shares just as the author mentioned a hassle of a work around.
Again as stated elsewhere Linux needs someone who programs and or scripts to get things done right. I have neitherthe time or patience for such. I use command line when needed. I would rather have 2 or 3 clicks and I am done then have to remember every CLI for every thing I need to do.
Time is money, time is not a commodity. Linus wastes too much time.
It is geting better with each distro true. But It has been 11 years from red hat 5?? and Linux is not a whole lot better than it was then.
What is needed if Linux really wants to make a stand in the desktop space, is a unified pull togeher ofall distro's. Sit down and truly plan out the desktop. Put together a solid platform that out of the box can really put the hurt on MS or Apple.
Look what Apple did with OSX! And how many developers are wrking on it? How many developers are working on Linux all distro's? OSX is a jewel in 7 years it has matured much farther than any *nix distro. And has a following that cannot yet be challenged by any distro available.
Why is it that when win2k came out Linux was claiming to be superior, and yet after 10 years of development it is hardly comparable to XP let alonevista/win 7 or OSX?
You guys really need to wake up and smell the coffee!
Penti - Monday, September 7, 2009 - link
Of course it's not ready for consumer desktops, there are no serious distributions for that.It means no DVD player OOB, no proprietary codecs, no video editing software, no proprietary drivers which works magically. Of course not is SLED and RHEL Desktop ready for normal users it's targeted for Linux admins to set up the environment. Community distributions won't have as easy time to be set up by those. Community distros will also always lack the above mentioned stuff. It's simply not legal for them to offer it OOB. OS X is actually older then Linux and ran on x86 before Apple bought Jobs NeXT company. It's also supported by an OEM. (OEM = Themselves). Which no Linux dist is. It also uses many GNU technologies like GCC, X11 (optional but included on disc), bash shell and so on, and of course SAMBA for SMB/CIFS, on the server edition they use a modified openldap server, dovecot and postfix for mail, Apache, PHP, Perl, MySQL etc. Stuff thats developed on Linux and has matured thanks to it.
There's a lot of problems with having just community supported stuff, but that doesn't mean it's useless or sucks. Sure the kernel aren't really helping getting drivers in there, by locking out closed source stuff but they end up useless if they are proprietary and not updated any way. For the servers just buy RHEL or SLES certified stuff and you get all the hardware support-needed. But on the other hand you wouldn't be much successful in running 7 year old video drivers in Windows either. Community distros definitively don't need to cease existing for the creation of a commercial one. But there will never be one linux and that's really the beauty of it not the course. It wasn't meant to be something rivaling windows and the kernel developers has no desire to create a distro. That's why we can see Linux in stuff like Android and Maemo. And from home routers to mainframes and supercomputers. For a commercial entity targeting that many devices wouldn't be possible. Not with the same basic code and libraries. There are definitively some top notch products and solutions based on Linux and GNU. But Linux doesn't want anything as it's not an entity. And it's really up to GNOME and KDE to create the desktop environment. It's not the distros that shape them and write all the libraries that software developers use to create their software. As there are no major consumer desktop distro maker there is also no one that can really steer them by sponsoring work and holding discussions either. Not towards a unified desktop environment for normal non-tech users anyway. Also GNOME and KDE has no desire to create a exclusive platform around their software. OS X is a innovative 20 year old OS (since commercial release) and is actually based on work before then (BSD code). OS X UI is really 20 years into it's making and builds heavily on the next/openstep framework. On other Unixes there hasn't been any such heritage to build on, X was an total mess on commercial Unixes and I would actually say it's a lot better and more streamline now. There's just Xorg now, sure there are a lot of window managers but only two major environments now so it's still better then when all the vendors had it's own and couldn't make up it's mind on which direction to go and standardize on. In the middle of the 90's there where like at least 4 major Unix vendors that all had their own workstations.
fazer150 - Friday, September 4, 2009 - link
which Linux distro have you tried? did you try the PCLinuxOS which is atleast as usable as windows xp, 2003?nilepez - Sunday, August 30, 2009 - link
Most end users are not comfortable with the command line. Linux, even Ubuntu, is still not ready for the masses. This shouldn't be confused with the quality of the OS. It's mostly GUI issue. I've also had some issues with installers failing. Some were solved from an xterm and others just didn't work.It wasn't a big deal in most cases, because there's generally another program that can get the job done, but for the typical home user, it's a deal killer. Nevertheless, I must give credit where credit is due, and Ubuntu has made huge strides in the right direction. The UI isn't close to Windows 7 and I suspect it's not close to OS X either, but Canonical is moving in the right direction.
Etern205 - Thursday, August 27, 2009 - link
See this is the problem with some of linux users, you guys are some what always closed in a nutshell. What you may think is easy does not mean the rest of the world will agree with you. In this day and age, people what to get things done quickly and use the least amount of time as possible. For Mac OS X and Windows getting a simple task done takes like 3 simple clicks, for Ubuntu performing the same tasks requires the user to do extensive amount of research just to complete it.I'm glad this article was written by a author who has not head into linux terriroty before and it shows the true side of linux from the perspective of a new user.
If you like to do ramen coding and so forth does not mean the others will. If linux want's to become mainstream, then they really need to stand in the shoes of Joe or Jane.
forkd - Saturday, October 31, 2009 - link
I use mac, windows and linux and I must disagree with your assessment of "this is the problem with some linux users"This article, and this site for that matter, comes from the perspective of a windows (and some mac) user looking at linux. More specifically Ubuntu. From this point of view of course Linux is difficult. A person who is linux focused thinks windows is difficult at first too and is likely to criticize. If you take the time to learn something instead of just criticizing something because it is different you may be a lot happier.
fepple - Thursday, August 27, 2009 - link
Check out all the usability studies the Gnome Project does, then come back and make some more generalization :)SoCalBoomer - Thursday, August 27, 2009 - link
Again - those are done by Linux people. His points are right on. . .someone a while ago did a "Mom" test, which is closer to what is needed, not people who know computers doing studies on usability.fepple - Friday, August 28, 2009 - link
That is exactly how the usability tests are performed. Developer asks Mom "can you change the background" then records what they dofepple - Friday, August 28, 2009 - link
So i tried to find some links about this relating to gnome, but only got some pretty old ones. There are other methods they are using as well, like the 100 paper cuts idea. honestly have a look around and you'll see how much of a focus it is, particularly with ubuntuap90033 - Friday, August 28, 2009 - link
Face it, Linux is still back in Windows 2000 days. Try getting SLI working, 1080P working right, games working. IT IS Way to much trouble and damn near impossible for regular users. In Windows or Mac its next to no work and very little issue. Wake up guys, Linux has Potential but thats it. BECAUSE those who advocate it spend so much energy defending what is "easy" to them when they ought to use that energy making it ACTUALLY easy and USER USER USER USER (DO YOU UNDERSTAND THIS WORD?) FRIENDLY... NOT PROGRAMMER FRIENDLY...newend - Wednesday, September 2, 2009 - link
All of the things you mention are probably not that easy for grandma to do either. People thrive on saying it's so hard to do things in Linux, but I think it's generally not intuitive to use most computer systems. Imagine if you had no exposure to computers how difficult any system would be. A few years ago a friend of mine wanted me to install some software on her Mac. I had no idea how to do it. I've been using computers since I was 5 years old, but had to google for information on installing software.I actually think that Yum/Apt repos actually make it significantly easier to install software. The other day I wanted an application to take a photo with my webcam. I simply did a search "yum search webcam" and looked at the descriptions of included software and found Cheese which did exactly what I wanted.
When you know exactly what you want, and it's not available in the repos you use, I agree it is more difficult to get it installed. Still with both Red Hat/Fedora and Debian/Ubuntu, you can do an install by downloading a package file. This doesn't get you the benefit of automatic updates, but it's just as easy to install as an MSI file.
fepple - Friday, August 28, 2009 - link
Well maybe they would want '1080p' but I'm not sure how that could be a problem unless you have some strange hardware that requires a specific driver... like another OS sometime needs you to go to a manufacturers website ;)Penti - Tuesday, September 1, 2009 - link
Installing nVidia drivers and XBMC or mplayer isn't that hard.But keep in mind there is only homebrew codecs on Linux which OEMs like Dell can never ship with there computers and has limited support of proprietary formats such as BD. It's the same codecs as ffdshow, or as in XBMC or VLC on Windows. What's lacking is a PowerDVD with BD support. w32codecs is also available for gstreamer, giving alternative support for WMV and such. Installing ubuntu-restricted-extras is essentially the only thing you need for it to work in Totem if you don't need/have VDPAU support. XBMC is definitively a decent platform to playback warez. You need to rip blurays to be able to play them back at all though. But an Ion is definitively powerful enough for 1080p h264 under linux. But because all that software contains unlicensed patented codecs Canonical don't officially support any of it. So it won't work on ubuntu OOB. Codecs aren't free.
fepple - Friday, August 28, 2009 - link
Thing is, 'regular users' dont care about SLI, 1080P and Windows Games... "where is the browser/word processor/email?" :)CastleFox - Friday, April 9, 2010 - link
Great review. Thank you for reviewing 8.04 LTS Please review 10.04 when it comes out. I am interested to see if they software center has changed the authors opinions.tiffanyrose - Wednesday, June 30, 2010 - link
I have a new website!wedding dresses uk:http://www.dresssale.co.uktiffanyrose - Wednesday, June 30, 2010 - link
have a new website!wedding dresses uk:http://www.dresssale.co.ukcosplay - Thursday, July 8, 2010 - link
Your ignorance and stupidity is showing here. No engineering software for Linux? Hello? Matlab is available, Simulink is available, Labview the same. Xilinx and Altera have supported Linux for a long time and so do the smaller FPGA houses like Lattice and Actel. Mentor Graphics too. Orcad is the only one you mentioned that isn't available on Linux, but Cadence does support Linux with their Allegro product and so does Mentor Graphics with PADS and Board Station and Expedition.amiracollections - Monday, August 30, 2010 - link
i find a fashion shop, they supply handbags, wedding dresses, party dresses, evening dresses. and the price lower to others, more info to http://www.amiracollections.co.uknar321 - Thursday, September 23, 2010 - link
Love it, more important it installs on drives that BSOD if trying to install WinXP.kevith - Monday, December 6, 2010 - link
to see the Danish - as in inhabitants of my country, Denmark, and not like in pastry - gratis = free here at Anands:-)alunduo - Thursday, January 27, 2011 - link
i have a wonderful website: http://www.wonderful-again.comalunduo - Thursday, January 27, 2011 - link
wonderful website: <a href="http://www.wonderful-again.com">www.wonder...amorexue - Sunday, September 18, 2011 - link
http://www.loveweddingdress.co.uksummershine - Wednesday, November 2, 2011 - link
Good article! I also wish you can come to my site! It is a new site selling wedding dresses online: http://www.brideswardrobe.com.au/xuli - Thursday, November 3, 2011 - link
This a new website!There are many wedding dresses here:http://www.brideswardrobe.com.au/.ultraseksy - Thursday, March 8, 2012 - link
best blog ever I have read all 17 pages of comments…a lot of Linux lovers out there… and they all purposely ignore few important things that make Windows successful, which in term, makes most Linux distribution marking failures, I have used Linux on my net book and my ps3, and I absolutely.http://www.ultraseksy.com
http://www.ultraseksy.com/blog
ultraseksy - Tuesday, March 13, 2012 - link
i agree this is a brilliant site with lots of great content which i love thanks<a href="http://www.sendfreetext.info">send free text</a>
<a href="http://www.myplussizelingerie.co.uk">plus size lingerie</a>
<a href="http://www.uktrollbeads.co.uk">uk troll beads</a>
ultraseksy - Tuesday, March 13, 2012 - link
http://www.sendfreetext.info send free texthttp://www.myplussizelingerie.co.uk plus size lingerie
http://www.uktrollbeads.co.uk uk troll beads
juweliefol - Tuesday, April 24, 2012 - link
8.04 is a long LTS release, still there are lots of people are using.http://www.elegantpark.co.uk/wedding-dresses/a-lin...
xinxin - Friday, August 31, 2012 - link
If this article was a private work of the author to provide him an answer on whether he may or may not move to Linux, people should advise him the above mentioned. As for an article intended to be read by thousands it must be pointed out that it's conclusion is a miss lead.It is really nice.http://www.shortweddingdressus.com">http:/...http://www.shortweddingdressus.com
xinxin - Friday, August 31, 2012 - link
Nice website.I wish more works.http://www.shortweddingdressus.com