Archive

Archive for the ‘Blast from the Past’ Category

Blast from the Past: Phoenix Rising

December 14th, 2016 No comments

This post was originally published on December 12, 2009. The original can be found here.


As we prepare to bring The Linux Experiment to a close over the coming weeks, I find that this has been a time of (mostly solemn) reflection for myself and others. At the very least, it’s been an interesting experience with various flavours of Linux and what it has to offer. At its peak, it’s been a roller-coaster of controversial posts (my bad), positive experiences, and the urge to shatter our screens into pieces.

Let me share with you some of the things I’ve personally taken away from this experiment over the last three-and-a-half months.

Fedora 12 is on the bleeding edge of Linux development

This has been a point of discussion on both of our podcasts at this point, and a particular sore spot with both myself and Tyler. It’s come to a place wherein I’m sort of… afraid to perform updates to my system out of fear of just bricking it entirely. While this is admittedly something that could happen under any operating system and any platform, it’s never been as bad for me as it has been under Fedora 12.

As an example, the last *six* kernel updates for me to both Fedora 11 and 12 combined have completely broken graphics capability with my adapter (a GeForce 8600 M GS). Yes, I know that the Fedora development team is not responsible for ensuring that my graphics card works with their operating system – but this is not something the average user should have to worry about. Tyler has also had this issue, and I think would tend to agree with me.

Linux is fun, too

Though there have been so many frustrating moments over the last four months that I have been tempted to just format everything and go back to my native Windows 7 (previously: release candidate, now RTM). Through all of this though, Fedora – and Linux in general – has never stopped interesting me.

This could just be due to the fact that I’ve been learning so much – I can definitely do a lot more now than I ever could before under a Linux environment, and am reasonably pleased with this – but I’ve never sat down on my laptop and been bored to play around with getting stuff to work. In addition, with some software (such as Wine or CrossOver) I’ve been able to get a number of Windows games working as well. Linux can play, too!

Customizing my UI has also been a very nice experience. It looks roughly like Sasha’s now – no bottom panel, GnomeDo with Docky, and Compiz effects… it’s quite pretty now.

There’s always another way

If there’s one thing I’ve chosen to take away from this experiment it’s that there is ALWAYS some kind of alternative to any of my problems, or anything I can do under another platform or operating system. Cisco VPN client won’t install under Wine, nor will the Linux client version? BAM, say hello to vpnc.

Need a comprehensive messaging platform with support for multiple services? Welcome Pidgin into the ring.

No, I still can’t do everything I could do in Windows… but I’m sure, given enough time, I could make Fedora 12 an extremely viable alternative to Windows 7 for me.

The long and short of it

There’s a reason I’ve chosen my clever and rather cliche title for this post. According to lore, a phoenix is a bird that would rise up from its own ashes in a rebirth cycle after igniting its nest at the end of a life cycle. So is the case for Fedora 12 and my experience with Linux.

At this point, I could not see myself continuing my tenure with the Fedora operating system. For a Linux user with my relatively low level of experience, it is too advanced and too likely to brick itself with a round of updates to be viable for me. Perhaps after quite a bit more experience with Linux on the whole, I could revisit it – but not for a good long while. This is not to say it’s unstable – it’s been rock solid, never crashing once – but it’s just not for me.

To that end, Fedora 12 rests after a long and interest-filled tenure with me. Rising from the ashes is a new user in the world of Linux – me. I can say with confidence that I will be experimenting with Linux distributions in the future – maybe dipping my feet in the somewhat familiar waters of Ubuntu once more before wading into the deep-end.

Watch out, Linux community… here I come.

 

Blast from the Past: Coming to Grips with Reality

December 12th, 2016 No comments

This post was originally published on December 8, 2009. The original can be found here.


The following is a cautionary tale about putting more trust in the software installed on your system than in your own knowledge.

Recently, while preparing for a big presentation that relied on me running a Java applet in Iceweasel, I discovered that I needed to install an additional package to make it work. This being nothing out of the ordinary, I opened up a terminal, and used apt-cache search to locate the package in question. Upon doing so, my system notified me that I had well over 50 ‘unnecessary’ packages installed. It recommended that I take care of the issue with the apt-get autoremove command.

Bad idea.

On restart, I found that my system was virtually destroyed. It seemed to start X11, but refused to give me either a terminal or a gdm login prompt. After booting into Debian’s rescue mode and messing about in the terminal for some time trying to fix a few circular dependencies and get my system back, I decided that it wasn’t worth my time, backed up my files with an Ubuntu live disk, and reinstalled from a netinst nightly build disk of the testing repositories. (Whew, that was a long sentence)

Unfortunately, just as soon as I rebooted from the install, I found that my system lacked a graphical display manager, and that I could only log in to my terminal, even though I had explicitly told the installer to add GNOME to my system. I headed over to #debian for some help, and found out that the testing repositories were broken, and that my system lacked gdm for some unknown reason. After following their instructions to work around the problem, I got my desktop back, and once more have a fully functioning system.

The moral of the story is a hard one for me to swallow. You see, I have come to the revelation that I don’t know what I’m doing. Over the course of the last 3 months, I have learned an awful lot about running and maintaining a Linux system, but I still lack the ability to fix even the simplest of problems without running for help. Sure, I can install and configure a Debian box like nobody’s business, having done it about 5 times since this experiment started; but I still lack the ability to diagnose a catastrophic failure and to recover from it without a good dose of help. I have also realized something that as a software developer, I know and should have been paying attention to when I used that fatal autoremove command – when something seems wrong, trust your instincts over your software, because they’re usually correct.

This entire experiment has been a huge learning experience for me. I installed an operating system that I had never used before, and eschewed the user-friendly Ubuntu for Debian, a distribution that adheres strictly to free software ideals and isn’t nearly as easy for beginners to use. That done, after a month of experience, I switched over from the stable version of Debian to the testing repositories, figuring that it would net me some newer software that occasionally worked better (especially in the case of Open Office and Gnome Network Manager), and some experience with running a somewhat less stable system. I certainly got what I wished for.

Overall, I don’t regret a thing, and I intend to keep the testing repositories installed on my laptop. I don’t usually use it for anything but note taking in class, so as long as I back it up regularly, I don’t mind if it breaks on occasion; I enjoy learning new things, and Debian keeps me on my toes. In addition, I think that I’ll install Kubuntu on my desktop machine when this whole thing is over. I like Debian a lot, but I’ve heard good things about Ubuntu and its variants, and feel that I should give them a try now that I’ve had my taste of what a distribution that isn’t written with beginners in mind is like. I have been very impressed by Linux, and have no doubts that it will become a major part of my computing experience, if not replacing Windows entirely – but I recognize that I still have a long way to go before I’ve really accomplished my goals.

As an afterthought: If anybody is familiar with some good tutorials for somebody who has basic knowledge but needs to learn more about what’s going on below the surface of a Linux install, please recommend them to me.

 

Blast from the Past: Top 10 things I have learned since the start of this experiment

December 9th, 2016 No comments

This post was originally published on October 2, 2009. The original can be found here.


In a nod to Dave’s classic top ten segment I will now share with you the top 10 things I have learned since starting this experiment one month ago.

10: IRC is not dead

Who knew? I’m joking of course but I had no idea that so many people still actively participated in IRC chats. As for the characters who hang out in these channels… well some are very helpful and some… answer questions like this:

Tyler: Hey everyone. I’m looking for some help with Gnome’s Empathy IM client. I can’t seem to get it to connect to MSN.

Some asshat: Tyler, if I wanted a pidgin clone, I would just use pidgin

It’s this kind of ‘you’re doing it wrong because that’s not how I would do it’ attitude can be very damaging to new Linux users. There is nothing more frustrating than trying to get help and someone throwing BS like that back in your face.

9: Jokes about Linux for nerds can actually be funny

Stolen from Sasha’s post.

Admit it, you laughed too

Admit it, you laughed too

8. Buy hardware for your Linux install, not the other way around

Believe me, if you know that your hardware is going to be 100% compatible ahead of time you will have a much more enjoyable experience. At the start of this experiment Jon pointed out this useful website. Many similar sites also exist and you should really take advantage of them if you want the optimal Linux experience.

7. When it works, it’s unparalleled

Linux seems faster, more featured and less resource hogging than a comparable operating system from either Redmond or Cupertino. That is assuming it’s working correctly…

6. Linux seems to fail for random or trivial reasons

If you need proof of these just go take a look back on the last couple of posts on here. There are times when I really think Linux could be used by everyone… and then there are moments when I don’t see how anyone outside of the most hardcore computer users could ever even attempt it. A brand new user should not have to know about xorg.conf or how to edit their DNS resolver.

Mixer - buttons unchecked

5. Linux might actually have a better game selection than the Mac!

Obviously there was some jest in there but Linux really does have some gems for games out there. Best of all most of them are completely free! Then again some are free for a reason

Armagetron

Armagetron

4. A Linux distribution defines a lot of your user experience

This can be especially frustrating when the exact same hardware performs so differently. I know there are a number of technical reasons why this is the case but things seem so utterly inconsistent that a new Linux user paired with the wrong distribution might be easily turned off.

3. Just because its open source doesn’t mean it will support everything

Even though it should damn it! The best example I have for this happens to be MSN clients. Pidgin is by far my favourite as it seems to work well and even supports a plethora of useful plugins! However, unlike many other clients, it doesn’t support a lot of MSN features such as voice/video chat, reliable file transfers, and those god awful winks and nudges that have appeared in the most recent version of the official client. Is there really that good of a reason holding the Pidgin developers back from just making use of the other open source libraries that already support these features?

2. I love the terminal

I can’t believe I actually just said that but it’s true. On a Windows machine I would never touch the command line because it is awful. However on Linux I feel empowered by using the terminal. It lets me quickly perform tasks that might take a lot of mouse clicks through a cumbersome UI to otherwise perform.

And the #1 thing I have learned since the start of this experiment? Drum roll please…

1. Linux might actually be ready to replace Windows for me

But I guess in order to find out if that statement ends up being true you’ll have to keep following along 😉

 

Blast from the Past: How is it doing that?

December 7th, 2016 No comments

This post was originally published on December 15, 2009. The original can be found here.


Just about everything that I’ve ever read about media playback on Linux has been negative. As I understand the situation, the general consensus of the internet is that Linux should not be relied on to play media of any kind. Further, I know that the other guys have had troubles with video playback in the past.

All of which added up to me being extremely confused when I accidentally discovered that my system takes video playback like a champ. Now from the outset, you should know that my system is extremely underpowered where high definition video playback is concerned. I’m running Debian testing on a laptop with a 1.73 GHz single-core processor, 758MB shared video RAM, and a 128MB Intel GMA 900 integrated graphics card.

Incredibly enough, it turns out that this humble setup is capable of playing almost every video file that I can find, even with compiz effects fully enabled and just a base install of vlc media player.

Most impressively, the machine can flawlessly stream a 1280x528px 1536kb/s *.mkv file over my wireless network.

As a comparison, I have a Windows Vista machine with a 2.3GHz processor, 4GB of RAM, and a 512MB video card upstairs that can’t play the same file without special codecs and the help of a program called CoreAVC. Even with these, it plays the file imperfectly.

I can’t explain how this is possible, but needless to say, I am astounded at the ability of Linux.

 

Blast from the Past: A lengthy, detailed meta-analysis of studies of GNOME Do

December 5th, 2016 No comments

This post was originally published on November 23, 2009. The original can be found here.


GNOME Do is a fantastic little program that makes Linux Mint a very comfortable experience. At first glance, GNOME Do just looks like a collection of launchers that can be docked to your window, with a search function attached for completeness. What stands out about Do, though, is that the search function offers a lot of versatility. Through Do, I can launch programs, mount and unmount drives, bring up folders, and execute a variety of actions through the plug-ins. I’ve found that it saves me a lot of mouse movement (yes, I’m that lazy) when I’m working on assignments. In less than two seconds, I can call up Kate to start up my data entry, start up R in terminal, open the folder containing all of my data, and start a conversation in Pidgin. Best of all, since the search function can be called up with the Super+Space key combination, I can do all of this without ever having to switch windows.

I also find that Do helps to clean up the clutter on my desktop. I’ve got it set up as the Docky theme on the bottom of my screen. Since I have no need for the panel, I’ve got it set up to autohide at the top of my monitor. This means when I have something maximized, it legitimately takes up the entire monitor.

What a beautifully clean desktop.

What a beautifully clean desktop.

Adding or removing programs to or from Do is a cinch too – it’s as simple as dragging and dropping.

Unfortunately, it’s not all great

Like every other Linux program, Do saves time and effort. Like every other Linux program, Do also costs time and effort in the bugs that it has. The most frustrating bug I’ve had so far is that Do simply disappears on a restart. It runs and in a manner it “exists” since I can resize it on my desktop, but I can’t actually see or use it. Apparently this is a known bug, and I haven’t been able to find a decent solution to it. It’s especially unfortunate because Do provides so much convenience that when it doesn’t work properly, I feel like I’m reverting to some primitive age where I’m dependent on my mouse (the horror!)

Notice how the cursor is cut off? In reality, it's a resizing cursor, used to resize an invisible panel. It technically does work since after I reboot I find that GNOME Do inadvertently takes up half my screen.

Notice how the cursor is cut off? In reality, it’s a resizing cursor, used to resize an invisible panel. It technically does function, since after I reboot I find that GNOME Do inadvertently takes up half my screen.

Regardless, I’d recommend Do for anyone who can install it. When it works, it’s great for saving you some time and effort; when it doesn’t, well, ’tis better to have loved and lost….

 

Blast from the Past: Open formats are… the best formats?

December 2nd, 2016 No comments

This post was originally published on January 17, 2016. The original can be found here.


Over the past few years there has been a big push to replace proprietary formats with open formats. For example Open Document Format and Office Open XML have largely replaced the legacy binary formats, we’re now seeing HTML5 + JavaScript supplant Silverlight and Java applets, and even the once venerable Flash is on its deathbed.

This of course all makes sense. We’re now in an era where the computing platforms, be it Microsoft Windows, Apple OS X, Android, iOS, Linux, etc., simply don’t command the individual market shares (or at least mind shares) that they once used to. Things are… more diversified now. And while they may not matter to the user the underlying differences in technologies certainly matter to the developer. This is one of the many reasons you see lots of movement to open formats where the same format can be implemented, relatively easily, on all of the aforementioned platforms.

So then the question must be asked: does this trend mean that open formats are the best formats? That is obviously quite a simple question to a deep (and perhaps subjective) subject so perhaps it’s better to look at it from a user adoption perspective. Does being an open format, given all of its advantages, translate to market adoption? There the answer is not as clear.

Open by example

Let’s take a look a few instances where a clear format winner exists and see if it is an open format or a closed/proprietary format.

Documents

When it comes to documents the Open Document Format and Open Office XML have largely taken over. This has been driven largely by Microsoft making Office Open XML the default file format in all versions of Microsoft Office since 2007. Additionally many governments and organizations around the world have standardized on the use of Open Document Format. That said older Microsoft Office binary formats (i.e. .doc, .xls, etc.) are still widely in use.

Verdict: open formats have largely won out.

Audio

For the purposes of the “audio” category let’s consider simply the audio codec that most people use to consume their music. In that regard MP3 is still the absolute dominant format. While it is somewhat encumbered by patents you will hardly find a single device out there that doesn’t support it. This is true even when there are better lossy compression formats (including the proprietary AAC or open Ogg Vorbis) as well as lossless formats like FLAC.

Verdict: the closed/proprietary MP3 format is the de facto standard.

Video

Similarly for the “video” category I’ll only be focusing on the codecs. While there are plenty of open video formats (Theora, WebM, etc.) they are not nearly as well supported as the proprietary formats like MPEG-2, H.264, etc. Additionally the open formats (in general) don’t have quite as good quality vs size ratios as the proprietary ones which is often while you’ll see websites using them in order to save on bandwidth.

Verdict: closed/proprietary formats have largely won out.

File Compression

Compression is something that most people consider more as an algorithm than a format which is why I’ll be focusing on the compressed file container formats for this category. In that regard the ZIP file format is by far the most common. It has native support in every modern operating system and offers decent compression. Other open formats, such as 7-Zip, offer better performance and even some proprietary formats, like RAR, have seen widespread use but for the most part ZIP is the go-to format. What muddies the waters here a bit is that the base ZIP format is open but some of the features added later on were not. However the majority of uses are based on the open standards.

Verdict: the open zip format is the most widely used standard.

Native Applications vs Web Apps

While applications may not, strictly speaking, be a format it does seem to be the case that every year there are stories about how Web Apps will soon replace Native Applications. So far however the results are a little mixed with e-mail being a perfect example of this paradox. For personal desktop e-mail web apps, mostly Gmail and the like, have largely replaced native applications like Microsoft Outlook and Thunderbird. On mobile however the majority of users still access their e-mail via native “apps”. And even then in enterprises the majority of e-mail usage is still done via native applications. I’m honestly not sure which will eventually win out, if either, but for now let’s call it a tie.

Verdict: tie.

The answer to the question is…

Well just on the five quick examples above we’ve got wins for 2 open formats, 2 closed/proprietary formats and one tie. So clearly based on market adoption we’re at a stand still.

Personally I’d prefer if open formats would take over because then I wouldn’t have to worry about my device supporting the format in question or not. Who knows, maybe by next year we’ll see one of the two pull ahead.

This post originally appeared on my website here.