This isn’t going well.

July 26th, 2013 No comments

Today I started out by going into work, only to discover that it is NEXT Friday that I need to cover.

So I came home and decided to get a jump start on installing Kubuntu.

I am now at a screeching halt because the hardware I am using has Win8 installed on it and when I boot into the Start Up settings, I lose the ability to use my keyboard. This is going swimmingly.

So, it is NOW about 3 hours later.

In this time, I have cursed, yelled, felt exasperated and been downright pissed.

This is mainly because Windows 8 does not make it easily accessible to get to the Boot Loader. In fact, the handy Windows made video that is supposed to walk you through how EASY, and user friendly the process of changing system settings is fails to mention what to do if the “Use a Device” option is nowhere to be found (as it was in my case).

So I relied on Google, which is usually pretty good about answering questions about stupid computer issues. I FINALLY came across one post that stated that due to how quickly Windows 8 boots, that there is no time to press F2 or F8. However, I tries anyway. F8 is the key to selecting what device you want to boot from, as you will see later in this post.

What you will want to do if installing any version of Linux is, first format a USB stick to hold your Linux distro. I used Universal USB Loader. The nice thing about this loader is that you don’t have to already have the .iso for the distro you want to use downloaded. You have the option of downloading right in the program.

After you have selected you distro, downloaded the .iso and loaded it onto your USB stick now is the fun part. Plug your USB stick into the computer you wish to load Linux onto.

Considering how easy this was once I figured it all out, I do feel rather silly. If I were to have to do it again, I would feel much more knowledgeable.

If you are using balls-ass Windows 8, like I was, the EASIEST way to select an alternate device to boot from is to restart the computer and press F8 a billion times until a menu pops up, letting you choose from multiple devices. Choose the device with the name of the USB stick, for me it was PENDRIVE.

Once you press enter (from a keyboard that is attached directly to the computer you are using via USB cable, because apparently Win8 loses the ability to use Wireless USB devices before the OS has fully booted…at least that was my experience).

So now, I am being prompted to install Kubuntu (good news, I already know it supports my projector, because I can see this happening).

Now, I have had to plug in a USB wired keyboard and mouse for this process so far. This makes life a little bit difficult because the computer I am using sits in a closet, too far away from my projector screen. This makes it almost impossible for me to see what is going on, on the screen. So installing the drives for my wireless USB devices it a bit of a pain.

However, the hard part is over. The OS is installed successfully. My next post will detail how the hell to install wireless USB devices. I will probably also make a fancy signature, so you all know what I am running.

Come on, really?!

July 25th, 2013 3 comments

So it is 9:40 PM and I started my “Find a Linux distro to install” process. Like many people, I decided to type exactly what I wanted to search into Google. Literally, I typed “Linux Distro Chooser” into Google. Complex and requiring great technical skill, I know.

My next mission was to pick the site that had a description with the least amount of “sketch”. Meaning, I picked the first site in the Google results. I then used my well honed multiple choice skills (ignore the question, pick B) to find my perfect Linux distro match.

After several pages of clicking through, I was presented with a list of Linux distributions that fit my needs and hardware.

See, a nice list, with percents and everything.

This picture has everything... percents, mints, Man Drivers...

This picture has everything… percents, mints, Man Drivers…

So naturally, I do what everyone does with lists.. look at my options and pick the one with the prettiest picture.

For me that distro was Kubuntu. It has a cool sounding name that starts with the same letter as my name.

So I follow the link through to the website to pull the .iso and this pops up.

Fuck Drupal

God damn Drupal!

I have dealt with Drupal before, as it was the platform the website I did data entry for was built on. Needless to say, I hate it. Hey Web Dev with Trev, if you are out there, I hope you burn your toast the next time you make some.

So, to be productive while waiting for Drupal to fix it’s shit, I decided to start a post and rant. In the time this took, the website for Kubuntu has recovered (for now).

So, I downloaded my .iso and am ready to move it onto a USB stick.

I’m debating whether I want to install it now or later, as I would really like to watch some West Wing tonight. I know that if I start this process and fuck it up, I am going to be forced to move upstairs where there is another TV, but it is small :(

Well, here I go, we’ll see how long it takes me to install it. If you are reading this, go ahead and time me… it may be a while.

Announcing The Linux Experiment (The Third!)

July 23rd, 2013 No comments

That’s right, time for round three!

If you’ve been following this website in the past you know what this means. This time around our rules are simple: give back to the community. With this idea in mind the participants, known as guinea pigs, will attempt to use their unique passions, interests and talents to give something back to the world of Linux and open source software. The goal is purposely designed to be generic and open ended – we want each guinea pig to interpret the rule in their own way and let their creativity determine how they will give back.

Some ideas we’ve tossed around to accomplish this goal have been:

  • Writing about a unique way to setup open source software to accomplish something
  • Trying out and writing about a new distribution that offers a different experience
  • Giving back to a distribution you use or like
  • Helping a project by submitting code patches, documentation updates, artwork, UX design or simply spreading the word
  • Starting your own project to accomplish something
  • Etc.

So join us dear reader as we chronicle giving back to the community. Oh and don’t be afraid to take part in your own way either ;)




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.
Categories: Tyler B Tags:

Make printing easy with the Samsung Unified Linux Driver Repository

July 13th, 2013 No comments

I recently picked up a cheap Samsung laser printer and decided to give the Samsung Unified Linux Driver Repository a shot while installing it. Basically the SULDR is a repository you add to your /etc/apt/sources.list file which allows you to install one of their driver management applications. Once that is installed anytime you go to hookup a new printer the management application automatically searches the repository, full of the official Samsung printer drivers, finds the correct one for you and installs it. Needless to say I didn’t have any problems getting this printer to work on linux!




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.
Categories: Hardware, Linux, Tyler B Tags: , ,

Listener Feedback Podcast Update (July 2013)

July 12th, 2013 No comments

A couple new Listener Feedback podcast episodes have been released in case you missed them:

So grab the MP3 or Ogg version of this Creative Commons podcast and enjoy!




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.

Big distributions, little RAM 6

July 9th, 2013 3 comments

It’s that time again where I install the major, full desktop, distributions into a limited hardware machine and report on how they perform. Once again, and like before, I’ve decided to re-run my previous tests this time using the following distributions:

  • Fedora 18 (GNOME)
  • Fedora 18 (KDE)
  • Fedora 19 (GNOME
  • Fedora 19 (KDE)
  • Kubuntu 13.04 (KDE)
  • Linux Mint 15 (Cinnamon)
  • Linux Mint 15 (MATE)
  • Mageia 3 (GNOME)
  • Mageia 3 (KDE)
  • OpenSUSE 12.3 (GNOME)
  • OpenSUSE 12.3 (KDE)
  • Ubuntu 13.04 (Unity)
  • Xubuntu 13.04 (Xfce)

I even happened to have a Windows 7 (64-bit) VM lying around and, while I think you would be a fool to run a 64-bit OS on the limited test hardware, I’ve included as sort of a benchmark.

All of the tests were done within VirtualBox on ‘machines’ with the following specifications:

  • Total RAM: 512MB
  • Hard drive: 8GB
  • CPU type: x86 with PAE/NX
  • Graphics: 3D Acceleration enabled

The tests were all done using VirtualBox 4.2.16, and I did not install VirtualBox tools (although some distributions may have shipped with them). I also left the screen resolution at the default (whatever the distribution chose) and accepted the installation defaults. All tests were run between July 1st, 2013 and July 5th, 2013 so your results may not be identical.

Results

Just as before I have compiled a series of bar graphs to show you how each installation stacks up against one another. This time around however I’ve changed how things are measured slightly in order to be more accurate. Measurements (on linux) were taken using the free -m command for memory and the df -h command for disk usage. On Windows I used Task Manager and Windows Explorer.

In addition this will be the first time where I provide the results file as a download so you can see exactly what the numbers were or create your own custom comparisons (see below for link).

Things to know before looking at the graphs

First off if your distribution of choice didn’t appear in the list above its probably not reasonably possible to be installed (i.e. I don’t have hours to compile Gentoo) or I didn’t feel it was mainstream enough (pretty much anything with LXDE). Secondly there may be some distributions that don’t appear on all of the graphs, for example because I was using an existing Windows 7 VM I didn’t have a ‘first boot’ result. As always feel free to run your own tests. Thirdly you may be asking yourself ‘why does Fedora 18 and 19 make the list?’ Well basically because I had already run the tests for 18 and then 19 happened to be released. Finally Fedora 19 (GNOME), while included, does not have any data because I simply could not get it to install.

First boot memory (RAM) usage

This test was measured on the first startup after finishing a fresh install.

 

All Data Points

All Data Points

RAM

RAM

Buffers/Cache Only

Buffers/Cache

RAM - Buffers/Cache

RAM – Buffers/Cache

Swap Usage

Swap Usage

RAM - Buffers/Cache + Swap

RAM – Buffers/Cache + Swap

Memory (RAM) usage after updates

This test was performed after all updates were installed and a reboot was performed.

After_Updates_All

All Data Points

RAM

RAM

Buffers/Cache

Buffers/Cache

RAM - Buffers/Cache

RAM – Buffers/Cache

Swap

Swap

RAM - Buffers/Cache + Swap

RAM – Buffers/Cache + Swap

Memory (RAM) usage change after updates

The net growth or decline in RAM usage after applying all of the updates.

All Data Points

All Data Points

RAM

RAM

Buffers/Cache

Buffers/Cache

RAM - Buffers/Cache

RAM – Buffers/Cache

Swap Usage

Swap

RAM - Buffers/Cache + Swap

RAM – Buffers/Cache + Swap

Install size after updates

The hard drive space used by the distribution after applying all of the updates.

Install Size

Install Size

Conclusion

Once again I will leave the conclusions to you. This time however, as promised above, I will provide my source data for you to plunder enjoy.

Source Data




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.

How to backup and restore a Trac project

July 4th, 2013 No comments

If you use Trac as your bug and progress tracking tool then you too may one day need to take a backup of it or move it to a new server like I had to the other day. Thankfully, as I discovered, it is a relatively straight forward process. Here are the steps to backup and restore a Trac project.

Take a hot backup of your existing install. This is essentially a backup from a fixed point that you can take while still using your Trac at the same time (great for having no downtime).

trac-admin [/path/to/projenv] hotcopy [/path/to/backupdir]

For example:

trac-admin /var/www/trac/projectx hotcopy /home/awesomeadmin/trac_backup/projectx

In order to restore it on another server you just need to create the project from scratch (i.e. using initenv) like this

trac-admin [targetdir] initenv

and then simply replace the install directory contents with the backed up contents. Strictly speaking I’m not even sure if you need to initenv but that’s how I did it and it worked.

Hopefully this works for you as well. Happy… err… Trac-ing?




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.
Categories: Tyler B Tags: , ,

How to backup and restore an SVN repository with full commit history

July 2nd, 2013 No comments

Sometimes you need to move an SVN repository from one server to another but maintain the full commit history (i.e. comments and changes). Here is a very simple way to do so.

1. Dump (and compress) the source SVN in one line:

svnadmin dump [path to source SVNrepository] | gzip -9 > [path to destination gzipped dump file]

For example:

svnadmin dump /var/svn/projectx | gzip -9 > /home/awesomeadmin/svn_backup/projectx.dump.gz

2. Transfer gzipped dump to new server

3. Decompress dump

gunzip projectx.dump.gz

4. Restore dump to new SVN repository

svnadmin load [path to new SVN repository] < [path to dump file]

For example:

svnadmin load /var/svn/projecty < /home/awesomeadmin/svn_backup/projectx.dump

That’s it. Pretty simple, no?




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.
Categories: Tyler B Tags: , , ,

An Experiment in Transitioning to Open Document Formats

June 15th, 2013 2 comments

Recently I read an interesting article by Vint Cerf, mostly known as the man behind the TCP/IP protocol that underpins modern Internet communication, where he brought up a very scary problem with everything going digital. I’ll quote from the article (Cerf sees a problem: Today’s digital data could be gone tomorrow – posted June 4, 2013) to explain:

One of the computer scientists who turned on the Internet in 1983, Vinton Cerf, is concerned that much of the data created since then, and for years still to come, will be lost to time.

Cerf warned that digital things created today — spreadsheets, documents, presentations as well as mountains of scientific data — won’t be readable in the years and centuries ahead.

Cerf illustrated the problem in a simple way. He runs Microsoft Office 2011 on Macintosh, but it cannot read a 1997 PowerPoint file. “It doesn’t know what it is,” he said.

“I’m not blaming Microsoft,” said Cerf, who is Google’s vice president and chief Internet evangelist. “What I’m saying is that backward compatibility is very hard to preserve over very long periods of time.”

The data objects are only meaningful if the application software is available to interpret them, Cerf said. “We won’t lose the disk, but we may lose the ability to understand the disk.”

This is a well known problem for anyone who has used a computer for quite some time. Occasionally you’ll get sent a file that you simply can’t open because the modern application you now run has ‘lost’ the ability to read the format created by the (now) ‘ancient’ application. But beyond this minor inconvenience it also brings up the question of how future generations, specifically historians, will be able to look back on our time and make any sense of it. We’ve benefited greatly in the past by having mediums that allow us a more or less easy interpretation of written text and art. Newspaper clippings, personal diaries, heck even cave drawings are all relatively easy to translate and interpret when compared to unknown, seemingly random, digital content. That isn’t to say it is an impossible task, it is however one that has (perceivably) little market value (relatively speaking at least) and thus would likely be de-emphasized or underfunded.

A Solution?

So what can we do to avoid these long-term problems? Realistically probably nothing. I hate to sound so down about it but at some point all technology will yet again make its next leap forward and likely render our current formats completely obsolete (again) in the process. The only thing we can do today that will likely have a meaningful impact that far into the future is to make use of very well documented and open standards. That means transitioning away from so-called binary formats, like .doc and .xls, and embracing the newer open standards meant to replace them. By doing so we can ensure large scale compliance (today) and work toward a sort of saturation effect wherein the likelihood of a complete ‘loss’ of ability to interpret our current formats decreases. This solution isn’t just a nice pie in the sky pipe dream for hippies either. Many large multinational organizations, governments, scientific and statistical groups and individuals are also all beginning to recognize this same issue and many have begun to take action to counteract it.

Enter OpenDocument/Office Open XML

Back in 2005 the Organization for the Advancement of Structured Information Standards (OASIS) created a technical committee to help develop a completely transparent and open standardized document format the end result of which would be the OpenDocument standard. This standard has gone on to be the default file format in most open source applications (such as LibreOffice, OpenOffice.org, Calligra Suite, etc.) and has seen wide spread adoption by many groups and applications (like Microsoft Office). According to Wikipedia the OpenDocument is supported and promoted by over 600 companies and organizations (including Apple, Adobe, Google, IBM, Intel, Microsoft, Novell, Red Hat, Oracle, Wikimedia Foundation, etc.) and is currently the mandatory standard for all NATO members. It is also the default format (or at least a supported format) by more than 25 different countries and many more regions and cities.

Not to be outdone, and potentially lose their position as the dominant office document format creator, Microsoft introduced a somewhat competing format called Office Open XML in 2006. There is much in common between these two formats, both being based on XML and structured as a collection of files within a ZIP container. However they do differ enough that they are 1) not interoperable and 2) that software written to import/export one format cannot be easily made to support the other. While OOXML too is an open standard there have been some concerns about just how open it actually is. For instance take these (completely biased) comparisons done by the OpenDocument Fellowship: Part I / Part II. Wikipedia (Open Office XML – from June 9, 2013) elaborates in saying:

Starting with Microsoft Office 2007, the Office Open XML file formats have become the default file format of Microsoft Office. However, due to the changes introduced in the Office Open XML standard, Office 2007 is not entirely in compliance with ISO/IEC 29500:2008. Microsoft Office 2010 includes support for the ISO/IEC 29500:2008 compliant version of Office Open XML, but it can only save documents conforming to the transitional schemas of the specification, not the strict schemas.

It is important to note that OpenDocument is not without its own set of issues, however its (continuing) standardization process is far more transparent. In practice I will say that (at least as of the time of writing this article) only Microsoft Office 2007 and 2010 can consistently edit and display OOXML documents without issue, whereas most other applications (like LibreOffice and OpenOffice) have a much better time handling OpenDocument. The flip side of which is while Microsoft Office can open and save to OpenDocument format it constantly lags behind the official standard in feature compliance. Without sounding too conspiratorial this is likely due to Microsoft wishing to show how much ‘better’ its standard is in comparison. That said with the forthcoming 2013 version Microsoft is set to drastically improve its compatibility with OpenDocument so the overall situation should get better with time.

Current day however I think, technologically, both standards are now on more or less equal footing. Initially both standards had issues and were lacking some features however both have since evolved to cover 99% of what’s needed in a document format.

What to do?

As discussed above there are two different, some would argue, competing open standards for the replacement of the old closed formats. Ten years ago I would have said that the choice between the two is simple: Office Open XML all the way. However the landscape of computing has changed drastically in the last decade and will likely continue to diversify in the coming one. Cell phone sales have superseded computers and while Microsoft Windows is still the market leader on PCs, alternative operating systems like Apple’s Mac OS X and Linux have been gaining ground. Then you have the new cloud computing contenders like Google’s Google Docs which let you view and edit documents right within a web browser making the operating system irrelevant. All of this heterogeneity has thrown a curve ball into how standards are established and being completely interoperable is now key – you can’t just be the market leader on PCs and expect everyone else to follow your lead anymore. I don’t want to be limited in where I can use my documents, I want them to work on my PC (running Windows 7), my laptop (running Ubuntu 12.04), my cellphone (running iOS 5) and my tablet (running Android 4.2). It is because of these reasons that for me the conclusion, in an ideal world, is OpenDocument. For others the choice may very well be Office Open XML and that’s fine too – both attempt to solve the same problem and a little market competition may end up being beneficial in the short term.

Is it possible to transition to OpenDocument?

This is the tricky part of the conversation. Lets say you want to jump 100% over to OpenDocument… how do you do so? Converting between the different formats, like the old .doc or even the newer Office Open XML .docx, and OpenDocument’s .odt is far from problem free. For most things the conversion process should be as simple as opening the current format document and re-saving it as OpenDocument – there are even wizards that will automate this process for you on a large number of documents. In my experience however things are almost never quite as simple as that. From what I’ve seen any document that has a bulleted list ends up being converted with far from perfect accuracy. I’ve come close to re-creating the original formatting manually, making heavy use of custom styles in the process, but its still not a fun or straightforward task – perhaps in these situations continuing to use Microsoft formatting, via Office Open XML, is the best solution.

If however you are starting fresh or just converting simple documents with little formatting there is no reason why you couldn’t make the jump to OpenDocument. For me personally I’m going to attempt to convert my existing .doc documents to OpenDocument (if possible) or Office Open XML (where there are formatting issues). By the end I should be using exclusively open formats which is a good thing.

I’ll write a follow up post on my successes or any issues encountered if I think it warrants it. In the meantime I’m curious as to the success others have had with a process like this. If you have any comments or insight into how to make a transition like this go more smoothly I’d love to hear it. Leave a comment below.

This post originally appeared on my personal website here.




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.

The apps of KDE 4.10 Part VII: Dragon Player

May 27th, 2013 2 comments

Rounding out this little series I took a look at KDE’s video player of choice: Dragon Player.

Dragon Player

For those of you familiar with similar applications such as VLC, Totem or even Windows Media Player, Dragon Player is a simplistic interface on top of quite powerful video playback.

Everyone loves Big Buck Bunny!

Everyone loves Big Buck Bunny!

Dragon Player’s power comes from the integrated KDE media backend Phonon. What this means for the user is that it is completely compatible with all installed system codecs. Speaking of codecs, Dragon Player prompts you whenever it doesn’t recognize a new piece of media and offers the ability to automatically search and install the required codecs. This works very well and allows you to keep your system relatively free of nonsense codecs you’ll never actually use, instead installing what you need as you need it.

For a KDE application Dragon Player is surprisingly streamlined and doesn’t offer very many configuration options. In fact almost any other video player has more configuration options than Dragon Player. The only real settings I could find were changing how the video playback looks:

Video Settings

Video Settings

And that’s it. No seriously, there isn’t anything else to mention about this application and believe it or not that’s a good thing! This program is designed for exactly one thing and it does it well. If you’re looking for a single use video player application, and you’re not already a VLC fan, I would highly suggest this as an alternative.

More in this series




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.
Categories: KDE, Tyler B Tags: ,