Archive for the ‘Hardware’ Category

Fix PulseAudio loopback delay

July 1st, 2011 13 comments

Sort of a follow up (in spirit) to two of Jon’s previous posts regarding pulse audio loopback; I noticed that there was quite a bit of delay (~500ms to 1second) in the default configuration and began searching for a way to fix it. After some research I found an alternative way to achieve the loopback but with must less delay.

1. Install paman

First install the PulseAudio Manager application so that you can correctly identify the input device (i.e. your mic or line-in) and your output device (i.e. the sound card you are using).

sudo apt-get install paman

You can find the input sources under the Sources section and the output devices under the Sinks section of the Devices tab. Make note of the names of the two devices.

2. Unload any previous loopback modules

If you had followed Jon’s previous posts then you will need to unload the modules (and potentially change your PulseAudio configuration so they don’t get loaded again on next restart). This is to stop it from doubling all loopback sound.

3. Create an executable script

Create a script and copy the following command into it:

pacat -r --latency-msec=1 -d [input] | pacat -p --latency-msec=1 -d [output]

where [input] is the name of your input device found in step 1 and [output] is the name of the output device. In my case it would look like:

pacat -r --latency-msec=1 -d alsa_input.pci-0000_05_02.0.analog-stereo | pacat -p --latency-msec=1 -d alsa_output.pci-0000_05_02.0.analog-surround-51

4. Run script

By simply running the script now you should get correct loopback and with much less delay than using the default loopback module. Even better if you set this script to run at startup you won’t have to worry about it ever again.

I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).

Unwanted Effects on my Line-in Interface

August 26th, 2010 No comments

Shortly after purchasing an xbox360, I wrote a short piece that gave instructions for forwarding your line-in audio through your pc speakers. By using this method and sharing my network connection, I’ve managed to run my xbox as a peripheral to my main computer setup, saving me space and money.

Lately however, the line-in loopback has not been working as expected. At times, it sounds like effects have been applied to the line. In particular, it sounds like somebody has applied a phaser or a delay effect to the input signal.

For the last week or so, I’ve been scratching my head about this issue, trying to figure out what part of my system may have applied effects to my loopback, but not to other audio on the system. Tonight, I was reviewing my original instructions for setting the thing up, and noticed that the module was being loaded on startup after being added to a system config file:

sudo sh -c ' echo "load-module module-loopback" >>  /etc/pulse/ '

On a hunch, I took a look at the end of the file, and found the following lines:

### Make some devices default
#set-default-sink output
#set-default-source input
load-module module-loopback
load-module module-loopback

It looked like the instruction to load the loopback module had ended up in the config file twice! Because of this, the module was being loaded twice on startup.

So what does this have to do with the effects on the line? Well, if you play two copies of the same sound with a half-second gap between them, your ears will be tricked into thinking that you’re hearing one copy of the sound, but that it’s all echoey, as if a delay effect had been applied. If you repeat the experiment but this time decrease the gap between the two sounds even further, say to a few milliseconds, your ears will hear one copy of the sound with a phaser effect applied.

Essentially, when the module loaded twice, it was capturing the mix from the line-in port twice and playing back two separate copies of the audio. Depending on how close together these instances  were, the result sounded normal, phased, or delayed. I fixed the issue by removing one of the lines and then restarting the machine. This time, it started only one copy of the service, and everything sounded fine.

The moral of the story: If you’re loading modules at startup, make sure that you only start one copy of them.

On my Laptop, I am running Linux Mint 12.
On my home media server, I am running Ubuntu 12.04
Check out my profile for more information.

PulseAudio: Monitoring your Line-In Interface

July 11th, 2010 22 comments

At home, my setup consists of three machines –  a laptop, a PC, and an XBOX 360. The latter two share a set of speakers, but I hate having to climb under the desk to switch the cables around, and wanted a better way to switch them back and forth. My good friend Tyler B suggested that I run the line out from the XBOX into the line-in on my sound card, and just let my computer handle the audio in the same way that it handles music and movies. In theory, this works great. In practice, I had one hell of a time figuring out how to force the GNOME sound manager applet into doing my bidding.

After quite a bit of googling, I found the answer on the Ubuntu forums. It turns out that the secret lies in a pulse audio module that isn’t enabled by default. Open up a terminal and use the following commands to permanently enable this behaviour. As always, make sure that you understand what’s up before running random commands that you find on the internet as root:

pactl load-module module-loopback
sudo sh -c ' echo "load-module module-loopback" >>  /etc/pulse/ '

The first line instructs PulseAudio (one of the many ways that your system talks with the underlying sound hardware) to load a module called loopback, which unsurprisingly, loops incoming audio back through your outputs. This means that you can hear everything that comes into your line-in port in real time. Note that this behaviour does not extend to your microphone input by design. The second line simply tells PulseAudio to load this module whenever the system starts.

Now if you’ll excuse me, I have jerks to run over in GTA…

On my Laptop, I am running Linux Mint 12.
On my home media server, I am running Ubuntu 12.04
Check out my profile for more information.

Setting up a RocketRaid 2320 controller on Linux Mint 9

July 4th, 2010 7 comments

After the most recently recorded podcast, I decided to take a stab at running Linux on my primary media server. The machine sports a Highpoint RocketRaid 2320 storage controller, which has support for running up to eight SATA drives. Over the course of last evening, I found out that the solution wasn’t quite as plug-and-play as running the same card under Windows. Here’s what I found out and how you can avoid the same mistakes.

Remove the RocketRaid card when installing Mint.

Make sure you have decent physical access to the machine, as the Mint installer apparently does not play nicely with this card. I replicated a complete system freeze (no keyboard or mouse input) after progressing past the keyboard layout section during the installer. Temporarily removing the 2320 from its PCI-Express slot avoided this problem; I was then able to re-insert the card after installation was complete.

Compile the Open Source driver for best results.

Highpoint has a download page for their 2300-series cards, which points to Debian and Ubuntu (x86 and x64) compatible versions of the rr232x driver. Unfortunately, the Ubuntu 64-bit version did not seem to successfully initialize – the device just wasn’t present.

A post on the Ubuntu forums (for version 9.04) was quite helpful in pointing out the required steps, but had a broken link that wasn’t easy to find. To obtain the Open Source driver, click through to the “Archive Driver Downloads for Linux and FreeBSD” page, then scroll to the bottom and grab the 32/64-bit .tar.gz file with a penguin icon. I’ve mirrored version 1.10 here in case the URLs on the HighPoint site change again: rr232x-linux-src-v1.10-090716-0928.tar.gz

The process for building the driver is as in the original post:

  • Extract the .tar.gz file to a reasonably permanent location. I say this because you will likely need to rebuild the module for any kernel upgrades. I’m going to assume you’ve created something under /opt, such as /opt/rr232x.
  • Change to the extraction directory and run:cd product/rr232x/linux
    sudo make
    sudo make install
  • Reboot your system after the installation process and the kernel will load the rr232x driver as a module.

Install gnome-disk-utility to verify and mount your filesystem.

I’m not sure why this utility disappeared as a default between Mint 8 and 9, but gnome-disk-utility will display all connected devices and allow you to directly mount partitions. It will also let you know if it “sees” the RR2320 controller. In my case, after installing the driver and rebooting, I was able to click on the 3.5TB NTFS-formatted storage and assign it a mount point of /media/Raid5 in two clicks.

What’s next?

Most of the remaining complaints online revolve around booting to the RR2320 itself, which seems like more of a pain than it’s worth (even under Windows this would seem to be the case.) I personally run a separate system drive; the actual Ubuntu installation manual from Highpoint may have additional details on actually booting to your RAID volume.

I’ve yet to install the Web or CLI management interface for Linux, which should happen in the next few days. One of the really neat items about this controller is that it can email you if a disk falls out of the array, but I’ll need to get the Web interface running in order to change some outgoing mail servers.

I also haven’t done any performance testing or benchmarking with the controller versus Windows, or if there would be an improvement migrating the filesystem to ext4 as opposed to NTFS. I do plan to stick with NTFS as I’d like portability across all major platforms with this array. From initial observations, I can play back HD content from the array without stuttering while large files are being decompressed and checksummed, which is my main goal.

Fix ATI vsync & video tearing issue once and for all!

May 6th, 2010 23 comments

NOTE: ATI’s most recent drivers now include a no tearing option in the driver control panel. Enabling it there is now the preferred method.

Two of the linux machines that I use both have ATI graphics cards from the 4xxx series in them. They work well enough for what I do, very casual gaming, lots of video watching, but one thing has always bothered me to no end: video tearing. I assumed that this was due to vsync being off by default (probably for performance sake) but even after installing the proprietary drivers in the new Ubuntu 10.04 and trying to force it on I still could not get the issue to resolve itself. After some long googling I found what seems to be a solution, at least in my case. I’ll walk you through what I did.

Before you continue read this: In order to fix this issue on my computers I had to trash xorg.conf and start over. If you are afraid you are going to ruin yourself, or if you have a custom setup already, please be very careful and read before doing what I suggest or don’t continue at all. Be sure to make a backup!

1 ) Install the ATI proprietary drivers and restart so that they can take effect.

2 ) Make a backup of your xorg.conf file. Do this by opening a terminal and copying it to a backup location. For example I ran the following code:

sudo cp /etc/X11/xorg.conf /etc/X11/backup.xorg.conf

3 ) Remove your existing (original) xorg.conf file:

sudo rm /etc/X11/xorg.conf

4 ) Generate a new default xorg.conf file using aticonfig (that’s two dashes below):

sudo aticonfig –initial

5 ) Enable video syncing (again two dashes before each command):

sudo aticonfig –sync-video=on –vs=on

6 ) If possible also enable full anti-aliasing:

sudo aticonfig –fsaa=on –fsaa-samples=4

7 ) Restart now so that your computer will load the new xorg.conf file.

8 ) Open up Catalyst Control Center and under 3D -> More Settings make sure the slider under Wait for vertical refresh is set to Always On.

That should be it. Please note that this trick may not work with all media players either (I noticed Totem seemed to still have some issues). One other thing I tried in VLC was to change the video output to be OpenGL which seemed to help a lot.

Good luck!

I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).

Pulse Audio Nonsense

January 4th, 2010 3 comments

Just a heads up: This isn’t the kind of post that contains answers to your problems. It is, unfortunately, the kind of post that contains a lot of the steps that I took to fix a problem, without much information about the order in which I performed them, why I performed them, or what they did. All that I can tell you is that after doing some or all of these things in an arbitrary order, stuff seemed to work better than it did before.

It’s funny how these posts often seem to come about when trying to get hardware related things working. I distinctly remember writing one of these about getting hardware compositing working on Debian. This one is about getting reliable audio on Kubuntu 9.10.

You see, I have recently been experiencing some odd behaviour from my audio stack in Kubuntu. My machine almost always plays the startup/shutdown noises, Banshee usually provides audio by way of GStreamer, videos playing in VLC are sometimes accompanied by audio, and Flash videos almost never have working sound. Generally speaking, restarting the machine will change one or all of these items, and sometimes none. The system is usuable, but frustrating (although I might be forgiven for saying that having no audio in Flash prevents me from wasting so much time watching youtube videos when I ought to be working).

Tonight, after some time on the #kubuntu IRC channel and the #pulseaudio channel on freenode, I managed to fix all of that, and my system now supports full 5.1 surround audio, at all times, and from all applications. Cool, no? Basically, the fix was to install some PulseAudio apps:

sudo apt-get install pulseaudio pavucontrol padevchooser

Next, go to System Settings > Multimedia, and set PulseAudio as the preferred audio device in each of the categories on the left. Finally, restart the machine a couple of times. If you’re lucky, once you restart and run pavucontrol from the terminal, you’ll see a dialog box called Volume Control. Head over to the Configuration tab, and start choosing different profiles until you can hear some audio from your system. Also, I found that most of these profiles were muted by default – you can change that on the Output Devices tab. If one of the profiles works for  you, congratulations! If not, well, I guess you’re no worse off than you were before. I warned you that this was that kind of post.

Also, while attempting to fix my audio problems, I found some neat sites:

  • Colin Guthrie – I spoke to this guy on IRC, and he was really helpful. He also seems to write a lot of stuff for the PulseAudio/Phonon stack in KDE. His site is a wealth of information about the stack that I really don’t understand, but makes for good reading.
  • Musings on Maintaining Ubuntu – Some guy named Dan who seems to be a lead audio developer for the Ubuntu project. Also a very interesting read, and full of interesting information about audio support in Karmic.
  • A Script that Profiles your Audio Setup – This bash script compiles a readout of what your machine thinks is going on with your audio hardware, and automatically hosts it on the web so that you can share it with people trying to help you out.
  • A Handy Diagram of the Linux Audio Stack – This really explains a lot about what the hell is going on when an application tries to play audio in the Linux.
  • What the Linux Audio Stack Seems Like – This diagram reflects my level of understanding of Linux audio. It also reminds me of XKCD.
  • Ardour – The Digital Audio Workstation – In the classic tradition of running before walking, I just have to try this app out.

On my Laptop, I am running Linux Mint 12.
On my home media server, I am running Ubuntu 12.04
Check out my profile for more information.

Kubuntu 9.10 (Part II)

January 4th, 2010 No comments

Well I managed to fix my compositing problem but I honestly don’t know why it worked. Basically I went into the System Settings > Desktop > Desktop Effects menu and manually turned off all desktop effects. Next I used jockey-text to disable the ATI driver. After a quick restart I re-enabled the ATI driver and restarted again. Once I logged in I went back into the System Settings > Desktop > Desktop Effects menu and enabled desktop effects. This magically worked… but only until I restarted. In order to actually get it to start enabled I had to go back into System Settings > Desktop > Desktop Effects and then click on the Advanced tab and then disable functionality checks. I am sure this is dangerous or something but its the only way I can get my computer to restart with the effects enabled by default.

I’m really starting to hate this graphics card…

I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).

Going Linux, Once and for All

December 23rd, 2009 7 comments

With the linux experiment coming to an end, and my Vista PC requiring a reinstall, I decided to take the leap and go all linux all the time. To that end, I’ve installed Kubuntu on my desktop PC.

I would like to be able to report that the Kubuntu install experience was better than the Debian one, or even on par with a Windows install. Unfortunately, that just isn’t the case.

My machine contains three 500GB hard drives. One is used as the system drive, while an integrated hardware RAID controller binds the other two together as a RAID1 array. Under Windows, this setup worked perfectly. Under Kubuntu, it crashed the graphical installer, and threw the text-based installer into fits of rage.

With plenty of help from the #kubuntu IRC channel on freenode, I managed to complete the Kubuntu install by running it with the two RAID drives disconnected from the motherboard. After finishing the install, I shut down, reconnected the RAID drives, and booted back up. At this point, the RAID drives were visible from Dolphin, but appeared as two discrete drives.

It was explained to me via this article that the hardware RAID support that I had always enjoyed under windows was in fact a ‘fake RAID,’ and is not supported on Linux. Instead, I need to reformat the two drives, and then link them together with a software RAID. More on that process in a later post, once I figure out how to actually do it.

At this point, I have my desktop back up and running, reasonably customized, and looking good. After trying KDE’s default Amarok media player and failing to figure out how to properly import an m3u playlist, I opted to use Gnome’s Banshee player for the time being instead. It is a predictable yet stable iTunes clone that has proved more than capable of handling my library for the time being. I will probably look into Amarok and a few other media players in the future. On that note, if you’re having trouble playing your MP3 files on Linux, check out this post on the ubuntu forums for information about a few of the necessary GStreamer plugins.

For now, my main tasks include setting up my RAID array, getting my ergonomic bluetooth wireless mouse working, and working out folder and printer sharing on our local Windows network. In addition, I would like to set up a Windows XP image inside of Sun’s Virtual Box so that I can continue to use Microsoft Visual Studio, the only Windows application that I’ve yet to find a Linux replacement for.

This is just the beginning of the next chapter of my own personal Linux experiment; stay tuned for more excitement.

This post first appeared at Index out of Bounds.

On my Laptop, I am running Linux Mint 12.
On my home media server, I am running Ubuntu 12.04
Check out my profile for more information.

This… looks… awesome!

December 8th, 2009 No comments

Looks being the key word there because I haven’t actually been able to successfully run either of  these seemingly awesome pieces of software.

Amahi is the name of an open source software collection, for lack of a better term, that resembles what Windows Home Server has to offer. I first came across this while listening to an episode of Going Linux (I think it was episode #85 but I can’t remember anymore!) and instantly looked it up. Here is a quick rundown of what Amahi offers for you:

  • Currently built on top of Fedora 10, but they are hoping to move it to the most recent version of Ubuntu
  • Audio streaming to various apps like iTunes and Rhythmbox over your home network
  • Media streaming to other networked appliances including the Xbox 360
  • Acts as a NAS and can even act as a professional grade DHCP server (taking over the job from your router) making things even easier
  • Built in VPN so that you can securely connect to your home network from remote locations
  • SMB and NFS file sharing for your whole network
  • Provides smart feedback of your drives and system, including things like disk space and temperature
  • Built-in Wiki so that you can easily organize yourself with your fellow co-workers, roommates or family members
  • Allows you to use the server as a place to automate backups to
  • Windows, Mac & Linux calendar integration, letting you share a single calendar with everyone on the network
  • Implements the OpenSearch protocol so that you can add the server as a search location in your favorite browser. This lets you search your server files from right within your web browser!
  • Includes an always-on BitTorrent client that lets you drop torrent files onto the server and have it download them for you
  • Supports all Linux file systems and can also read/write to FAT32 and read from NTFS.
  • Sports a plugin architecture that lets developers expand the platform in new and exciting ways
  • Inherits all of the features from Fedora 10
  • Finally Amahi offers a free DNS service so you only have to remember a web address, not your changing home IP address

FreeNAS is a similar product, although I use that term semi-loosely seeing as it is also open source, except instead of being based on Linux, FreeNAS is currently based on FreeBSD 7.2. Plans are currently in the works to fork the project and build a parallel Linux based version. Unlike Amahi, FreeNAS sticks closer to the true definition of a NAS and only includes a few additional features in the base install, letting the user truly customize it to their needs. Installed it can take up less than 64MB of disk space. It can (through extensions) include the following features:

  • SMB and NFS as well as TFTP, FTP, SSH, rsync, AFP, and UPnP
  • Media streaming support for iTunes and Xbox 360
  • BitTorrent support allowing you to centralize your torrenting
  • Built-in support for Dynamic DNS through major players like DynDNS, etc.
  • Includes full support for ZFS, UFS, ext2, ext3. Can also fully use FAT32 (just not install to), and can read from NTFS formatted drives.
  • Small enough footprint to boot from a USB drive
  • Many supported hardware and software RAID levels
  • Full disk encryption via geli

Both of these can be fully operated via a web browser interface and seem very powerful. Unfortunately I was unable to get either up and running inside of a VirtualBox environment. While I recognize that I could just install a regular Linux machine and then add most of these features myself, it is nice to see projects like that package them in for ease of use.

This is definitely something that I will be looking more closely at in the future; you know once these pesky exams are finished. In the mean time if anyone has any experience with either of these I would love to hear about it.


While publishing this, the folks over at Amahi sent out an e-mail detailing many new improvements. Turns out they released a new version now based on Fedora 12. Here are their notable improvements:

  • Amahi in the cloud! This release has support for VPS servers (Virtual Private Servers).
  • Major performance and memory improvements, providing a much faster web interface and a 30% smaller memory footprint.
  • Based on Fedora 12, with optimizations for Atom processors built-in, preliminary support in SAMBA for PDC (Primary Domain Controller) with Windows 7 clients and much more.
  • Completely revamped web-based installer.
  • Users are more easily and securely setup now, the with password-protected pages and admin users.
  • Brand new architecture, with future growth in mind, supporting more types of apps, and more importantly, bring us closer to supporting Ubuntu and other platforms. Over 100+ apps are working in this release out of the gates!

It all sounds great. I will be looking into this new version as soon as I have a moment to do so.

I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).

Today, the search engines…

November 23rd, 2009 No comments

I would just like to point out that thanks to you the readers, who I’d like to reinforce are fantastic and have been a huge help to us (as well as making us feel good that people are making use of the site!) have catapulted us to previously unknown heights in the world of Canadian search engine fame!

The big three search engines with Canadian domains – Google, Bing, and Yahoo – have all launched us up to top-shelf status on their search pages with a search string of ‘The Linux Experiment’: – first search result (yay!) – first search result (double yay!) – second search result (darn you, PC World)

Let’s collectively step it up and get us to the top of the Google search charts.  With a scant 38 days left in the Experiment, time is quickly running out!

Today, the search engines… tomorrow, the (PC) world!

Twelve to twelve

November 5th, 2009 3 comments

Well, it’s official – twelve more days remain until the November 17 release of Fedora 12 (Constantine).  I, for one, can hardly wait – Fedora 11 has been rock-solid for me so far (under Gnome, anyways – but I’ll leave that subject alone) and I can only imagine that Fedora 12 is going to bring more of the same my way.

Among some of the more notable changes being made that caught my interest:

  • Gnome 2.28 – the current version bundled into my Fedora 11 distribution, 2.26.3, has been nothing but amazing.  Unflinchingly stable, fast, and reliable – it’s everything I want in a desktop environment.
  • Better webcam support – not sure how this can get any better from my perspective since my LG P300’s built-in webcam worked straight out of the box on Fedora 11, but I’m interested to see exactly what they bring to the table here
  • Better IPv6 support – since our router does actively support this protocol, it’s nice to see Fedora taking charge and always improving the standard
  • Better power management – for me, this is a major headache under Gnome (I know, I know…) since it really doesn’t let me customize anything as much as I would like to.   Among other things, it’s supposed to offer better support for wake-from-disk and wake-from-RAM.  We’ll see.

I’m sure that Tyler and I will keep you posted as the due date gets closer, and especially once we’ve done the upgrade itself!

Interesting Linux article

October 26th, 2009 4 comments

I stumbled across a very interesting post linked off of Digg, which I browse on a fairly regular basis.  In it, the author attempts to put to rest some of the more common (and, for the most part, completely inaccurate) stories that revolve around various Linux distributions.

Though I think Jake B might have something to say about the first point on the list, it made for interesting reading at the very least – and for the most part, I agree with the author wholeheartedly.  Link after the jump!

Debunking Some Linux Myths

Categories: Dana H, Free Software, Hardware, Linux Tags:

Well shit, that was easy

October 12th, 2009 1 comment

One of my big griefs with Mint was that the sound was far too quiet. I assumed this was some sort of hardware compatibility issue. Apparently it’s not and it’s really easy to fix. Essentially, the default “front speaker” volume is not at the max level. While this has given me a great max volume, my latest problem is getting Mint to increment/decrement the volume properly – the master volume is essentially muted at 70%. That being said, I’m glad I can finally watch online videos from my laptop without needing headphones or a soundproof room.

Categories: Hardware, Linux Mint, Sasha D Tags:

Blackbery Sync Attempt #3: Compiling from Source

October 5th, 2009 7 comments

After my first two attempts at getting my Blackberry to sync with Mozilla Thunderbird, I got pissed off and went right to the source of my problems. I emailed the developer of the opensync-plugin-mozilla package that (allegedly) allows Thunderbird to play nicely with OpenSync, and gave him the what for, (politely) asking what I should do. He suggested that I follow the updated installation instructions for checking out and compiling the latest version of his plugin from scratch instead of using the older, precompiled versions that are no longer supported.

I set to it, first removing all of the packages that I had installed during my last two attempts, excluding Barry, as I had already built and installed the latest version of its libraries. Everything else, including OpenSync and all of its plugins went, and I started from scratch. Luckily, the instructions were easy to follow, although they recommended that I get the latest versions of some libraries by adding Debian’s sid repositories to my sources list. This resulted in me shitting my pants later in the day, when I saw 642 available updates for my system in Synaptic. I figured out what was going on pretty quickly and disabled updates from sid, without ruining my system. If there’s one thing that Windows has taught me over the years, it is to never set a machine to auto-install updates.

Once I had the source code and dependency libraries, the install was a snap. The plugin source came with a utils directory full of easy to use scripts that automated most of the process. With everything going swimmingly, I was jarred out of my good mood by a nasty error that occurred when I ran the script:

CMake Error at cmake/modules/FindPkgConfig.cmake:357 (message):
None of the required ‘libopensync1;>=0.39’ found
Call Stack (most recent call first):
cmake/modules/FindOpenSync.cmake:27 (PKG_SEARCH_MODULE)
CMakeLists.txt:15 (FIND_PACKAGE)

CMake Error at cmake/modules/FindOpenSync.cmake:46 (MESSAGE):
OpenSync cmake modules not found.  Have you installed opensync core or did
you set your PKG_CONFIG_PATH if installing in a non system directory ?
Call Stack (most recent call first):
CMakeLists.txt:15 (FIND_PACKAGE)

It turns out that the plugin requires OpenSync v0.39 or greater to be installed to work. Of course, the latest version of same in either the Debian main or lenny-backports repositories is v0.22-2. This well-aged philosophy of the Debian Stable build has irked me a couple of times now, and I fully intend to update my system to the testing repositories before the end of the month. In any case, I quickly made my way over to the OpenSync homepage to obtain a newer build of their libraries. There I found out not only that version 0.39 had just been released on September 21st, and also that it isn’t all that stable:

Releases 0.22 (and 0.2x svn branch) and before are considered stable and suitable for production. 0.3x releases introduce major architecture and API changes and are targeted for developers and testers only and may not even compile or are likely to contain severe bugs.

0.3x releases are not recommended for end users or distribution packaging.

Throwing caution to the wind, I grabbed a tarball of compilation scripts from the website, and went about my merry way gentooing it up. After a couple of minor tweaks to the script, I got the cmpOpensync script to run, which checked out the latest trunk from the svn, and automatically compiled and installed it for me. By running the command msynctool –version, I found out that I now had OpenSync v0.40-snapshot installed. Relieved, I headed back to my BlueZync installation. This time around, I managed to get right up to the script before encountering another horrible dependency error:

— checking for one of the modules ‘glib-2.0’
—   found glib-2.0, version 2.16.6
— Found GLib2: glib-2.0 /usr/include/glib-2.0;/usr/lib/glib-2.0/include
— Looking for include files HAVE_GLIB_GREGEX_H
— Looking for include files HAVE_GLIB_GREGEX_H – found
— checking for one of the modules ‘libxml-2.0’
—   found libxml-2.0, version 2.6.32
— checking for one of the modules ‘libopensync1’
—   found libopensync1, version 0.40-snapshot
— checking for one of the modules ‘thunderbird-xpcom;icedove-xpcom’
—   found icedove-xpcom, version
—     THUNDERBIRD_XPCOM_MAIN_INCLUDE_DIR /usr/include/icedove
—     NSPR_MAIN_INCLUDE_DIR /usr/include/nspr
—     THUNDERBIRD_XPCOM_LIBRARIES xpcom;plds4;plc4;nspr4;pthread;dl
— checking for one of the modules ‘sunbird-xpcom;iceowl-xpcom’
—   found iceowl-xpcom, version 0.8
SUNBIRD_INCLUDE_DIRS /usr/include/iceowl;/usr/include/iceowl/xpcom;/usr/include/iceowl/string;/usr/include/nspr
—      SUNBIRD_MAIN_INCLUDE_DIR /usr/include/iceowl
— Found xpcom (thunderbird and sunbird):
—   XPCOM_INCLUDE_DIRS /usr/include/nspr;/usr/include/icedove;/usr/include/icedove/addrbook;/usr/include/icedove/extensions;/usr/include/icedove/rdf;/usr/include/icedove/string;/usr/include/icedove/xpcom_obsolete;/usr/include/icedove/xpcom;/usr/include/icedove/xulapp;/usr/include/iceowl
—   XPCOM_LIBRARY_DIRS /usr/lib/icedove
—   XPCOM_LIBRARIES xpcom;plds4;plc4;nspr4;pthread;dl
XPCOM_LIBRARIES  xpcom;plds4;plc4;nspr4;pthread;dl
— checking for one of the modules ‘check’
CMake Error at cmake/modules/FindPkgConfig.cmake:357 (message):
None of the required ‘check’ found
Call Stack (most recent call first):
cmake/modules/FindCheck.cmake:27 (PKG_SEARCH_MODULE)
CMakeLists.txt:73 (FIND_PACKAGE)

CMAKING mozilla-sync 0.1.7
— Configuring done

From what I can gather from this output, the configuration file was checking for dependencies, and got hung up on one called “check.” Unfortunately, this gave me zero information that I could use to solve the problem. I can verify that the install failed by running msynctool –listplugins, which returns:

Available plugins:
msynctool: symbol lookup error: msynctool: undefined symbol: osync_plugin_env_num_plugins

Ah, shit. Looks like I’m stuck again. Maybe one day I’ll figure it out. Until then, if any of our readers has ever seen something like this, I could use a couple of pointers.


October 4th, 2009 No comments

I swear that I’ve encountered this before…

That is all.

Categories: Flash, God Damnit Linux, Hardware, Jon F, Linux Tags:

WTF #17(qq)

October 2nd, 2009 No comments

It’s no secret that Linux, as with any other operating system (and yes, I realize that I just grouped all Linux distributions into a collective) has its idiosyncrasies.  The little things that just sort of make me cock my head to the side and wonder why I’m doing this to myself, or make me want to snap my entire laptop in half.

One of these things is something Tyler previously complained about – a kernel update on Fedora 11 that just happened to tank his graphics capabilities.  Now, I might just be lucky but why in the hell would Fedora release a kernel update before compatibility for two major graphics card manufacturers wasn’t released yet?

Fortunately for Tyler, a kmod-catalyst driver was released for his ATI graphics card yesterday (today?) and he’s now rocking the latest kernel with the latest video drivers.  Unfortunately for me, some slacker has yet to update my kmod-nvidia drivers to operate properly with the latest kernel.

While this is more of a rant than anything else, it’s still a valid point.  I’ve never had trouble on a Windows-based machine wherein a major update will cause a driver to no longer function (short of an actual version incrementation – so of course, I would expect Windows XP drivers to not function in Vista, and Vista drivers to not function in Windows 7; similarly, I would not expect Fedora 11 drivers to function in Fedora 12).

<end rant>

Top 10 things I have learned since the start of this experiment

October 2nd, 2009 4 comments

In a nod to Dave’s classic top ten segment I will now share with you the top 10 things I have learned  since starting this experiment one month ago.

10: IRC is not dead

Who knew? I’m joking of course but I had no idea that so many people still actively participated in IRC chats. As for the characters who hang out in these channels… well some are very helpful and some… answer questions like this:

Tyler: Hey everyone. I’m looking for some help with Gnome’s Empathy IM client. I can’t seem to get it to connect to MSN.

Some asshat: Tyler, if I wanted a pidgin clone, I would just use pidgin

It’s this kind of ‘you’re doing it wrong because that’s not how I would do it’ attitude can be very damaging to new Linux users. There is nothing more frustrating than trying to get help and someone throwing BS like that back in your face.

9: Jokes about Linux for nerds can actually be funny

Stolen from Sasha’s post.

Admit it, you laughed too

Admit it, you laughed too

8. Buy hardware for your Linux install, not the other way around

Believe me, if you know that your hardware is going to be 100% compatible ahead of time you will have a much more enjoyable experience. At the start of this experiment Jon pointed out this useful website. Many similar sites also exist and you should really take advantage of them if you want the optimal Linux experience.

7. When it works, it’s unparalleled

Linux seems faster, more featured and less resource hogging than a comparable operating system from either Redmond or Cupertino. That is assuming it’s working correctly…

6. Linux seems to fail for random or trivial reasons

If you need proof of these just go take a look back on the last couple of posts on here. There are times when I really think Linux could be used by everyone… and then there are moments when I don’t see how anyone outside of the most hardcore computer users could ever even attempt it. A brand new user should not have to know about xorg.conf or how to edit their DNS resolver.

Mixer - buttons unchecked

5. Linux might actually have a better game selection than the Mac!

Obviously there was some jest in there but Linux really does have some gems for games out there. Best of all most of them are completely free! Then again some are free for a reason



4. A Linux distribution defines a lot of your user experience

This can be especially frustrating when the exact same hardware performs so differently. I know there are a number of technical reasons why this is the case but things seem so utterly inconsistent that a new Linux user paired with the wrong distribution might be easily turned off.

3. Just because its open source doesn’t mean it will support everything

Even though it should damn it! The best example I have for this happens to be MSN clients. Pidgin is by far my favourite as it seems to work well and even supports a plethora of useful plugins! However, unlike many other clients, it doesn’t support a lot of MSN features such as voice/video chat, reliable file transfers, and those god awful winks and nudges that have appeared in the most recent version of the official client. Is there really that good of a reason holding the Pidgin developers back from just making use of the other open source libraries that already support these features?

2. I love the terminal

I can’t believe I actually just said that but it’s true. On a Windows machine I would never touch the command line because it is awful. However on Linux I feel empowered by using the terminal. It lets me quickly perform tasks that might take a lot of mouse clicks through a cumbersome UI to otherwise perform.

And the #1 thing I have learned since the start of this experiment? Drum roll please…

1. Linux might actually be ready to replace Windows for me

But I guess in order to find out if that statement ends up being true you’ll have to keep following along 😉

Barry: Round Two with the Blogosphere riding Shotgun

September 30th, 2009 2 comments

Given the problems that I’ve been having lately with getting my Blackberry calendar and contacts to synchronize with anything in Linux, I was quite surprised when I almost got it working tonight. Forgetting everything that I’ve learned about the process, I started over, following these helpful tutorials and working through the entire install from the beginning. Unfortunately, aside from some excellent documentation of the install process (finally), the only new idea that those blogs provided me with was to try syncing the phone with different pieces of software. Specifically, Chip recommended KDEPIM, although I opted to  jump through a few more hoops before giving in and dropping the Thunderbird/Lightning combination entirely.

After a bit more mucking about, I decided to give up Lightning and installed Iceowl, Debian’s rebranding of Mozilla Sunbird, instead. Iceowl is the standalone calendar application that Lightning is based on, and is a very lightweight solution that is supposed to cooperate with the opensync-plugin-iceowl package. In theory, this allows calendar data to be shared between my device and the Iceowl calendar after configuring the plugin to read my Iceowl calendar from the /home/username/.mozilla/iceowl/crazyfoldername/storage.sdb file. In practice, the sync process gets locked up every time:

Screenshot-PIM Synchronization - KitchenSync-1

Why must you tease me?

Well, I’ve tried everything that I can think of to get my phone to synchronize with any Mozilla product. I’m very close to giving up, which is a shame, because they really are superior products. The ridiculousness of the entire thing is that I can easily dump my PIM data to a folder, and Thunderbird stores it’s data in an SQLite database. If this were Windows, I’d have written a VB app to fix my problems hours ago… Anybody know any python?

Update: I’ve also managed to successfully synchronize my phone with the Evolution mail client. Unfortunately, Evolution looks rather pale next to Thunderbird. In fact, the entire reason that I switched to Thunderbird about a week ago is that Evolution mysteriously stopped receiving my IMAP email with no explanation. No new email comes in, and the Send/Receive button is grayed out. Until now, I was happy with my decision, as Thunderbird is a superior application.

Barry: The Open-Sourced Blackberry Utility

September 30th, 2009 No comments

There is no denying that the installation process for the Barry project sucks. That said, the promise of having the ability to sync my blackberry with a linux-based calendar application like Mozilla’s Thunderbird or the Evolution mail client kept me working at it through the wee hours of the night. The Barry site at Sourceforge provides not one, not two, but four Debian packages (which rely on an additional two undocumented packages), that need to be downloaded and installed in a specific and undocumented order:

  1. libbarry0_0.15-0_i386.deb (sourceforge)
  2. barry-util_0.15-0_i386.deb (sourceforge)
  3. libglademm-2.4-1c2a (
  4. barrybackup-gui_0.15-0_i386.deb (sourceforge)
  5. libopensync0 (
  6. opensync-plugin-barry_0.15-0_i386.deb (sourceforge)

With the packages installed, I launched a terminal and used the auto-complete feature to find the command barrybackup. At first, I couldn’t figure out what it’s syntax was, until I realized that it doesn’t need any arguments, because it simply launches a GUI (that doesn’t appear anywhere in my Applications menu) that lets you back up your device databases:

Screenshot-Barry Backup

Well, thats a handy utility, assuming that it is also capable of restoring the backups to the device. I shied away from trying the restore feature, as I didn’t have access to a Windows box with which to fix the device should the worst happen.

I’m currently using Mozilla’s Thunderbird (re-branded in Debian as Icedove) as my primary mail client, along with the Lightning calendar plugin, and would be thrilled if I could synchronize it with my Blackberry. You’ll note that libopensync and a Barry opensync plugin were both a part of the installation process; having never used libopensync, I had a tough time figuring out how to make them cooperate.

The opensync page on Wikipedia lead me to install the multisync-tools package, which claims to be able to “synchronize calendars, address books and other PIM data between programs on your computer and other computers, mobile devices, PDAs or cell phones. It relies on the OpenSync  framework to do the actual synchronisation.” I have PIM data that I would like to sync! I have the OpenSync framework! We’re on a roll!

Finally, I installed the multisync-0.90 GUI and opensync-plugin-evolution v0.22-2 opensync plugin packages, which should have allowed me to sync between the Evolution mail client and my phone. I chose to try the process with this software first, as a plugin for Thunderbird was not immediately available. Unfortunately, when attempting to sync, I got this message:

Surprisingly, it was the evolution plugin that failed to connect

Surprisingly, it was the evolution plugin that failed to connect

Useful? Sort of. The Add button let me set up a Blackberry profile with both the barry and evolution plugins, but no matter how I tweaked the settings, I couldn’t get the evolution plugin to connect to my PIM data. Further, after making a synchronization group and adding plugins to it, I couldn’t find a way of replacing a plugin with a different one.

Sick of the limited GUI, I moved on to try KitchenSync, the KDE-based alternative. While it was uglier, I found it to be a far more useful front-end, and managed to get it to sync my device calendar and contacts with my filesystem:

Screenshot-PIM Synchronization - KitchenSync

This process exported all of the calendar and contact information from my Blackberry to a folder full of vCalendar and vContact files on my machine. Now if only I could get Thunderbird to read these files.

After a bit more looking around on the OpenSync webpage, I found a link to these guys, who claim to have programmed an opensync plugin called libopensync-plugin-mozilla-0.1.6 that allows Thunderbird and Lightning to talk to the OpenSync manager. They provide the plugin as a tarball that contains a *.so binary file and a sample *.xml configuration file… but no instructions on how to install them.

Thouroughly lost, I turned to the #opensync channel on for help. Until they see fit to help me out, I’m taking a break from this. No sense in giving myself a heart attack out of extreme frustration.

Edit: I got some help from the members of the #opensync channel, who recommended that I drop the file into the /usr/lib/opensync/plugins/ directory. While this didn’t immediately allow OpenSync to see the plugin, I noticed that every other plugin in the directory has an associated *.la configuration file. So I fabricated my own *.la file, and tried again. That didn’t work either.

The members of the channel then recommended that I try downloading the source code directly from the creators. I did as much, and found that it didn’t include a configure or make script, but just the source code. Not knowing how to proceed, I attempted to follow these instructions, which entailed downloading another 20 or so packages, including the sunbird-xpcom-devel package, which again lacks documentation on how to proceed with installation.

Lacking that package, and again frustrated beyond belief, I decided to drop the issue for another hour or so and do some math homework. That’s right, I chose to do math homework over playing with my computer, because this process has been that frustrating.

It doesn’t help that this entire process seems to be aimed at installed BlueZync, and not the opensync-mozilla-plugin. What the hell is going on here?

DNS Not Satisfactory

September 25th, 2009 No comments

While trying to connect to a remote webserver via SSH last night, I found that my machine refused to resolve the hostname to an IP address. I couldn’t ping the server either, but could view a webpage hosted on it. Now this was a new one on me – I figured that my machine was caching a bad DNS record for the webserver, and couldn’t connect because the server’s IP had since changed. That didn’t really explain why I was able to access the server from a webbrowser, but I ran with it. So how do you refresh your DNS cache in Linux? It’s easy to do in Windows, but the Goog and the Bing let me down spectacularly on this issue.

This morning, I tried to connect via SSH from my school network, and couldn’t get a connection there either. This reinforced the idea that a local DNS cache might have an outdated record in it, because at school, I was using a different nameserver than at home, and a whole 12 hours had elapsed. Out of theories, and lacking a method to refresh my local DNS cache, I hit the #debian channel on IRC for some guidance. Unlike my last two trips to this channel, I got help from a number of people within minutes (must be a timezone thing), and found out that unless I manually installed one, Debian does not maintain a DNS cache. Well, there goes that idea.

So where was I getting my DNS lookup service? A quick look at my /etc/resolv.conf file showed that the only entry in it was, which is the IP of my home router. The file also has a huge warning banner that claims that any changes will be overwritten by the operating system. Makes sense, as when I connect to a new network, I presumably get DNS resolution from their router, which may have a different IP address than mine. The guys on IRC instructed me to try to connect to the server with it’s IP address instead of it’s hostname, thereby taking the DNS resolution at the router out of the picture. This worked just fine.

They then instructed me to add a line to the file with the IP address of the nameserver that the router is using. In the case of our home network, we use OpenDNS, a local company with static servers. I did so, and could immediately resolve the IP of my remote server, and obtain an SSH connection to it.

Well fine, my problem is solved by bypassing DNS resolution at the router, but it still doesn’t explain what’s going on here. Why, if DNS resolution was failing at the router level (presumably because the router maintains some kind of DNS cache), did it work for my webbrowser, but not the for ssh, scp, or ping commands? Don’t they all resolve nameservers in the same way? Further, if it was the router cache that had a bad record in it, why did the problem also manifest itself at school, where the router is taken entirely out of the picture?

Further, will the file actually be overwritten by the OS the next time I connect to a different wireless network? If so, will my manual entry be erased, and will the problem return? Time will tell. Something smells fishy here, and it all points to the fact that my machine is in fact retaining a local DNS cache. How else can I explain away the problem manifesting itself on the school network? Further, even if I do have a local cache that is corrupted or contains a bad record, why did Iceweasel bypass it and resolve the address of the webserver at the router level (thereby allowing it to connect, even though the ssh, scp, and ping commands could not)?