Archive

Archive for the ‘Ubuntu’ Category

And I thought this would be easy…

September 22nd, 2013 1 comment

Some of you may remember my earlier post about contemplating an upgrade from Windows Home Server (Version 1) to a Linux alternative. Since then, I have decided the following:

Amahi isn’t worth my time

 

This conclusion was reached after a fruitless install of the latest Amahi 7 installation on the 500 GB ‘system’ drive, included with the EX470. After backing up the Windows Home Server to a single external 2 TB drive (talk about nerve-wracking!), I popped the drive into a spare PC and installed Amahi with the default options.

ffuu

No, I’m not 13. Yes, this image accurately reflects my frustrations.

Moving the drive back into the EX470 yielded precisely zero results, no matter what I tried – the machine would not respond to a ‘ping’ command, and since I’ve opted to try and do this without a debug board, I don’t even have VGA to tell me what the hell is going on. So, that’s it for Amahi.

When all else fails, Ubuntu

 

After deciding that I really didn’t feel like a repeat of my earlier Fedora experiment, I decided to try out the Linux ‘Old Faithful’ as it were – Ubuntu 12.04 LTS. I opted for the LTS version due to – well, you know – the ‘long-term support’ deal.

Oh, and I upgraded my storage (new 1 TB system drive not shown, and I apologize for the potato-quality image):

IMG_20130921_234311

The only kind of ‘TB’ I like. Not tuberculosis.

 

Following from the earlier Amahi instructions, I popped the primary 1 TB drive into a spare machine and allowed the Ubuntu installer to do its thing. Easy enough! From there, I installed the following two additional items (having to add an additional repository for the latter):

  • Openssh-Server

This allows me to easily control the machine through SSH, and – as I understand it – is pretty much a must for someone wanting to control a headless box. Setup was easy-breezy, in that it required nothing at all.

  • Greyhole

For those unfamiliar, Greyhole is – in their own words – an ‘Easily expandable and redundant storage pool for home servers’. One of my favourite things about WHS v1 was its ‘disk pooling’ capability – essentially a JBOD with software-managed share duplication, ensuring that each selected share was copied over to one other disk in the array.

After those were done with, I popped the drive into the EX470, and – lo and behold! – I was able to SSH in.

sshsuccess

This? This is what relatively minor success looks like.

So at this point, I’m feeling relatively confident. I shut down the server (don’t forget -h!) over SSH, popped in the first of the three 3 TB drives, and…

…nothing. Nada. Zip. Zilch. The server happily blinks away like a small puppy wags its tail, excited to see its owner but clearly bereft of purpose when left to its owner. I can’t ping it, I can’t… well, that’s really it. I can’t ping it, so there’s nothing I can do. Looking to see if GRUB was stuck at the menu, I stuck in a USB keyboard and hit ‘Enter’ to no effect. Yes, my troubleshooting skills are that good.

My next step was to pop both the 1 TB and 3 TB drives into the ‘spare’ machine; this ran fine. Running lshw -short -c disk shows a 1 TB and 3 TB drive without issue. I also ran these parted commands:

mklabel gpt

mkpart primary -1 1

 

(I think that last command is right.) So, all set, right? Cool. Pop the drive back in to the EX470, and…

STILL NOTHING. At this point, I’m ready to go pick up a new four-bay NAS, but I feel like that may be overkill. If anyone has any recommendations on how to get the stupid thing to boot with a 3 TB drive, I’m open to suggestions.

 

WTF Ubuntu

September 7th, 2013 2 comments

I’m not even sure what to say about this one… it looks like I might have an angry video card.

I sat down at my machine after it had been sitting for three or four days to find this... wtf?

I sat down at my machine after it had been sitting for three or four days to find this… wtf?

Categories: God Damnit Linux, Jon F, Ubuntu Tags:

Dual Booting Ubuntu 13.04 and Windows 8 on a Lenovo Y400 IdeaPad

July 27th, 2013 1 comment

With the third edition of The Linux Experiment already underway, I decided to get my new laptop set up with an Ubuntu partition to work with over the next few months. A little while back, I purchased this laptop with intent to use it as a gaming rig. It shipped with Windows 8, which was a serious pain in the ass to get used to. Now that I’ve dealt with that and have Steam and Origin set up on the Windows partition, it’s time to make this my primary machine and start taking advantage of the power under its hood by dual-booting an Ubuntu partition for development and experiment work.

I started my adventure by downloading an ISO of the latest release of Ubuntu – at the time of this writing, that’s 13.04. Because my new laptop has UEFI instead of BIOS, I made sure to grab the x64 version of the distribution.

Aside: If you’re using NoScript while browsing Ubuntu’s website, you’ll want to keep an eye on the address bar while navigating through the download steps. In my case, the screen that asks you to donate to the project redirected me to a different version of the ISO until I enabled JavaScript.

After using Ubuntu’s Startup Disk Creator to create a bootable USB stick, I started my first adventure – figuring out how to get the IdeaPad to boot from USB. A bit of quick googling told me that the trick was to alternately tap F10 and F12 during the boot sequence. This brought up a boot menu that allowed me to select the USB stick.

Once Ubuntu had booted off of the USB stick, I opened up GParted and went about making some space for my new operating system. The process was straightforward – I selected the largest existing partition (it also helped that it was labelled WINDOWS_OS), and split it in half. My only mistake in this process was to choose to put the new partition in front of the existing partition on the drive. Because of this, GParted had to copy all of the data on the Windows partition to a new physical location on the hard drive, a process that took about three hours.

The final partitioning scheme with my new Linux partition highlighted

The final partitioning scheme with my new Linux partition highlighted

With my hard drive appropriately partitioned, it was time to install the operating system. The modern Ubuntu installer pretty much takes care of everything, even going so far as selecting an appropriate space to use on the hard drive. I simply told it to install alongside the existing Windows partition, and let it take care of the details.

The installer finished its business in short order, and I restarted the machine. Ubuntu booted with no issues, but my Windows 8 partition refused to cooperate. It would seem as though something that the installer did wasn’t getting along well with UEFI/SecureBoot. Upon attempting to boot Windows, I got the following message:

error: Secure Boot forbids loading module from (hd0,gpt8)/boot/grub/x86_64-efi/ntfs.mod.
error: failure reading sector 0x0 from ‘cd0’
error: no such device: 0030DA4030DA3C7A
error: can’t find command ‘drivemap’
error: invalid EFI file path

Press any key to continue…

Uh oh.

Like I said, I could boot Ubuntu, so I headed on over to their website and read their page on UEFI. At first glance, it seemed as though I had done everything correctly. The only place that I deviated from these instructions was in manually resizing my Windows partition to create space for my new Ubuntu partition.

Thinking that I might be experiencing troubles with  my boot partition, I took a shot at running Ubuntu’s Boot-Repair utility. It seemed to do something, but upon restarting the machine, I found that I had even more problems – now a Master Boot Record wasn’t found at all:

It would appear as though I may have made things worse...

It would appear as though I may have made things worse…

After dismissing the boot device error, I was prompted to choose which device to boot from. I chose to boot Windows’ UEFI Repair partition, and was (luckily) able to get to a desktop. Unfortunately, none of the other partitions on the device seem to work, so I’m back where I started at the beginning, except that now in addition to having to put up with Windows 8, I also have a broken master boot record.

Lenovo: 1 / Jon: 0.

Airing of grievances: in which upgrading Ubuntu wreaks havoc

February 24th, 2013 4 comments

I’ve had a few nasty experiences this week with Linux and figured I’d vent here. Unlike my previous efforts with Linux From Scratch and Gentoo, my complaints this time around are related to upgrading Ubuntu.

Ubuntu 10.04 to 12.04: Save yourself the trouble

At this point the current Ubuntu LTS release (12.04) is my preferred distribution to work with: it has become widespread enough that troubleshooting and previous solutions online are easy to locate. In a professional capacity, I also maintain systems that are still on 8.04 LTS (supported until April 2013, so we have to be pretty aggressive about replacing them) or 10.04 LTS (good until April 2015).

I attempted to complete two upgrades from the 10.04 release this week to 12.04 – one 10.04 LTS “desktop” installation, and one 10.04 LTS headless server installation. Both were virtual machines running under VMWare ESXi, but neither had given me any trouble during normal use.

Canonical’s updater process (the wrapper around dist-upgrade) appears to be pretty slick; it gives you appropriate warnings, attempts to start a SSH daemon as a fallback mechanism and starts on its merry way to download the necessary packages to bring your system completely up to date. On my 10.04 desktop VM, the installer fell apart completely during the package replacement/removal/installation sequence. I was left with two nasty message boxes: one advising that my system was now in a broken state, and another that completely contained rectangular, unprintable characters.

To put it bluntly, I was not amused, but it wasn’t a critical system and I was content to replace it with a fresh 12.04 installation rather than waste additional time troubleshooting with apt or dpkg. Strike one for the upgrader.

At least the server came back up!

Next on the upgrade schedule was the 12.04 server VM. Install, package replacement and reboot went fine, but I had several custom PPAs installed to support development of XenonMKV (Github page) – specifically ppa:krull/deadsnakes to add Python 2.7 to Ubuntu 10.04.

Python 2.7 still worked when the server came back up, and all my usual tools of choice like SABnzbd+, SickBeard and CouchPotato were still functional.

For some reason, though, I’d gotten it into my head this evening to check out Mezzanine as a potential WordPress replacement. Mezzanine uses Django, a Python Web framework, and the list of supported features is pretty encompassing.

Sidebar: Django and mod_wsgi – complicated enough?

One of the most irritating things from a system administration point of view is getting Web applications to run in a standard server environment – typically a Linux base system and Apache or nginx to serve content. I suppose I’ve been spoiled with how easy it is to get PHP-based sites up and running these days in that configuration by adding an Apache module through apt. A lot of new Web app frameworks come with their own small webservers for development and testing, but generally their creators recommend that when you’re ready to put your site live, that the product run under a well-known Web or application server.

The Django folks recommend using mod_wsgi in their documentation, which in and of itself really just says “RTFM for mod_wsgi and then you’ll have a much better idea of how to do this.” I had to go poking around on Google for the installation article since there are some broken links, but okay, it’s an Apache module with a small bit of configuration (even though a simple walkthrough in the Django documentation would go a long way to making deployment easier.) This is where I ran into my dependency/PPA problem on Ubuntu 10.04.

I’ve suppose I’ve screwed the pooch…

Running the suggested command, I tried: sudo apt-get install libapache2-mod-wsgi and got the following

The following packages have unmet dependencies:
libapache2-mod-wsgi : Depends: libpython2.7 (>= 2.7) but it is not going to be installed
E: Unable to correct problems, you have held broken packages.

Backtracking, I then found out why the library wasn’t going to get installed:


The following packages have unmet dependencies:
libpython2.7 : Depends: python2.7 (= 2.7.3-0ubuntu3.1) but 2.7.3-2+lucid1 is to be installed

Aha! The Python installation from the PPA for Lucid – 10.04 – was installed and acting as the 2.7 package. Since the newly-upgraded Ubuntu 12.04 uses Python 2.7 as a dependency for a good portion of the default applications, I couldn’t just purge or uninstall it, and my attempts to force a reinstallation all ended in:


Reinstallation of python2.7 is not possible, since it cannot be downloaded.

Rebuild?

At this point it looks like I’ll have to rebuild the server VM as well, but if any readers have any bright ideas on fixing this dependency hell – please comment with your suggestions!

Categories: God Damnit Linux, Jake B, Ubuntu Tags:

Using ATI Catalyst drivers on Ubuntu 12.10 with old hardware

February 14th, 2013 No comments

As of version 12.10, Ubuntu has upgraded the version of X.org they include to the latest and unfortunately it is no longer compatible with the official ATI Catalyst drivers for some cards, specifically the HD2xxx, 3xxx and 4xxx models. The open source driver is the only officially supported alternative and, while it is fine for most uses, it doesn’t support the advanced power settings that the ATI driver does. This means that on my laptop in particular the fan runs constantly as it tries to cool down the overheating card.

So… no Ubuntu 12.10+ then?

Thankfully someone has created a PPA that successfully downgrades the version of X.org to the maximum supported version for the official ATI driver. This step is obviously quite drastic and should not be used on production systems. However from the limited time that I have been running it things seem pretty stable. The PPA (and instructions) can be found at this link: AMD Catalyst Legacy

Categories: Tyler B, Ubuntu, Xorg/X11 Tags: , , ,

Limit Bandwitdth Used by apt-get

October 22nd, 2012 No comments

It’s easy. Simply throw “-o Acquire::http::Dl-Limit=X” in your apt-get command where X is the kb/s you wish to limit it to. So for example let’s say that you want to limit an apt-get upgrade command to roughly 50kb/s of bandwidth. Simply issue the following command:

sudo apt-get -o Acquire::http::Dl-Limit=50 upgrade

Simple right?

Ubuntu 12.10 Beta 1 (Report #3)

September 22nd, 2012 No comments

Just a quick update on my experience running the pre-release version of Ubuntu (this time upgraded to Ubuntu 12.10 Beta 1!). Not a whole lot new to report – Beta 1 is basically the same as Alpha 3 but with the addition of an option to connect to a Remote Server directly from the login screen. Unfortunately the bugs that I have filed so far have yet to be resolved, but I’m still hopeful someone has a chance to correct them prior to release.

It is already almost the end of September which means there are only a couple more weeks before the official 12.10 launch. From what I’ve seen so far this upgrade will be a pretty small, evolutionary update to the already good 12.04 release.

Previous posts in this series:

Categories: Tyler B, Ubuntu Tags: ,

Ubuntu 12.10 Alpha 3 (Report #2)

September 1st, 2012 No comments

Running an alpha version of an operating system, Linux or otherwise, is quite a different experience. It means, for instance, that you are not allowed to complain when minor things have bugs or simply don’t work – it is all par for the course, after all this is alpha software. That doesn’t mean however that when you do run into problems that it doesn’t still suck.

I ran into one of these problems earlier today while trying to connect via SSH to a remote computer within Nautilus. It seems that this release of the software is currently broken resulting in the following error message every time I try and browse my remote server’s directories:

The second really annoying issue I ran into was GIMP no longer showing menu items in Ubuntu’s global appmenu. This was especially infuriating because, prior to installing some updates today, it had worked perfectly fine in the past. I even had to hunt down a sub-par paint (GNU Paint) application just to crop the above screenshot.

Hopefully my annoying experiences, and subsequent bug filings, will prevent other users from experiencing the same pains when 12.10 is finally released to all. Here’s hoping anyway…

Update: It turns out that it wasn’t just the GIMP that wasn’t displaying menu items, no applications are. Off to file another bug…

Previous posts in this series:

Categories: Tyler B, Ubuntu Tags:

Ubuntu 12.10 Alpha 3 (Report #1)

August 27th, 2012 No comments

Well it’s been a little while since I made the mistake (joking) of installing Ubuntu 12.10 Alpha 3. Here is what I’ve learned so far.

  1. My laptop really does not like the open source ATI graphics driver – and there are no proprietary drivers for this release yet. It’s not that the driver doesn’t perform well enough graphically, its just that it causes my card to give off more heat than the proprietary driver. This in turn causes my laptop’s fan to run non-stop and drains my battery at a considerable rate.
  2. Ubuntu has changed the way they do updates in this release. Instead of the old Update Manager there is a new application (maybe just a re-skinning of the old) that is much more refined and really quite simple. Interestingly enough the old hardware drivers application is also now gone, instead it is merged into the update manager. Overall I’m neutral on both changes.

    Updates are quite frequent when running an alpha release

  3. There is a new Online Accounts application (part of the system settings) included in this release. This application seems to work like an extension of the GNOME keyring – saving passwords for your various online accounts (go figure). I haven’t really had a chance to play around with it too much yet but it seems to work well enough.

That’s it for now. I’m off to file a bug over this open source driver that is currently melting my computer. I’ll keep you posted on how that goes.

Categories: Tyler B, Ubuntu Tags: ,

Test driving the new Ubuntu (12.10)

August 26th, 2012 No comments

Call it crazy but I’ve decided to actually install an Ubuntu Alpha release, specifically Ubuntu 12.10 Alpha 3. Why would anyone in their right mind install an operating system that is bound to be full of bugs and likely destroy all of my data? My reasons are twofold:

  1. I regularly use Ubuntu or Ubuntu derivatives and would like to help in the process of making them better
  2. There are still a few quirks with my particular laptop that I would like to help iron out once and for all, hopefully correcting them in a more universal sense for Linux as a whole

So join me over the next few posts as I relate my most recent experiences running… shall we say, less than production code.

 

Categories: Tyler B, Ubuntu Tags: ,

Building glibc for LFS from Ubuntu by replacing awk

November 23rd, 2011 No comments

If you run into the following error trying to build LFS from a Ubuntu installation:


make[1]: *** No rule to make target `/mnt/lfs/sources/glibc-build/Versions.all', needed by `/mnt/lfs/sources/glibc-build/abi-versions.h'. Stop.

The mawk utility installed with Ubuntu, and symlinked to /usr/bin/awk by default does not properly handle the regular expressions in this package. Perform the following commands:


# apt-get install gawk
# rm -rf /usr/bin/{m}awk
# ln -snf /usr/bin/gawk /usr/bin/awk

Then you’re just a make clean; ./configure –obnoxious-dash-commands; make; make install away from success.

Ubuntu 11.10’s WiFi crashes my router

October 19th, 2011 9 comments

No seriously, it does. Whenever it makes a connection to the router it causes it to enter some bad state wherein it refuses to allow any connections to occur. This also has the effect of booting all other machines from the network. Apparently I’m not the only one to have this problem either.

I did manage to find a bit of a work around though:

  1. Set your wireless router to Mixed B/G mode only (yes I know, you lose out on N by doing this…)
  2. Enter the following into a terminal:
    echo "options iwlagn 11n_disable=1" | tee /etc/modprobe.d/iwlagn.confg
    sudo modprobe -rf iwlagn
    sudo modprobe -v iwlagn
    sudo service network-manager restart
  3. Maybe reboot?

I’ve also heard of some people getting it to work by enabling this instead of disabling it. To do so simply change the 11n_disable=1 line above to 11n_disable=0.

Hopefully they will have this annoying bug fixed soon.

Categories: Tyler B, Ubuntu Tags: , , , ,

How to install sun-java6-jdk and Netbeans in Ubuntu 11.10

October 14th, 2011 9 comments

If you’ve recently upgraded to Ubuntu 11.10 and are a developer you may notice some things missing. For one there is no longer an option to install the sun-java6-jdk or JRE from the repositories. Worse they also removed the Netbeans IDE. Apparently this had something to do with licenses but if you’re going to offer MP3 support the least you could do is make software like this available for those who are willing to look for it.

Anyway with that rant out of the way I did manage to find a way to install both.

Install sun-java6-jdk

Following the instructions on this excellent post I was able to successfully install sun-java6-jdk using the following commands:

sudo add-apt-repository ppa:ferramroberto/java
sudo apt-get update
sudo apt-get install sun-java6-jdk sun-java6-plugin

There are alternative instructions for installing Java 7 as well.

Install Netbeans

My first attempt at installing both was to head to the official Oracle Java website and download the Netbeans + JDK installer. Unfortunately the installer seems to crash in this version of Ubuntu. However since the above process had installed the JRE I was able to simply grab the Netbeans only installer from Oracle which ended up working surprisingly well. Just remember to run it using sudo if you want other users to be able to use it as well.

Categories: Tyler B, Ubuntu Tags: , ,

On Veetle, Linux Mint, and ICEauthority

September 21st, 2011 4 comments

Like most people, I use my computer for multimedia. Recently I’ve discovered a multi-platform program called Veetle. It’s a pretty good program, but I ran into an issue after having installed it on my system (currently running Linux Mint 11): while I was using it to stream video, my computer basically locked up – every running process continued working, but I had no control over it. Since I was watching a full-screen video, this was pretty unfortunate. After all, it often helps to be able to maneuver your windows when you’re in a bind. I also immediately noticed that I lost all sound control on my keyboard. I rebooted my computer, but when I tried to log in, I got an error telling me that my computer could not update /home/user/.ICEauthority, followed by another error message, which I’m assuming was related but of less importance.

I actually into this exact problem before on an older machine, but before I had the chance to investigate, the hard disk died (for unrelated reasons). Luckily, I recognized the error on my newer machine and put two and two together: both failures coincided with the installation of Veetle. Now, because I’m a nerd, I have two functioning and constantly active computers right next to each other, for just such an occasion! It may also be related to the fact that websites that stream media tend to be a bit iffy so I feel more secure not using my Windows machine while exploring them, but enough about that! I Googled (or Binged, assuming “Bong” or “Bung” isn’t the past tense) a solution.

The solution

As it turns out, other people have run into this same problem, and it’s been covered on the Ubuntu forums and elsewhere. Basically, I ran the Veetle script as root (D’oh!), and this royally boned everything. This post by mjcritchie at the ubuntu Forums (which follows the advice of tommcd at LinuxQuestions.org) explained what to do:

I have had the same problem twice, both times after updating (currently running 64bit Karmic).

Tried various solutions on the net, but this is the only one that worked for me:

Open a terminal and run:

Quote:
sudo chown -R user:user /home/user/.*

Where user is your user_name. This should change ownership of all the hidden files and directories in your home directory to: user:user, as they should be.

This comes courtesy of tommcd over at this post on LinuxQuestions.org

So there you have it. My machine currently works, and now I can get back to streaming media. At least until the next time I get too adventurous when installing things.

Ubuntu 11.04 Installer Fail

August 24th, 2011 3 comments

So I decided to take a go at Ubuntu 11.04 in a virtual machine before taking the leap and installing it for real. As I understand it, the new Unity desktop is a pretty major departure from the Gnome 2.x desktop that I’m used to, and I want to see if it’s as bad as it looks in the screenshots.

Unfortunately, I’ve yet to make it to the desktop, as Ubuntu has decided that it will take 42 minutes to download some language packs that I neither want or need.

Didn’t I tell it what language I speak as the first step of the install process? Surely this can be done later.

Categories: God Damnit Linux, Jon F, Ubuntu Tags:

Linux Multimedia Studio on Ubuntu 10.04

July 31st, 2011 1 comment

Recently, Tyler linked me to Linux Multimedia Studio, a Fruityloops-type application for Linux. Since I’m big into music recording and production, he figured that I’d be interested in trying it out, and he was right. Unfortunately, the developers of same were not as interested.

To start off, I installed the application from a PPA with the following terminal commands:

sudo apt-add-repository ppa:dns/sound
sudo aptitude update
sudo aptitude install lmms

After the install process finished, I tried to launch the application from the command line, only to see a bunch of nasty error messages:

jonf@THE-LINUX-EXPERIMENT:~$ sudo lmms
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
bt_audio_service_open: connect() failed: Connection refused (111)
Segmentation fault

I dumped the errors into Google, and found a helpful thread on the Ubuntu forums that suggested that I uninstall Bluetooth Audio Services from my machine. Since I don’t use bluetooth audio in any capacity, I happily obliged. When finished, my list of installed items with Bluetooth in the name looked like this:

A list of installed software matching the search term "bluetooth" in Ubuntu Software Centre

Unfortunately, I didn't think ahead enough to note down the names of the packages that I uninstalled.

After ridding myself of Bluetooth audio support, I tried to launch the application again. Unfortunately, I got another Segmentation fault error:

jonf@THE-LINUX-EXPERIMENT:~$ sudo lmms
Segmentation fault

Reading on in the thread, I saw somebody suggest that I check the dmesg tail for messages pertaining to the crash:

jonf@THE-LINUX-EXPERIMENT:~$ dmesg | tail
[  233.302221] JFS: nTxBlock = 8192, nTxLock = 65536
[  233.314247] NTFS driver 2.1.29 [Flags: R/O MODULE].
[  233.343361] QNX4 filesystem 0.2.3 registered.
[  233.367738] Btrfs loaded
[ 2233.118020] __ratelimit: 33 callbacks suppressed
[ 2233.118026] lmms[10706]: segfault at 7f241c7fdd80 ip 00007f241c7fdd80 sp 00007f24187f8a38 error 14 in zm1_1428.so[7f241ca01000+1000]
[ 2523.015245] lmms[10808]: segfault at 7fd80e9bcd80 ip 00007fd80e9bcd80 sp 00007fd80a9b7a38 error 14 in zm1_1428.so[7fd80ebc0000+1000]
[ 2671.323363] lmms[10845]: segfault at 7fbe39a77d80 ip 00007fbe39a77d80 sp 00007fbe35a72a38 error 14 in zm1_1428.so[7fbe39c7b000+1000]
[ 2836.885480] lmms[11246]: segfault at 7f885b71ed80 ip 00007f885b71ed80 sp 00007f8857719a38 error 14 in zm1_1428.so[7f885b922000+1000]
[ 3039.773287] lmms[11413]: segfault at 7ff83056ed80 ip 00007ff83056ed80 sp 00007ff82c569a38 error 14 in zm1_1428.so[7ff830772000+1000]

On the last few lines, you can see that the error was thrown in a module called zml_1428.so. A bit of Googling turned up the fact that this module is a part of the LADSPA (Linux Audio Developers Simple Plugin API) stack, which provides developers with a standard, cross-platform API for dealing with audio filters and effects.

Scrolling down in the aforementioned thread, I found a post that suggested that I kill all PulseAudio activities on my system before attempting to run the application. PulseAudio is another part of the Linux audio layer that allows user-land applications to talk to your sound hardware via a simple API. It also provides some effects plugins and mixdown capabilities. I went ahead and killed the PulseAudio server on my machine with the following command:

jonf@THE-LINUX-EXPERIMENT:~$ killall pulseaudio

After executing this command, I still got a Segmentation fault when starting LMMS under my user account, but did actually get to a Settings panel when running it with Sudo:

jonf@THE-LINUX-EXPERIMENT:~$ sudo lmms
Home directory /home/jfritz not ours.
ALSA lib pcm_dmix.c:1010:(snd_pcm_dmix_open) unable to open slave
Playback open error: Device or resource busy
Expression 'snd_pcm_hw_params_set_buffer_size_near( self->pcm, hwParams, &bufSz )' failed in 'src/hostapi/alsa/pa_linux_alsa.c', line: 1331
Expression 'PaAlsaStreamComponent_FinishConfigure( &self->playback, hwParamsPlayback, outParams, self->primeBuffers, realSr, outputLatency )' failed in 'src/hostapi/alsa/pa_linux_alsa.c', line: 1889
Expression 'PaAlsaStream_Configure( stream, inputParameters, outputParameters, sampleRate, framesPerBuffer, &inputLatency, &outputLatency, &hostBufferSizeMode )' failed in 'src/hostapi/alsa/pa_linux_alsa.c', line: 1994
Couldn't open PortAudio: Unanticipated host error
Home directory /home/jfritz not ours.
Home directory /home/jfritz not ours.

Although the output appeared to be riddled with audio layer errors, and the Audio Settings tab of the Setup panel gave me a clue as to why:

Notice how the Audio Interface setting in that image says “Pulse Audio (bad latency!)”. I would hazard a guess that the latency issues with PulseAudio have something to do with the fact that I killed it just prior to getting this damned thing to launch. When I hit the OK button, I was able to see the application, but there was no sound.

Figuring that sound was a necessary component of an audio production application, I booted back to the Setup menu, and told the app to funnel its audio through JACK instead of PulseAudio. The JACK Audio Connection Kit is another sound subsystem, kind of like PulseAudio, that provides an API that developers can use to interface with a machine’s audio hardware. Because of its low latency performance, JACK is often considered to be the standard API for high-quality audio recording and production apps. Unfortunately, it doesn’t work worth a damn in LMMS:

jonf@THE-LINUX-EXPERIMENT:~$ sudo lmms
jackd 0.118.0
Copyright 2001-2009 Paul Davis, Stephane Letz, Jack O'Quinn, Torben Hohn and others.
jackd comes with ABSOLUTELY NO WARRANTY
This is free software, and you are welcome to redistribute it
under certain conditions; see the file COPYING for details

no message buffer overruns
JACK compiled with System V SHM support.
loading driver ..
SSE2 detected
creating alsa driver ... hw:0|hw:0|1024|2|48000|0|0|nomon|swmeter|-|32bit
control device hw:0
SSE2 detected
all 32 bit float mono audio port buffers in use!
cannot assign buffer for port
cannot deliver port registration request
no more JACK-ports available!
No audio-driver working - falling back to dummy-audio-driver
You can render your songs and listen to the output files...
Home directory /home/jfritz not ours.
Home directory /home/jfritz not ours.
the playback device "hw:0" is already in use. Please stop the application using it and run JACK again
cannot load driver module alsa
Home directory /home/jfritz not ours.

Having dealt with JACK on a previous install, I had one more trick up my sleeve in my effort to get this bastard application to make a sound. I installed the JACK Control Panel from the Ubuntu Software Centre. It’s a QT app that interfaces with the JACK server and allows you to modify settings and stuff.


With it installed, I pressed the big green (or is it red – I’m colour blind, and hate when developers use these two colours for important status messages) Start button, only to encounter some nasty errors:


That might be a problem. I hit the messages button and found a message advising me to make a change to the /etc/security/limits.conf file so that JACK would be allowed to use realtime scheduling:

JACK is running in realtime mode, but you are not allowed to use realtime scheduling.
Please check your /etc/security/limits.conf for the following lines
and correct/add them:
@audio - rtprio 100
@audio - nice -10
After applying these changes, please re-login in order for them to take effect.
You don't appear to have a sane system configuration. It is very likely that you
encounter xruns. Please apply all the above mentioned changes and start jack again!

I figured that it was worth a shot, considering how far I’ve already gone just to try out a piece of software that I don’t really even need. I made the requested changes in the config file, restarted my machine and tried again… only to be greeted by the same damned error message.

At this point, I decided to give up on LMMS. It’s too damned complicated, and ultimately not worth my time. Perhaps when they release a version that I can install and start using without an hour of troubleshooting, I’ll come back and give it another shot. In the mean time, if you’re looking for a decent drum machine with more than a few tricks up its sleeve, check out Hydrogen Drum Machine. It works very well, and I’ve created some neat stuff in it.

How to Compile Banshee 1.9.0 on Ubuntu 10.04

December 9th, 2010 1 comment

Regular readers of this site will know that I’m no fan of Rhythmbox. When I recently installed Ubuntu 10.04 on my desktop PC, I decided to give Gnome’s default media player a few days to win back my affection. Unfortunately, while Novell’s Banshee project appears to be moving ahead with lots of great new features, Rythmbox still suffers from the issues that I outlined in my now infamous lambasting of it, nearly 8 months ago. To be fair, the pre-installed version of Rythmbox is only 0.12.8 on Ubuntu 10.04 (the same one that I reviewed previously), while the project has forged ahead to version 0.13.2.

Regardless, I prefer to listen to my music with Banshee, and I’m itching to try the latest version. On November 10th, the project released Banshee 1.9.0, and it looks positively excellent. I decided to give it a go, and downloaded the source tarball from the project’s website. Following are the steps that were necessary to install it:

  1. Head over to a terminal and install intltool, libgtk2.0-dev, libgtk2.0-cil-dev, libmono-dev, mono-gmcs, libmono-addins-cil-dev, monodoc-base, boo, libboo-cil-dev, libmono-addins-gui-cil-dev, libndesk-dbus-glib1.0-cil-dev, libgdata-dev, libgdata-cil-dev, libtag1-dev, libtaglib-cil-dev, sqlite3, libsqlite3-dev, libgconf2.0-cil-dev, libmtp-dev, libmono-zeroconf1.0-cil, libmono-zeroconf1.0-cil-dev, libwebkit-dev, libwebkit-cil-dev, and libsoup-gnome2.4-dev with the following command:

    sudo apt-get install intltool libgtk2.0-dev libgtk2.0-cil-dev libmono-dev mono-gmcs libmono-addins-cil-dev libmono-addins-gui-cil-dev monodoc-base boo libboo-cil-dev libndesk-dbus-glib1.0-cil-dev libgdata-dev libgdata-cil-dev libtag1-dev libtaglib-cil-dev sqlite3 libsqlite3-dev libgconf2.0-cil-dev libmtp-dev libmono-zeroconf1.0-cil libmono-zeroconf1.0-cil-dev libwebkit-dev libwebkit-cil-dev libsoup-gnome2.4-dev

  2. Next, you’ll need GStreamer and a few of its base plugins package: libgstreamer0.10-dev and libgstreamer-plugins-base0.10-dev

    sudo apt-get install libgstreamer0.10-dev libgstreamer-plugins-base0.10-dev

  3. If you want to play music encoded in non-free formats like mp3, you’ll also need a few restricted GStreamer libraries like gstreamer-plugins-good, gstreamer-plugins-bad, gstreamer-plugins-bad-multiverse, gstreamer-plugins-ugly, and gstreamer-plugins-ugly-multiverse.

    sudo apt-get install gstreamer-plugins-good gstreamer-plugins-bad gstreamer-plugins-bad-multiverse gstreamer-plugins-ugly gstreamer-plugins-ugly-multiverse

  4. Since I don’t have an iPod or similar Apple device, I’ve configured my installation to disable Apple device support. If you have an iPod, you can lose the –disable-apple-device and –disable-ipod flags after the configure command, but you’ll also need to add a couple of extra libraries to your system. To compile and install Banshee, navigate to the folder where you unzipped the tarball, and type the following in your terminal:

    ./configure –disable-appledevice –disable-ipod
    sudo make
    sudo make install

Banshee should now be installed. From your terminal, type

banshee-1

as a sanity check. Once the application launches, select Help > About and ensure that the version number is 1.9.0. If so, you should be good to go.

I’ll try to post a full review of this latest version of Banshee within a couple of days. In the mean time, happy listening!

Create a GTK+ application on Linux with Objective-C

December 8th, 2010 8 comments

As sort of follow-up-in-spirit to my older post I decided to share a really straight forward way to use Objective-C to build GTK+ applications.

Objective-what?

Objective-C is an improvement to the iconic C programming language that remains backwards compatible while adding many new and interesting features. Chief among these additions is syntax for real objects (and thus object-oriented programming). Popularized by NeXT and eventually Apple, Objective-C is most commonly seen in development for Apple OSX and iOS based platforms. It ships with or without a large standard library (sometimes referred to as the Foundation Kit library) that makes it very easy for developers to quickly create fast and efficient programs. The result is a language that compiles down to binary, requires no virtual machines (just a runtime library), and achieves performance comparable to C and C++.

Marrying Objective-C with GTK+

Normally when writing a GTK+ application the language (or a library) will supply you with bindings that let you create GUIs in a way native to that language. So for instance in C++ you would create GTK+ objects, whereas in C you would create structures or ask functions for pointers back to the objects. Unfortunately while there used to exist a couple of different Objective-C bindings for GTK+, all of them are quite out of date. So instead we are going to rely on the fact that Objective-C is backwards compatible with C to get our program to work.

What you need to start

I’m going to assume that Ubuntu will be our operating system for development. To ensure that we have what we need to compile the programs, just install the following packages:

  1. gnustep-core-devel
  2. libgtk2.0-dev

As you can see from the list above we will be using GNUstep as our Objective-C library of choice.

Setting it all up

In order to make this work we will be creating two Objective-C classes, one that will house our GTK+ window and another that will actually start our program. I’m going to call my GTK+ object MainWindow and create the two necessary files: MainWindow.h and MainWindow.m. Finally I will create a main.m that will start the program and clean it up after it is done.

Let me apologize here for the poor code formatting; apparently WordPress likes to destroy whatever I try and do to make it better. If you want properly indented code please see the download link below.

MainWindow.h

In the MainWindow.h file put the following code:

#import <gtk/gtk.h>
#import <Foundation/NSObject.h>
#import <Foundation/NSString.h>

//A pointer to this object (set on init) so C functions can call
//Objective-C functions
id myMainWindow;

/*
* This class is responsible for initializing the GTK render loop
* as well as setting up the GUI for the user. It also handles all GTK
* callbacks for the winMain GtkWindow.
*/
@interface MainWindow : NSObject
{
//The main GtkWindow
GtkWidget *winMain;
GtkWidget *button;
}

/*
* Constructs the object and initializes GTK and the GUI for the
* application.
*
* *********************************************************************
* Input
* *********************************************************************
* argc (int *): A pointer to the arg count variable that was passed
* in at the application start. It will be returned
* with the count of the modified argv array.
* argv (char *[]): A pointer to the argument array that was passed in
* at the application start. It will be returned with
* the GTK arguments removed.
*
* *********************************************************************
* Returns
* *********************************************************************
* MainWindow (id): The constructed object or nil
* arc (int *): The modified input int as described above
* argv (char *[]): The modified input array modified as described above
*/
-(id)initWithArgCount:(int *)argc andArgVals:(char *[])argv;

/*
* Frees the Gtk widgets that we have control over
*/
-(void)destroyWidget;

/*
* Starts and hands off execution to the GTK main loop
*/
-(void)startGtkMainLoop;

/*
* Example Objective-C function that prints some output
*/
-(void)printSomething;

/*
********************************************************
* C callback functions
********************************************************
*/

/*
* Called when the user closes the window
*/
void on_MainWindow_destroy(GtkObject *object, gpointer user_data);

/*
* Called when the user presses the button
*/
void on_btnPushMe_clicked(GtkObject *object, gpointer user_data);

@end

MainWindow.m

For the class’ actual code file fill it in as show below. This class will create a GTK+ window with a single button and will react to both the user pressing the button, and closing the window.

#import “MainWindow.h”

/*
* For documentation see MainWindow.h
*/

@implementation MainWindow

-(id)initWithArgCount:(int *)argc andArgVals:(char *[])argv
{
//call parent class’ init
if (self = [super init]) {

//setup the window
winMain = gtk_window_new (GTK_WINDOW_TOPLEVEL);

gtk_window_set_title (GTK_WINDOW (winMain), “Hello World”);
gtk_window_set_default_size(GTK_WINDOW(winMain), 230, 150);

//setup the button
button = gtk_button_new_with_label (“Push me!”);

gtk_container_add (GTK_CONTAINER (winMain), button);

//connect the signals
g_signal_connect (winMain, “destroy”, G_CALLBACK (on_MainWindow_destroy), NULL);
g_signal_connect (button, “clicked”, G_CALLBACK (on_btnPushMe_clicked), NULL);

//force show all
gtk_widget_show_all(winMain);
}

//assign C-compatible pointer
myMainWindow = self;

//return pointer to this object
return self;
}

-(void)startGtkMainLoop
{
//start gtk loop
gtk_main();
}

-(void)printSomething{
NSLog(@”Printed from Objective-C’s NSLog function.”);
printf(“Also printed from standard printf function.\n”);
}

-(void)destroyWidget{

myMainWindow = NULL;

if(GTK_IS_WIDGET (button))
{
//clean up the button
gtk_widget_destroy(button);
}

if(GTK_IS_WIDGET (winMain))
{
//clean up the main window
gtk_widget_destroy(winMain);
}
}

-(void)dealloc{
[self destroyWidget];

[super dealloc];
}

void on_MainWindow_destroy(GtkObject *object, gpointer user_data)
{
//exit the main loop
gtk_main_quit();
}

void on_btnPushMe_clicked(GtkObject *object, gpointer user_data)
{
printf(“Button was clicked\n”);

//call Objective-C function from C function using global object pointer
[myMainWindow printSomething];
}

@end

main.m

To finish I will write a main file and function that creates the MainWindow object and eventually cleans it up. Objective-C (1.0) does not support automatic garbage collection so it is important that we don’t forget to clean up after ourselves.

#import “MainWindow.h”
#import <Foundation/NSAutoreleasePool.h>

int main(int argc, char *argv[]) {

//create an AutoreleasePool
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

//init gtk engine
gtk_init(&argc, &argv);

//set up GUI
MainWindow *mainWindow = [[MainWindow alloc] initWithArgCount:&argc andArgVals:argv];

//begin the GTK loop
[mainWindow startGtkMainLoop];

//free the GUI
[mainWindow release];

//drain the pool
[pool release];

//exit application
return 0;
}

Compiling it all together

Use the following command to compile the program. This will automatically include all .m files in the current directory so be careful when and where you run this.

gcc `pkg-config –cflags –libs gtk+-2.0` -lgnustep-base -fconstant-string-class=NSConstantString -o “./myprogram” $(find . -name ‘*.m’) -I /usr/include/GNUstep/ -L /usr/lib/GNUstep/ -std=c99 -O3

Once complete you will notice a new executable in the directory called myprogram. Start this program and you will see our GTK+ window in action.

If you run it from the command line you can see the output that we coded when the button is pushed.

Wrapping it up

There you have it. We now have a program that is written in Objective-C, using C’s native GTK+ ‘bindings’ for the GUI, that can call both regular C and Objective-C functions and code. In addition, thanks to the porting of both GTK+ and GNUstep to Windows, this same code will also produce a cross-platform application that works on both Mac OSX and Windows.

Source Code Downloads

Source Only Package
File name: objective_c_gtk_source.zip
File hashes: Download Here
File size: 2.4KB
File download: Download Here

Originally posted on my personal website here.

Django Development on Ubuntu 10.04

December 8th, 2010 2 comments

When I’m not rocking out my ninja-like linux skillz here at The Linux Experiment, I like to spend my spare time working on SlightlySauced, a weekly round table podcast. When we started the show, we chose to host it on a simple Tumblr blog, because it offered a fast setup experience and didn’t require much additional configuration to work well enough for our purposes. In light of this week’s Tumblr outages, we’ve decided to move the show off of the cloud and onto the same hosting provider that this site resides on.

Since I find myself with a little bit of spare time recently, I’ve also decided to write a custom site for the show using Django, my new favourite web framework. If you’re interested in trying your hand at Django development (and honestly, if you’re doing web development of any kind, why haven’t you tried it yet?), you can follow along with my progress here.

Step 1: Installing MySql

Because Django is a Python-based web framework, it includes SQLite out of the box. My web host of choice provides solid MySQL support, so I’ve decided to swap out SQLite for MySql. This requires that I install a local MySQL server for development purposes. Ubuntu has posted some handy documentation that I followed loosely. I’ll repeat the relevant steps here for posterity and ease of use.

From your terminal, type:

sudo apt-get install mysql-server

During the installation process, you’ll be prompted to enter a password for MySql’s root user account. If your server is going to be public-facing, it’s a good idea to enter a strong password. If it’s just for development purposes, you can probably use something weaker and easier to type.

Once the installation has finished, check that your server is running by typing:

sudo netstat -tap | grep mysql

This command should output something like the following:

tcp     0     0     localhost:mysql     *:*     LISTEN 2556/mysqld

Note: This command didn’t actually work for me. I had to remove the pipe and type just

sudo netstat -tap

and then search the resulting list for the MySql entry. I found it easily enough, and was convinced that the daemon was running and waiting for clients.

Step 2: MySQL Workbench (Optional)

Once your MySql daemon is up and running, you could edit the /etc/mysql/my.cnf file to alter its configuration. Instead, I opted to use MySQL Workbench, a decent graphical management tool that is distributed by Oracle (the same folks who make MySql). I’ve used it extensively at work, so I’m familiar with it and comfortable with its quirks. If you care to use it, you’ll have to grab it from Oracle’s website, as it’s not in the Ubuntu repositories. Luckily, Oracle provides a Ubuntu 10.04 64-bit *.deb that can be easily installed with GDebi. For those who care about such things, MySQL Workbench is a fully OSS GPL-licensed product, so there’s no funny stuff going on with regards to licensing.

With MySQL Workbench up and running, you’ll be presented with a screen like this one:

Click on New Connection under the SQL Development column in the bottom left of the screen, and enter the connection details of your local MySql server. It should be available via the loopback IP 127.0.0.1 on port 3306. The default username is root, and the password is whatever you set during the installation process. Once you get access, you can create a new schema and fire a few commands at it to test your setup.

Head back over to the Home tab and click on New Server Instance under the Server Administration column at the bottom right of the screen. In the dialog that pops up, select Take Parameters from Existing Database Connection and hit Next a bunch of times. The resulting window is a full MySQL daemon monitoring window that details traffic, the number of connections to the server, etc. More importantly, it allows you to set up user accounts and change configuration variables from a handy graphical front end instead of wading through MySQL’s extensive configuration files.

I headed over to the Accounts tab and created a user account for Django. At this stage of development, you’ll want to give this account full root access to the database, as Django will automatically create and drop schemas and tables as you code your website. Once development is done, you can pare these down to only those that are necessary.

Step 3: Installing Django

Holy crap, that was a lot of work, and we haven’t even gotten our framework of choice installed yet! Let’s get on with that. The project has some excellent documentation on this issue. I’ll repeat the basic steps here for your convenience, but strongly suggest that you read through the full instruction set if you encounter any issues or want to perform a customized installation.

Since Django is a python-based framework, you’ll need to make sure that you have a compatible version of Python installed on your system. At the time of writing, Ubuntu 10.04 ships with Python version 2.6.5. Django only works with Python versions 2.4 through 2.7. If you’re not running Ubuntu 10.04, you can check which version you have installed by typing

python –version

in your terminal. Once you’ve ensured that you have a compatible Python version installed, type

sudo apt-get install python-django

in your terminal to install version 1.1.1 of the framework from your repositories. Once the installation has finished, you should check the installed version. Since Django lives inside of python, you’ll need to start a python terminal by typing

python

in your terminal. Once started, type

import django
print django.get_version()

If you don’t see any horrendous-looking error messages, you’re good to go. As a side note, if you type

apt-cache search django

you’ll see that the Ubuntu repositories include quite a few handy Django plugins and applications that you might want to use in your projects, including a URL shortener, a user-registration module, and a contact form. Each of these can be installed on your system and included in any Django project quite easily. I’ll probably end up using one or more in my project to save me some time.

Finally, you’ll need to install an extra database connector for python in order to use MySql from within Django. In Ubuntu 10.04, this package is called python-mysqldb.

Step 4: Write Some Code!

So you’re up and running. If you’re not familiar with Django, I suggest that you run through their online tutorial. It’s well-written and provides a great introduction to some of the stuff that the framework can do.

Whatever you do, have fun! In my experience, Django makes web development a pleasure because it takes care of a lot of the nitty-gritty crap for you and lets you get on with solving harder problems.

Let me know what you think in the comments.

Edit: Added an extra database connector package that’s necessary if you want to use MySql with Django.

Setting up an Ubuntu-based ASP.NET Server with Mono

November 21st, 2010 5 comments

Introduction:

In my day job, I work as an infrastructure developer for a small company. While I wouldn’t call us a Microsoft shop by any stretch (we actually make web design tools), we do maintain a large code base in C#, which includes our website and a number of web-based administrative tools. In planning for a future project, I recently spent some time figuring out how to host our existing ASP.NET-based web site on a Linux server. After a great deal of research, and just a bit of trial and error, I came up with the following steps:

VirtualBox Setup:

The server is going to run in a virtual machine, primarily because I don’t have any available hardware to throw at the problem right now. This has the added benefit of being easily expandable, and our web hosting company will actually accept *.vdi files, which allows us to easily pick up the finished machine and put it live with no added hassle. In our case, the host machine was a Windows Server 2008 machine, but these steps would work just as well on a Linux host.

I started off with VirtualBox 3.2.10 r66523, although like I said, grabbing the OSE edition from your repositories will work just as well. The host machine that we’re using is a bit underpowered, so I only gave the virtual machine 512MB of RAM and 10GB of dynamically expanding storage. One important thing – because I’ll want this server to live on our LAN and interact with our other machines, I was careful to change the network card settings to Bridged Adapter and to make sure that the Ethernet adapter of the host machine is selected in the hardware drop down. This is important because we want the virtual machine to ask our office router for an IP address instead of using the host machine as a private subnet.

Installing the Operating System:

For the initial install, I went with the Ubuntu 10.10 Maverick Meerkat 32-bit Desktop Edition. Any server admins reading this will probably pull out their hair over the fact, but in our office, we have administrators who are very used to using Windows’ Remote Desktop utility to log into remote machines, and I don’t feel like training everybody on the intricacies of PuTTy and SSH. If you want to, you can install the Server version instead, and forgo all of the additional overhead of a windowing system on your server. Since all of my installation was done from the terminal, these instructions will work just as well with or without a GUI.

From VirtualBox, you’ll want to mount the Ubuntu ISO in the IDE CD-ROM drive, and start the machine. When prompted, click your way through Ubuntu’s slick new installer, and tell it to erase and use entire disk, since we don’t need any fancy partitioning for this setup. When I went through these steps, I opted to encrypt the home folder of the vm, mostly out of habit, but that’s up to you. Once you make it to a desktop, install VirtualBox Guest Additions.

From Terminal, type sudo apt-get upgrade to apply any patches that might be available.

Setting up a Static IP Address:

From a terminal, type ifconfig and find the HWaddr entry for your ethernet card, usually eth0. It will probably look something like 08:00:27:1c:17:6c. Next, you’ll need to log in to your router and set it up so that any device with this hardware address (also called a MAC address) is always given the same IP address. In my case, I chose to assign the virtual server an IP address of 192.168.1.10 because it was easy to remember. There are other ways that you can go about setting up a static IP, but I find this to be the easiest.

Getting Remote Desktop support up and running:

As I mentioned above, the guys in our office are used to administering remote machines by logging in via Windows’ remote desktop client. In order to provide this functionality, I chose to set up the xrdp project on my little server. Installing this is as easy as typing sudo apt-get install xrdp in your terminal. The installation process will also require the vnc4server and xbase-clients packages.

When the installation has completed, the xrdp service will run on startup and will provide an encrypted remote desktop server that runs on port 3389. From Windows, you can now connect to 192.168.1.10 with the standard rdp client. When prompted for login, make sure that sesman-Xvnc is selected as the protocol, and you should be able to log in with the username and password combination that you chose above.

Installing a Graphical Firewall Utility:

Ubuntu ships with a firewall baked into the kernel that can be accessed from the terminal with the ufw tool. Because some of our administrators are afraid of the command line, I also chose to install a graphical firewall manager. In the terminal, type sudo apt-get install gufw to install an easy to use gui for the firewall. Once complete, it will show up in the standard Gnome menu system under System > Administration > Firewall Configuration.
Let’s do a bit of setup. Open up the Firewall Configuration utility, and check off the box to enable the firewall. Below that box, make sure that all incoming traffic is automatically denied while all outgoing is allowed. These rules can be tightened up later, but are a good starting point for now. To allow incoming remote desktop connections, you’ll need to create a new rule to allow all TCP connections on port 3389. If this server is to be used on the live Internet, you may also consider limiting the IP addresses that these connections can come from so that not just anybody can log in to your server. Remember, defense in depth is your best friend.

Adding SSH Support:

Unlike my coworkers, I prefer to manage my server machines via command line. As such, an SSH server is necessary. Later, the SSH connection can be used for SFTP or a secure tunnel over which we can communicate with our source control and database servers. In terminal, type sudo apt-get install openssh-server to start the OpenSSH installation process. Once it’s done, you’ll want to back up its default configuration file with the command cp /etc/ssh/sshd_config /etc/ssh/sshd_config_old. Next, open up the config file your text editor of choice (mine is nano) and change a couple of the default options:

  • Change the Port to 5000, or some other easy to remember port. Running an SSH server on port 22 can lead to high discoverability, and is regarded by some as a security no-no.
  • Change PermitRootLogin to no. This will ensure that only normal user accounts can log in.
  • At the end of the file, add the line AllowUsers <your-username> to limit the user accounts that can log in to the machine. It is good practice to create a user account with limited privileges and only allow it to log in via SSH. This way, if an attacker does get in, they are limited in the amount of damage that they can do.

Back in your terminal, type sudo /etc/init.d/ssh restart to load the new settings. Using the instructions above, open up your firewall utility and create a new rule to allow all TCP connections on port 5000. Once again, if this server is to be used on the live Internet, it’s a good idea to limit the IP addresses that this traffic can originate from.

With this done, you can log in to the server from any other Linux-based machine using the ssh command in your terminal. From Windows, you’ll need a third-party utility like PuTTy.

Installing Apache and ModMono:

For simplicity’s sake, we’ll install both Apache (the web server) and mod_mono (a module responsible for processing ASP.NET requests) from Ubuntu’s repositories. The downside is that the code base is a bit older, but the upside is that everything should just work, and the code is stable. These instructions are a modified version of the ones found on the HBY Consultancy blog. Credit where credit is due, after all. From your terminal, enter the following:

$ sudo apt-get install monodevelop mono-devel monodevelop-database mono-debugger mono-xsp2 libapache2-mod-mono mono-apache-server2 apache2

$ sudo a2dismod mod_mono

$ sudo a2enmod mod_mono_auto

With this done, Apache and mod_mono are installed. WE’ll need to do a bit of configuration before they’re ready to go. Open up mod_mono’s configuration file in your text editor of choice with something like sudo nano /etc/apache2/mods-available/mod_mono_auto.conf. Scroll down to the bottom and append the following text to the file:

MonoPath default “/usr/lib/mono/3.5”

MonoServerPath default /usr/bin/mod-mono-server2

AddMonoApplications default “/:/var/www”

Finally, restart the Apache web server so that the changes take effect with the command sudo /etc/init.d/apache2 restart. This configuration will allow us to run aspx files out of our /var/www/ directory, just like html or php files that you may have seen hosted in the past.

Having a Beer:

That was a fair bit of work, but I think that it was worth it. If everything went well, you’ve now got a fully functional Apache web server that’s reasonably secure, and can run any ASP.NET code that you throw at it.

The one hiccup that I encountered with this setup was that Mono doesn’t yet have support for .NET’s Entity Framework, which is the object-relational mapping framework that we use as a part of our database stack on the application that we wanted to host. This means that if I want to host the existing code on Linux, I’ll have to modify it so that it uses a different database back end. Its kind of a pain, but not the end of the world, and certainly a situation that can be avoided if you’re coding up a website from scratch. You can read more about the status of Mono’s ASP.NET implementation on their website.

Hopefully this helped somebody. Let me know in the comments if there’s anything that isn’t quite clear or if you encounter any snags with the process.