Archive

Archive for the ‘Linux’ Category

Using screen to keep your terminal sessions alive

December 15th, 2016 No comments

Have you ever connected to a remote Linux computer, using a… let’s say less than ideal WiFi connection, and started running a command only to have your ssh connection drop and your command killed off in a half finished state? In the best of cases this is simply annoying but if it happens during something of consequence, like a system upgrade, it can leave you in a dire state. Thankfully there is a really simple way to avoid this from happening.

Enter: screen

screen is a simple terminal application that basically allows you to create additional persistent terminals. So instead of ssh-ing into your computer and running the command in that session you can instead start a screen session and then run your commands within that. If your connection drops you lose your ‘screen’ but the screen session continues uninterrupted on the computer. Then you can simply re-connect and resume the screen.

Explain with an example!

OK fine. Let’s say I want to write a document over ssh. First you connect to the computer then you start your favourite text editor and begin typing:

ssh user@computer
user@computer’s password:

user@computer: nano doc.txt

What a wonderful document!

What a wonderful document!

Now if I lost my connection at this point all of my hard work would also be lost because I haven’t saved it yet. Instead let’s say I used screen:

ssh user@computer
user@computer’s password:

user@computer: screen

Welcome to screen!

Welcome to screen!

Now with screen running I can just use my terminal like normal and write my story. But oh no I lost my connection! Now what will I do? Well simply re-connect and re-run screen telling it to resume the previous session.

ssh user@computer
user@computer’s password:

user@computer: screen -r

Voila! There you have it – a simple way to somewhat protect your long-running terminal applications from random network disconnects.

 

Blast from the Past: Phoenix Rising

December 14th, 2016 No comments

This post was originally published on December 12, 2009. The original can be found here.


As we prepare to bring The Linux Experiment to a close over the coming weeks, I find that this has been a time of (mostly solemn) reflection for myself and others. At the very least, it’s been an interesting experience with various flavours of Linux and what it has to offer. At its peak, it’s been a roller-coaster of controversial posts (my bad), positive experiences, and the urge to shatter our screens into pieces.

Let me share with you some of the things I’ve personally taken away from this experiment over the last three-and-a-half months.

Fedora 12 is on the bleeding edge of Linux development

This has been a point of discussion on both of our podcasts at this point, and a particular sore spot with both myself and Tyler. It’s come to a place wherein I’m sort of… afraid to perform updates to my system out of fear of just bricking it entirely. While this is admittedly something that could happen under any operating system and any platform, it’s never been as bad for me as it has been under Fedora 12.

As an example, the last *six* kernel updates for me to both Fedora 11 and 12 combined have completely broken graphics capability with my adapter (a GeForce 8600 M GS). Yes, I know that the Fedora development team is not responsible for ensuring that my graphics card works with their operating system – but this is not something the average user should have to worry about. Tyler has also had this issue, and I think would tend to agree with me.

Linux is fun, too

Though there have been so many frustrating moments over the last four months that I have been tempted to just format everything and go back to my native Windows 7 (previously: release candidate, now RTM). Through all of this though, Fedora – and Linux in general – has never stopped interesting me.

This could just be due to the fact that I’ve been learning so much – I can definitely do a lot more now than I ever could before under a Linux environment, and am reasonably pleased with this – but I’ve never sat down on my laptop and been bored to play around with getting stuff to work. In addition, with some software (such as Wine or CrossOver) I’ve been able to get a number of Windows games working as well. Linux can play, too!

Customizing my UI has also been a very nice experience. It looks roughly like Sasha’s now – no bottom panel, GnomeDo with Docky, and Compiz effects… it’s quite pretty now.

There’s always another way

If there’s one thing I’ve chosen to take away from this experiment it’s that there is ALWAYS some kind of alternative to any of my problems, or anything I can do under another platform or operating system. Cisco VPN client won’t install under Wine, nor will the Linux client version? BAM, say hello to vpnc.

Need a comprehensive messaging platform with support for multiple services? Welcome Pidgin into the ring.

No, I still can’t do everything I could do in Windows… but I’m sure, given enough time, I could make Fedora 12 an extremely viable alternative to Windows 7 for me.

The long and short of it

There’s a reason I’ve chosen my clever and rather cliche title for this post. According to lore, a phoenix is a bird that would rise up from its own ashes in a rebirth cycle after igniting its nest at the end of a life cycle. So is the case for Fedora 12 and my experience with Linux.

At this point, I could not see myself continuing my tenure with the Fedora operating system. For a Linux user with my relatively low level of experience, it is too advanced and too likely to brick itself with a round of updates to be viable for me. Perhaps after quite a bit more experience with Linux on the whole, I could revisit it – but not for a good long while. This is not to say it’s unstable – it’s been rock solid, never crashing once – but it’s just not for me.

To that end, Fedora 12 rests after a long and interest-filled tenure with me. Rising from the ashes is a new user in the world of Linux – me. I can say with confidence that I will be experimenting with Linux distributions in the future – maybe dipping my feet in the somewhat familiar waters of Ubuntu once more before wading into the deep-end.

Watch out, Linux community… here I come.

 

Blast from the Past: Coming to Grips with Reality

December 12th, 2016 No comments

This post was originally published on December 8, 2009. The original can be found here.


The following is a cautionary tale about putting more trust in the software installed on your system than in your own knowledge.

Recently, while preparing for a big presentation that relied on me running a Java applet in Iceweasel, I discovered that I needed to install an additional package to make it work. This being nothing out of the ordinary, I opened up a terminal, and used apt-cache search to locate the package in question. Upon doing so, my system notified me that I had well over 50 ‘unnecessary’ packages installed. It recommended that I take care of the issue with the apt-get autoremove command.

Bad idea.

On restart, I found that my system was virtually destroyed. It seemed to start X11, but refused to give me either a terminal or a gdm login prompt. After booting into Debian’s rescue mode and messing about in the terminal for some time trying to fix a few circular dependencies and get my system back, I decided that it wasn’t worth my time, backed up my files with an Ubuntu live disk, and reinstalled from a netinst nightly build disk of the testing repositories. (Whew, that was a long sentence)

Unfortunately, just as soon as I rebooted from the install, I found that my system lacked a graphical display manager, and that I could only log in to my terminal, even though I had explicitly told the installer to add GNOME to my system. I headed over to #debian for some help, and found out that the testing repositories were broken, and that my system lacked gdm for some unknown reason. After following their instructions to work around the problem, I got my desktop back, and once more have a fully functioning system.

The moral of the story is a hard one for me to swallow. You see, I have come to the revelation that I don’t know what I’m doing. Over the course of the last 3 months, I have learned an awful lot about running and maintaining a Linux system, but I still lack the ability to fix even the simplest of problems without running for help. Sure, I can install and configure a Debian box like nobody’s business, having done it about 5 times since this experiment started; but I still lack the ability to diagnose a catastrophic failure and to recover from it without a good dose of help. I have also realized something that as a software developer, I know and should have been paying attention to when I used that fatal autoremove command – when something seems wrong, trust your instincts over your software, because they’re usually correct.

This entire experiment has been a huge learning experience for me. I installed an operating system that I had never used before, and eschewed the user-friendly Ubuntu for Debian, a distribution that adheres strictly to free software ideals and isn’t nearly as easy for beginners to use. That done, after a month of experience, I switched over from the stable version of Debian to the testing repositories, figuring that it would net me some newer software that occasionally worked better (especially in the case of Open Office and Gnome Network Manager), and some experience with running a somewhat less stable system. I certainly got what I wished for.

Overall, I don’t regret a thing, and I intend to keep the testing repositories installed on my laptop. I don’t usually use it for anything but note taking in class, so as long as I back it up regularly, I don’t mind if it breaks on occasion; I enjoy learning new things, and Debian keeps me on my toes. In addition, I think that I’ll install Kubuntu on my desktop machine when this whole thing is over. I like Debian a lot, but I’ve heard good things about Ubuntu and its variants, and feel that I should give them a try now that I’ve had my taste of what a distribution that isn’t written with beginners in mind is like. I have been very impressed by Linux, and have no doubts that it will become a major part of my computing experience, if not replacing Windows entirely – but I recognize that I still have a long way to go before I’ve really accomplished my goals.

As an afterthought: If anybody is familiar with some good tutorials for somebody who has basic knowledge but needs to learn more about what’s going on below the surface of a Linux install, please recommend them to me.

 

Blast from the Past: Top 10 things I have learned since the start of this experiment

December 9th, 2016 No comments

This post was originally published on October 2, 2009. The original can be found here.


In a nod to Dave’s classic top ten segment I will now share with you the top 10 things I have learned since starting this experiment one month ago.

10: IRC is not dead

Who knew? I’m joking of course but I had no idea that so many people still actively participated in IRC chats. As for the characters who hang out in these channels… well some are very helpful and some… answer questions like this:

Tyler: Hey everyone. I’m looking for some help with Gnome’s Empathy IM client. I can’t seem to get it to connect to MSN.

Some asshat: Tyler, if I wanted a pidgin clone, I would just use pidgin

It’s this kind of ‘you’re doing it wrong because that’s not how I would do it’ attitude can be very damaging to new Linux users. There is nothing more frustrating than trying to get help and someone throwing BS like that back in your face.

9: Jokes about Linux for nerds can actually be funny

Stolen from Sasha’s post.

Admit it, you laughed too

Admit it, you laughed too

8. Buy hardware for your Linux install, not the other way around

Believe me, if you know that your hardware is going to be 100% compatible ahead of time you will have a much more enjoyable experience. At the start of this experiment Jon pointed out this useful website. Many similar sites also exist and you should really take advantage of them if you want the optimal Linux experience.

7. When it works, it’s unparalleled

Linux seems faster, more featured and less resource hogging than a comparable operating system from either Redmond or Cupertino. That is assuming it’s working correctly…

6. Linux seems to fail for random or trivial reasons

If you need proof of these just go take a look back on the last couple of posts on here. There are times when I really think Linux could be used by everyone… and then there are moments when I don’t see how anyone outside of the most hardcore computer users could ever even attempt it. A brand new user should not have to know about xorg.conf or how to edit their DNS resolver.

Mixer - buttons unchecked

5. Linux might actually have a better game selection than the Mac!

Obviously there was some jest in there but Linux really does have some gems for games out there. Best of all most of them are completely free! Then again some are free for a reason

Armagetron

Armagetron

4. A Linux distribution defines a lot of your user experience

This can be especially frustrating when the exact same hardware performs so differently. I know there are a number of technical reasons why this is the case but things seem so utterly inconsistent that a new Linux user paired with the wrong distribution might be easily turned off.

3. Just because its open source doesn’t mean it will support everything

Even though it should damn it! The best example I have for this happens to be MSN clients. Pidgin is by far my favourite as it seems to work well and even supports a plethora of useful plugins! However, unlike many other clients, it doesn’t support a lot of MSN features such as voice/video chat, reliable file transfers, and those god awful winks and nudges that have appeared in the most recent version of the official client. Is there really that good of a reason holding the Pidgin developers back from just making use of the other open source libraries that already support these features?

2. I love the terminal

I can’t believe I actually just said that but it’s true. On a Windows machine I would never touch the command line because it is awful. However on Linux I feel empowered by using the terminal. It lets me quickly perform tasks that might take a lot of mouse clicks through a cumbersome UI to otherwise perform.

And the #1 thing I have learned since the start of this experiment? Drum roll please…

1. Linux might actually be ready to replace Windows for me

But I guess in order to find out if that statement ends up being true you’ll have to keep following along 😉

 

KWLUG: C Language, WebOS (2016-12)

December 8th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of C Language, WebOS published on December 6th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

KWLUG: OpenWRT customization (2016-11)

December 7th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of OpenWRT customization published on December 6th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

Blast from the Past: How is it doing that?

December 7th, 2016 No comments

This post was originally published on December 15, 2009. The original can be found here.


Just about everything that I’ve ever read about media playback on Linux has been negative. As I understand the situation, the general consensus of the internet is that Linux should not be relied on to play media of any kind. Further, I know that the other guys have had troubles with video playback in the past.

All of which added up to me being extremely confused when I accidentally discovered that my system takes video playback like a champ. Now from the outset, you should know that my system is extremely underpowered where high definition video playback is concerned. I’m running Debian testing on a laptop with a 1.73 GHz single-core processor, 758MB shared video RAM, and a 128MB Intel GMA 900 integrated graphics card.

Incredibly enough, it turns out that this humble setup is capable of playing almost every video file that I can find, even with compiz effects fully enabled and just a base install of vlc media player.

Most impressively, the machine can flawlessly stream a 1280x528px 1536kb/s *.mkv file over my wireless network.

As a comparison, I have a Windows Vista machine with a 2.3GHz processor, 4GB of RAM, and a 512MB video card upstairs that can’t play the same file without special codecs and the help of a program called CoreAVC. Even with these, it plays the file imperfectly.

I can’t explain how this is possible, but needless to say, I am astounded at the ability of Linux.

 

Blast from the Past: A lengthy, detailed meta-analysis of studies of GNOME Do

December 5th, 2016 No comments

This post was originally published on November 23, 2009. The original can be found here.


GNOME Do is a fantastic little program that makes Linux Mint a very comfortable experience. At first glance, GNOME Do just looks like a collection of launchers that can be docked to your window, with a search function attached for completeness. What stands out about Do, though, is that the search function offers a lot of versatility. Through Do, I can launch programs, mount and unmount drives, bring up folders, and execute a variety of actions through the plug-ins. I’ve found that it saves me a lot of mouse movement (yes, I’m that lazy) when I’m working on assignments. In less than two seconds, I can call up Kate to start up my data entry, start up R in terminal, open the folder containing all of my data, and start a conversation in Pidgin. Best of all, since the search function can be called up with the Super+Space key combination, I can do all of this without ever having to switch windows.

I also find that Do helps to clean up the clutter on my desktop. I’ve got it set up as the Docky theme on the bottom of my screen. Since I have no need for the panel, I’ve got it set up to autohide at the top of my monitor. This means when I have something maximized, it legitimately takes up the entire monitor.

What a beautifully clean desktop.

What a beautifully clean desktop.

Adding or removing programs to or from Do is a cinch too – it’s as simple as dragging and dropping.

Unfortunately, it’s not all great

Like every other Linux program, Do saves time and effort. Like every other Linux program, Do also costs time and effort in the bugs that it has. The most frustrating bug I’ve had so far is that Do simply disappears on a restart. It runs and in a manner it “exists” since I can resize it on my desktop, but I can’t actually see or use it. Apparently this is a known bug, and I haven’t been able to find a decent solution to it. It’s especially unfortunate because Do provides so much convenience that when it doesn’t work properly, I feel like I’m reverting to some primitive age where I’m dependent on my mouse (the horror!)

Notice how the cursor is cut off? In reality, it's a resizing cursor, used to resize an invisible panel. It technically does work since after I reboot I find that GNOME Do inadvertently takes up half my screen.

Notice how the cursor is cut off? In reality, it’s a resizing cursor, used to resize an invisible panel. It technically does function, since after I reboot I find that GNOME Do inadvertently takes up half my screen.

Regardless, I’d recommend Do for anyone who can install it. When it works, it’s great for saving you some time and effort; when it doesn’t, well, ’tis better to have loved and lost….

 

Stop screen tearing with Nvidia/Intel graphics combo

November 29th, 2016 No comments

Ever since upgrading my laptop to Linux Mint 18 I’ve noticed some pronounced screen tearing happening. Initially I figured this was something I would simply have to live with due to the open source driver being used instead of the proprietary one, but after some Googling I found a way to actually fix the issue.

Following this post on Ask Ubuntu I created a new directory at /etc/X11/xorg.conf.d/ and then created a new file in there called 20-intel.conf. Inside of this file I placed the following text:

Section “Device”
Identifier      “Intel Graphics”
Driver          “intel”
Option          “TearFree”          “true”
EndSection

A quick reboot later and I’m happy to say that my screen no longer tears as I scroll down long web pages.

Even Borat agrees!

Even Borat agrees!

Alternative software: Vocal Podcast Client

November 1st, 2016 No comments

In my never-ending quest to seek out the hidden gems amongst the Linux alternative software pile I decided to take a look into what was offered in terms of podcast clients or podcatchers if you prefer. It wasn’t long into my Googling that I stumbled across a beautiful piece of software that I had never even heard of before: the Vocal podcast client.

What a nice, clean interface

What a nice, clean interface

Originally designed for elementaryOS this application presents a very clean, attractive interface for managing both your audio and video podcasts. It comes with a few different options like the basics – ability to stream versus download the podcasts or quickly skip forward/backward – but it was how it walked the user through setting it up the first time that actually impressed me the most. Here’s a look at that process.

When you first open the application you are presented with the following screen:

Two pretty standard options and one very intriguing one

Two pretty standard options and one very intriguing one

As you can see in the screenshot there are two pretty standard options – Add a new Feed or Import Subscriptions from another application – but it was the third option that really intrigued me. So what exactly is the Vocal Starter Pack? It’s a curated list of high-quality podcasts that give a good spread of different podcast types and topics, a perfect place for a new user to start getting into podcasts. Seriously this is a really awesome idea!

The Starter Pack imports just like any other export you may have brought over

The Starter Pack imports just like any other export you may have brought over

So once you’ve select your podcasts or imported them you can begin the fun part – the actual listening or watching of your episodes. Selecting an audio episode will display the embedded show notes and other information about it. This is a neat touch and lets you quickly see what other episodes are in the feed that you may want to listen to as well.

Podcast feed and related info

Podcast feed and related info

Or if video podcasts are more your thing Vocal has you covered there as well.

That's an unfortunate screenshot

That’s an unfortunate screenshot

Overall for as simple as this application is I’m very impressed with Vocal. Sure it only does the basics but it does it really well! If the feature set of the upcoming version 2 is anything to go by Vocal has a good future ahead of it (What? Built in iTunes store podcast browser? Heck yeah!).

Alternative software: Midori Browser

October 30th, 2016 No comments

In my previous post I spoke about how the Linux platform has an incredible amount of alternative software and wrote a bit about my experiences using one of those applications: the Konqueror browser. I decided to stay in the same genre of applications and take a look at another alternative web browser Midori.

Midori is an interesting browser whose main goal seems to be to strip away the clutter and really streamline the web browsing experience. It’s no surprise then that Midori has ended up as the default web browser for other lightweight and streamlined distributions such as elementary OS, Bodhi Linux and SliTaz at one time or another. It is also neat from a technical perspective as portions of the browser are written in the Vala programming language.

So what does it look like when you first launch the browser then?

Sigh... Another alternative browser that shows an error on first launch...

Sigh… another alternative browser that shows an error on first launch…

Midori itself is a very nice looking browser but I was disappointed to immediately see an error just like the first time I tried Konqueror. To its credit however I’m almost certain that this error is a result of me running it on Linux Mint 18 – and thus missing the Ubuntu related file it was looking for. So really… this is more of a bug on Linux Mint’s end than a problem with Midori.

Poking around in the application preferences shows a commitment to that streamlined design even in the settings menus. Beyond that there wasn’t too much to note there.

Browsing The Linux Experiment

Browsing The Linux Experiment

So how does Midori handle as a web browser then? First off let me say that it does remarkably better than Konqueror did. Pages seemed to render fine and I only had minor issues overall.

The first issue I hit was that some embedded media and plugins didn’t seem to work. For example I couldn’t get an embedded PDF to display at all. Perhaps this is something that can be fixed by finding a Midori specific plugin?

Another oddity I could see was that sometimes the right fonts wouldn’t be used or the website text would be rendered slightly larger than it would be in Firefox or Chrome for example. For the slightly larger font issue it’s kind of strange to describe… it’s as if Midori shows the text as bolded while the other browsers don’t.

I figured that as a lightweight, streamlined browser it might be a decent idea to quickly see memory usage differences between it and Firefox (just to give a baseline). At first the results showed a clear memory usage advantage to Midori when only viewing one website:

Browser Memory Usage
Firefox 144MB
Midori 46MB

However after opening 4 additional tabs and waiting for them to all finish loading the story reversed quite substantially:

Browser Memory Usage
Firefox 183MB
Midori 507MB

I have no idea why there would be such a difference between the two or why Midori’s memory usage would skyrocket like that but I guess the bottom line is that you may want to reconsider your choice if you’re planning on using Midori on a system with low RAM.

Finally if I had to give one last piece of criticism it would be that even as a stripped down, streamlined browser Midori still doesn’t feel quite as fast as something like Chrome.

Other than those mostly minor issues though Midori did really well. Even YouTube’s HTML5 playback controls worked as expected! I might even recommend people try out Midori if they’re looking for an alternative web browser to use in their day-to-day computing.

Removing old Kernels in Ubuntu 16.04/Linux Mint 18

October 25th, 2016 No comments

Recently I’ve noticed that my /boot partition has become full and I’ve even had some new kernel updates fail as a result. It seems the culprit is all of the older kernels still lying around on my system even though they are no longer in use. Here are the steps I took in order to remove these old kernels and reclaim my /boot partition space.

A few warnings:

  • Always understand commands you are running on your machine before you run them. Especially when they start with sudo.
  • Be very careful when removing kernels – you may end up with a system that doesn’t boot!
  • My rule of thumb is to only remove kernels older than the most recent 2 (assuming I haven’t had any bad experiences with either of them). This allows me to revert back to a slightly older version if I find something that no longer works in the latest version.
First determine what kernel your machine is actually currently running

For example running the command:

uname -a

prints out the text “4.4.0-45-generic“. This is the name of the kernel my system is currently using. I do not want to remove this one!

Next get a list of all installed kernels

You can do this a few different ways but I like using the following command:

dpkg --list | grep linux-image

This should print out a list similar to the one in the screenshot below.

Example list of installed kernels

Example list of installed kernels

From this list you can identify which ones you want to remove to clear up space. On my system I had versions 4.4.0-21.37, 4.4.0-36.55, 4.4.0-38.57 and 4.4.0-45.66 so following my rule above I want to remove both 4.4.0-21.37 and 4.4.0-36.55.

Remove the old kernels

Again this can be done a number of different ways but seeing as we’re already in the terminal why not use our trust apt-get command to do the job?

sudo apt-get purgelinux-image-4.4.0-21-generic linux-image-4.4.0-36-generic

and just like that almost 500MB of disk space is freed up!

Trying out KeePassX

October 23rd, 2016 No comments

KeePassX is an independent implementation of the popular password manager that supports the KeePass (kdb) and KeePass2 (kdbx) database formats. Like the official KeePass application, KeePassX is open source but the main difference is that KeePass requires Microsoft’s .NET framework or the Mono runtime to be installed whereas KeePassX does not.

The feature list from their website shows that KeePassX offers:

  • Extensive management
    • title for each entry for its better identification
    • possibility to determine different expiration dates
    • insertion of attachments
    • user-defined symbols for groups and entries
    • fast entry dublication
    • sorting entries in groups
  • Search function
    • search either in specific groups or in complete database
  • Autofill (experimental)
  • Database security
    • access to the KeePassX database is granted either with a password, a key-file (e.g. a CD or a memory-stick) or even both.
  • Automatic generation of secure passwords
    • extremly customizable password generator for fast and easy creation of secure passwords
  • Precaution features
    • quality indicator for chosen passwords
    • hiding all passwords behind asterisks
  • Encryption
    • either the Advanced Encryption Standard (AES) or the Twofish algorithm are used
    • encryption of the database in 256 bit sized increments
  • Import and export of entries
    • import from PwManager (*.pwm) and KWallet (*.xml) files
    • export as textfile (*.txt)
  • Operating system independent
    • KeePassX is cross platform, so are the databases, as well
  • Free software
    • KeePassX is free software, published under the terms of the General Public License, so you are not only free to use it free of charge, but also to redistribute it, to examine and/or modify it’s source code and to publish your modifications as long as you provide the same freedoms for your modified version.

I’ve been a long time user of KeePass and figured I would check out KeePassX to see if there were any advantages to making the switch. Opening up my existing KeePass2 database was a breeze and even the ‘experimental’ autofill seemed to work just fine. I should also point out that, at least on Linux, KeePassX seems to be much quicker and definitely feels more native compared to the WinForms+Mono official version (I imagine the opposite is true while running on Windows).

The password generation tool for KeePassX is also very similar to the one in the official KeePass however they’ve opted for some defaults which could actually reduce the randomness, and thus security, of a password: exclude look-alike characters, ensure that the password contains characters from every group, etc.

The Password Generator in the official KeePass application

The Password Generator in the official KeePass application

These defaults do make it a bit easier to read or transcribe the passwords should you ever need to and given a long enough password the impact on security should be minimal.

The Password Generator in KeePassX

The Password Generator in KeePassX

So what are my feelings on KeePassX overall? In my limited use it seems like an excellent alternative to the official KeePass application and one that may almost be preferred on non-Windows platforms. I think I’ll be making the switch to KeePassX for my Linux-based installs.

Update: after some slow progress a few developers decided to fork the KeePassX project over at KeePassX Reboot. We’ll have to see how things with this fork play out but I wanted to mention it here in case you decided that the fork was the better version for you.

KWLUG: Emulating Tor (2016-10)

October 4th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of Emulating Tor published on October 4th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

KWLUG: Watcamp calendar, Indieweb, Key Retention using Guile (2016-09)

October 4th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of Watcamp calendar, Indieweb, Key Retention using Guile published on September 13th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

Ubuntu 16.04 VNC woes? Try this!

October 2nd, 2016 No comments

You may recall a few years back I made a very similar post about Ubuntu 14.04’s ‘VNC woes’. Well unfortunately it seems things have changed slightly between 14.04 and 16.04 and now the setting that once fixed everything now doesn’t persist and is only good for that session. Thankfully it is pretty easy to adapt the existing work around into a script that gets run on startup in order to ‘fix it’ forever. Note that these steps should also work on any Ubuntu derivatives such as Linux Mint 18, etc.

Credit goes to the excellent post over at ThinkingMedia for confirming that the fix is basically the same as the one I had for 14.04. What follows is their instructions on creating a start up script:

1. Create a text file called vino-fix.sh and place the following in it:

#!/bin/bash
export DISPLAY=0:0
gsettings set org.gnome.Vino require-encryption false 

2. Modify the file’s permissions so that it becomes executable. You can do this via the terminal with the following command:

chmod +x vino-fix.sh

3. Create a new startup application and point it at your script. Now every time you reboot it will run that script for you and ‘fix’ the issue.

One last thing I should point out – this work around disables the built in VNC encryption. Generally I would absolutely not recommend disabling any sort of security like this however VNC at its core is not really a secure protocol to begin with. You are far better off setting up VNC to only listen to local connections and then using SSH+VNC for your secure remote desktop needs. Just my two cents.

How To Set Up An OpenVPN Client On Linux

September 28th, 2016 No comments

Getting a VPN set up right on your Linux machine has a number of advantages, especially today when online privacy is a must and files are being shared remotely more extensively than ever. First off, securing your connection with a virtual private network will keep your online traffic encrypted and safe from hackers and other people with malicious intents. But originally, VPNs weren’t used for that reason at all; rather, they were exactly what the name suggests: virtual private networks. By connecting to a VPN, your computer and, for example, your colleague’s remote computer (that’s not physically connected to it via a LAN cable), can “see” each other as if they were part of a local area network and share files via the Internet. VPNs can also be utilized for remotely accessing a computer to offer assistance, or for whatever other reason you’d need to.

OpenVPN is regarded as one of the most secure and most efficient tunneling protocols for VPNs, and fortunately enough it’s quite simple to set up an OpenVPN client on a Linux computer if you know your way around the terminal.

Installing and Configuring The Client

First of all, you have to install the OpenVPN package, which you can easily do via the terminal command sudo apt-get install openvpn. Enter your sudo password (the password of your account) and press Enter. A few dependencies ask for permission to be installed, so just accept all of them for the installation to finish.

Then you’ll have to grab a few certificates off the server that the client side needs in order for OpenVPN to work. Locate the following files on your server PC and put them on a flash drive, so that you can copy them to your client PC:

  • /etc/openvpn/easy-rsa/keys/hostname.crt

  • /etc/openvpn/easy-rsa/keys/hostname.key

  • /etc/openvpn/ca.crt

  • /etc/openvpn/ta.key

Copy all of the files to the /etc/openvpn directory of your client PC (note that instead of “hostname”, in the first two files, it will be the hostname of your client). To further configure the client you have to use the command sudo cp /usr/share/doc/openvpn/examples/sample-config-files/client.conf /etc/openvpn, which copies a sample configuration file to the right directory.

Editing The Configuration File

Use a text editor such as gpedit to open the client.conf file and locate the following text:

dev tap
remote vpn.example.com 1181
cert hostname.crt
key hostname.key
tls-auth ta.key 1

You need to make a few changes here. Instead of “vpn.example.com”, put your server’s address. “1181” should be the port of your OpenVPN server, and “hostname” should, once again, be the actual name of the certificates that you copied to etc/openvpn/easy-rsa/keys a moment ago.

Now that you’ve set all of this up, you need to restart OpenVPN with the following command: sudo /etc/init.d/openvpn restart. Your remote local area network should be accessible now, which you can check by pinging the server’s VPN IP address.

Setting Up A Graphic UI Tool for OpenVPN

Unless you feel like using the terminal to navigate to every file and folder on your virtual network, it’s a good idea to set up some kind of a GUI. The Gadmin OpenVPN client does a fantastic job at this, and it’s real simple to set up, either via the Ubuntu Software Center, Synaptic or PackageKit. No matter what you choose, once it’s installed simply run the command sudo gadmin-openvpn-client and a neat graphic user interface will appear on the screen.

Now all you have to do is input some information about the server, and you’re set. Fill in the Connection name (what you’d like the connection to your VPN to be called), the Server address (the IP address of your OpenVPN server), the Server port, and the location of the certificates (the ca.crt and ta.key files mentioned earlier). Once you’re done with that, click the Add button, select the connection that you’ve just created and click Activate. Your VPN network will now be accessible.

That’s it, you’re done! You now have your own OpenVPN server that you can use to share data. Note that there are plenty other GUI tools for VPNs to be found in the Software store, so if you don’t like Gadmin, you can always use something else and still have access to OpenVPN, just through a different interface.

Summary

As you can see, it’s pretty simple to set up an OpenVPN client and connect to an existing VPN server. Setting up an OpenVPN server on Linux is a bit more of a challenge, though it’s perfectly possible. For a better and smoother experience, though, you might want to think about subscribing to a dedicated VPN provider, such as ExpressVPN. It’s not free, but it’ll give you greater security and stability, and save you the hassle of maintaining an OpenVPN server by yourself. If you’re interested, you should check out some ExpressVPN reviews before you make your choice.

Thomas Milva is an IT Security Analyst, Web entrepreneur and Tech enthusiast. He is the co-editor of http://wefollowtech.com

KWLUG: Summer Smorgasboard (2016-08)

August 21st, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of Summer Smorgasboard published on August 11th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

KWLUG: Personal Information Manager Synchronization (2016-07)

July 9th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of Personal Information Manager Synchronization published on July 5th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

RetroPie – turning your Raspberry Pi into a retro-gaming console!

June 12th, 2016 No comments

Recently I decided to pick up a new Raspberry Pi 3 B from BuyaPi.ca. I wasn’t exactly sure what I was going to do with it but I figured with all of the neat little projects going on for the device I would find something. After doing some searching I stumbled upon a few candidate projects before finally settling on RetroPie as my first shot at playing around with the Raspberry Pi.

RetroPie works great on other Raspberry Pi models as well but performance is much better on the 3

RetroPie works great on other Raspberry Pi models as well but performance is much better on the 3

RetroPie, as their site says, “allows you to turn your Raspberry Pi into a retro-gaming machine.” It does this by linking together multiple Raspberry Pi projects, including Raspbian, EmulationStation, RetroArch and more, into a really nice interface that essentially just works out of the box.

Setup

The setup couldn’t be easier. Simply follow the instructions to download a ready made image for your SD Card, put the RetroPie image on your SD Card, plug in a controller (I used a wired Xbox 360 controller), power it on and follow the setup instructions.

When it gets to the controller configuration settings screen be careful what you select. If you follow the on-screen button pushes by default (i.e. button “A” for “A” and button “B” for “B”, etc.) you will end up with something that matches the name of the button but not the placement you’re expecting. This is because RetroPie/RetroArch uses the SNES Controller layout as its default.

The 'default' SNES controller layout

The ‘default’ SNES controller layout

So if you simply followed the on-screen wizard and pushed the Xbox 360 controller’s “A” button instead of it’s “B” button (which is the location of the “A” button on the SNES) you’ll experience all sorts of weird behaviour in the various emulators. So be sure to actually follow the setup guide for your particular controller (see below for example).

Notice how you actually have to push "B" when it asks for "A" and so on during the initial controller configuration

Notice how you actually have to push “B” when it asks for “A” and so on during the initial controller configuration

The one confusing downside to this work around is that all of the menus in RetroPie itself still ask you to push “A” or “B” but they really mean what you mapped that to, so it’s kind of backwards until you actually get into a game. That said it’s a minor thing and one that I’m sure I could fix, if I cared enough to do so, by setting a custom alternative controller layout for the menu only.

Games

RetroPie supports a crazy number of emulators. No seriously it’s a bit ridiculous. Look at this list (as of the time of writing):

RetroPie automatically detects if you have games for the systems. So if you had a SNES game for example you would get a SNES system to choose from on the main menu.

RetroPie automatically detects if you have games for the systems. So if you had a SNES game you would get a SNES system to choose from on the main menu.

Additionally you get PC emulators like DOSBox and the Apple II and there are a number of custom ports of PC games including DOOM, Duke Nukem 3D, Minecraft Pi Edition, OpenTTD and more!

Now obviously not all of the above emulators work flawlessly. Some are still labeled experimental and some systems even offer multiple emulators so you can customize it to the game you are trying to play – just in case one emulator happens to offer better compatibility than another. That said for the majority of the emulators I tried, especially for the older systems, things work great.

The RetroPie SD Card contains various folders that you simply copy the ROM or various bits of game data to. Once the files are there you just restart EmulationStation and it automatically discovers the new games.

Remote Storage

One thing I had to try was to see if I could use a remote share to play the games on the RetroPie off of my NAS instead. This would save quite a bit of space on the SD Card and as long as the transfer speeds between the Raspberry Pi and the NAS were decent enough should actually work.

I figured using a Windows share from the NAS was the easiest (this would also let you share games from basically any computer on your network). Here are the steps to set it up:

SSH into the Raspberry Pi

The default login for RetroPie is username pi and password raspberry. You can usually find it on the network by simply connecting to the device name retropie.

Add remote mounts to fstab

The most simple way to set up the remote mounts is to use fstab. This will ensure that the system gets the share as soon as it boots up. However you might run into problems booting the RaspberryPi if it can’t find the share on the network… so that is something to keep in mind.

Open up /etc/fstab (I used nano):

sudo nano /etc/fstab

Then add a line that looks like this to the end of the file

//{the location of the share}    /home/pi/RetroPie/roms/{the location to mount it}    cifs    guest,uid=1000,iocharset=utf8    0    0

replacing the pieces in { brackets } with where you actually want things to mount. So for example let’s say the NAS is at IP address 192.168.1.50 and you wanted to mount a share on the NAS called SNES that contains SNES ROMs for RetroPie. First I would recommend creating a new sub-directory in the standard SNES ROMs location so that you can have both ROMs on the SD Card and remote ones:

mkdir /home/pi/RetroPie/roms/snes/NASGames

Then you would add something like this to your fstab file:

//192.168.1.50/SNES    /home/pi/RetroPie/roms/snes/NASGames    cifs    guest,uid=1000,iocharset=utf8    0    0

The next time you boot up your Raspberry Pi it should successfully add that remote share and show you any SNES ROMs that are on the NAS in RetroPie!

After testing a few remote games this way I can say that it does indeed work well (via WiFi no less!). This is especially true for the older systems where game size is only a few KiB or MiBs. When you start to get into larger PC or disc based games were the sizes are in the hundreds of MiB it still works decently well but the first time you access something you might notice a bit of a delay. Thankfully Linux does a decent job of caching the file data after it’s read it once and so subsequent reads are much faster. That said if you had a good wired connection I have no doubt that things would work even more smoothly.

Portable Console? Best Console? A bit of both.

The RetroPie project is really neat, not only for its feature set but also because as a games console it’s one of the smallest and has the potential to have one of the largest games library ever!

My setup is pretty plain but some people have done awesome things with theirs!

My setup is pretty plain but some people have done awesome things with theirs like turning it into a full arcade cabinet!

If you like to play classic games then I would seriously recommend giving RetroPie a try.

This post originally appeared on my website here.