Archive

Archive for the ‘Linux’ Category

Blast from the Past: Automatically put computer to sleep and wake it up on a schedule

December 26th, 2016 No comments

This post was originally published on June 24, 2012. The original can be found here.


Ever wanted your computer to be on when you need it but automatically put itself to sleep (suspended) when you don’t? Or maybe you just wanted to create a really elaborate alarm clock?

I stumbled across this very useful command a while back but only recently created a script that I now run to control when my computer is suspended and when it is awake.

#!/bin/sh
t=`date –date “17:00” +%s`
sudo /bin/true
sudo rtcwake -u -t $t -m on &
sleep 2
sudo pm-suspend

This creates a variable, t above, with an assigned time and then runs the command rtcwake to tell the computer to automatically wake itself up at that time. In the above example I’m telling the computer that it should wake itself up automatically at 17:00 (5pm). It then sleeps for 2 seconds (just to let the rtcwake command finish what it is doing) and runs pm-suspend which actually puts the computer to sleep. When run the computer will put itself right to sleep and then wake up at whatever time you specify.

For the final piece of the puzzle, I’ve scheduled this script to run daily (when I want the PC to actually go to sleep) and the rest is taken care of for me. As an example, say you use your PC from 5pm to midnight but the rest of the time you are sleeping or at work. Simply schedule the above script to run at midnight and when you get home from work it will be already up and running and waiting for you.

I should note that your computer must have compatible hardware to make advanced power management features like suspend and wake work so, as with everything, your mileage may vary.

This post originally appeared on my personal website here.

 

Blast from the Past: Linux from Scratch: I’ve had it up to here!

December 23rd, 2016 No comments

This post was originally published on November 27, 2011. The original can be found here.


As you may be able to tell from my recent, snooze-worthy technical posts about compilers and makefiles and other assorted garbage, my experience with Linux from Scratch has been equally educational and enraging. Like Dave, I’ve had the pleasure of trying to compile various desktop environments and software packages from scratch, into some god-awful contraption that will let me check my damn email and look at the Twitters.

To be clear, when anyone says I have nobody to blame but myself, that’s complete hokum. From the beginning, this entire process was flawed. The last official LFS LiveCD has a kernel that’s enough revisions behind to cause grief during the setup process. But I really can’t blame the guys behind LFS for all my woes; their documentation is really well-written and explains why you have to pass fifty --do-not-compile-this-obscure-component-or-your-cat-will-crap-on-the-rug arguments.

Patch Your Cares Away

CC attribution licensed from benchilada

Read more…

5 apt tips and tricks

December 22nd, 2016 No comments

Everyone loves apt. It’s a simple command line tool to install new programs and update your system, but beyond the standard commands like update, install and upgrade did you know there are a load of other useful apt-based commands you can run?

1) Search for a package name with apt-cache search

Can’t remember the exact package name but you know some of it? Make it easy on yourself and search using apt-cache. For example:

apt-cache search Firefox

It lists all results for your search. Nice and easy!

It lists all results for your search. Nice and easy!

2) Search for package information with apt-cache show

Want details of a package before you install it? Simple just search for it with apt-cache show.

apt-cache show firefox

More details than you probably even wanted!

More details than you probably even wanted!

3) Upgrade only a specific package

So you already know that you can upgrade your whole system by running

apt-get upgrade

but did you know you can upgrade a specific package instead of the whole system? It’s easy, just specify the package name in the upgrade command. For example to upgrade just firefox run:

apt-get upgrade firefox

4) Install specific package version

Normally when you apt-get install something you get the latest version available but what if that’s not what you wanted? What if you wanted a specific version of the package instead? Again, simple, just specify it when you run the install command. For example run:

apt-get install firefox=version

Where version is the version number you wish to install.

5) Free up disk space with clean

When you download and install packages apt will automatically cache them on your hard drive. This can be useful for a number of reasons, for example some distributions use delta packages so that only what has changed between versions are re-downloaded. In order to do this it needs to have a base cached file already on your hard drive. However these files can take up a lot of space and often times don’t get a lot of updates anyway. Thankfully there are two quick commands that free up this disk space.

apt-get clean

apt-get autoclean

Both of these essentially do the same thing but the difference here is autoclean only gets rid of cached files that have a newer version also cached on your hard drive. These older packages won’t be used anymore and so they are an easy way to free up some space.

There you have it, you are now officially 5 apt commands smarter. Happy computing!

 

Blast from the Past: 10 reasons why Mint might not fail in India

December 21st, 2016 No comments

This post was originally published on July 7, 2010. The original can be found here.


Last evening while reading the SA forums, I encountered a thread about Linux and what was required to bring it to the general public. One of the goons mentioned a post that indicated ten reasons why Ubuntu wasn’t ready for the desktop in India. I kid you not – the most ridiculous reason was because users couldn’t perform the important ritual of right click/Refreshing on the desktop five or more times before getting down to work.

Here are Bharat’s reasons why Ubuntu fails, followed by why I think Mint might succeed instead in its place (while still employing his dubious logic.) When I refer to Indian users, of course, I’m taking his word for it – he’s obviously the authority here.

GRUB Boot Loader does not have an Aesthetic Appeal.

Bharat complains about the visual appearance of Grub – how it does not create a good first impression. This is, of course, in spite of Windows’ horrible boot menu when there’s more than one operating system or boot option to select. Apparently Indian users all have full-color splash screens with aesthetic appeal for BIOS, video card and PCI add-in initialization as well; this is just the icing on the cake that makes them go “eurrrgh” and completely discount Ubuntu.

To improve relations with India and eliminate this eyesore, Mint has added a background image during this phase of boot. My good friend Tyler also informs me that there’s a simple option in the Mint Control Center called “Start-Up Manager” that alllows easy configuration of grub to match a system’s native resolution and color depth.

Login Screen-Users are required to type in their username.

Again, another seemingly impenetrable barrier. Has nobody in India worked in an environment where typing in usernames AND passwords is required – like, for example, posting a blog entry on WordPress or signing into Gmail? In any event, Mint’s GNOME installation definitely gives a clickable list for this awfully onerous task.

Desktop-The Refresh option is missing!

I’m just going to directly lift this description as to the burning need for right click / Refresh:

What does an average Indian user do when the desktop loads in Windows?He rights clicks on the desktop and refreshes the desktop about 5-6 times or until he is satisfied.This is a ritual performed by most Indian Users after switching on the computer and just before shutting down the computer.
When this average user tries to perform his ‘Refresh’ ritual in Ubuntu,he is in for a rude shock.The Ubuntu Desktop does not have a Refresh Option or any other simliar option like Reload in the Right Click Menu.
So I advice Ubuntu Developers to include to a Refresh or a Reload option in the right click menu on the Desktop and in the Nautilus File Manager.The option should be equivalent of pressing Ctrl+R.As of now ,pressing Ctrl+R refreshes the Desktop in Ubuntu.

Mint’s developers have unfortunately not come around to this clearly superior way of thinking by default yet.

Read more…

Blast from the Past: Linux Saves the Day

December 19th, 2016 No comments

This post was originally published on December 23, 2009. The original can be found here.


Earlier this week I had an experience where using Linux got me out of trouble in a relatively quick and easy manner. The catch? It was kind of Linux’s fault that I was in trouble in the first place.

Around halfway through November my Linux install on my laptop crapped out, and really fucked things up hard. However, my Windows install wasn’t affected, so I started using Windows on my laptop primarily, while switching to an openSUSE VM on my desktop for my home computing needs.

About a week back I decided it was time to reinstall Linux on my laptop, since exams and my 600 hojillion final projects were out of the way. I booted into Win7, nuked the partitions being used by Linux and… went and got some pizza and forgot to finish my install. Turns out I hadn’t restarted my PC anywhere between that day and when shit hit the fan. When I did restart, I was informed to the merry tune of a PC Speaker screech that my computer had no bootable media.

… Well shit.

My first reaction was to try again, my second was to check to make sure the hard drive was plugged in firmly. After doing this a few times, I was so enraged about my lost data that I was about ready to repave the whole drive when I had the good sense to throw in a BartPE live CD and check to see if there was any data left on the drive. To my elation, all of my data was still in tact! It was at this precise moment I thought to myself “Oh drat, I bet I uninstalled that darned GRUB bootloader. Fiddlesticks!”

However, all was not lost. I know that Linux is great and is capable of finding other OS installs during its install and setting them up in GRUB without me having to look around for a windows boot point and do it myself. 20 minutes and an openSUSE install later, everything was back to normal on my laptop, Win7 and openSUSE 11.1 included!

As we speak I’m attempting an in-place upgrade to openSUSE 11.2 so hopefully I get lucky and everything goes smoothly!

 

CoreGTK 3.18.0 Released!

December 18th, 2016 No comments

The next version of CoreGTK, version 3.18.0, has been tagged for release! This is the first version of CoreGTK to support GTK+ 3.18.

Highlights for this release:

  • Rebased on GTK+ 3.18
  • New supported GtkWidgets in this release:
    • GtkActionBar
    • GtkFlowBox
    • GtkFlowBoxChild
    • GtkGLArea
    • GtkModelButton
    • GtkPopover
    • GtkPopoverMenu
    • GtkStackSidebar
  • Reverted to using GCC as the default compiler (but clang can still be used)

CoreGTK is an Objective-C language binding for the GTK+ widget toolkit. Like other “core” Objective-C libraries, CoreGTK is designed to be a thin wrapper. CoreGTK is free software, licensed under the GNU LGPL.

Read more about this release here.

This post originally appeared on my personal website here.

Using screen to keep your terminal sessions alive

December 15th, 2016 No comments

Have you ever connected to a remote Linux computer, using a… let’s say less than ideal WiFi connection, and started running a command only to have your ssh connection drop and your command killed off in a half finished state? In the best of cases this is simply annoying but if it happens during something of consequence, like a system upgrade, it can leave you in a dire state. Thankfully there is a really simple way to avoid this from happening.

Enter: screen

screen is a simple terminal application that basically allows you to create additional persistent terminals. So instead of ssh-ing into your computer and running the command in that session you can instead start a screen session and then run your commands within that. If your connection drops you lose your ‘screen’ but the screen session continues uninterrupted on the computer. Then you can simply re-connect and resume the screen.

Explain with an example!

OK fine. Let’s say I want to write a document over ssh. First you connect to the computer then you start your favourite text editor and begin typing:

ssh user@computer
user@computer’s password:

user@computer: nano doc.txt

What a wonderful document!

What a wonderful document!

Now if I lost my connection at this point all of my hard work would also be lost because I haven’t saved it yet. Instead let’s say I used screen:

ssh user@computer
user@computer’s password:

user@computer: screen

Welcome to screen!

Welcome to screen!

Now with screen running I can just use my terminal like normal and write my story. But oh no I lost my connection! Now what will I do? Well simply re-connect and re-run screen telling it to resume the previous session.

ssh user@computer
user@computer’s password:

user@computer: screen -r

Voila! There you have it – a simple way to somewhat protect your long-running terminal applications from random network disconnects.

 

Blast from the Past: Phoenix Rising

December 14th, 2016 No comments

This post was originally published on December 12, 2009. The original can be found here.


As we prepare to bring The Linux Experiment to a close over the coming weeks, I find that this has been a time of (mostly solemn) reflection for myself and others. At the very least, it’s been an interesting experience with various flavours of Linux and what it has to offer. At its peak, it’s been a roller-coaster of controversial posts (my bad), positive experiences, and the urge to shatter our screens into pieces.

Let me share with you some of the things I’ve personally taken away from this experiment over the last three-and-a-half months.

Fedora 12 is on the bleeding edge of Linux development

This has been a point of discussion on both of our podcasts at this point, and a particular sore spot with both myself and Tyler. It’s come to a place wherein I’m sort of… afraid to perform updates to my system out of fear of just bricking it entirely. While this is admittedly something that could happen under any operating system and any platform, it’s never been as bad for me as it has been under Fedora 12.

As an example, the last *six* kernel updates for me to both Fedora 11 and 12 combined have completely broken graphics capability with my adapter (a GeForce 8600 M GS). Yes, I know that the Fedora development team is not responsible for ensuring that my graphics card works with their operating system – but this is not something the average user should have to worry about. Tyler has also had this issue, and I think would tend to agree with me.

Linux is fun, too

Though there have been so many frustrating moments over the last four months that I have been tempted to just format everything and go back to my native Windows 7 (previously: release candidate, now RTM). Through all of this though, Fedora – and Linux in general – has never stopped interesting me.

This could just be due to the fact that I’ve been learning so much – I can definitely do a lot more now than I ever could before under a Linux environment, and am reasonably pleased with this – but I’ve never sat down on my laptop and been bored to play around with getting stuff to work. In addition, with some software (such as Wine or CrossOver) I’ve been able to get a number of Windows games working as well. Linux can play, too!

Customizing my UI has also been a very nice experience. It looks roughly like Sasha’s now – no bottom panel, GnomeDo with Docky, and Compiz effects… it’s quite pretty now.

There’s always another way

If there’s one thing I’ve chosen to take away from this experiment it’s that there is ALWAYS some kind of alternative to any of my problems, or anything I can do under another platform or operating system. Cisco VPN client won’t install under Wine, nor will the Linux client version? BAM, say hello to vpnc.

Need a comprehensive messaging platform with support for multiple services? Welcome Pidgin into the ring.

No, I still can’t do everything I could do in Windows… but I’m sure, given enough time, I could make Fedora 12 an extremely viable alternative to Windows 7 for me.

The long and short of it

There’s a reason I’ve chosen my clever and rather cliche title for this post. According to lore, a phoenix is a bird that would rise up from its own ashes in a rebirth cycle after igniting its nest at the end of a life cycle. So is the case for Fedora 12 and my experience with Linux.

At this point, I could not see myself continuing my tenure with the Fedora operating system. For a Linux user with my relatively low level of experience, it is too advanced and too likely to brick itself with a round of updates to be viable for me. Perhaps after quite a bit more experience with Linux on the whole, I could revisit it – but not for a good long while. This is not to say it’s unstable – it’s been rock solid, never crashing once – but it’s just not for me.

To that end, Fedora 12 rests after a long and interest-filled tenure with me. Rising from the ashes is a new user in the world of Linux – me. I can say with confidence that I will be experimenting with Linux distributions in the future – maybe dipping my feet in the somewhat familiar waters of Ubuntu once more before wading into the deep-end.

Watch out, Linux community… here I come.

 

Blast from the Past: Coming to Grips with Reality

December 12th, 2016 No comments

This post was originally published on December 8, 2009. The original can be found here.


The following is a cautionary tale about putting more trust in the software installed on your system than in your own knowledge.

Recently, while preparing for a big presentation that relied on me running a Java applet in Iceweasel, I discovered that I needed to install an additional package to make it work. This being nothing out of the ordinary, I opened up a terminal, and used apt-cache search to locate the package in question. Upon doing so, my system notified me that I had well over 50 ‘unnecessary’ packages installed. It recommended that I take care of the issue with the apt-get autoremove command.

Bad idea.

On restart, I found that my system was virtually destroyed. It seemed to start X11, but refused to give me either a terminal or a gdm login prompt. After booting into Debian’s rescue mode and messing about in the terminal for some time trying to fix a few circular dependencies and get my system back, I decided that it wasn’t worth my time, backed up my files with an Ubuntu live disk, and reinstalled from a netinst nightly build disk of the testing repositories. (Whew, that was a long sentence)

Unfortunately, just as soon as I rebooted from the install, I found that my system lacked a graphical display manager, and that I could only log in to my terminal, even though I had explicitly told the installer to add GNOME to my system. I headed over to #debian for some help, and found out that the testing repositories were broken, and that my system lacked gdm for some unknown reason. After following their instructions to work around the problem, I got my desktop back, and once more have a fully functioning system.

The moral of the story is a hard one for me to swallow. You see, I have come to the revelation that I don’t know what I’m doing. Over the course of the last 3 months, I have learned an awful lot about running and maintaining a Linux system, but I still lack the ability to fix even the simplest of problems without running for help. Sure, I can install and configure a Debian box like nobody’s business, having done it about 5 times since this experiment started; but I still lack the ability to diagnose a catastrophic failure and to recover from it without a good dose of help. I have also realized something that as a software developer, I know and should have been paying attention to when I used that fatal autoremove command – when something seems wrong, trust your instincts over your software, because they’re usually correct.

This entire experiment has been a huge learning experience for me. I installed an operating system that I had never used before, and eschewed the user-friendly Ubuntu for Debian, a distribution that adheres strictly to free software ideals and isn’t nearly as easy for beginners to use. That done, after a month of experience, I switched over from the stable version of Debian to the testing repositories, figuring that it would net me some newer software that occasionally worked better (especially in the case of Open Office and Gnome Network Manager), and some experience with running a somewhat less stable system. I certainly got what I wished for.

Overall, I don’t regret a thing, and I intend to keep the testing repositories installed on my laptop. I don’t usually use it for anything but note taking in class, so as long as I back it up regularly, I don’t mind if it breaks on occasion; I enjoy learning new things, and Debian keeps me on my toes. In addition, I think that I’ll install Kubuntu on my desktop machine when this whole thing is over. I like Debian a lot, but I’ve heard good things about Ubuntu and its variants, and feel that I should give them a try now that I’ve had my taste of what a distribution that isn’t written with beginners in mind is like. I have been very impressed by Linux, and have no doubts that it will become a major part of my computing experience, if not replacing Windows entirely – but I recognize that I still have a long way to go before I’ve really accomplished my goals.

As an afterthought: If anybody is familiar with some good tutorials for somebody who has basic knowledge but needs to learn more about what’s going on below the surface of a Linux install, please recommend them to me.

 

Blast from the Past: Top 10 things I have learned since the start of this experiment

December 9th, 2016 No comments

This post was originally published on October 2, 2009. The original can be found here.


In a nod to Dave’s classic top ten segment I will now share with you the top 10 things I have learned since starting this experiment one month ago.

10: IRC is not dead

Who knew? I’m joking of course but I had no idea that so many people still actively participated in IRC chats. As for the characters who hang out in these channels… well some are very helpful and some… answer questions like this:

Tyler: Hey everyone. I’m looking for some help with Gnome’s Empathy IM client. I can’t seem to get it to connect to MSN.

Some asshat: Tyler, if I wanted a pidgin clone, I would just use pidgin

It’s this kind of ‘you’re doing it wrong because that’s not how I would do it’ attitude can be very damaging to new Linux users. There is nothing more frustrating than trying to get help and someone throwing BS like that back in your face.

9: Jokes about Linux for nerds can actually be funny

Stolen from Sasha’s post.

Admit it, you laughed too

Admit it, you laughed too

8. Buy hardware for your Linux install, not the other way around

Believe me, if you know that your hardware is going to be 100% compatible ahead of time you will have a much more enjoyable experience. At the start of this experiment Jon pointed out this useful website. Many similar sites also exist and you should really take advantage of them if you want the optimal Linux experience.

7. When it works, it’s unparalleled

Linux seems faster, more featured and less resource hogging than a comparable operating system from either Redmond or Cupertino. That is assuming it’s working correctly…

6. Linux seems to fail for random or trivial reasons

If you need proof of these just go take a look back on the last couple of posts on here. There are times when I really think Linux could be used by everyone… and then there are moments when I don’t see how anyone outside of the most hardcore computer users could ever even attempt it. A brand new user should not have to know about xorg.conf or how to edit their DNS resolver.

Mixer - buttons unchecked

5. Linux might actually have a better game selection than the Mac!

Obviously there was some jest in there but Linux really does have some gems for games out there. Best of all most of them are completely free! Then again some are free for a reason

Armagetron

Armagetron

4. A Linux distribution defines a lot of your user experience

This can be especially frustrating when the exact same hardware performs so differently. I know there are a number of technical reasons why this is the case but things seem so utterly inconsistent that a new Linux user paired with the wrong distribution might be easily turned off.

3. Just because its open source doesn’t mean it will support everything

Even though it should damn it! The best example I have for this happens to be MSN clients. Pidgin is by far my favourite as it seems to work well and even supports a plethora of useful plugins! However, unlike many other clients, it doesn’t support a lot of MSN features such as voice/video chat, reliable file transfers, and those god awful winks and nudges that have appeared in the most recent version of the official client. Is there really that good of a reason holding the Pidgin developers back from just making use of the other open source libraries that already support these features?

2. I love the terminal

I can’t believe I actually just said that but it’s true. On a Windows machine I would never touch the command line because it is awful. However on Linux I feel empowered by using the terminal. It lets me quickly perform tasks that might take a lot of mouse clicks through a cumbersome UI to otherwise perform.

And the #1 thing I have learned since the start of this experiment? Drum roll please…

1. Linux might actually be ready to replace Windows for me

But I guess in order to find out if that statement ends up being true you’ll have to keep following along 😉

 

KWLUG: C Language, WebOS (2016-12)

December 8th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of C Language, WebOS published on December 6th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

KWLUG: OpenWRT customization (2016-11)

December 7th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of OpenWRT customization published on December 6th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,

Blast from the Past: How is it doing that?

December 7th, 2016 No comments

This post was originally published on December 15, 2009. The original can be found here.


Just about everything that I’ve ever read about media playback on Linux has been negative. As I understand the situation, the general consensus of the internet is that Linux should not be relied on to play media of any kind. Further, I know that the other guys have had troubles with video playback in the past.

All of which added up to me being extremely confused when I accidentally discovered that my system takes video playback like a champ. Now from the outset, you should know that my system is extremely underpowered where high definition video playback is concerned. I’m running Debian testing on a laptop with a 1.73 GHz single-core processor, 758MB shared video RAM, and a 128MB Intel GMA 900 integrated graphics card.

Incredibly enough, it turns out that this humble setup is capable of playing almost every video file that I can find, even with compiz effects fully enabled and just a base install of vlc media player.

Most impressively, the machine can flawlessly stream a 1280x528px 1536kb/s *.mkv file over my wireless network.

As a comparison, I have a Windows Vista machine with a 2.3GHz processor, 4GB of RAM, and a 512MB video card upstairs that can’t play the same file without special codecs and the help of a program called CoreAVC. Even with these, it plays the file imperfectly.

I can’t explain how this is possible, but needless to say, I am astounded at the ability of Linux.

 

Blast from the Past: A lengthy, detailed meta-analysis of studies of GNOME Do

December 5th, 2016 No comments

This post was originally published on November 23, 2009. The original can be found here.


GNOME Do is a fantastic little program that makes Linux Mint a very comfortable experience. At first glance, GNOME Do just looks like a collection of launchers that can be docked to your window, with a search function attached for completeness. What stands out about Do, though, is that the search function offers a lot of versatility. Through Do, I can launch programs, mount and unmount drives, bring up folders, and execute a variety of actions through the plug-ins. I’ve found that it saves me a lot of mouse movement (yes, I’m that lazy) when I’m working on assignments. In less than two seconds, I can call up Kate to start up my data entry, start up R in terminal, open the folder containing all of my data, and start a conversation in Pidgin. Best of all, since the search function can be called up with the Super+Space key combination, I can do all of this without ever having to switch windows.

I also find that Do helps to clean up the clutter on my desktop. I’ve got it set up as the Docky theme on the bottom of my screen. Since I have no need for the panel, I’ve got it set up to autohide at the top of my monitor. This means when I have something maximized, it legitimately takes up the entire monitor.

What a beautifully clean desktop.

What a beautifully clean desktop.

Adding or removing programs to or from Do is a cinch too – it’s as simple as dragging and dropping.

Unfortunately, it’s not all great

Like every other Linux program, Do saves time and effort. Like every other Linux program, Do also costs time and effort in the bugs that it has. The most frustrating bug I’ve had so far is that Do simply disappears on a restart. It runs and in a manner it “exists” since I can resize it on my desktop, but I can’t actually see or use it. Apparently this is a known bug, and I haven’t been able to find a decent solution to it. It’s especially unfortunate because Do provides so much convenience that when it doesn’t work properly, I feel like I’m reverting to some primitive age where I’m dependent on my mouse (the horror!)

Notice how the cursor is cut off? In reality, it's a resizing cursor, used to resize an invisible panel. It technically does work since after I reboot I find that GNOME Do inadvertently takes up half my screen.

Notice how the cursor is cut off? In reality, it’s a resizing cursor, used to resize an invisible panel. It technically does function, since after I reboot I find that GNOME Do inadvertently takes up half my screen.

Regardless, I’d recommend Do for anyone who can install it. When it works, it’s great for saving you some time and effort; when it doesn’t, well, ’tis better to have loved and lost….

 

Stop screen tearing with Nvidia/Intel graphics combo

November 29th, 2016 No comments

Ever since upgrading my laptop to Linux Mint 18 I’ve noticed some pronounced screen tearing happening. Initially I figured this was something I would simply have to live with due to the open source driver being used instead of the proprietary one, but after some Googling I found a way to actually fix the issue.

Following this post on Ask Ubuntu I created a new directory at /etc/X11/xorg.conf.d/ and then created a new file in there called 20-intel.conf. Inside of this file I placed the following text:

Section “Device”
Identifier      “Intel Graphics”
Driver          “intel”
Option          “TearFree”          “true”
EndSection

A quick reboot later and I’m happy to say that my screen no longer tears as I scroll down long web pages.

Even Borat agrees!

Even Borat agrees!

Alternative software: Vocal Podcast Client

November 1st, 2016 No comments

In my never-ending quest to seek out the hidden gems amongst the Linux alternative software pile I decided to take a look into what was offered in terms of podcast clients or podcatchers if you prefer. It wasn’t long into my Googling that I stumbled across a beautiful piece of software that I had never even heard of before: the Vocal podcast client.

What a nice, clean interface

What a nice, clean interface

Originally designed for elementaryOS this application presents a very clean, attractive interface for managing both your audio and video podcasts. It comes with a few different options like the basics – ability to stream versus download the podcasts or quickly skip forward/backward – but it was how it walked the user through setting it up the first time that actually impressed me the most. Here’s a look at that process.

When you first open the application you are presented with the following screen:

Two pretty standard options and one very intriguing one

Two pretty standard options and one very intriguing one

As you can see in the screenshot there are two pretty standard options – Add a new Feed or Import Subscriptions from another application – but it was the third option that really intrigued me. So what exactly is the Vocal Starter Pack? It’s a curated list of high-quality podcasts that give a good spread of different podcast types and topics, a perfect place for a new user to start getting into podcasts. Seriously this is a really awesome idea!

The Starter Pack imports just like any other export you may have brought over

The Starter Pack imports just like any other export you may have brought over

So once you’ve select your podcasts or imported them you can begin the fun part – the actual listening or watching of your episodes. Selecting an audio episode will display the embedded show notes and other information about it. This is a neat touch and lets you quickly see what other episodes are in the feed that you may want to listen to as well.

Podcast feed and related info

Podcast feed and related info

Or if video podcasts are more your thing Vocal has you covered there as well.

That's an unfortunate screenshot

That’s an unfortunate screenshot

Overall for as simple as this application is I’m very impressed with Vocal. Sure it only does the basics but it does it really well! If the feature set of the upcoming version 2 is anything to go by Vocal has a good future ahead of it (What? Built in iTunes store podcast browser? Heck yeah!).

Alternative software: Midori Browser

October 30th, 2016 No comments

In my previous post I spoke about how the Linux platform has an incredible amount of alternative software and wrote a bit about my experiences using one of those applications: the Konqueror browser. I decided to stay in the same genre of applications and take a look at another alternative web browser Midori.

Midori is an interesting browser whose main goal seems to be to strip away the clutter and really streamline the web browsing experience. It’s no surprise then that Midori has ended up as the default web browser for other lightweight and streamlined distributions such as elementary OS, Bodhi Linux and SliTaz at one time or another. It is also neat from a technical perspective as portions of the browser are written in the Vala programming language.

So what does it look like when you first launch the browser then?

Sigh... Another alternative browser that shows an error on first launch...

Sigh… another alternative browser that shows an error on first launch…

Midori itself is a very nice looking browser but I was disappointed to immediately see an error just like the first time I tried Konqueror. To its credit however I’m almost certain that this error is a result of me running it on Linux Mint 18 – and thus missing the Ubuntu related file it was looking for. So really… this is more of a bug on Linux Mint’s end than a problem with Midori.

Poking around in the application preferences shows a commitment to that streamlined design even in the settings menus. Beyond that there wasn’t too much to note there.

Browsing The Linux Experiment

Browsing The Linux Experiment

So how does Midori handle as a web browser then? First off let me say that it does remarkably better than Konqueror did. Pages seemed to render fine and I only had minor issues overall.

The first issue I hit was that some embedded media and plugins didn’t seem to work. For example I couldn’t get an embedded PDF to display at all. Perhaps this is something that can be fixed by finding a Midori specific plugin?

Another oddity I could see was that sometimes the right fonts wouldn’t be used or the website text would be rendered slightly larger than it would be in Firefox or Chrome for example. For the slightly larger font issue it’s kind of strange to describe… it’s as if Midori shows the text as bolded while the other browsers don’t.

I figured that as a lightweight, streamlined browser it might be a decent idea to quickly see memory usage differences between it and Firefox (just to give a baseline). At first the results showed a clear memory usage advantage to Midori when only viewing one website:

Browser Memory Usage
Firefox 144MB
Midori 46MB

However after opening 4 additional tabs and waiting for them to all finish loading the story reversed quite substantially:

Browser Memory Usage
Firefox 183MB
Midori 507MB

I have no idea why there would be such a difference between the two or why Midori’s memory usage would skyrocket like that but I guess the bottom line is that you may want to reconsider your choice if you’re planning on using Midori on a system with low RAM.

Finally if I had to give one last piece of criticism it would be that even as a stripped down, streamlined browser Midori still doesn’t feel quite as fast as something like Chrome.

Other than those mostly minor issues though Midori did really well. Even YouTube’s HTML5 playback controls worked as expected! I might even recommend people try out Midori if they’re looking for an alternative web browser to use in their day-to-day computing.

Removing old Kernels in Ubuntu 16.04/Linux Mint 18

October 25th, 2016 No comments

Recently I’ve noticed that my /boot partition has become full and I’ve even had some new kernel updates fail as a result. It seems the culprit is all of the older kernels still lying around on my system even though they are no longer in use. Here are the steps I took in order to remove these old kernels and reclaim my /boot partition space.

A few warnings:

  • Always understand commands you are running on your machine before you run them. Especially when they start with sudo.
  • Be very careful when removing kernels – you may end up with a system that doesn’t boot!
  • My rule of thumb is to only remove kernels older than the most recent 2 (assuming I haven’t had any bad experiences with either of them). This allows me to revert back to a slightly older version if I find something that no longer works in the latest version.
First determine what kernel your machine is actually currently running

For example running the command:

uname -a

prints out the text “4.4.0-45-generic“. This is the name of the kernel my system is currently using. I do not want to remove this one!

Next get a list of all installed kernels

You can do this a few different ways but I like using the following command:

dpkg --list | grep linux-image

This should print out a list similar to the one in the screenshot below.

Example list of installed kernels

Example list of installed kernels

From this list you can identify which ones you want to remove to clear up space. On my system I had versions 4.4.0-21.37, 4.4.0-36.55, 4.4.0-38.57 and 4.4.0-45.66 so following my rule above I want to remove both 4.4.0-21.37 and 4.4.0-36.55.

Remove the old kernels

Again this can be done a number of different ways but seeing as we’re already in the terminal why not use our trust apt-get command to do the job?

sudo apt-get purgelinux-image-4.4.0-21-generic linux-image-4.4.0-36-generic

and just like that almost 500MB of disk space is freed up!

Trying out KeePassX

October 23rd, 2016 No comments

KeePassX is an independent implementation of the popular password manager that supports the KeePass (kdb) and KeePass2 (kdbx) database formats. Like the official KeePass application, KeePassX is open source but the main difference is that KeePass requires Microsoft’s .NET framework or the Mono runtime to be installed whereas KeePassX does not.

The feature list from their website shows that KeePassX offers:

  • Extensive management
    • title for each entry for its better identification
    • possibility to determine different expiration dates
    • insertion of attachments
    • user-defined symbols for groups and entries
    • fast entry dublication
    • sorting entries in groups
  • Search function
    • search either in specific groups or in complete database
  • Autofill (experimental)
  • Database security
    • access to the KeePassX database is granted either with a password, a key-file (e.g. a CD or a memory-stick) or even both.
  • Automatic generation of secure passwords
    • extremly customizable password generator for fast and easy creation of secure passwords
  • Precaution features
    • quality indicator for chosen passwords
    • hiding all passwords behind asterisks
  • Encryption
    • either the Advanced Encryption Standard (AES) or the Twofish algorithm are used
    • encryption of the database in 256 bit sized increments
  • Import and export of entries
    • import from PwManager (*.pwm) and KWallet (*.xml) files
    • export as textfile (*.txt)
  • Operating system independent
    • KeePassX is cross platform, so are the databases, as well
  • Free software
    • KeePassX is free software, published under the terms of the General Public License, so you are not only free to use it free of charge, but also to redistribute it, to examine and/or modify it’s source code and to publish your modifications as long as you provide the same freedoms for your modified version.

I’ve been a long time user of KeePass and figured I would check out KeePassX to see if there were any advantages to making the switch. Opening up my existing KeePass2 database was a breeze and even the ‘experimental’ autofill seemed to work just fine. I should also point out that, at least on Linux, KeePassX seems to be much quicker and definitely feels more native compared to the WinForms+Mono official version (I imagine the opposite is true while running on Windows).

The password generation tool for KeePassX is also very similar to the one in the official KeePass however they’ve opted for some defaults which could actually reduce the randomness, and thus security, of a password: exclude look-alike characters, ensure that the password contains characters from every group, etc.

The Password Generator in the official KeePass application

The Password Generator in the official KeePass application

These defaults do make it a bit easier to read or transcribe the passwords should you ever need to and given a long enough password the impact on security should be minimal.

The Password Generator in KeePassX

The Password Generator in KeePassX

So what are my feelings on KeePassX overall? In my limited use it seems like an excellent alternative to the official KeePass application and one that may almost be preferred on non-Windows platforms. I think I’ll be making the switch to KeePassX for my Linux-based installs.

Update: after some slow progress a few developers decided to fork the KeePassX project over at KeePassX Reboot. We’ll have to see how things with this fork play out but I wanted to mention it here in case you decided that the fork was the better version for you.

KWLUG: Emulating Tor (2016-10)

October 4th, 2016 No comments

This is a podcast presentation from the Kitchener Waterloo Linux Users Group on the topic of Emulating Tor published on October 4th 2016. You can find the original Kitchener Waterloo Linux Users Group post here.

Read more…

Categories: Linux, Podcast, Tyler B Tags: ,