Archive

Archive for the ‘Hardware’ Category

Extend the life of your SSD on linux

February 9th, 2014 2 comments

This past year I purchased a laptop that came with two drives, a small 24GB SSD and a larger 1TB HDD. My configuration has placed the root filesystem (i.e. /) on the SSD and my home directory (i.e. /home) on the HDD so that I benefit from very fast system booting and application loading but still have loads of space for my personal files. The only downside to this configuration is that linux is sometimes not the best at ensuring your SSD lives a long life.

Unlike HDDs, SSDs have a finite number of write operations before they are guaranteed to fail (although you could argue HDDs aren’t all that great either…). Quite a few linux distributions have not yet been updated to detect and configure SSDs in such a way as to extend their life. Luckily for us it isn’t all that difficult to make the changes ourselves.

Change #1 – noatime

The first change that I do is to configure my system so that it no longer updates each files access time on the SSD partition. By default Linux records information about when files were created and last modified as well as when it was last accessed. There is a cost associated with recording the last access time and including this option can not only significantly reduce the number of writes to the drive but also give you a slight performance improvement as well. Note that if you care about access times (for example if you like to perform filesystem audits or something like that) then obviously disabling this may not be an option for you.

Open /etc/fstab as root. For example I used nano so I ran:

sudo nano /etc/fstab

Find the SSD partition(s) (remember mine is just the root, /, partition) and add noatime to the mounting options:

UUID=<some hex string> /               ext4    noatime,errors=remount-ro

Change #2 – discard

UPDATE: Starting with 14.04 you no longer need to add discard to your fstab file. It is now handled automatically for you through a different system mechanism.

TRIM is a technology that allows a filesystem to immediately notify the SSD when a file is deleted so that it can more efficiently manage the underlying storage and improve the lifespan of the drive. Not all filesystems support TRIM but if you are like most people and use ext4 then you can safely enable this feature. Note that some people have actually had drastic write performance decreases when enabling this option but personally I’d rather have that than a dead drive.

To enable TRIM support start by again opening /etc/fstab as root and find the SSD partition(s). This time add discard to the mounting options:

UUID=<some hex string> /               ext4    noatime,errors=remount-ro,discard

Change #3 – tmpfs

If you have enough RAM you can also dedicate some of it to mounting specific partitions via tmpfs. Tmpfs essentially makes a fake hard drive, known as a RAM disk, that exists only in your computer’s RAM memory while it is running. You could use this to store commonly written to temporary filesystems like /tmp or log file locations such as /var/logs.

This has a number of consequences. For one anything that gets written to tmpfs will not be there the second you restart or turn the computer off – it never gets written back to a real hard drive. This means that while you can save your SSD all of those log file writes you also won’t be able to debug a problem using those log files on a computer crash or something of the like. Also being a RAM disk means that it will slowly(?) eat up your RAM growing larger and larger the more you write to it between restarts. There are options for putting limits on how large a tmpfs partition can grow but I’ll leave you to search for those.

To set this up open /etc/fstab as root. This time add new tmpfs lines using the following format:

tmpfs   /tmp    tmpfs   defaults  0       0

You can lock it down even more by adding some additional options like noexec (disallows execution of binaries on the filesystem) and nosuid (block the operation of suid, and sgid bits). Some other locations you may consider adding are /var/log, /var/cache/apt etc. Please read up on each of these before applying them as YMMV.

Categories: Hardware, Tyler B Tags: , , , , ,

A tale of a gillion installs

January 21st, 2014 1 comment

Install number one: LMDE 201303.  I was hoping for the best of both worlds, but I got driver issues instead.  LMDE has known ATI proprietary driver install issues.  I followed the Mint instructions and got it working, then got a blank screen after too much tinkering.  I was surprised that LMDE had this problem since Debian doesn’t, and LMDE should be a more polished version of LMDE.  This wasn’t a big deal, but I decided to give Debian a chance.

Install number two: debian stable (7.3).  The debian website has a convoluted maze of installation links, but it’s still fairly easy to find an ISO for the stable version you need.  I installed from the live ISO using a USB key.  The installation and ATI driver update went smoothly, and I thought all was well at first.  I soon realized that about 50% of reboots failed; the audio driver was the culprit.  I installed the latest driver from Realtec/ALSA and it sort of worked, but I was still getting some crap from # dmesg and the audio would crackle with some files.

LMDE.  I live booted LMDE to see if the same issue existed there and it did.

Time for Mint 16.  As expected everything worked.  Man I really wish Ubuntu hadn’t chosen the dark side – their OS is really good.  All of these distros use ALSA audio drivers, so why is Ubuntu the only one that works?   Kernel versions:

debian stable (7.3):
cat /proc/asound/version
Advanced Linux Sound Architecture Driver Version 1.0.24.
Mint 16:
cat /proc/asound/version
Advanced Linux Sound Architecture Driver Version k3.11.0-12-generic.

One more thing to check.  What kernel version is the real debian testing “jessie” using:

http://packages.debian.org/testing/kernel/linux-image-3.12-1-amd64

LMDE 201303 = 3.2
debian stable 7.3 = 3.2
Mint 16 = 3.11
debian testing “jessie - Jan 2014” = 3.12!

I determined to try debian testing before settling for Mint.  I tried a netinstall from USB key which killed my PC and grub bootloader.  The debian stable live iso usb key decided to stop working as well.   I finally got a real DVD debian stable install to work, changed the repositories to point to “jessie” and upgraded.  I was very surprised to see this worked!   I’m having some problems with bash, but all of my day to day software is up and running.  Nice.

TL;DR: LMDE was using an old kernel so I needed the real debian testing (jessie) to solve my driver problems.

Screen brightness work around (part 2)

January 19th, 2014 No comments

As mentioned before I am having some issues with my laptop’s hardware and controlling the screen brightness. Previously my work around was to set acpi_backlight=vender in the grub command line options. While this resulted in having full screen brightness it also removed my ability to use my keyboard function keys to adjust the screen brightness on the fly (not so good when you’re on battery). Removing this option allowed me to manually adjusted my screen brightness again but once again always started the laptop at zero brightness. What to do?

While far from a perfect solution my current work around is to use xdotool to simulate key presses on login which raise the screen brightness for me automatically. Here is the script that I run on startup:

#!/bin/bash
for i in {1..20}
do
     xdotool key XF86MonBrightnessUp
done

While this works great it still isn’t perfect. Because xdotool requires an X session it means I cannot run it before one is created. If you were unaware the login screen, in my case MDM, does not run inside of X (it actually starts X when you successfully login). So while this will automatically brighten my screen it won’t do so until I type in my username and password, leaving me to type into a fully dark screen or manually adjust the brightness up enough to see what I’m doing. Hopefully I’ll have a better solution sooner rather than later…




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.

Fix no screen brightness on boot problem

October 14th, 2013 No comments

I recently upgraded my laptop to a brand new Lenovo Y410P and promptly replaced Windows 8 with a Linux install. Unfortunately I immediately ran into a very strange driver(?) issue where, on boot, the computer would default to the absolute lowest screen brightness level. This meant that I would need to manually adjust the screen brightness up just to see the login screen. Thankfully after some help from the excellent people over on the Ubuntu Forums I managed to find a very easy work around.

1) As root open up /etc/default/grub

I did this by simply issuing the following command:

sudo nano /etc/default/grub

2) Find the line that says GRUB_CMDLINE_LINUX= and add “acpi_backlight=vendor” to the list of options.

3) From a terminal run this command to update GRUB

sudo update-grub

4) Reboot!

That’s pretty much it. My computer now boots with the correct screen brightness as one would expect.




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.

And I thought this would be easy…

September 22nd, 2013 1 comment

Some of you may remember my earlier post about contemplating an upgrade from Windows Home Server (Version 1) to a Linux alternative. Since then, I have decided the following:

Amahi isn’t worth my time

 

This conclusion was reached after a fruitless install of the latest Amahi 7 installation on the 500 GB ‘system’ drive, included with the EX470. After backing up the Windows Home Server to a single external 2 TB drive (talk about nerve-wracking!), I popped the drive into a spare PC and installed Amahi with the default options.

ffuu

No, I’m not 13. Yes, this image accurately reflects my frustrations.

Moving the drive back into the EX470 yielded precisely zero results, no matter what I tried – the machine would not respond to a ‘ping’ command, and since I’ve opted to try and do this without a debug board, I don’t even have VGA to tell me what the hell is going on. So, that’s it for Amahi.

When all else fails, Ubuntu

 

After deciding that I really didn’t feel like a repeat of my earlier Fedora experiment, I decided to try out the Linux ‘Old Faithful’ as it were – Ubuntu 12.04 LTS. I opted for the LTS version due to – well, you know – the ‘long-term support’ deal.

Oh, and I upgraded my storage (new 1 TB system drive not shown, and I apologize for the potato-quality image):

IMG_20130921_234311

The only kind of ‘TB’ I like. Not tuberculosis.

 

Following from the earlier Amahi instructions, I popped the primary 1 TB drive into a spare machine and allowed the Ubuntu installer to do its thing. Easy enough! From there, I installed the following two additional items (having to add an additional repository for the latter):

  • Openssh-Server

This allows me to easily control the machine through SSH, and – as I understand it – is pretty much a must for someone wanting to control a headless box. Setup was easy-breezy, in that it required nothing at all.

  • Greyhole

For those unfamiliar, Greyhole is – in their own words – an ‘Easily expandable and redundant storage pool for home servers’. One of my favourite things about WHS v1 was its ‘disk pooling’ capability – essentially a JBOD with software-managed share duplication, ensuring that each selected share was copied over to one other disk in the array.

After those were done with, I popped the drive into the EX470, and – lo and behold! – I was able to SSH in.

sshsuccess

This? This is what relatively minor success looks like.

So at this point, I’m feeling relatively confident. I shut down the server (don’t forget -h!) over SSH, popped in the first of the three 3 TB drives, and…

…nothing. Nada. Zip. Zilch. The server happily blinks away like a small puppy wags its tail, excited to see its owner but clearly bereft of purpose when left to its owner. I can’t ping it, I can’t… well, that’s really it. I can’t ping it, so there’s nothing I can do. Looking to see if GRUB was stuck at the menu, I stuck in a USB keyboard and hit ‘Enter’ to no effect. Yes, my troubleshooting skills are that good.

My next step was to pop both the 1 TB and 3 TB drives into the ‘spare’ machine; this ran fine. Running lshw -short -c disk shows a 1 TB and 3 TB drive without issue. I also ran these parted commands:

mklabel gpt

mkpart primary -1 1

 

(I think that last command is right.) So, all set, right? Cool. Pop the drive back in to the EX470, and…

STILL NOTHING. At this point, I’m ready to go pick up a new four-bay NAS, but I feel like that may be overkill. If anyone has any recommendations on how to get the stupid thing to boot with a 3 TB drive, I’m open to suggestions.

 

Finding a replacement for Windows Home Server

July 29th, 2013 6 comments

Hello, everyone! It’s great to be back in the hot seat for this, our third installment of The Linux Experiment. I know that last time I caused a bit of a stir with my KDE-bashing post, so will try to keep it relatively PG this time around.

Not many people know about it or have used it, but – through an employee purchase program about five years ago – I was able to get my hands on the HP EX470 MediaSmart Home Server. What manner of witchcraft is this particular device, you may ask? Here’s a photo preview:

Server

It really is about as simple as it looks. The EX470 (stock) came equipped with a 500 GB drive, pre-loaded with Windows Home Server – which in turn was built on Windows Server 2003. 512 MB of RAM and an AMD Sempron 3400+ rounded it off; the device is completely headless, meaning that no monitor hookup is possible without a debug cable. The server also comes with four(?) USB ports, eSATA, and gigabit ethernet.

My current configuration is 3 x 1 TB drives, plus the original 500 GB, and an upgraded 2 GB DIMM. One of the things I’ve always loved about Windows Home Server is its ‘folder duplication’. Not merely content to RAID the drives together, Microsoft cooked up an idea to have each folder able to duplicate itself over to another drive in case of failure. It’s sort of like RAID 1, but without entirely-mirrored disks. Still, pretty solid redundancy.

Unfortunately for me, this feature was removed in the latest update to Windows Home Server 2011 – and support for that is even waning now, leading me to believe that patches for this OS may stop coming entirely within the next year or two. So, where does that leave me? I’m not keen to run a non-supported OS on this thing (it is internet-connected), so I’m definitely looking into alternatives.

Over the next few days, I plan to write about my upcoming ‘adventures’ in finding a suitable Linux-based alternative to Windows Home Server. Will I find one that sticks, or will I end up going with a Windows 8 Pro install? Only time will tell. Stay tuned!

Categories: Dana H, Hardware, Linux Tags:

Installing Bluetooth devices on Kubuntu

July 27th, 2013 No comments

This is actually a much easier process than I imagined it would be.

First: Ensure your devices (mouse, headphones, keyboard, etc…) are charged and turned on.

Next click on the “Start” menu icon in the bottom left of the desktop screen.

Then click on the “Computer” icon along the bottom, followed by System Settings.

Computer Tab

This will take you into the System Settings folder where you can change many things. Here we will select Bluetooth, since that is the type of device you want to install.

Bluetooth Menu

I took these pictures after I successfully installed my wireless USB keyboard and mouse. So you know I am not bullshitting about this process actually working.

Like most Bluetooth devices, mine have a red “Connect” button on the bottom. Ignore the sweet, sweet compulsion to press that button. I’m convinced it is nearly useless. Instead, use the “Add devices” method, as seen here.

Add Device

More awesome Photoshop.

Now, if you followed my first instruction (charge and turn on your Bluetooth Device) you should see them appear in this menu. Select the item you would like to add and click next. This will prompt you to enter a PIN on the device you wish to insyall (if installing a keyboard), or it will just add your device. If you have done this process successfully, your device will show up in the device menu. If it does not, you fucked up.

 

Dual Booting Ubuntu 13.04 and Windows 8 on a Lenovo Y400 IdeaPad

July 27th, 2013 1 comment

With the third edition of The Linux Experiment already underway, I decided to get my new laptop set up with an Ubuntu partition to work with over the next few months. A little while back, I purchased this laptop with intent to use it as a gaming rig. It shipped with Windows 8, which was a serious pain in the ass to get used to. Now that I’ve dealt with that and have Steam and Origin set up on the Windows partition, it’s time to make this my primary machine and start taking advantage of the power under its hood by dual-booting an Ubuntu partition for development and experiment work.

I started my adventure by downloading an ISO of the latest release of Ubuntu – at the time of this writing, that’s 13.04. Because my new laptop has UEFI instead of BIOS, I made sure to grab the x64 version of the distribution.

Aside: If you’re using NoScript while browsing Ubuntu’s website, you’ll want to keep an eye on the address bar while navigating through the download steps. In my case, the screen that asks you to donate to the project redirected me to a different version of the ISO until I enabled JavaScript.

After using Ubuntu’s Startup Disk Creator to create a bootable USB stick, I started my first adventure – figuring out how to get the IdeaPad to boot from USB. A bit of quick googling told me that the trick was to alternately tap F10 and F12 during the boot sequence. This brought up a boot menu that allowed me to select the USB stick.

Once Ubuntu had booted off of the USB stick, I opened up GParted and went about making some space for my new operating system. The process was straightforward – I selected the largest existing partition (it also helped that it was labelled WINDOWS_OS), and split it in half. My only mistake in this process was to choose to put the new partition in front of the existing partition on the drive. Because of this, GParted had to copy all of the data on the Windows partition to a new physical location on the hard drive, a process that took about three hours.

The final partitioning scheme with my new Linux partition highlighted

The final partitioning scheme with my new Linux partition highlighted

With my hard drive appropriately partitioned, it was time to install the operating system. The modern Ubuntu installer pretty much takes care of everything, even going so far as selecting an appropriate space to use on the hard drive. I simply told it to install alongside the existing Windows partition, and let it take care of the details.

The installer finished its business in short order, and I restarted the machine. Ubuntu booted with no issues, but my Windows 8 partition refused to cooperate. It would seem as though something that the installer did wasn’t getting along well with UEFI/SecureBoot. Upon attempting to boot Windows, I got the following message:

error: Secure Boot forbids loading module from (hd0,gpt8)/boot/grub/x86_64-efi/ntfs.mod.
error: failure reading sector 0×0 from ‘cd0′
error: no such device: 0030DA4030DA3C7A
error: can’t find command ‘drivemap’
error: invalid EFI file path

Press any key to continue…

Uh oh.

Like I said, I could boot Ubuntu, so I headed on over to their website and read their page on UEFI. At first glance, it seemed as though I had done everything correctly. The only place that I deviated from these instructions was in manually resizing my Windows partition to create space for my new Ubuntu partition.

Thinking that I might be experiencing troubles with  my boot partition, I took a shot at running Ubuntu’s Boot-Repair utility. It seemed to do something, but upon restarting the machine, I found that I had even more problems – now a Master Boot Record wasn’t found at all:

It would appear as though I may have made things worse...

It would appear as though I may have made things worse…

After dismissing the boot device error, I was prompted to choose which device to boot from. I chose to boot Windows’ UEFI Repair partition, and was (luckily) able to get to a desktop. Unfortunately, none of the other partitions on the device seem to work, so I’m back where I started at the beginning, except that now in addition to having to put up with Windows 8, I also have a broken master boot record.

Lenovo: 1 / Jon: 0.




On my Laptop, I am running Linux Mint 12.
On my home media server, I am running Ubuntu 12.04
Check out my profile for more information.

What is this, text for ants? Part I

July 26th, 2013 No comments

Unlike many people who may be installing a version of Linux, I am doing so on a machine that has a projector with a 92″ screen as it’s main display.

So, upon initial installation of Kubuntu, I couldn’t see ANY of the text on the desktop, it was itty bitty.

Font for Ants

I can’t even read this standing inches away.

In order to fix this, I had to hook up an additional display.

Thankfully, living in a house with a computer guru, I had many to choose from.

In order to get my secondary display to appear, I had to first plug it into the display port on the machine I am using. I then had to turn off the current display (projector) and reboot the machine so that it would initialize the use of my new monitor.

Sounds easy enough, and it was, albeit with some gentle guidance from Jake B.

From here, I am able to properly configure my display.

The thing I am enjoying most about Kubuntu so far is that it is very user friendly. It seems almost intuitive where each setting can be found in menus.

So these are the steps I followed to change my display configuration.

I went into Menu > Computer > System Settings

Computer Tab

Check out my sweet Photoshop Skills. I may have taken this picture with a potato.

Once you get into the System Settings folder, you have the option to change a lot of things. For example, your display resolution.

System Settings

Looks a lot like the OSX System Preferences layout.

Now that you are in this menu, you will want to select Display and Monitor from the options. Here you can set your resolution, monitor priority, mirroring, and multiple displays. Since I will only be using this display on the Projector, I ensured that the resolution was set so that I could read the text properly on the Projector Screen. Before disabling my secondary monitor, I also set up my Bluetooth keyboard and mouse, which I will talk about in another post.

This process only took a few moments. I will still have to tweak the font scaling, as I have shit-tastic eyesite.

Make printing easy with the Samsung Unified Linux Driver Repository

July 13th, 2013 No comments

I recently picked up a cheap Samsung laser printer and decided to give the Samsung Unified Linux Driver Repository a shot while installing it. Basically the SULDR is a repository you add to your /etc/apt/sources.list file which allows you to install one of their driver management applications. Once that is installed anytime you go to hookup a new printer the management application automatically searches the repository, full of the official Samsung printer drivers, finds the correct one for you and installs it. Needless to say I didn’t have any problems getting this printer to work on linux!




I am currently running a variety of distributions, primarily Linux Mint 17.
Previously I was running KDE 4.3.3 on top of Fedora 11 (for the first experiment) and KDE 4.6.5 on top of Gentoo (for the second experiment).
Check out my profile for more information.
Categories: Hardware, Linux, Tyler B Tags: , ,