How To Patch The Debian 6 Squeeze Shellshock Bug

Debian I run a few webservers at work that are internal facing only (intranet) that run Debian 6 Squeeze.  I’ve been monitoring the Shellshock exploit since it was discovered a few weeks ago and have been looking for a way to get those few systems patched…despite them existing only internally.  Patches for Squeeze-lts (long term release) were released quickly and then just a last week, another patch was put into play as well.  I decided to go ahead and patch these internal systems and since I couldn’t find much out there for blog posts on how to do it…I decided to share how I did it.

Difference Between Squeeze and Squeeze-lts

The difference between Squeeze general and Squeeze-lts is that the LTS (long term support) repositories will continue to receive backported patches from the current release tree (which is version 7 for Debian).  I didn’t originally install/setup these two internal servers so the first thing I have to do is get the version of Debian these servers are running and then check to see if they are using the LTS repositories.

Finding Your Version of Debian

lsb_release -a

This command returns a vanilla squeeze install for me.

Changing Repositories to LTS

Now to see which repositories are enabled.

nano /etc/apt/sources.list

You should open your sources list with your favorite text editor.  If you just have vanilla sources like the two servers I have you can just comment out the sources listed there and paste the following:


deb http://http.debian.net/debian/ squeeze main contrib non-free
deb-src http://http.debian.net/debian/ squeeze main contrib non-free

deb http://security.debian.org/ squeeze/updates main contrib non-free
deb-src http://security.debian.org/ squeeze/updates main contrib non-free

deb http://http.debian.net/debian squeeze-lts main contrib non-free
deb-src http://http.debian.net/debian squeeze-lts main contrib non-free

Now that your sources have changed, update and patch your system:

 apt-get update && apt-get upgrade && apt-get dist-upgrade

Checking To See if You still Vulnerable

You can use bash itself to see if you’re vulnerable to the bug.  Execute the following command:

env x='() { :;}; echo vulnerable' bash -c 'echo hello'

This should return the following if you are patched:

bash: warning: x: ignoring function definition attempt
bash: error importing function definition for `x'
hello

If you’re not patched…the word ‘vulnerable’ will appear in your results.

Further Reading on Shellshock

You can read further about how to switch to LTS repositories here:  https://wiki.debian.org/LTS/Using

For more reading on the Shellshock bug, how it is being exploited and the history/timeline, see here:  http://www.troyhunt.com/2014/09/everything-you-need-to-know-about.html

Finding Files Modified in the Past Few Days

It’s said that with age comes distinction and wisdom. If we believe that, then we’re talking about people and not files.  Working with older files doesn’t make you wise beyond your years…one could argue that it makes you a glutton for punishment :).  That doesn’t always have to be the case as we can solve finding and working with older files using the ‘find’ command.

Recently, I was tasked with finding files that had been modified in the past 5 days. I was to copy these files from a SAN Snapshot and move them over to a recover area that anyone could get to (read: Windows File Share).

We were doing this in Linux because the snapshot, which was a NTFS filesystem would only mount in Linux.  It seems that Linux is more forgiving of errors on a hard disk than Windows is when dealing with NTFS.

So, the snapshot was located on a server designated as X.X.X.X below.  I decided to use the find command to locate all files that were modified in the past 5 days.  The find command can be summarized succinctly using the following logic statement:  find where-to-look criteria what-to-do.  Keeping this logic in mind, I used the following command to get what I needed:

find . -mtime 4 -daystart -exec cp -a {} /home/devnet/fileshare\$ on\ X.X.X.X/RECOVER/ \;

Let’s break down what the above command is doing.  First and foremost, the find command when used in conjunction with a period means to search the current directory (where-to-look in logic statement above).  If you need to specify where to search via path, replace the period with the path to the directory you’ll be searching in  Next, I’ve added the following flags (criteria in logic statement above) which I’ll define:

  1. -mtime:  stands for ‘modified time’.  This means I’m searching for only files modified in the past 4 days.
  2. -daystart:  This flag is used to measure time from the beginning of the current day instead of 24 hours ago which is default.  So in the example above, it would find files 4 days from the start of today (which equates to 5 days from midnight versus 4 days from 24 hours ago for my task)
  3. -exec:  specifies that with the results, a new command should be executed.

The {} above is where the results of our find command are passed.  It will do the command after -exec for each result from the find command.

So, we’re copying with the cp -a command and flag, which will copy recursively, preserving file structure and attributes thanks to our -a flag.  That command copies all the files we’ve found using the find command to the path stated next (what-to-do in our logic statement above).  The last symbols \; are the end statement for our -exec flag.  This must always be present for our -exec command…and the exec flag should be the last option given in the find command as well.

It’s important to note above that I mounted the NTFS SAN snapshot using the GUI like I would any NTFS volume on a Linux desktop and that I executed this find command while I was located in the root of the directory I wanted to search on that snapshot.  The server I was copying the files to noted as X.X.X.X above was a Windows File Server on our network that had open permissions for me to copy to.  I used Samba to mount this server in the directory ‘fileshare’ in my home directory.  The RECOVER directory was made by me to house all the files I’ve found so I could keep them separate from any other files in the root of the file server directory.  I had to manually create this folder prior to issuing the command.

There are more than a couple of different ways to do what I did above.  There are also numerous ways to alter the command and adapt it for your needs.  For example, perhaps you want to find all files that are 3 days old and delete them…and you’re not a stickler for the -daystart option.  In this case:

find . -mtime -3 -exec rm -rf {} \;

Maybe you want to copy mp3’s from a directory to a separate location:

find . -name '*.mp3' -exec cp -a {} /path/to/copy/stuff/to \;

There are lots of ways to adapt this to help locate and deal with files.  The command line/shell are always more than powerful enough to help you get what you need.  I hope this helps you and if you have questions or just want to say thanks…please don’t hesitate to let me know in the comments below.

Finding Files with locate

Many Linux users use the ‘find’ utility when searching for files using the command line on their system. They’ll do a simple:

find / -name 'pattern'

Really though, the power of find isn’t just in finding names of files but rather specific details about those files. For example, if you wanted to find files which are writable by both their owner and their group:

find / -perm -444 -perm /222 ! -perm /111

or perhaps find any file that’s been altered in your Download directory in the past 24 hours:

find /home/user/Downloads/ -mtime 0

As you can see, the find command is very versatile and can be used to find an array of different attributes of files.  There are times though where I’m just looking for something and I don’t want to have to wait for the command to scan the entire directory tree in order to track it down.  That’s where locate comes in with quick and simple results.

Using the Locate Command

Using the locate command can only be accomplished if you install the mlocate package.  Most major distributions have this available.  If not, head over to the mlocate homepage and install manually.  Once that is accomplished, you’ll need to manually run a command to index your filesystem with it…otherwise, you’ll have to wait for the command to run automatically as it registers with cron to do so on a system level.  Open an terminal and change to your root user, then execute the following:

updatedb &

This updates the mlocate database that indexes your files and forks it to the background (the ‘&’ forks it to the background).  You can now logout of the terminal as root and the process will quietly work in the background.

After the command completes, using mlocate is as easy as using the locate command:

locate firefox | less

The command above will look for all files with firefox in the name and pipe the command through less so you can use the spacebar or enter key to scroll the file buffer.  Of course, the reason we pipe it through less is because any file that resides in the ‘firefox’ directory will be reported in the output.  While this tool isn’t as granular as the find command, it is a quick way to track down paths, directories, and files you know should exist.  Since the data is indexed using the updatedb command (by cron) the results are very quick and the command does not have to scan through the filesystem to return the results.

There are plenty more advanced options via flags (such as following symbolic links, making search term case sensitive, and even using regexp).  See the man page for details on how each of these options work.  Play around with locate and see what you can do!  It’s a powerful and quick search command!

Convert PNG to GIF via Command Line

I installed a bare bones Arch Linux system today and took a screenshot.  With no graphics utilities installed, I needed a way to convert a PNG to a GIF for a Simple Machines forum template thumbnail.  I figured I’d use a command line utility to help me and ImageMagick is installed by default on most distributions.  A quick read through the ImageMagick manpage and I found the convert command and thought I’d share it with everyone.  Use convert in the following fashion:  convert [input-options] input-file [output-options] output-file

convert SMFPress.png -channel Alpha -threshold 80% -resize 120x120 thumbnail.gif

This did a quick, same-size conversion with little loss for me to display the thumbnail online.  For more information on the options I used and other options that I didn’t use, take a peek at the ImageMagick Online Help Page for convert.

Read & Display Single Line of a File

Sometimes when I’m troubleshooting a PHP error and a function is called in the debugger that gives me a line number of a file to look at, I want to know what that line says without opening up the file.  Using the command line, you can accomplish this in the following way:

head -n 96 filename.php | tail -n 1

This allows you to quickly display the 96th line of filename.php. Hope this helps someone like it has me.

Photo Albums in one shell command!

Managing photo albums with programs or flat files can be time consuming and tedious. However, there is an alternative. Through the use of Imagemagick and album, two fantastic programs, you can build your own photoalbum and even design your own skins utilizing CSS (cascading style sheets). I gave this tutorial a try and I had a sharp looking photo gallery in a matter of minutes. The great thing for me is that it was all command line so I didn’t have to have an Xsession to have it build an album on my server. I just SFTP in, drop the photos, open up a telnet session, executed the command, and viola! A sharp and clean photo album in the directory I executed the code.

Give it a try and you’ll agree…this software is very nice and very handy in case you need to add a photo album to your site.

Tutorial: http://www.linuxplanet.com/linuxplanet/tutorials/5681/2/

Imagemagick: http://www.imagemagick.org/

Album: http://marginalhacks.com/Hacks/album/

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.