Linux/Unix

Posts about using Linux and Unix

Scanning Linux For Intrusion With RKHunter

RKHunter (or Root Kit Hunter) is a program that can be used to scan a Linux machine to see if there is anything there that might be a sign of a security breach. It will scan all of the files on the system and look out for any suspicious files or unexpected changes to system files that might indicate a security breach. Just like anti-virus systems it has a database of root kit definitions that it will use to compare files against to see if they are infected but will also just check for changes to core system files.

Some Useful Curl Snippets

Curl is an incredibly useful tool and has all sorts of flags and options available for every situation. I tend to use curl quite a lot for all kinds of stuff, and not just downloading large files. So I thought I would post a few of the most common things that I use the tool for. Note that most of the following URLs don't really exist, they are just for demo purposes. I have also left out the output of these commands as they vary from a few lines to many pages of output.

Find Architecture And Version Of A Linux Box

When doing an audit of an existing Linux server a good first step is to find out what distribution is running and if the server is running a 32 or 64 bit architecture.

To find out what architecture a server is running you can run the uname command, which will print out certain system information. This must be supplied with the -a flag in order to print out as much information as possible.

uname -a

This will print out a line similar to the following on an Ubuntu system.

Linux vlad 3.2.0-23-generic #36-Ubuntu SMP Tue Apr 10 20:39:51 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

This can be broken down bit by bit and will contain the following information.

Print A Specific Block Of Lines From A File In Linux

If you have a large file of data that you are trying to import, or a log file you are trying to dissect then you'll rarely want to print it directly out to the screen. Using commands like more or programs like vim can make things a little easier but you still have to run through potentially thousands of lines to find the correct block.

To load a few specific lines from a file you can use a combination of the head and tail commands. The following command will print out lines 200 to 220 from a large file called 'bigfile. The head command will print out the first 220 lines from a file, which is then piped into a tail command that prints out only the last 20 lines of the output generated by the previous command.

head -n 220 bigfile | tail -n 21

Alternatively, you can use sed to print out the same block from the large file.

Automatically List Directory Contents When Changing Directory In Linux

When navigating around a Linux box I tend to find I use the same two commands a lot. The first is 'cd' to change a directory, and the second is 'ls' in order to see what is in the new directory. Rather than do this over and over again I decided to look around for a good solution to automate this.

I found a variety of results on the internet, but some were simply creating a different alias that wrapped the same two commands. I found this example on superuser, which solves the problem quite nicely. Here is the example in full.

Adding Apache Reporting To Munin

When you first install a Munin node it will try to install as many plugins as it can so that it can report on different things. For example, if you have a Varnish server running then Munin will detect this and enable the plugins so that it can report on the activity of Varnish. Once you have started getting data through to your Munin server then you can turn on plugins on the nodes to get more data.

The data of any plugin is presented in a standard format and so is understood by the Munin server. Perhaps the most important plugin for my work is the Apache status plugin that shows what is going on inside Apache. This plugin isn't always installed with the Munin node and so you might have to do this yourself. This is a good way of getting familiar with Munin plugins.

Exporting And Importing Munin Graph Data

When Munin does a data update it stores all of the data from the nodes as a set of rrd files. These files are then picked up by the munin-graph and munin-html programs and turned into the graph images and web pages that you are probably familiar with if you use Munin.

The default location for Munin to store these data files is within the directory /var/lib/munin. Each group you define in your config is given it's own sub directory and the rrd data files for all servers within each group are kept within that directory. If you kept the default Munin config file you will probably have a directory called localhost which will contain all of the rrd files for your Munin server.

Copying Files With Secure Copy

The secure copy command (run using scp) is a Linux command that allows the transfer of files between two computers. This can be locally to a remote server, from a remote server to a local location, or even between two remote servers.

When copying to or from a remote host scp uses ssh for the data transfer. This means that authentication is required, but the files are copied in a secure fashion. When starting a scp request the command first sets up an ssh connection to the remote location, which is then used for the rest of the transfer.

It is also possible to copy the files on a local hard drive, but in this case you should probably use the standard cp command.

Monitoring Performance With Munin

I have been searching for a good server monitoring solution for a while so that I can keep an eye on some of the servers that I run. Tools like Smokeping, Cacti and Nagios seemed promising at the outset, but they are more concerned with bandwidth and server status, not how the server is running. What I really needed was a way to find out how much memory a server was using, how many Apache requests were being made, what the average load of the server was, and also some way of letting me know when things were under strain.

Uzing Tar To Compress And Uncompress Files

The tar command can be used to compress or extract one or more files in Linux. A tar file isn't actually a compressed format, instead it is a collection of files within a single file. The tar command can take one or more files, convert them into a tar file and then compress it into a gzip file format. The file created will have the extension tar.gz.

There are a large number of flags that can be used but the main ones for everyday use are.