Fork me on Github
Fork me on Github

Joe Dog Software

Proudly serving the Internets since 1999

A sh script INI parser

An INI file provides an old-timey way to configure applications. On MSDOS and early versions of Windows, it was the primary configuration mechanism. (Sadly, it’s been mostly replaced by the stupid registry.) Fortunately, INI files are still around because they remain very useful. On Linux, many developers have adapted the format.

Years ago Your JoeDog wrote an INI parser in sh. This enabled him to use one config file across multiple SAP landscapes. Each section of the file, is headed by the landscape name: [D] [Q] [P] etc. A script’s configuration varied depending on which landscape it was launched in. This enabled us to test on one landscape and promote the script without any changes to the next one.

Recently a colleague needed this type of one-with-many configuration. She needed to loop through a list of servers and reference their many attributes. Your JoeDog dusted off this parser for her and now he’s passing it along to you.

Consider this INI file:

#
# The parser supports comments
[WEB]
 srv = www.joedog.org
 usr = jdfulmer
 file = haha.txt
 path = /usr/local/content/www
[FTP]
 srv = ftp.joedog.org
 usr = jeff
 file = papa.txt
 path = /usr/local/content/ftp

Now here’s a script that contains our parser along with an example of how to use it:

#!/bin/sh
##++
## Parses an INI style file:
## [section]
## attr=thing
## key=val
## [header]
## thing=another
## foo=bar
## <p>
## @param file full path to the INI file
## @param section a header that matches the stuff in brackets [section]
## @return void (the variables are made available to your script)
##--
ini_parser() {
 FILE=$1
 SECTION=$2
 eval $(sed -e 's/[[:space:]]*=[[:space:]]*/=/g' 
 -e 's/[;#].*$//' 
 -e 's/[[:space:]]*$//' 
 -e 's/^[[:space:]]*//' 
 -e "s/^(.*)=([^"']*)$/1="2"/" 
 < $FILE 
 | sed -n -e "/^[$SECTION]/I,/^s*[/{/^[^;].*=.*/p;}")
}
# A sections array that we'll loop through
SECTIONS="WEB FTP"
for SEC in $SECTIONS; do
 ini_parser "papa.conf" $SEC
 echo "scp $file $usr@$srv:$path/$file"
done
exit

Now let’s run this script and see what happens:

Pom $ sh papa
scp haha.txt [email protected]:/usr/local/content/www/haha.txt
scp papa.txt [email protected]:/usr/local/content/ftp/papa.txt





So Are You Vulnerable To Shell-shock?

Here’s a quick command line test to see if you’re vulnerable to shell-shock, the bash vulnerability that everyone — I mean everyone — is talking about:

$ env x='() { :;}; echo 1. env' bash -c "echo 2. bash"

If your bash is vulnerable, it will execute the echo command inside the environment, if it’s not vulnerable, then it will only execute the stuff after -c

A vulnerable system prints this:

$ env x='() { :;}; echo 1. env' bash -c "echo 2. bash"
1. env
2. bash

A non-vulnerable system prints this:

$ env x='() { :;}; echo 1. env' bash -c "echo 2. bash"
2. bash

On the vulnerable system, the echo command that is set in the environment is executed by bash when the shell is invoked:

env x='() { :;}; echo 1. env' bash -c "echo 2. bash"

The stuff in red should NOT be executed. That’s a bug; it needs to be fixed.

NOTE: The second command was run on the server that hosts this blog entry. You guys can quit trying, mmmkay?

 



Shellshocked

Wired provides an interesting angle on the bash shell bug that has all your panties in a bind

[Brian] Fox drove those tapes to California and went back to work on Bash, other engineers started using the software and even helped build it. And as UNIX gave rise to GNU and Linux—the OS that drives so much of the modern internet—Bash found its way onto tens of thousands of machines. But somewhere along the way, in about 1992, one engineer typed a bug into the code. Last week, more then twenty years later, security researchers finally noticed this flaw in Fox’s ancient program. They called it Shellshock, and they warned it could allow hackers to wreak havoc on the modern internet.

[Wired: The Internet Is Broken]

 



sh Script: Read lines and omit comments and blanks

I perform remote operations on a series of servers. Rather than maintain the server list in several scripts, I consolidated it into a single file called servers.txt  Exciting! The second I did that, I raised my own bar. You’d expect a config file parser to omit comments and blank lines, right? I do.

The anticipated time to write that parser was longer than I expected. In order to save you time, dear reader, I decided to post it here. This sh script reads a file into an array while skipping comment lines and blanks.

Here’s a sample config file:

#
# Comment line 
# Wed Feb 25 19:28:54 EST 2014
homer.joedog.org
 marge.joedog.org
bart.joedog.org
 lisa.joedog.org
burns.joedog.org
# EOF

And here’s a script to parse it:

#!/bin/sh
let X=0
while read line ; do
  if [[ ! $line =~ [^[:space:]] ]] ; then
    continue
  fi
  [ -z "`echo $line | grep '^#'`" ] || continue
  SRV[$X]=$line
  let X=$X+1
done < servers.txt
for (( i=0; i<${#SRV[@]}; i++ )); do
  echo ${SRV[$i]}
done

Here’s the output from the script:

$ sh ./haha
homer.joedog.org
marge.joedog.org
bart.joedog.org
lisa.joedog.org
burns.joedog.org

Here’s another way to coax the data out of the $SRV array. You can convert it into a space separated string and loop through it in a traditional manner:

SRV=${SRV[@]}
for S in $SRV ; do
  echo $S
done

After you guys vet this in the comments, I’ll add it to the sh scripting cheat sheet. Happy hacking.

UPDATE: A reader sends me a one-liner which implements similar functionality.  If you don’t require an indexed array, then it’s only drawback is its perl dependency.

SRV=$(egrep '[^[:space:]$]' servers.txt|egrep -v '^#'|perl -pe 's/^s+//')
for S in $SRV ; do
  echo $S
done


Creating Config Files For sh Scripts

Your JoeDog uses mondorescue for bare-metal Linux restoration. We use mondorestore to recover the OS and Net Backup to recover its content. Since we’re only concerned about archiving the OS for bare-metal recovery, it’s necessary to exclude directories when we run mondoarchive.

My exclude requirement varies from server to server so I wanted to build the list dynamically. As a coder, I have religious aversion to altering scripts for the purpose of configuring them. If we set config variables inside the script, then we have a different version on every server. That’s a paddlin’.

For my mondoarchive script, I developed a pretty slick way to read a configuration file and build an exclude list. The list is configured in a conf file that ignores comment lines and superfluous white space. A typical configuration looks like this:

#
# This file is maintained by the Puppet Master 
# 
# This is the exclude list for mondoarchive Directories inside
# this list will not be archived for bare metal recovery.
#
/tmp
/export
/usr/src
/var/mail
/var/cache
/var/log

My mondoarchive script builds a string of pipe separated directories like this:

/tmp|/export|/usr/src|/var/mail|/var/cache|/var/log

Since very few of you will have a similar usecase, I wrote an example that reads the file into a sh array. This version will loop through the array and print each one.

#!/bin/sh
# An example script that reads a list from a config
# file into a sh script array.
CONF="haha.conf"
LIST=""
#
# Read the directory list from $CONF
if [[ -e $CONF ]] ; then
  while read line ; do
    chr=${line:0:1}
    # XXX: Use awk's substr on older systems like
    # HPUX which don't support the above syntax.
    # chr=$(echo $line | awk '{print substr($1,0,1)}')
    case $chr in
     '#')
       # ignore comments
       ;;
     *)
       if [[ ${#line} -gt 2 ]] ; then
         if [[ -z $LIST ]] ; then
           LIST="$line"
         else
         LIST="$LIST $line"
         fi
       fi
       ;;
    esac
  done < $CONF
else
  echo "$0: [error] unable to locate $CONF"
fi
let X=1
for I in $LIST ; do
  echo "$X: $I"
  let X=$X+1
done

Let’s run this bad boy and see what happens:

$ sh haha
1: /tmp
2: /etc
3: /usr/local
4: /data/mrepo

If some of the concepts listed don’t make sense, then you might want to see our sh scripting cheat sheet. It will help you understand things like ‘-e $CONF’ and sh script arrays. Happy hacking.

UPDATE: Given the introduction to this post, it’s likely that many of you have arrived here in search of a mondoarchive backup script. Well, we won’t let you leave empty handed. You can grab my archive script here: Mondo Rescue Archive Script

This script builds both NFS recoverable archives and DVD images to an NFS mounted volume. Here’s its usage banner:

Usage: archiver [-c|-n]
Requires either a '-c' or a '-n' argument
  -c      create a CD Rom archive
  -n      create an NFS archive



Use Fido To Process FTP Uploads

Did you ever want to process a file immediately after it was uploaded via FTP? You could have the upload script execute a remote command after the file is uploaded. That requires shell access that you may or may not be able to grant. On the server, you could run a processing script every minute out of cron but that could get messy.

Fido provides alternative method.

Starting with version 1.0.7, Fido has the ability to monitor a file or directory by its modification date. When the date changes, fido launches a script. We can use this feature to process files that are uploaded via ftp.

In this example, we’ll monitor a directory. In fido.conf, we’ll set up a file block that points to a directory. (For more information about configuring fido, see the user’s manual). This is our configuration:

/home/jdfulmer/incoming {
 rules = modified
 action = /home/jdfulmer/bin/process
 log = /home/jdfulmer/var/log/fido.log
}

With this configuration, fido will continuously watch /home/jdfulmer/incoming for a modification change. When a file is upload, the date will change and fido will launch /home/jdfulmer/bin/process. Pretty sweet, huh?

Not quite. The modification date will change the second ftp lays down the first bite. Our script would start to process the file before it’s fully uploaded. How do we get around that? We can make our script smarter.

For the purpose of this exercise, I’m just going to move uploaded files from incoming to my home directory. Here’s a script that will do that:

#!/bin/sh
PREFIX="/home/jdfulmer/incoming"
FILES=$(ls $PREFIX)
for F in $FILES ; do
  while [ -n "$(lsof | grep $F)" ] ; do
    sleep 1
  done
  mv $PREFIX/$F /home/jdfulmer
done

In order to ensure the file is fully uploaded, I check lsof for its name. If there’s an open file handle under that name, then the script will continue to loop until it’s cleared. When the while loop breaks, the script moves the file.

There’s just one more thing to think about. When the script moves the file what happens to the directory fido is watching? Yes. Its modification date changes. In my example, process runs a second time but does nothing since nothing is there. Depending on your situation, you may need to make the script a little smarter.