Tag: BASH

[email protected] Init Script Additions: throttle & unthrottle

January 13, 2010 » Geek

As I’ve posted before, I’ve started running [email protected] on my machines. One issue I’ve found is that on a dual core machine I will sometimes bog down as [email protected] consumes a whole core. That plus a lot of busy Firefox tabs and my box starts to crawl.

To fix that, I added a few pieces to my [email protected] init script, which was originally scavenged from this site, though on Googling there is a much nicer one on the [email protected] wiki. You might just want to apply my changes to that one.

In any case, I just added two commands to throttle and unthrottle the [email protected] application using cpulimit. This way I can add a cron job to manage it, or just throttle it when it starts to bug me.

Here it is if you want it!

#!/bin/sh

export DIRECTORY=/var/cache/fah
USER=fah
export OUTPUT=/dev/null

test -f $DIRECTORY/fah6 || exit 0


title() {
  echo $1
  error=0
}

status() {
  error=0
}

case "$1" in

  start)
    title "Starting [email protected]"
    cd $DIRECTORY
    su $USER -c 'nohup $DIRECTORY/fah6 >$OUTPUT 2>&1 &'
    error=$?
    status
;;

  stop)
    title "Stopping [email protected]"
    killall -15 $DIRECTORY/fah6 || error=$?
    status
;;

  restart)
    $0 stop; $0 start
;;

  unthrottle)
    FHPID=$(ps aux | grep FahCore | grep [TR]N | grep -v grep | awk '{print $2}')
    CLPID=$(ps aux | grep "cpulimit -p $FHPID -l" | grep -v grep | awk '{print $2}')
    if [ "$CLPID" != "" ]; then
      echo "Killing existing cpulimit, $CLPID"
      kill -9 $CLPID
    fi
    kill -18 $FHPID # It may be in SIGSTOP, so send it a SIGCONT
;;

  throttle)
    $0 unthrottle;
    FHPID=$(ps aux | grep FahCore | grep [TR]N | grep -v grep | awk '{print $2}')
    if [ "$FHPID" != "" ]; then
      echo "Found process $FHPID, throttle to 50%"
      nohup cpulimit -p $FHPID -l 50 >$OUTPUT 2>&1 &
    else
      echo "Could not find fah process!"
    fi
;;


  *)
    echo "Usage: $0 { start | stop | restart | throttle | unthrottle }"
    exit 1
;;

esac

exit 0

[email protected] Team Statistics Scraper

December 11, 2009 » Geek

I created a team for Little Filament on [email protected] Our team number is 172406 (in case you want to join), but I wanted to add our latest stats on the Little Filament site. As far as I can tell there is no API for the stats, so I worked up a scraper in bash.

Basically all it does is fetch the page, then grep and sed it’s way to the variables, finally dumping them into a json file (for easy JavaScript consumption).

The kicker is that the stats server is overloaded or down a lot, so we can’t rely on it and we don’t want to stress it out further. My decision was to poll it at a large interval, 12-24 hours. I don’t have enough clients on the team to exact significant change over 6-12 hours, but I don’t want to fall too far out of date either. So if the server is overloaded and drops it once or twice, not a big deal.

Without further ado, here is the script.

#!/bin/bash

NOW=$(date +%s)
THEN=$(cat fah_check.lock | tr -d '\n')

if [ $NOW -gt $(($THEN + 86400)) ]; then
	wget "http://fah-web.stanford.edu/cgi-bin/main.py?qtype=teampage&teamnum=172406" -O fah_check.html
	if [ "$?" == "0" ]; then
		grep "Grand Score" fah_check.html > /dev/null 2&>1
		if [ "$?" == "0" ]; then
			SCORE=$(grep -C 2 "Grand Score" fah_check.html | sed 's/[^0-9]//gm' | tr -d '\n')
			WU=$(grep -C 2 "Work Unit Count" fah_check.html | sed 's/[^0-9]//gm' | tr -d '\n')
			RANK=$(grep -C 1 "Team Ranking" fah_check.html | sed 's/[^0-9of]//gm' | tr -d '\n' | sed 's/f\([0-9]*\)of\([0-9]*\)/\1 of \2/')
			echo "{\"score\": \"$SCORE\", \"work_units\": \"$WU\", \"rank\": \"$RANK\" }" > fah_check.json
			echo "[$NOW] - Success!" >> fah_check.log
			echo $NOW > fah_check.lock
		else
			echo "[$NOW] - Filter Failed" >> fah_check.log
		fi
	else
		echo "[$NOW] - Download Failed" >> fah_check.log
	fi
else
	echo "[$NOW] - Skip Update" >> fah_check.log
fi

That cranks out fah_check.json, which looks like this:

{"score": "4355", "work_units": "20", "rank": "39881 of 169721" }

To see it in action, check out the Little Filament Folding page.

Great bash calculator

November 3, 2009 » Geek

I spend a lot of time on the command line, shop and one thing I run up against every once in a while is doing math. I normally jump through the bc hoops, malady but today this bit came through my feed reader.

calc(){ awk "BEGIN{ print $* }" ;}

Just drop it into your .bashrc, .alias, or whatever else you use.

[email protected]:~$ calc 9*100+14/10
901.4
[email protected]:~$

Great solution using existing tools, props to TinyHacker.com for this one.

Tags: , ,

ImageMagick Thumbnails and Contact Sheets

March 30, 2009 » Geek

Update (2010-06-14)
Thanks to Glenn Turnbull I’ve fixed a bug where the last contact sheet would not be created when the number of photos is evenly divisible by the contact sheet size.

Additionally, this script and others will now be kept updated at http://github.com/jmhobbs/helper-scripts

Update (2012-04-10)
Phillip Vuchetich wrote a neat script for making composite 4×6 out of wallet sized images, which he has allowed me to post about here.

Wow, long time no post. Darcy and I got a digital camera about a week ago, a Nikon D90. We haven’t really had a chance to put it through it’s paces, but we’ve taken a few pictures around the house to play with it.

At about 3mb each (JPEG’s) the images are really slow to preview in Konqueror. I decided it would be better to be able to download all the photos from the card, then run a script to make my thumbnails. That way I wouldn’t have to wait around while I was viewing photos, instead I could just wait once at the beginning of the process.

My resulting script may have some holes, but it works well for me on Sidux. It takes all of the images in the current directory and makes 600×600 base thumbnails into a directory called “thumb” then uses those to make 12 image contact sheets into a directory called “contact”.

real user sys
resize 0m43.478s 0m40.625s 0m2.525s
scale 0m25.449s 0m22.975s 0m2.236s
sample 0m18.362s 0m15.983s 0m2.211s
Script times for 16 JPEG images at 3Mb each
To 600×600 thumbnails and 200×200 contact sheet frames.

Your results will vary, but I ran it with three different scaling types (resize, scale, sample). I’m fine with the output from the fastest one (sample) but you can do as you please. I didn’t add command line options because I wanted to have consistent sizes and qualities every time I use it.


Side By Side Resize Method Comparison
Click For Fullsize

Sample Contact Sheet
Click For Full Size

It keeps you updated so you know it hasn’t stalled, here is a sample run.

[email protected]:~/Desktop/D90/dcim/example$ digiCamProc.sh
Processing 16 Images

Creating Thumbnails
100%

Creating Contact Sheets
1 of 2
2 of 2
[email protected]:~/Desktop/D90/dcim/example$

And here it is. Feel free to comment your changes!

#!/bin/bash

# Digital camera thumbnail/contact sheet tool.
# http://www.velvetcache.org/2009/03/30/imagemagick-thumbnails-and-contact-sheets
# http://github.com/jmhobbs/helper-scripts
#
# Copyright (c) 2009-2010 John Hobbs
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation
# files (the "Software"), to deal in the Software without
# restriction, including without limitation the rights to use,
# copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following
# conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
# OTHER DEALINGS IN THE SOFTWARE.

# CHANGELOG
# 2010-06-14 - Fixed contact sheet problem, thanks to Glenn Turnbull. (John Hobbs)
# 2009-03-30 - Created script. (John Hobbs)

### SETTINGS ###

# Scaling Methods:
# resize (Best/Slow)
# scale (Middle/Middle)
# sample (Worst/Fast)
METHOD="sample"

# Thumbnail Size
THUMBSIZE="600x600"
# Thumbnail Directory
THUMBDIR="thumb"
# Thumbnail Quality
THUMBQUALITY="80"

# Contact Item Size
CONTACTSIZE="200x200"
# Contact Sheet Max Width
CONTACTWIDTH="3"
# Contact Sheet Max Height
CONTACTHEIGHT="4"
# Horizontal Spacing
CONTACTSPACINGH="3"
# Vertical Spacing
CONTACTSPACINGV="3"
# Contact Sheet Directory
CONTACTDIR="contact"
# Contact Sheet Quality
CONTACTQUALITY="100"

################


CONTACTCOUNT=$(($CONTACTWIDTH * $CONTACTHEIGHT))
PIX=$(ls -l *.jpg | wc -l)

echo "Processing $PIX Images"
echo
echo "Creating Thumbnails"

mkdir -p $THUMBDIR
CTR=0
echo -n "0%"
for i in *.jpg; do
    echo -ne "\r"
    echo -n "$((100 * $CTR / $PIX))%"
    convert -strip -quality ${THUMBQUALITY} -${METHOD} ${THUMBSIZE} "$i" "${THUMBDIR}/${i}"
    CTR=$(($CTR + 1))
done

echo -ne "\r"
echo "100%"

echo
echo "Creating Contact Sheets"

mkdir -p $CONTACTDIR
CTR=0
PAGES=$(($PIX / $CONTACTCOUNT))
if [ $(($PIX % $CONTACTCOUNT)) -ne 0 ]; then
    PAGES=$(($PAGES + 1))
fi

PAGE=1
LIST=""
for i in ${THUMBDIR}/*.jpg; do
    if [ $(($CTR % $CONTACTCOUNT)) -eq 0 ] && [ $CTR -ne 0 ]; then
        echo "$PAGE of $PAGES"
        montage -label %f -quality $CONTACTQUALITY -frame 5 -tile ${CONTACTWIDTH}x${CONTACTHEIGHT} -geometry ${CONTACTSIZE}+${CONTACTSPACINGH}+${CONTACTSPACINGV} $LIST jpg:- > ${CONTACTDIR}/${PAGE}.jpg
        LIST=""
        PAGE=$(($PAGE + 1))
    fi
    LIST="$LIST $i"
    CTR=$(($CTR + 1))
done

if [ "" != "$LIST" ]; then
    echo "$PAGE of $PAGES"
    montage -label %f -quality $CONTACTQUALITY -frame 5 -tile ${CONTACTWIDTH}x${CONTACTHEIGHT} -geometry ${CONTACTSIZE}+${CONTACTSPACINGH}+${CONTACTSPACINGV} $LIST jpg:- > ${CONTACTDIR}/${PAGE}.jpg
fi

Move Subversion repository without svnadmin

November 12, 2008 » Geek

Update (2008-11-13)
Okay, so I did actually end up finding a way to move from SF.net for real. It seems they provide read-only rsync access straight to the repository directory. So here is what I did instead.

$ rsync -av blowpass.svn.sourceforge.net::svn/blowpass/* blowpass
$ svnadmin dump blowpass/ > blowpass.dump
$ svnadmin create clearpass
$ svnadmin load clearpass < blowpass.dump

Update (2008-11-12)
Made a small performance change then ran it on the ClearPass repository. Worked flawlessly.

I have been having a terrible time trying to figure out how I can get the Subversion repository for ClearPass out of SourceForge. I could not find a single reference to svnadmin on the SourceForge and no examples of exporting without it. So I took matters into my own hands. Below is a quick and dirty shell script that exports and imports a repository one revision at a time, using common Linux command line tools and the svn command. I'm going to do more testing before using it for real, but so far it has done well. Hope this helps someone else in my position.

Download it: svncrossload

#!/bin/sh

################################################################################
# LICENSE
################################################################################
# Copyright 2008 John Hobbs
################################################################################
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the
# Free Software Foundation, Inc.,
# 59 Temple Place - Suite 330, Boston, MA  02111-1307, USA.
################################################################################

################################################################################
# ABOUT
################################################################################
#
# Home: http://www.velvetcache.org/
#
# This is a script to cross load subversion repositories (kind of) keeping history
# intact without access to svnadmin.  Import into a FRESH repository only, and
# be sure to do a comprehensive diff at the end.
#
# Also be sure to do this in an empty directory. Temp files get added and removed
# without sincere thought put into them.
#
# Log messages get eaten and re-inserted as shown below. Edit to taste.
#   $ svn log -r 1
#   ------------------------------------------------------------------------
#   r1 | jmhobbs | 2008-11-12 18:19:43 -0600 (Wed, 12 Nov 2008) | 7 lines
#
#   Imported from file:///srv/svn/scs using svncrossload
#
#     |r1 | jmhobbs | 2008-10-27 17:32:44 -0500 (Mon, 27 Oct 2008) | 2 lines
#     |
#     |Initial import.
#     |
#
#   ------------------------------------------------------------------------
#   $

echo "Checking out initial revisions"
svn co $2 importing > /dev/null
svn co -r 0 $1 updateme > /dev/null

echo "Getting most recent revision number"
LATESTREVISION=$(svn info $1 | grep Revision | sed 's/^Revision: *\([0-9]*\)/\1/')

for i in $(seq 1 $LATESTREVISION); do

  echo -e "\nCopying revision $i"

  cd updateme
  svn update -r $i > ../_update
  echo -e "Imported from $1 using svncrossload\n" > ../_log
  # The '\-\-\-\-\...' looks ridiculous, but it works.
  svn log -r $i | grep -v '\-\-\-\-\-\-\-\-\-\-\-\-\-\-\-' | sed 's/\(.*\)/   |\1/'  >> ../_log
  cd ..

  cat _update | grep -E '^A' | sed 's/^A *//' > _update_add
  cat _update | grep -E '^D' | sed 's/^D *//' > _update_del
  cat _update | grep -E '^U' | sed 's/^U *//' > _update_mod

  echo "$(wc -l _update_add | sed 's/^\([0-9]*\).*/\1/') Files To Add"
  echo "$(wc -l _update_mod | sed 's/^\([0-9]*\).*/\1/') Files To Modify"
  echo "$(wc -l _update_del | sed 's/^\([0-9]*\).*/\1/') Files To Delete"

  # Copy
  for j in $(cat _update_add | tr ' ' '@'); do
    if [ -d "updateme/${j//@/ }" ]; then
      mkdir "importing/${j//@/ }"
    else
      cp -f "updateme/${j//@/ }" "importing/${j//@/ }"
    fi
    cd importing
    # We send cerr to null because it warns when we add existing stuff
    svn add "${j//@/ }" 2> /dev/null
    cd ..
  done

  # Modify
  for j in $(cat _update_mod | tr ' ' '@'); do
    if [ -f "updateme/${j//@/ }" ]; then
      cp -f "updateme/${j//@/ }" "importing/${j//@/ }"
    fi
  done

  # Delete
  for j in $(cat _update_del | tr ' ' '@'); do
    cd importing
    svn rm "${j//@/ }"
    cd ..
  done

  echo "Committing"
  cd importing
  svn commit -F ../_log
  cd ..

done

echo "Cleaning up"
rm -rf importing _log _update _update_add _update_del updateme _update_mod
echo "Done!"