Thursday Quote: Mike Solomon
“Linux is good in terms of manholes, there’s always a way to get in and see how your system is behaving.”
– Mike Solomon
Scalability at YouTube
“Linux is good in terms of manholes, there’s always a way to get in and see how your system is behaving.”
– Mike Solomon
Scalability at YouTube
Big files happen, and sometimes they need to be moved.
I’ve found that moving big files between local computers is often fastest with a USB drive and my shoes, especially since I don’t really have access to a wired network usually.
But sometimes, files are just too big for your thumb drive. Some of us don’t carry huge drives around, I usually only have a spare 2GB on me.
Suppose you have a 4.4GB ZIP file (don’t ask, just suppose). I’ve only got a 2GB (1.9GB usable) thumb drive on me, but I need to move it over to another machine and I don’t have all day.
In the past I’ve use dd tricks, but I knew there had to be a better way.
Enter split and cat. cat concatenates files, and it’s cousin split, well, splits them.
Perfect! Just what I needed! I’ll split my file into chunks with split, sneaker it over the other machine and cat it all back together.
computer-one $ split -b 1500M big.zip
(...wait...)
computer-one $ ls -l
-rw-rw-r-- 1 jmhobbs jmhobbs 1572864000 2012-02-27 14:44 xaa
-rw-rw-r-- 1 jmhobbs jmhobbs 1572864000 2012-02-27 14:45 xab
-rw-rw-r-- 1 jmhobbs jmhobbs 1572864000 2012-02-27 14:46 xac
-rw-rw-r-- 1 jmhobbs jmhobbs 7637844 2012-02-27 14:46 xad
computer-one $
computer-two $ cat xaa xab xac xad > big.zip
(...wait...)
computer-two $
computer-two $ ls -l
-rw-rw-r-- 1 jmhobbs jmhobbs 4726229844 2012-02-27 15:02 big.zip
I’ve used a lot of twitter clients over the years, I even wrote one back in ’08. TweetDeck is pretty good, but Air is shoddy on 64-bit Linux, and so it didn’t integrate well and would crash.
I rejected Gwibber and Choqok, just not my style. My old web standby Brizzly seems pretty lame these days too. Then I found Hotot. It’s awesome. It’s like a Linux native TweetDeck, but with a friendlier, more welcoming feel.
If you are looking for a Linux twitter client, you have got to give Hotot a try.
So I’m working on a little admin interface and I decided to tail some logs. It’s in PHP and Google came up with some stuff, but they were all a bit finickey and not well documented. So I wrote my own!
Here’s the basic concept:
Here’s the code:
function tail ( $file, $lines, $max_chunk_size = 4096 ) {
// We actually want to look for +1 newline so we can get the whole first line
$rows = $lines + 1;
// Open up the file
$fh = fopen( $file, 'r' );
// Go to the end
fseek( $fh, 0, SEEK_END );
$position = ftell( $fh );
$buffer = '';
$found_newlines = 0;
$break = false;
while( ! $break ) {
// If we are at the start then we are done.
if( $position <= 0 ) { break; }
// We can't seek past the 0 position obviously, so figure out a good chunk size
$chunk_size = ( $max_chunk_size > $position ) ? $position : $max_chunk_size;
// Okay, now seek there and read the chunk
$position -= $chunk_size;
fseek( $fh, $position );
$chunk = fread( $fh, $chunk_size );
// See if there are any newlines in this chunk, count them if there are
if( false != strpos( $chunk, "\n" ) ) {
if( substr( $chunk, -1 ) == "\n" ) { ++$found_newlines; }
$found_newlines += count( explode( "\n", $chunk ) );
}
// Have we exceeded our desired rows?
if( $found_newlines > $rows ) { $break = true; }
// Prepend
$buffer = $chunk . $buffer;
}
// Now extract only the lines we requested
$buffer = explode( "\n", $buffer );
return implode( "\n", array_slice( $buffer, count( $buffer ) - $lines ) );
}
You can give it a try on some junk data here: http://static.velvetcache.org/pages/2010/12/03/tail-in-php/
I did some tinkering with an Exim server today, and it’s been probably a year at least since I las touched one. Found an invaluable cheat sheet at http://bradthemad.org/tech/notes/exim_cheatsheet.php but it’s not terribly printer friendly.
I knocked up a more printable one for myself with Abiword, hope it can help you too.