Archived entries for bash

Converting OpenSSH keys to PuTTY format via command line

The information on converting to and from PuTTY key formats without starting up the PuTTYgen GUI is quite few and far between, as I’ve found.

The key bit is that PuTTY is available as a package for most Linux distros, and comes with PuTTYgen as well.

To convert your existing key, simply use:

puttygen openssh_private_key -o output_filename.ppk

And if you’d like to generate new OpenSSH and PuTTY keys, use something like:

username=testy
fullname="Testy McTest"
ssh-keygen -t rsa -b 2048 -C "$fullname's Key" -N "" -f $username && \
puttygen $username -o $username.ppk

Which will generate files testy, testy.pub, and testy.ppk.

I’d imagine that the PuTTYgen utility will accept the same command format on Windows as well if you’ve already got your OpenSSH keys handy there.

Create and mount a file as a disk in Linux

If you need to test something where a disk needs to be mounted, but don’t want to go through the hassle of actually attaching a physical disk or provisioning a virtual one you can simply create a ‘loopback’ device in linux.

I’ve been working on project where in the future I might want to export a big pile of disks via NFS, but I don’t feel like provisioning a bunch of temporary devices for it. There are a few other tutorials out there on how to do this, but they either have extraneous commands [ie losetup] or don’t deal with actually creating the image. Continue reading…

Copy and Paste Files Between SSH Sessions

I’ve found that the permissions in the directories on the servers I’ve been working on recently are not very friendly to using scp or rsync, [root owns the dir, but sshd PermitRootLogin = No] but I need to copy files around regularly. I’ve used a simple cat file | base64 to embed file contents in scripts before, so why not pair it up with tar to move many files?

I’ll save the sob story where I found that tar by default pads with a LOT of null bytes, but that’s why -z and -b 1 are your friends.
Continue reading…

Binary to Text Using Bash

So I was looking at some job postings when I came across one with a block of binary data in the description. I was curious to see if it was anything actually relevant to the posting that might help my application, something clever slipped under the HR dept’s nose, or complete gibberish that someone thought looked cool.

Here it is for reference:

01001100 01101111 01101111 01101011 01101001 01101110 01100111 00100000 01100110 01101111 01110010 00111010 00100000 01001001 01010100 00100000 01000010 01110101 01110011 01101001 01101110 01100101 01110011 01110011 00100000 01000001 01101110 01100001 01101100 01111001 01110011 01110100 00100000 01110111 01101000 01101111 00100000 01100011 01100001 01101110 00100000 01110100 01110010 01100001 01101110 01110011 01101100 01100001 01110100 01100101 00100000 01001001 01010100 00100000 01101100 01101001 01101110 01100111 01101111 00100000 01101001 01101110 01110100 01101111 00100000 01110000 01101100 01100001 01101001 01101110 00100000 01101100 01100001 01101110 01100111 01110101 01100001 01100111 01100101 00100000 01110100 01101111 00100000 01100011 01110010 01100101 01100001 01110100 00100000 01110011 01101111 01100110 01110100 01110111 01100001 01110010 01100101 00100000 01110011 01101111 01101100 01110101 01110100 01101001 01101111 01101110 01110011 00100000 01100110 01101111 01110010 00100000 01100011 01101100 01101001 01100101 01101110 01110100 01110011 00001101 00001010

A little Googling, and some finagling with the commands [apparently bc on my system freaks out if obase is not declared first] I came up with this command:

for item in `cat bin.txt `; do echo "obase=10;ibase=2;$item" | bc | awk '{printf("%c", $1)}'; done

Which translated the block of binary octets in bin.txt to this, complete with spelling mistake:

Looking for: IT Business Analyst who can translate IT lingo into plain language to creat software solutions for clients

Turned out kind of lame, but at least I made a handy little one-liner to share.



Copyright © 2009–2010. All rights reserved.

RSS Feed.