Penguin
Note: You are viewing an old revision of this page. View the current version.

Many oneliners rely on the magic of Perl's -i switch, which means when files supplied on the commandline are opened they are edited in place, and if an extension was passed, a backup copy with that extension will be made. -e specifies the Perl code to run. See perlrun(1) for any other switches used here - in particular, -n and -p make powerful allies for -i.

The unsurpassed power of Perl's RegularExpression flavour contributes a great deal to the usefulness of nearly every oneliner, so you will also want to read the perlretut(1) and perlre(1) manpages to learn about it.

Removing empty lines from a file

perl -ni.bak -e'/\S/ && print' file1 file2

In Shell
for FILE in file1 file2 ; do mv "$F"{,.bak} ; grep '[^?' "$F.bak" > "$F" ; done

Collapse consecutive blank lines to a single one

perl -00 -pi.bak -e1 file1 file2

Note the use of 1 as a no-op piece of Perl code. In this case, the -00 and -p switches already do all the work, so only a dummy needs to be supplied.

Binary dump of a string

perl -e 'printf "%08b\n", $_ for unpack "C*", shift' 'My String'

Replace literal "\n" and "\t" in a file with newlines and tabs

cat $file | perl -ne 's/\\n/\012/g; s/\\t/\011/g; print'

This has a useless use of cat, and uses -n but then does an explicit print. You also don't need to use octal-coded character codes. It should be rewritten as
perl -pe 's!\\n!\n!g; s!\\t!\t!g' $file

You can use any punctuation as the separator in an s/// command, and if you have backslashes or even need literal slashes in your pattern then doing this can increase clarity.

List all currently running processes

This is useful if you suspect that ps(1) is not reliable, whether due to a RootKit or some other cause. It prints the process ID and command line of every running process on the system (except some "special" kernel processes that lie about/don't have command lines).

perl -0777 -pe 'BEGIN { chdir "/proc"; @ARGV = sort { $a <=> $b } glob("*/cmdline") }

$ARGV m!^(\d+)/!; print "$1\t"; s/\0/ /g; $_ . "\n";'

It runs an implicit loop over the /proc/*/cmdline files, by priming @ARGV with a list of files sorted numerically (which needs to be done explicitly using <=> -- the default sort is ASCIIbetical) and then employing the -p switch. -0777 forces files to be slurped wholesale. Per file, the digits that lead the filename are printed, followed by a tab. Since a null separates the arguments in these files, all of them are replaced by spaces to make the output printable. Finally, a newline is appended. The print call implicit in the -p switch then takes care of outputting the massaged command line.

List all currently running processes, but nicer

See above for what this does. The how is different, though.

perl -MFile::Slurp -0le 'for(sort { $a <=> $b } grep !/\D/, read_dir "/proc")

{ @ARGV = "/proc/$_/cmdline"; printf " %6g %s\n", $_, join(" ", <>); }'

This time the loop is explicit. Again, there are two parts to the program -- selecting files and doing IO? on them.

To read the directory, a convenience function is pulled in from the File::Slurp module, loaded using the -M switch. The module has been part of the core distribution since Perl 5.8.0. Reading a directory manually is straightforward but the code would be longer and clumsier.

Selecting the files is pretty simple, if a little obtuse: it's done by reading the contents of /proc, then using grep !/D/ to reject any entries that contain non-digit characters from the list. The results,are then sorted numerically, done explicitly using <=> because the default sort is ASCIIbetical. Each entry is then concatenated into a full path and stuck into @ARGV one by one, from where the <> "diamond operator" will pick it up, auto-open it and read it for us, even autoreporting any errors in a nicely verbose format.

Producing human-readable output is a little more involved, using some switches to abbreviate a bit of magic. The -0 switch sets the $/ variable: here, because it is not followed by a digit, it sets it to a null character. This means that null characters will be regarded as line separators on input. The -l switch has two effects, of which only one is relevant to us: it automatically chomp()s lines read using the diamond operator. (The other is to set the $\ variable, which we aren't interested in or affected by.)

Note that the -0 and -l switches are order sensitive both in syntax and semantics. We order them for syntax here, because they can both accept an octal number as input, but we don't to pass one to either of them (particularly, -0 will be mistaken for a digit parameter to -l if we turn them around).

Together, these switches effectively mean that we get null terminated lines from files, with the nulls removed on input. So we get the command line arguments listed in a /proc/*/cmdline file as nice list of separate strings. And because join() expects a list, <> returns all "lines" (ie command line arguments) at once, which join() then dutifully puts together with spaces between.

The printf(3) is straightforward.


AddToMe