Notes about open source software, computers, other stuff.

Tag: Bash

Using Emacs key bindings in Gnome, Firefox and other applications

As an avid Emacs user, I love to have my Emacs key bindings available in as many places as possible. For example, even though I still regularly use the arrow keys to move the cursor around, I also use Emacs’ Alt-f and Alt-b to move one word forward or back, respectively. Similarly, Ctrl-a to me doesn’t mean “select all”, but rather “go to the beginning of line” (like the Home key). And especially this latter keybinding has a huge potential to mess things up, e.g. when you follow it by typing text, because that will then overwrite the selected text, i.e. all text. And the only thing I really intended to do was to go to the beginning of the line.

Another clash: in Emacs Ctrl-k means “kill to end of line”, i.e. “delete everything from the cursor position up to the end of the current line”, but in Firefox it sends your cursor to the Search box (for those of you who, like me, still use that for searching instead of just typing your query in the address bar). Similarly, Ctrl-n moves one character forward in Emacs, but in Firefox it opens a new window.

Luckily for me, I have managed to tailor the settings of various tools and the Gnome desktop environment to accommodate at least some of the more common Emacs key bindings. Unfortunately, applications built using other frameworks, like the Signal and Mattermost desktop apps, don’t follow these settings.

Below are the settings I’m currently using. Most of them have been with me for several years at this point and have been migrated various Ubuntu Linux upgrades, so I hope they are complete. For the record, I’m currently running the 24.04 Noble Numbat release.

Gnome

Let’s start with the Gnome desktop environment. My Linux desktop of choice for roughly the past twenty years has been Ubuntu, which uses Gnome. There is a gsettings entry that allows you to enable Emacs key bindings in most Gnome/Gtk applications, including Thunderbird. The entry can be set by setting the “Emacs input” toggle in the Keyboard section of the Gnome Tweak tool, or directly on the command line with

gsettings get org.gnome.desktop.interface gtk-key-theme "Emacs"

The current value can be checked like this:

$ gsettings get org.gnome.desktop.interface gtk-key-theme
'Emacs'

The Arch Linux wiki also lists options for GTK-2.0 and GTK-3.0, but I haven’t got those configured (any more).

Gnome terminal

By default, Gnome terminal steals the Alt key and uses e.g. Alt-f to open the file menu. This can be turned off by going to the hamburger menu in the top right corner and under “Global” — “General” uncheck the box for “Enable mnemonics (such as Alt-F to open the File menu)”.

Shells (Bash, Zsh)

As Emacs has been around for so many years, many shells (well, actually, the readline library if I’m not mistaken) support the basic Emacs key bindings for editing the commands you type on the command line. Both Bash and Zsh use the Emacs bindings by default (others might do too, but I don’t have any experience with other shells, except tcsh a long long time ago). In fact, you have to run set -o vi on order to be able to use Vim key bindings.

Byobu & Screen

I often use byobo as a terminal multiplexer. Like screen it likes to “steal” Ctrl-a as “attention” or “escape” key. Luckily, when the user presses Ctrl-a for the first time in Byobu, they are asked whether they’d like to use Emacs key bindings or not. My answer is obvious, and I generally give them Ctrl-o to use instead. This can be done via a menu by pressing F9 and selecting “Change escape sequence”.

Alternatively, this can be changed in the ~/.byoby/keybindings file by adding the following code:

# replace ctrl-A by ctrl-o
escape ^Oo

For screen the same line should be added to ~/.screenrc .

Firefox

My solution for Firefox is to replace the Ctrl key with the Alt key. This way, I can open new tabs with Alt-t, new windows with Alt-n, etc. Together with the Gnome settings for Emacs key bindings (see above), this means I can use Ctrl-a, Ctrl-f, Ctrl-b, etc. for moving the cursor in text fields, Ctrl-d for delete, etc. Interestingly enough, Alt-f and Alt-b — for “move one word forward” and “move one word backward”, respectively — keep working in text fields as well. Note that this also means that “Undo” is handled by Alt-z instead of Ctrl-z, which is fine with me because Ctrl-z is normally used to let applications run in the background (in the shell).

Unfortunately, some sites seem to define their own extra keybindings that interfere with my settings. For example, when creating or commenting on a Github issue, Ctrl-e inserts a backtick (`), instead of going to the end of line. I haven’t yet found out how to disable or overwrite that. I’m glad that I mainly use Gitlab, as it behaves properly.

To change the key, point to about:config in the address bar of the browser and find the entry ui.key.accelKey and change its value to 18. This is the code for the Alt key (see the documentation). You may want to set the entries ui.key.generalAccessKey and ui.key.menuAccessKey to 0 to disable e.g. using Alt for accessing the menus, but I haven’t done so myself.

Additional documentation about Emacs key bindings in Firefox can be found in the Mozilla knowledgebase article.

There are, and have been, various Firefox extensions or other methods that allow(-ed) one to use Emacs for editing text in textfields like those used in forum posts, etc. However, the last one I used, “Emacs Everywhere” unfortunately doesn’t work yet with the Wayland window manager, although work seems to be on the way to fix that.

LibreOffice

Unfortunately I regularly have to edit MS Word documents (or their LibreOffice counterpart). Fortunately, Marcus Nitzschke created a customisation list for LibreOffice Writer that sets a series of basic Emacs movement key bindings! On his site he links to a Zip file that can be imported via Tools — Customize — Keyboard — Load. After that, the following keys should work in LibreOffice Writer (thanks to Marcus for this list):

Binding Function
C-f forward-char
C-b backward-char
C-n next-line
C-p previous-line
M-f forward-word
M-b backward-word
C-v next page
M-v previous page
C-a beginning-of-line
C-e end-of-line
C-k kill-line
M-d kill-word
M-backspace backward-kill-word

Related Images:

Bulk downloading and renaming of Expensify PDF reports

In my company, we have been using Expensify to manage small receipts, travel expenses, etc. Recently, however, I decided to switch to another platform that is part of the SAAS platform that our accountant uses. Even though it lacks some of the functionality provided by Expensify, having all receipts in a single location reduces the amount of time I have to spend on administrative tasks.

Every quarter Dutch companies have to file a VAT report, which meant I exported the Expensify reports to CSV files (to send to my accountant) and in PDF form, as a more “visual” backup, which lists the reported expenses, sorted in categories, and, importantly, also includes the scans on the various receipts.

As we changed accountants a couple of years ago, I wasn’t sure whether I had actually downloaded both the CSV and the PDF file for each Expensify report. Keeping records is required by Dutch law, so I decided to make sure and download all PDF files and back them up somewhere.

Unfortunately, the Expensify website doesn’t offer an option for bulk downloading of the PDF files. They do offer a kind of REST API (they call it the Integration Server), that I had played with years ago, so I decided to try that. Luckily, the credentials I had saved in my password manager still worked.

The process for downloading the PDFs consists of two steps:

  • Run a command to generate the reports, this returns the file names for the PDF files.
  • Use those names to download the PDFs

The first step took a couple of minutes to run and then listed the filenames for the PDF on stdout:

curl -X POST 'https://integrations.expensify.com/Integration-Server/ExpensifyIntegrations' \
    -d 'requestJobDescription={
        "type":"file",
        "credentials":{
            "partnerUserID":"XXXXXXXXXX",
            "partnerUserSecret":"YYYYYYYYYY"
        },
        "onReceive":{
            "immediateResponse":["returnRandomFileName"]
        },
        "inputSettings":{
            "type":"combinedReportData",
            "filters":{
                "startDate":"2013-01-01"
            }
        },
        "outputSettings":{
            "fileExtension":"pdf",
            "includeFullPageReceiptsPdf":"true"
        }
    }' \
    --data-urlencode 'template@expensify_template.ftl'

I’m not sure what the expensify_template.ftl file does in this command, but it was necessary to create that file locally, otherwise the curl call would return an error. I simply copied the example from the sample provided in the documentation for the Expensify Integration Server. I made a copy of the long list of PDF filenames output by the above command. A typical filename would look like this: exportc992bd79-aa4a-4b04-a76a-1149194bac94-34589514.pdf. Not very descriptive… As expected (confirmed in the web UI), there were 191 file names.

Next, step two: actually downloading the files. The basic call for that is:

curl -X POST 'https://integrations.expensify.com/Integration-Server/ExpensifyIntegrations' \
    -d 'requestJobDescription={
        "type":"download",
        "credentials":{
            "partnerUserID":"XXXXXXXXXX",
            "partnerUserSecret":"YYYYYYYYYY"
        },
        "fileName":"exportc992bd79-aa4a-4b04-a76a-1149194bac94-5803035.pdf",
        "fileSystem":"integrationServer"}
    }' \
    --data-urlencode 'template@expensify_template.ftl' --output "my_output.pdf"

So, in order to download all PDFs, I saved all file names in the file pdflist. All PDF file names are unique:

$ wc -l pdflist
191 pdflist
$ sort pdflist| uniq | wc -l
191

Next, I used a loop to read each line from the pdflist file and fiddled a bit with the quotes so I could use the pdf variable in the Curl call and download each file:

cat pdflist | while read pdf; do
curl -X POST 'https://integrations.expensify.com/Integration-Server/ExpensifyIntegrations' \
    -d "requestJobDescription={
        'type':'download',
        'credentials':{
            'partnerUserID':'XXXXXXXXXX',
            'partnerUserSecret':'YYYYYYYYYY'
        },
        'fileName':${pdf},
        'fileSystem':'integrationServer'}
    }" \
    --data-urlencode 'template@expensify_template.ftl' --output ${pdf}
done

This indeed gave me 191 Expensify report PDFs, with very uninformative names 😐 . To fix that I resorted to some more shell “scripting”. Every report has a title (usually something like “Small expenses 2020 Q4”) and by using the pdftotext utility, it looked like this was always on the third line of the pdftotext output. So I moved the original PDFs to a separate “archive” directory OriginalExports and ran the following to make a copy of each PDF to a new name that was equal to its title. My first attempt failed somewhat, because the number of renamed PDF files as smaller than the number of original PDFs. I guessed this would happen when two reports have the same name, and indeed, adding -i to the cp command to warn me of this showed I was right. As this was only happening for four files, I manually converted those.

for pdf in OriginalExports/export*.pdf; do
echo ${pdf}
title=$(pdftotext ${pdf} - | head -3 | tail -1 | tr "/" "_")
cp -i ${pdf} "${title}.pdf"
done

So there I had my backup of all receipts since we started using Expensify. And if the tax office or the accountant ever want to see those receipts, I am now sure I can provide them.

Related Images:

Using rsync to backup a ZFS file system to a remote Synology Diskstation

Some time ago I moved from using LVM to using ZFS on my home server. This meant I also had to change the backup script I used to make backups on a remote Synology Diskstation. Below is the updated script. I also updated it such that it now needs a single command line argument: the hostname of the Diskstation to backup to (because I now have two Diskstations at different locations). If you want to run this script from cron you should set up key-based SSH login (see also here and here).

#!/bin/bash
#
# This script makes a backup of my home dirs to a Synology DiskStation at
# another location. I use ZFS for my /home, so I make a snapshot first and
# backup from there.
#
# This script requires that the first command line argument is the
# host name of the remote backup server (the Synology NAS). It also
# assumes that the location of the backups is the same on each
# remote backup server.
#
# Time-stamp: <2014-10-27 11:35:39 (L.C. Karssen)>
# This script it licensed under the GNU GPLv3.
 
set -u
 
if [ ${#} -lt 1 ]; then
    echo -n "ERROR: Please specify a host name as first command" 1>&2
    echo " line option" 1>&2
    exit -1
fi
 
###############################
# Some settings
###############################
# Options for the remote (Synology) backup destination
DESTHOST=$1
DESTUSER=root
DESTPATH=/volume1/Backups/
DEST=${DESTUSER}@${DESTHOST}:${DESTPATH}
 
# Options for the client (the data to be backed up)
# ZFS options
ZFS_POOL=storage
ZFS_DATASET=home
ZFS_SNAPSHOT=rsync_snapshot
SNAPDIR="/home/.zfs/snapshot/$ZFS_SNAPSHOT"
 
# Backup source path. Don't forget to have trailing / otherwise
# rsync's --delete option won't work
SRC=${SNAPDIR}/
 
# rsync options
OPTIONS="--delete -azvhHSP --numeric-ids --stats"
OPTIONS="$OPTIONS --timeout=60 --delete-excluded"
OPTIONS="$OPTIONS --skip-compress=gz/jpg/mp[34]/7z/bz2/ace/avi/deb/gpg/iso/jpeg/lz/lzma/lzo/mov/ogg/png/rar/CR2/JPG/MOV"
EXCLUSIONS="--exclude lost+found --exclude .thumbnails --exclude .gvfs"
EXCLUSIONS="$EXCLUSIONS --exclude .cache --exclude Cache"
EXCLUSIONS="$EXCLUSIONS --exclude .local/share/Trash"
EXCLUSIONS="$EXCLUSIONS --exclude home/lennart/tmp/Downloads/*.iso"
EXCLUSIONS="$EXCLUSIONS --exclude home/lennart/.recycle"
EXCLUSIONS="$EXCLUSIONS --exclude _dev_dvb_adapter0_Philips_TDA10023_DVB*"
 
 
 
###############################
# The real work
###############################
 
# Create the ZFS snapshot
if [ -d $SNAPDIR ]; then
    # If the directory exists, another backup process may be running
    echo "Directory $SNAPDIR already exists! Is another backup still running?"
    exit -1
else
    # Let's make snapshots
    zfs snapshot $ZFS_POOL/$ZFS_DATASET@$ZFS_SNAPSHOT
fi
 
 
# Do the actual backup
rsync -e 'ssh' $OPTIONS $EXCLUSIONS $SRC $DEST
 
# Remove the ZFS snapshot
if [ -d $SNAPDIR ]; then
    zfs destroy $ZFS_POOL/$ZFS_DATASET@$ZFS_SNAPSHOT
else
    echo "$SNAPDIR does not exist!" 1>&2
    exit 2
fi
 
exit 0

Related Images:

Using ‘expect’ to distribute files among users

I’m currently teaching at the Summmer School in Statistical Omics in Split, Croatia. A great experience!

Because of the computations involved in the project work, we have access to a server. However, since the machine is part of a university cluster, I haven’t been given full root permissions (in fact, I’m only allowed to use sudo to install packages).

Now, the problem I had to solve was that I needed to distribute a certain file (.Renviron) to each student’s home directory. Normally I’d use sudo to do that, but the admin hadn’t allowed me to use cp via sudo. Furtunately, I had a list of user names and passwords for the students (because I had to distribute those), so I thought I’d use su - to change to each student’s account and copy the file, something along the lines of

echo PASSWORD | su -

and then loop over each account. Unfortunately, while testing the script I found out it wouldn’t work since su complained:

su: must be run from a terminal

Then I remembered the expect tool, which executes commands based on what it ‘sees’ on the command line. In this case I wanted it to enter the password at su‘s prompt. This is the expect script I came up with, it accepts two command line arguments, the user name and the password:

#!/usr/bin/expect -f
 
set user [lindex $argv 0]
set pass [lindex $argv 1]
 
spawn su - $user
expect "Password: "
send "$pass\r"
expect "$ "
send "cp -i /common/WORK/school/lennart/.Renviron .\r"
expect "$ "
send "ls -l .Renviron\r"
expect "$ "
send "exit\r"

The script was wrapped in the Bash script that I had already written:

#!/bin/bash
#
# This script is used to copy files from this directory to the
# home directories of the users listed in $USERFILE.
 
USERFILE=accounts.txt
SRCFILE=/common/WORK/school/lennart/.Renviron
 
while read user passw; do
    ./copy_file_to_users.expect $user $passw
done < $USERFILE

Related Images:

Fixing colours in git output after upgrading to Ubuntu 14.04

After upgrading my Ubuntu 13.10 installation to 14.04, I noticed that the output of several git commands (e.g. git diff and git log) didn’t show colours as they used to, but showed ESC[ ANSI codes instead.
A quick internet search lead to this post on unix.stackexchange.com where the LESS environment variable was ‘blamed’. Indeed, I have my LESS variable (re-) defined in my .bashrc and .zshrc files.

The solution was to add -R to the environment variable, which allows raw control characters to be displayed. I now have the environment variable defined as:

LESS='--quiet -X -F -R'

Related Images:

Replacing a character in a Bash variable name

Today I needed to replace a : in a bunch of file names with a -, so I wanted to write a Bash for-loop to do just that. I vaguely remembered that you can do character replacements within variables, but couldn’t remember the details.

This is how it’s done:

for filename in *; do
    mv "$filename" "${filename/:/-}"
done

I put the variables in double quotes, because the file names contained spaces.

Related Images:

Exit a Bash script if an error occurs

Last week I found out that a Bash script I wrote to do some data QC gave me a false sense of security: a script continues even if one (or more) of the statements in the scripts fails (with an exit status not equal to 0). It turned out that for some of the data sets the QC wasn’t done correctly because I didn’t check the exit status after each step.

My first thought was: oh boy, that means I have to check $? for every step. That means a lot of repetitive code to write! Luckily my colleague came with the answer: add

set -e

at the top of you Bash script and the script will fail if one of its statements fails (for the fine print see the top answer in this is StackOverflow post).

Related Images:

Embedding album art in FLAC files

I recently wanted to add cover art to my collection of FLAC-encoded audio files. I wrote the following simple script to help me automate the process. Running this script in a given directory (I group my music in directories per artist, followed by a subdirectory for each album) with the name of the album art image file name as argument then automatically embeds the image in the FLAC/Vorbis tag.

#!/bin/bash
#
# This script embeds a given image (usually .jpg) as album art in the
# FLAC files in the present directory (and its subdirectories).
#
# Time-stamp: <2011-07-31 20:43:23 (lennart)>
 
coverart=$1
 
find . -name "*.flac" -print0 |xargs -0 metaflac --import-picture-from="$coverart"

Related Images:

Enable incremental-search-forward in Bash

I just read Ruslan Spivak’s blog posting on how to get Ctrl-s (which is bound to incremental-search-forward in Emacs) working to search incrementally through the command history in Bash.

Normally this behaviour is shadowed by a terminal flow-control key binding. To turn that off and ‘reveal’ the search-forward function, simply type

stty -ixon

(of course, by adding this line to your ~/.bashrc file makes it permanent).

Great to get this working! Thanks Ruslan.

Related Images:

© 2024 Lennart's weblog

Theme by Anders NorénUp ↑