Obtain image file information using the Linux shell

Sometimes you have to evaluate an image, ie. find out what its width and height is and check against a maximum or minimum and based on the outcome take certain actions. You want to do this using the Linux shell.


RTC device behavior in Linux

Linux features a command called rtcwake which can - turn the computer off - suspend to memory - suspend to the hard drive - .. some other power saving mode and then turn it back on using the computers motherboards rtc device. Using Ubuntu Server 11.X and rtcwake with the "off" option was working as expected. On the very same machine with Ubuntu Desktop 10.10 after executing rtcwake the computer stays on with the rtcwake programm giving back control to the command line without giving any feedback (no error, no success message). How to "install" the "off" option (and other options) to work with rtcwake.


Producing a screendump of a website on a Linux server

I'm in a situation where I need to produce a screenshot of a website as part of an automated process on a web server. While this is not a big deal on a desktop, making a screendump on a machine that has, in fact, no screen at all proves a bit more challenging. The solution should obviously be command line based. I also do not want rely on any third party website (of which there are a ton out there) to produce the screenshots. The preferred form of output would be an image of any of the common formats (jpg, png, gif).

Synchronizing folders in Linux by using rsync

Linux has a nifty little tool called rsync that should be available on the distribution of your choice (provided it was updated at least once since the stone age). From the man pages:

Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to/from another host over any remote shell, or to/from a remote rsync daemon. It offers a large number of options that control every aspect of its behavior and permit very flexible specification of the set of files to be copied. It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination. Rsync is widely used for backups and mirroring and as an improved copy command for everyday use.

So rsync is perfect for the job of keeping folders on different machines up to date. The proposed solution uses a central server approach looks as follows:

  • Rsync is running in daemon mode on the server. For authentication a file with user:password pairs is used. Information on how to set up an rsync daemon can be found in the related man pages.
  • On the client machines rsync is executed by a small script. Authentication again happens through a file stating username and password that is passed to rsync via the --password-file parameter. This allows for automation which is achieved by execution through a cron job as well as during boot and shutdown. How to achieve this is dependant on the distribution in use. On e.g. Arch Linux rc.local and /etc/rc.local.shutdown would be good places for executing the script on boot and shutdown respectively. Information on how to set up a cron job may be found in the man pages for crontab


  • The user credentials for authentication are stored in a plain text file. While this is necessary for automated execution it may be a no-go for some people. The alternatives are not using a password at all or using ssh as connection protocol. Both have options have their own pitfalls.
  • The setup (i.e. the exact calls and paths) on the client side will vary from distribution to distribution and will most likely need tweaking on each new machine; no solution that works out of the box.


Synchronise folders between Linux server and client

I sometimes find myself in a situation where I work on one project but use several different computer systems (all of which run Linux) for that. An example would be an exercise for university which I work on at home on my PC but then take my laptop to university to discuss it with colleagues. Always keeping all the files up to date on both systems is tedious and I often forget to do so. Hence my motivation for the challenge. I have a small server at home which runs 24/7 and is also accessible from outside via SSH. I want to use that as something of a central storage unit for project folders. For that I need a system in place that automatically synchronizes selected local folders to and from that server. The synchronization should occur without me having to initialize it as I'd surely forget that. "Synchronization" in this context means that all files present in a folder on one machine should also be present in an identically named folder on the other one. If a file has a more recent modification date on one machine than on the other the more recent file should replace the older one. Tracking the changes or keeping a history of changes is not necessary. To sum it up: <ul> <li>file synchronization between Linux client and server</li> <li>no graphical user interface server side</li> <li>SSH access to server</li> <li>automated, no user input necessary</li> </ul> Regards and thanks for any input.

Setting up a Linux Server to use LDAP for Samba, SSH

The main issue is that Samba and SSH do not use LDAP directly, instead you have to use the technologies NSS and PAM which will then access the LDAP system. So in order to set up two systems, about 5 systems have to be set up, which seems difficult and I was not able to find proper documentation about the whole process yet.

easy to use backup tool for linux

Create an incremental backup of a debian-linux server with an easy to use tool. The backup-server is reachable via ftp and sftp. The backup should be encrypted and compressed to be space and bandwidth efficient. There should be an easy way to either restore parts of the backup or the full system after a failure.

perl script for that solution

this is a simple perl script using mbox:parser to check for mails and store the attachments to a given folder...
can be invoked by crontab (also takes care of emptying the box) e.g.:
/1 * * * * root sudo -u scans getscans >> /data/daten/000_Scans_Server/00logfile.txt; chmod a+r /data/daten/000_Scans_Server/00logfile.txt; echo -n > /var/mail/scans


Don't Use SBS as the Default Gateway

After having issues with dropped internet connections, when a Small Business Server 2003 is used as the (default) gateway I started to do some quick research. However it didn't turn up anything too useful.

While Microsoft proposed using the SBS as the default gateway (with two NICs) as a viable solution, I wasn't so sure. If the server was compromised, it was game over. There was no second layer of security and the whole system didn't work as expected.

So instead of solving the problem (the security concerns couldn't be resolved anyway), I just mitigated it by using an old server as a Linux firewall. For easier handling I chose (no, I'm not affiliated with them, this is no hidden product placement) - one of a bunch of free (GPL) unified thread management [UTM] systems. It offers in- and outbound firewalls, proxies for HTTP, POP3, SMTP,... (with capabilites for spam detection, virus scanning,...), logging, VPN,...
So it definitely adds an additional layer of protection.

After deploying the system, the internet connection has been rock-solid until today.

Lesson learned: Think out of the box. By solving one problem (internet connection), one can probably archieve something else as well (better security).

Final note: Windows SBS2008 cannot be used as a default gateway any more. Probably because Microsoft wants to sell more ISA licenses, but it probably didn't work too well...


How to enable sound in Ubuntu Linux on Asus computers

I have installed Ubuntu 8.04 on my ASUS laptop. And it works fine. However I cannot get the sound to work. The sound doesn't work even if the speakers are not muted. How can one get the sound to work?


Subscribe to linux