Sonntag, 11. Oktober 2009
A plugin for using GCstar to ... Geschrieben von Jan Girlich in technique um 18:49
You might know that I collect Lustige Taschenbücher (LTB) since I was a little kid. I got 258 issues by now and wanted to catalogize them with GCstar to have a better overview. Unfortunately the plugins to automatically fetch all information about the books shipped with it do not cover any webpage listing LTB. That's probably because LTBs are only sold in German speaking areas and thus are only locally popular.
But there is a quite nice webpage listing a lot of them at http://www.lustige-taschenbuecher.de/. So now all I had to do is writing a plugin to grab the data from this webpage and insert it in the database. Something dozens of people already did before me and is documented.
I won't go into details here (I might at some other point in time if someone wants me to) but Perls HTML::parser comes directly out of hell! Anyway, I got it.
If you want to use it as well download the plugin and place it in your GCstar lib directory (on my Ubuntu it's /usr/share/gcstar/lib/GCPlugins/GCcomics/).
I changed the database layout of the generic GCcomics model a little to make it work with LTBs. See this diff for the changes and apply them to yours or the plugin won't work (most likely the file GCcomics.gcm is to be found at /usr/share/gcstar/lib/GCModels/).
Now start your gcstar, create a new comicbook collection, add a new item, enter the issue number of the LTB as the volume, hit the internet search button, select to query by volume, select the LTB plugin and get happy. I just catalogized my whole collection within an hour. Be aware that some reprints are not listed on the webpage. Especially the early reprints up to about volume 120.
PS: Anyone willing to swap some LTBs with me?
Dienstag, 6. Oktober 2009
My Backup Solution using BackInTime Geschrieben von Jan Girlich in technique um 22:49
Since a while now I'm trying to figure out the right way of doing backups on my Ubuntu Linux system. I have some requiremtens I want to descibe brefely before continuing.
Don't do regular full backups
I used sbackup so far but it regularly does a full backup, thus taking a long time (I have about 100GBs I want to be backuped) and using up a lot of disk space.
Store the backups on a remote system
Backup basics: keep your backup at a logicaly and spacialy seperate place from your computer. So I want it to work over network and please encrypt the file transfer. And do only send the data of changed files, keeping nettraffic low.
Don't bother me
Work in the background and just do your job. Don't tell me what you're doing unless I ask you to do so. I don't even want to know you exist. The only reason I interact with you is the worst case: I got to restore something.
Easy to setup and use
I want an easy to setup solution which doesn't need complicated settings and extensive care to be kept running. I don't want to change tapes or drives, free disk space tell you that the target is avilable or not, figure it out for yourself and do something sensible. And if I ever have to restore files I want an easy to use interface which enables me to quickly restore everything.
Restoring is also without the backup program easily possible
No, looking inside incremental .tar.gz files is not easy. More like complete snapshots I can browse easily and copy the files I want out of with any basic operating system. But keep the needed needed space low.
Try to backup every little change, ASAP
Most annoying with most backup solutions I think is the fact they only do a backup once a day or similar. With a full backup of 100GBs over the network this might take longer than the longest contiguous uptime of my computer while at home for the day. But I still want pretty much any changed file to be backed up, so I can undo very recent changes.
I really liked what I heard about TimeMachine from Apple. But I don't use Apple, so I was looking for something similar for Linux. There are a couple of promising projects like TimeVault and FlyBack, if they weren't deserted since about two years.
So what does BackInTime do that I like it so much?
But it does not store backups on remote systems. Well, you can mount a remote share and do the backup on that one. Luckily BackInTime internally uses rsync which uses Delta Encoding to reduce the amount of data to be sent over the network. But whenever the target is not available BackInTime gets annoying telling me about it every 30 seconds or so. Inacceptable. But there is something you can do about it.
BackInTime is called every 5 minutes by Cron, so I wrote a little wrapper script checking if the target computer is found in the local network and mounting the share prior to calling BackInTime.
It's assumed your servers share is accessible via SSH and your user uses SSH keys to log in. There's a good howto on that on Ubuntus help pages.
Save it as e.g. ~/bin/backintimeprep.sh and change the crontab entry to something like
∗/5 ∗ ∗ ∗ ∗ ~/bin/backintimeprep.sh >/dev/null 2>&1
This script is very basic and fails if it finds another server with the same IP address.
Try it out and have fun! I just finished my initial backup and have it running in the background. Hopefully this ended my struggle for a good backup solution.
Using CIFS instead of SSHFS
My slug is very slow and I had a transfer rate of only about 1MB/s whilst totally hogging my CPU rendering it unusable for anything else during a backup. So I was looking for another solution with less CPU usage and better transfer rates.
CIFS is quick and doesn't use much CPU time. But it lacks the security of an encrypted connection which I decided to be negligible in my local network.
I assume you know your way around with samba i.e. how to add users. My /etc/samba/smb.conf on the server looks like this:
In the script backintimeprep.sh change the lines for mounting and unmounting the sahres like this.
# you might need to fiddle around with the charset to make i.e. Umlauts work
Works great like this. Much better performance.
Freitag, 2. Oktober 2009
Recently a couple of services offering online sotrage and a software client for easy synchronisation between several clients, easy sharing and backup came up. It seems like the idea is actually quite old, but as I think really sexy.
There is a nice video at DropBox's webpage explaining what those services are good for, so I won't explain these advantages anymore but redirect you to the video.
If I wanted to backup all my valuable data I would need more than 100 GB, which is more than the 50 to 100 GB both services offer to paying customers. But just for quickly sharing files or making some important files available everywhere or similar things the 2 GBs offered by both services are fine.
While UbuntuOne comes with an open source client which can be trusted, adapted and ported to other plattforms DropBox is a proprietary closed source product. Funnily enough DropBox is also available for Mac and Windows while UbuntuOne isn't, yet. The DropBox client needs some tricks to install on a Ubuntu and probably other Linuxes, too. But in the long run I see a bigger potential in UbuntuOne being ported to anything you could possibly wish for just because it's open source.
Both clients integrate very nicely with Nautilus and enable you via right click to share files or folders. Only details differ here.
Featurewise those clients are not very different either, but the way it works is a little different. There is only one feature I really miss at UbuntuOne and why I use Dropbox. With DropBox you can share files via a simple http link while at UbuntuOne you can only share files when both partners have a launchpad.net account. This is very bad for the very typical situation where you want to send a large file to someone. DropBox's direct link you can obtain by right clicking a file comes in very handy in this use case.
Just now I prefer DropBox over UbuntuOne because of the Windows client being available and not needing any client or account for sharing files via direct weblink. But I predict bigger potential for UbuntuOne because of better integration possibilities with Ubuntu, easier installation and it being open source so there can be clients for any OS and architecture.
« vorherige Seite (Seite 1 von 1, insgesamt 3 Einträge) nächste Seite »