Printable Version of Topic

Click here to view this topic in its original format

Unmanned Spaceflight.com _ Forum Management Topics _ Wanted - Cron-Job SQL/PHP Server guru

Posted by: djellison Nov 7 2008, 02:00 PM

Amoungst our troops there must be someone who knows how to write Cron-jobs.

I'd like to set up the UMSF server to automate backups of the forum DB and attachments folder, but it's just a bit beyond me. It's a LAMP setup using Cpanel 11.

Anyone who has 'the knowledge' to help me out - let me know!

No pay, but you can have an UMSF subdomain to put stuff on ( like Mike has for MMB anims ) as a UMSF perk biggrin.gif

Cheers

Doug

Posted by: imipak Nov 7 2008, 05:27 PM

Hi there! smile.gif

If it's a simple file copy (or compress, copy file off box, delete local copy) that'd be pretty trivial to do. PHP's probably not the ideal language to write the script in though, a tiny shell script is probably the best way to do it.

Posted by: Fran Ontanaya Nov 7 2008, 07:24 PM

As imipak says, if it's just about copying files it's quite simple. I use a bash script to copy, zip, encrypt and upload my docs to an external server. For a five times per month backup, a simple script could look like this:

#!/bin/bash
THEDAY=`date +%d`
THEDATE=`date +%m-%d`
if [ $THEDAY = 01 -o $THEDAY = 7 -o $THEDAY = 13 -o $THEDAY = 19 -o $THEDAY = 25 ] ;
then
zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/folder/mydocs/
fi
echo 'Backup finished. Closing...'
sleep 2

That zips everything in the folder '~/folder/mydocs/'. The old backups get overwritten after a year.

You can edit the cron with "crontab -e". I do my backups at 20:16 because I know the computer is on at that time. The line would look like this:

16 20 * * * ~/backups/backup.sh >/dev/null 2>&1 # Backup,

---

Or you can use a script like this:

#!/bin/bash
THEDATE=`date +%m-%d`
zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/folder/mydocs/
echo 'Backup finished. Closing...'
sleep 2

And a cron like this, for the same effect:

16 20 1 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup,
16 20 7 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup,
16 20 13 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup,
16 20 19 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup,
16 20 26 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup,

PS. I decline the subdomain offer.

Posted by: Fran Ontanaya Nov 7 2008, 07:42 PM

Same script with FTP. It deletes the local copy after:

#!/bin/bash
HOST='xxxx.xxxxxx.com'
USER='xxxxxx'
PASSWD='xxxxxx'
THEDATE=`date +%m-%d`
zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/mydocs/
ftp -v -n $HOST <<**
user $USER $PASSWD
cd backups/mydocs/
bin
put ~/backups/backup-mydocs-$THEDATE.zip
bye
**
rm ~/backups/backup-mydocs-$THEDATE.zip
echo 'Backup finished. Closing...'
sleep 2

Posted by: djellison Nov 7 2008, 08:13 PM

see - that's genius - I may come back with more questions, but this is a great start!

Cheers

Doug

Posted by: imipak Nov 7 2008, 09:40 PM

Wow, there's not much I can add to Fran's comment. Tchah!, and I had geekpr0n.unmannedspaceflight.com all worked out in my mind's eye... wink.gif biggrin.gif

If you want to get fancy / clever in future, you could try using rsync to do incremental (only save the changes) remote network backups with ssh crypto. A simple incantation is a one-liner:

CODE
rsync -a -k  -v -e ssh /home/andrew/* andrew@sluggbox:/media/sdb1/home/backup/thinkpad/


It's less typing than the shell script approach, but you have to be able to install rsync if it's not there already, and set up the remote machine, and be able to handle the bandwidth costs if you're doing this over the Internet rather than to the next machine down in the rack. The advantages of having the backups offsite is that you can easily be back up and running if the data centre / hosting location is offline for an extended period. There are surprisingly many http://www.google.co.uk/search?hl=en&q=sprint%20cogent%20partition&meta= [1] that can cause you to lose connectivity to the site, apart from the apocalyptic stuff the DR geeks get exercised about. Also the traffic on the wire is encrypted.[2]

I hacked up something along these lines when my techno-paranoid father finally bought a digital camera, and started accumulating irreplaceable data which he stored on a single HD, in a cheap Dell box, situated in a single-glazed, ground-floor room, in a completely un-alarmed house, in front of a window next to a public footpath. Oh yeah, and the window doesn't have a lock. [3]

There are also a couple of projects that use rsync as the core of a more full-featured network backups setup. A friend swears by http://www.bacula.org/en/, though I've not tried it myself. Lots of http://www.google.co.uk/search?hl=en&safe=off&q=rsync+backup+site%3Asourceforge.net&btnG=Search&meta= along the same lines are out there.


[1] *cough* Gloucester floods 2007 fasthosts

[2] One of the dirty little secrets of infosec is that examples of unencrypted traffic being maliciously intercepted are pretty rare these days. Unless, that is, the victims are too embarrassed to call the cops -- or don't realise it's happening...

[3] Oh yes, and it's also an absolute tinderbox of 18th century untreated wood, protected only by haphazardly concealed battery-operated smoke alarms which I hide around the place whenever I'm there, but which they throw out when the battery goes flat and they start beeping. If you need to give a sparkie sleepless nights, ask me for a pic of the fuseboard... </tangent>)


Posted by: RoverDriver Nov 21 2008, 05:55 PM

QUOTE (Fran Ontanaya @ Nov 7 2008, 11:42 AM) *
Same script with FTP. It deletes the local copy after:

#!/bin/bash
HOST='xxxx.xxxxxx.com'
USER='xxxxxx'
PASSWD='xxxxxx'
THEDATE=`date +%m-%d`
zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/mydocs/
ftp -v -n $HOST <<**
user $USER $PASSWD
cd backups/mydocs/
bin
put ~/backups/backup-mydocs-$THEDATE.zip
bye
**
rm ~/backups/backup-mydocs-$THEDATE.zip
echo 'Backup finished. Closing...'
sleep 2


On my server at home and on my workstation at work I use gtar instead of zip but the principle is the same. I do linear backups on Sundays and an incremental backup every day.
Some suggestions:

- consider iso9660 files. If you want to retrieve one file or a handful it makes it easier to just fetch single isolated files. I prefer tarballs since even if they get corrupted you can still recover some files out of them.
- consider file system/ssh/rsync file size limits
- whatever you do, periodically check your backups, try to unpack them and see if you are missing something.
- if you are going to provide the hardware (disks) to do the backup, NEVER EVER use the same brand for your main disks and your backup disks (*). Purchase different brands and at different dates so that their wear and tear is different.

Paolo

Posted by: djellison Nov 21 2008, 06:43 PM

You know how people suggest off-site backups. You've never thought of, you know, for something really important... I mean, it is VERY off site, it would be a good place to store passwords wink.gif

Doug

Posted by: RoverDriver Nov 21 2008, 07:10 PM

QUOTE (djellison @ Nov 21 2008, 10:43 AM) *
You know how people suggest off-site backups. You've never thought of, you know, for something really important... I mean, it is VERY off site, it would be a good place to store passwords wink.gif

Doug



Can't do that. We always get in trouble for taking up too much flash...

Paolo

Powered by Invision Power Board (http://www.invisionboard.com)
© Invision Power Services (http://www.invisionpower.com)