Wanted - Cron-Job SQL/PHP Server guru |
Wanted - Cron-Job SQL/PHP Server guru |
Nov 7 2008, 02:00 PM
Post
#1
|
|
Founder Group: Chairman Posts: 14434 Joined: 8-February 04 Member No.: 1 |
Amoungst our troops there must be someone who knows how to write Cron-jobs.
I'd like to set up the UMSF server to automate backups of the forum DB and attachments folder, but it's just a bit beyond me. It's a LAMP setup using Cpanel 11. Anyone who has 'the knowledge' to help me out - let me know! No pay, but you can have an UMSF subdomain to put stuff on ( like Mike has for MMB anims ) as a UMSF perk Cheers Doug |
|
|
Nov 7 2008, 05:27 PM
Post
#2
|
|
Member Group: Members Posts: 646 Joined: 23-December 05 From: Forest of Dean Member No.: 617 |
Hi there!
If it's a simple file copy (or compress, copy file off box, delete local copy) that'd be pretty trivial to do. PHP's probably not the ideal language to write the script in though, a tiny shell script is probably the best way to do it. -------------------- --
Viva software libre! |
|
|
Nov 7 2008, 07:24 PM
Post
#3
|
|
Member Group: Members Posts: 293 Joined: 22-September 08 From: Spain Member No.: 4350 |
As imipak says, if it's just about copying files it's quite simple. I use a bash script to copy, zip, encrypt and upload my docs to an external server. For a five times per month backup, a simple script could look like this:
#!/bin/bash THEDAY=`date +%d` THEDATE=`date +%m-%d` if [ $THEDAY = 01 -o $THEDAY = 7 -o $THEDAY = 13 -o $THEDAY = 19 -o $THEDAY = 25 ] ; then zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/folder/mydocs/ fi echo 'Backup finished. Closing...' sleep 2 That zips everything in the folder '~/folder/mydocs/'. The old backups get overwritten after a year. You can edit the cron with "crontab -e". I do my backups at 20:16 because I know the computer is on at that time. The line would look like this: 16 20 * * * ~/backups/backup.sh >/dev/null 2>&1 # Backup, --- Or you can use a script like this: #!/bin/bash THEDATE=`date +%m-%d` zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/folder/mydocs/ echo 'Backup finished. Closing...' sleep 2 And a cron like this, for the same effect: 16 20 1 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup, 16 20 7 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup, 16 20 13 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup, 16 20 19 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup, 16 20 26 * * ~/backups/backup.sh >/dev/null 2>&1 # Backup, PS. I decline the subdomain offer. |
|
|
Nov 7 2008, 07:42 PM
Post
#4
|
|
Member Group: Members Posts: 293 Joined: 22-September 08 From: Spain Member No.: 4350 |
Same script with FTP. It deletes the local copy after:
#!/bin/bash HOST='xxxx.xxxxxx.com' USER='xxxxxx' PASSWD='xxxxxx' THEDATE=`date +%m-%d` zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/mydocs/ ftp -v -n $HOST <<** user $USER $PASSWD cd backups/mydocs/ bin put ~/backups/backup-mydocs-$THEDATE.zip bye ** rm ~/backups/backup-mydocs-$THEDATE.zip echo 'Backup finished. Closing...' sleep 2 |
|
|
Nov 7 2008, 08:13 PM
Post
#5
|
|
Founder Group: Chairman Posts: 14434 Joined: 8-February 04 Member No.: 1 |
see - that's genius - I may come back with more questions, but this is a great start!
Cheers Doug |
|
|
Nov 7 2008, 09:40 PM
Post
#6
|
|
Member Group: Members Posts: 646 Joined: 23-December 05 From: Forest of Dean Member No.: 617 |
Wow, there's not much I can add to Fran's comment. Tchah!, and I had geekpr0n.unmannedspaceflight.com all worked out in my mind's eye...
If you want to get fancy / clever in future, you could try using rsync to do incremental (only save the changes) remote network backups with ssh crypto. A simple incantation is a one-liner: CODE rsync -a -k -v -e ssh /home/andrew/* andrew@sluggbox:/media/sdb1/home/backup/thinkpad/ It's less typing than the shell script approach, but you have to be able to install rsync if it's not there already, and set up the remote machine, and be able to handle the bandwidth costs if you're doing this over the Internet rather than to the next machine down in the rack. The advantages of having the backups offsite is that you can easily be back up and running if the data centre / hosting location is offline for an extended period. There are surprisingly many unexpected scenarios [1] that can cause you to lose connectivity to the site, apart from the apocalyptic stuff the DR geeks get exercised about. Also the traffic on the wire is encrypted.[2] I hacked up something along these lines when my techno-paranoid father finally bought a digital camera, and started accumulating irreplaceable data which he stored on a single HD, in a cheap Dell box, situated in a single-glazed, ground-floor room, in a completely un-alarmed house, in front of a window next to a public footpath. Oh yeah, and the window doesn't have a lock. [3] There are also a couple of projects that use rsync as the core of a more full-featured network backups setup. A friend swears by Bacula, though I've not tried it myself. Lots of other recipes along the same lines are out there. [1] *cough* Gloucester floods 2007 fasthosts [2] One of the dirty little secrets of infosec is that examples of unencrypted traffic being maliciously intercepted are pretty rare these days. Unless, that is, the victims are too embarrassed to call the cops -- or don't realise it's happening... [3] Oh yes, and it's also an absolute tinderbox of 18th century untreated wood, protected only by haphazardly concealed battery-operated smoke alarms which I hide around the place whenever I'm there, but which they throw out when the battery goes flat and they start beeping. If you need to give a sparkie sleepless nights, ask me for a pic of the fuseboard... </tangent>) -------------------- --
Viva software libre! |
|
|
Nov 21 2008, 05:55 PM
Post
#7
|
|
Member Group: Admin Posts: 976 Joined: 29-September 06 From: Pasadena, CA - USA Member No.: 1200 |
Same script with FTP. It deletes the local copy after: #!/bin/bash HOST='xxxx.xxxxxx.com' USER='xxxxxx' PASSWD='xxxxxx' THEDATE=`date +%m-%d` zip -r ~/backups/backup-mydocs-$THEDATE.zip ~/mydocs/ ftp -v -n $HOST <<** user $USER $PASSWD cd backups/mydocs/ bin put ~/backups/backup-mydocs-$THEDATE.zip bye ** rm ~/backups/backup-mydocs-$THEDATE.zip echo 'Backup finished. Closing...' sleep 2 On my server at home and on my workstation at work I use gtar instead of zip but the principle is the same. I do linear backups on Sundays and an incremental backup every day. Some suggestions: - consider iso9660 files. If you want to retrieve one file or a handful it makes it easier to just fetch single isolated files. I prefer tarballs since even if they get corrupted you can still recover some files out of them. - consider file system/ssh/rsync file size limits - whatever you do, periodically check your backups, try to unpack them and see if you are missing something. - if you are going to provide the hardware (disks) to do the backup, NEVER EVER use the same brand for your main disks and your backup disks (*). Purchase different brands and at different dates so that their wear and tear is different. Paolo -------------------- Disclaimer: all opinions, ideas and information included here are my own,and should not be intended to represent opinion or policy of my employer.
|
|
|
Nov 21 2008, 06:43 PM
Post
#8
|
|
Founder Group: Chairman Posts: 14434 Joined: 8-February 04 Member No.: 1 |
You know how people suggest off-site backups. You've never thought of, you know, for something really important... I mean, it is VERY off site, it would be a good place to store passwords
Doug |
|
|
Nov 21 2008, 07:10 PM
Post
#9
|
|
Member Group: Admin Posts: 976 Joined: 29-September 06 From: Pasadena, CA - USA Member No.: 1200 |
You know how people suggest off-site backups. You've never thought of, you know, for something really important... I mean, it is VERY off site, it would be a good place to store passwords Doug Can't do that. We always get in trouble for taking up too much flash... Paolo -------------------- Disclaimer: all opinions, ideas and information included here are my own,and should not be intended to represent opinion or policy of my employer.
|
|
|
Lo-Fi Version | Time is now: 27th September 2024 - 06:47 AM |
RULES AND GUIDELINES Please read the Forum Rules and Guidelines before posting. IMAGE COPYRIGHT |
OPINIONS AND MODERATION Opinions expressed on UnmannedSpaceflight.com are those of the individual posters and do not necessarily reflect the opinions of UnmannedSpaceflight.com or The Planetary Society. The all-volunteer UnmannedSpaceflight.com moderation team is wholly independent of The Planetary Society. The Planetary Society has no influence over decisions made by the UnmannedSpaceflight.com moderators. |
SUPPORT THE FORUM Unmannedspaceflight.com is funded by the Planetary Society. Please consider supporting our work and many other projects by donating to the Society or becoming a member. |