S3cmd on CentOS 6

S3cmd is a command line tool that we’ve used for uploading, retrieving and managing data in Amazon S3. To install S3cmd, you would need to download the .repo for the tool, and then, run install. cd /etc/yum.repos.d wget http://s3tools.org/repo/RHEL_6/s3tools.repo yum install s3cmd Answer yes when asked to accept a new GPG key. Note: The latest version of s3cmd requires python 2.6. If you come across this error Problem: ImportError: No module named S3.Exceptions, chances are, you do not have the correct version. One work around is as follows:- ...

March 30, 2016

Various Backup Scripts

Here are some sample backup scripts that I’ve used to back up various things in our network. These scripts uses Amazon S3 as the storage, and also s3cmd. Application #!/bin/sh # Site Backup script # # application_backup.sh # Initialize variables specific for this server # To exclude directory, update in code under Exclude directory LOG_FILE="/var/log/site-backup.log" SITE_PATH="/var/www/" SITE_BACKUP_PATH="/var/script/backup" SITE=( name_of_directory_one name_of_directory_two ) # Definition TIMESTAMP=`date "+%Y-%m-%d %H:%M:%S"` CURRENT_YEAR=`date "+%Y"` CURRENT_MONTH=`date "+%Y-%m"` TODAY_DATE=`date "+%Y-%m-%d"` S3_PATH=s3://name-of-s3-bucket/application/${CURRENT_YEAR}/${CURRENT_MONTH} # Remove Site Backup older than 15 days /usr/bin/find ${SITE_BACKUP_PATH} -type f -mtime +15 -delete echo "Site Backup Log: " ${TIMESTAMP} >> ${LOG_FILE} echo -e "--------------------------------------------" >> ${LOG_FILE} echo -e "" >> ${LOG_FILE} # Loop through the Site Repository for i in ${SITE[@]} do # Exclude directory EXCLUDE='' case $i in account.fxprimus.com) exclude="--exclude api --exclude assets \ --exclude nfiles --exclude nimages \ --exclude nlanguages --exclude ntemplates";; esac # Backup Site cd ${SITE_PATH} tar -zcf ${SITE_BACKUP_PATH}/$i-${TODAY_DATE}.tar.gz . ${EXCLUDE} # Transfer the file to Amazon S3 s3cmd put --acl-private --guess-mime-type \ ${SITE_BACKUP_PATH}/$i-${TODAY_DATE}.tar.gz \ ${S3_PATH}/$i-${TODAY_DATE}.tar.gz >> ${LOG_FILE} 2>&1 if [ "$?" -eq 1 ] then echo -e "***SITE BACKUP JOB, THERE WERE ERRORS***" >> ${LOG_FILE} 2>&1 else echo -e "Script Completed Successfully!" >> ${LOG_FILE} 2>&1 fi done MySQL #!/bin/sh # MySQL Backup script # # db_backup.sh # In summary, this is what is going to happen. # make directory # change directory # dump file # compress directory # remove directory # upload compressed file # Initialize variables specific for this server MYSQL_SOURCE_HOST=192.168.3.100 MYSQL_DATABASE=this_is_my_database MYSQL_SOURCE_USER=i_am_db_user MYSQL_SOURCE_PASS=i_am_db_password SRC_CONN="-h${MYSQL_SOURCE_HOST} -u${MYSQL_SOURCE_USER} -p${MYSQL_SOURCE_PASS}" MYSQL_TABLE=( tbl_one tbl_two ) LOG_FILE=/var/log/mysql-backup.log MYSQL_BACKUP_PATH=/var/script/backup # Definition TIMESTAMP=`date "+%Y-%m-%d %H:%M:%S"` CURRENT_YEAR=`date "+%Y"` CURRENT_MONTH=`date "+%Y-%m"` TODAY_DATE=`date "+%Y-%m-%d"` EPOCH=`date +%s` S3_PATH=s3://name-of-s3-bucket/db/${CURRENT_YEAR}/${CURRENT_MONTH}/${TODAY_DATE} # Remove MySQL Backup older than 3 days /usr/bin/find ${MYSQL_BACKUP_PATH} -type f -mtime +3 -delete echo "MySQL Backup Log: " ${TIMESTAMP} >> ${LOG_FILE} echo -e "--------------------------------------------" >> ${LOG_FILE} echo -e "" >> ${LOG_FILE} MYSQLDUMP_OPTIONS="--hex-blob --compress --lock-tables=false" # Loop through the tables for TBL in "${MYSQL_TABLE[@]}" do FILENAME=${TBL}-${TODAY_DATE}-${EPOCH} # Backup MySQL mysqldump ${SRC_CONN} ${MYSQLDUMP_OPTIONS} ${MYSQL_DATABASE} ${TBL} \ > ${MYSQL_BACKUP_PATH}/${FILENAME}.sql # Compress today's directory tar -zcf ${MYSQL_BACKUP_PATH}/${FILENAME}.tar.gz -C ${MYSQL_BACKUP_PATH} ${FILENAME}.sql # Transfer the file to Amazon S3 s3cmd put --acl-private --guess-mime-type \ ${MYSQL_BACKUP_PATH}/${FILENAME}.tar.gz \ ${S3_PATH}/${FILENAME}.tar.gz >> ${LOG_FILE} 2>&1 done if [ "$?" -eq 1 ] then echo -e "***MySQL BACKUP JOB, THERE WERE ERRORS***" >> ${LOG_FILE} 2>&1 else echo -e "Script Completed Successfully!" >> ${LOG_FILE} 2>&1 fi SVN Repository #!/bin/sh # SVN Off Site Backup script # # svn_backup.sh # Input from command line SVN_REPOSITORY=($1) # Definition TIMESTAMP=`date "+%Y-%m-%d %H:%M:%S"` CURRENT_YEAR=`date "+%Y"` CURRENT_MONTH=`date "+%Y-%m"` TODAY_DATE=`date "+%Y-%m-%d"` EPOCH=`date +%s` LOG_FILE=/var/log/svn-backup.log SVN_BACKUP_PATH=/var/script/backup SVN_PATH=/var/svn/repos S3_PATH=s3://name-of-s3-bucket/svn/${CURRENT_YEAR}/${CURRENT_MONTH} # Remove SVN Backup older than 7 days /usr/bin/find ${SVN_BACKUP_PATH} -type f -mtime +7 -delete echo "SVN Offsite Backup Log: " ${TIMESTAMP} >> ${LOG_FILE} echo -e "--------------------------------------------" >> ${LOG_FILE} echo -e "" >> ${LOG_FILE} # Loop through the SVN Repository for i in ${SVN_REPOSITORY[@]} do FILENAME=$i-${TODAY_DATE}-${EPOCH} # Backup SVN svnadmin dump ${SVN_PATH}/$i | gzip > ${SVN_BACKUP_PATH}/${FILENAME}.svndump.gz # Transfer the file to Amazon S3 s3cmd put --acl-private --guess-mime-type \ ${SVN_BACKUP_PATH}/${FILENAME}.svndump.gz \ ${S3_PATH}/${FILENAME}.svndump.gz >> ${LOG_FILE} 2>&1 if [ "$?" -eq 1 ] then echo -e "***SVN OFFSITE BACKUP JOB, THERE WERE ERRORS***" >> ${LOG_FILE} 2>&1 else echo -e "Script Completed Successfully!" >> ${LOG_FILE} 2>&1 fi done SVN Repository to Backup #!/bin/sh # SVN Repository to Backup # # call_backup.sh svn_repository=( my-project-number-one my-project-number-two my-project-number-three ) /var/script/svn_backup.sh "${svn_repository[*]}"

March 21, 2016

csync2 Installation and Setup

Csync2 is a cluster synchronization tool. It can be used to keep files on multiple hosts in a cluster in sync. Csync2 can handle complex setups with much more than just 2 hosts, handle file deletions and can detect conflicts. Installation Build and Install csycn2 by running the below script - build-csync2.sh #!/bin/sh # build-csync2.sh # Make directory to store csync2 related mkdir -p /data/build/ mkdir -p /data/sync-db/ mkdir -p /data/logs/csync2 mkdir -p /data/sync-conflicts/ cd /data/build/ # Get the files from our own file repository wget http://downloads.sourceforge.net/librsync/librsync-0.9.7.tar.gz wget http://oss.linbit.com/csync2/csync2-2.0.tar.gz # Install the packages required to compile csync2 yum install xinetd byacc flex gcc-c++ gnutls gnutls-devel openssl-devel openssl-static sqlite-devel -y # Untar the files tar -xzf librsync-0.9.7.tar.gz tar -xzf csync2-2.0.tar.gz # Build csync2 cd /data/build/csync2-2.0 ./configure \ --prefix=/usr \ --with-librsync-source=/data/build/librsync-0.9.7.tar.gz \ --localstatedir=/var \ --sysconfdir=/etc \ --disable-gnutls make && make install # Remove the directories that we no longer need cd .. rm -rf librsync-0.9.7 rm -rf csync2-2.0 # Add the csync2 port number to /etc/services echo "csync2 30865/tcp" >> /etc/services CSYNCLOC=`which csync2` # Create a xinted definition file for csync2 echo -e "# default: on\n# description: csync2 xinetd server\n\nservice csync2\n{\n disable = no\n flags = REUSE\n socket_type = stream\n wait = no\n user = root\n group = root\n server = $CSYNCLOC\n server_args = -i -D /data/sync-db/\n port = 30865\n type = UNLISTED\n log_type = FILE /data/logs/csync2/csync2-xinetd.log\n log_on_failure += USERID\n}\n" > /etc/xinetd.d/csync2 # Restart the service to include the newly created definition service xinetd restart Setup First, generate ONE self-signed SSL certificate that would be used by your cluster. ...

March 14, 2016

Running FTP on CentOS

Here’s how you can easily setup FTP on CentOS yum install vsftpd Added these to the security group if you are running AWS EC2 Add port range 20-21 Add port range 1024-1048 vi /etc/vsftpd/vsftpd.conf Set anonymous_enable=NO Set chroot_local_user=YES Add these pasv_enable=YES pasv_min_port=1024 pasv_max_port=1048 pasv_address=<Public IP of your instance> sudo service vsftpd restart To make sure that the service is started on reboot chkconfig --level 345 vsftpd on Create new FTP User useradd -d /path/to/new/home/dir -G apache userNameHere chown userNameHere:apache /path/to/new/home/dir passwd userNameHere The log file can be found in /var/log/xferlog ...

March 11, 2016

SVN Users and Repositories

We have setup our SVN to be accessed through http with the help of Apache. Here’s a sample configuration that we have used. vi /etc/httpd/conf.d/vhosts.conf Add the below into subversion.conf <VirtualHost *:443> ServerName svn.mydomain.com DocumentRoot /var/www/svn.mydomain.com <Directory /var/www/svn.mydomain.com> AllowOverride All Order Allow,Deny Allow from all Options -Indexes Require all granted </Directory> <Location /repos/newrepository> DAV svn SVNPath /var/svn/repos/newrepository AuthName "Subversion repository" AuthType Digest AuthUserFile /var/svn/svn-auth.htdigest AuthzSVNAccessFile /var/svn/svn-acl.conf Require valid-user </Location> SSLEngine on SSLProtocol all SSLCertificateFile /etc/pki/tls/certs/ca.svn.mydomain.com.crt SSLCertificateKeyFile /etc/pki/tls/private/ca.svn.mydomain.com.key ErrorLog /var/log/httpd/svn.mydomain.com-error_log CustomLog /var/log/httpd/svn.mydomain.com-access_log combined </VirtualHost> You can create a new SVN user by using the command below. This would store the encrypted password into svn-auth.htdigest ...

March 7, 2016

Testing Yii with Codeception

We have an application written in Yii. Up til this point, it has been manually tested. Since it’s more efficient to automate the testing of this application, we have decided to set up Codeception to assist us with this task. Set up was easy and straight forward - this said after spending countless hours understanding the correct way to set it up. To illustrate the set up process, we’ll create a skeleton Yii app. ...

February 2, 2016

What I've Learnt Migrating a Blog from Wordpress to Blogger

As per the subject, migrating a blog from Wordpress to Blogger isn’t as hard as most would have imagined, On the down side, it’s a bit tedious if you have stored your images with your Wordpress blog. For my case, I have always stored my images on Picasa Web, so, I need not go through the process of copying the images out from the Wordpress blog. Here’s the straight forward process of exporting your Wordpress blog into Blogger. ...

January 21, 2016

Share Your Knowledge and Pass It Along

Knowledge gained through hard work or experience are meant to be shared. You might think that holding on to this knowledge would make you indispensable, or by sharing, it would jeopardize your position since someone else can do it. Yet, have you considered the possibility that you have limited yourself to the scope of this work? You’ll end up being the person doing it all the time, and losing the chance to gain new knowledge and experience. ...

January 8, 2016