The below script may come in handy for someone that needs to pull files for whatever reason to a separate location for processing. I realize that this is not the only method to do this but this has been working for me for years.
I use the script to pull IIS log files from an IIS server and then run the logs through AWSTATS to generate website traffic stats daily.
I have this script running on a linux server that does the AWSTATS processing once per day using cron. Although I pull http logs files you can use it to pull other types of files such as mail server logs, samba logs, ftp logs, really any file accessible by ftp.
If you save the script as an .sh file remember to make it executable.
If possible please post a comment letting me know you are using the script. I am just curious to see how many this might help.
# This script will ftp a file from yesterday from a server to the local machine then run a command on it
# The lines below are some variables that I defined for various dates. The log file variables can be adjusted to form a filename that has a date in it
yesterdaysdate=`date –date=yesterday +%y%m%d`
yesterdayslog=ex`date –date=yesterday +%y%m%d`.log
tomorrowdate=`date –date=tomorrow +%y%m%d`
tomorrowslog=ex`date –date=tomorrow +%y%m%d`.log
# The below line just echos info to the screen to show progress
echo Trying to get yesterdays logfile $yesterdayslog
# The below lines will run an FTP script that uses variables from above to get specific files
ftp -n -v ww.xx.yy.zz << EOF
user usernamehere passwordhere
# The below line can be a script or other command