Optimize Your Check Point System with Our Efficient Log Deletion Script

Free up vital disk space on your Check Point firewall without compromising system performance!

If you manage a Check Point system, you’re likely aware that despite improvements since Solaris 10, the automatic log rotation and cleanup settings in versions up to R80.40 might not meet your needs. Our script can help you automate the deletion of outdated logs through a cron job, ensuring your system runs efficiently without manual oversight.

For later versions, below are references to the official Check Point documetation:

Check Point Solution ID: sk117317
How to configure log / log indexes maintenance policy for Global SmartEvent Server / Log Server and MDS / MLM R80.10 and higher
Version: R80.10 (EOL), R80.20 (EOL), R80.30 (EOL), R80.40 (EOL), R81, R81.10, R81.20

Check Point Solution ID: sk123532
log_keep_on_days value is not available in R80.xx
Version: R80.10 (EOL), R80.20 (EOL), R80.30 (EOL)

Script Features:

  • Automatic Log Deletion: Configure the script to automatically delete logs older than a specified time period.

Simple Script Example:

#!/bin/bash
# Deletes Check Point logs older than 3 months to maintain optimal system performance.
# Path: /var/log/disk2/

find /var/log/disk2/log/ -type f -name "*.log*" -mtime +90 -delete

Check Point R60 to R81.40 - Full Script:

#!/bin/bash
###########################################################################
###     DELETE CHECK POINT LOG FILES OLDER THAN 3 MONTHS                ###
###########################################################################
#
# Delete Check Point logs after the system has rotated the log files
# AUTHOR: Brett Gardner - https://clarisyte.com - LATEST RELEASE: 20210803
# CODE DISCLAIMER: https://clarisyte.com/code-disclaimer/
# VENDOR & ORIGINAL BASE SCRIPT: Check Point
# CUSTOMISED DEVICE TYPE: Check Point GAIA
# TESTED: Check Point R80.10
#
# DESCRIPTION:
# ------------
# This script deletes log files that are older than 3 months.
#
# Reference: Check Point Solution ID sk77300
# Above variables do not work for VSX (see sk for reference)
#
# Clish: Create userid to run cron
# HostName> add user cronuser uid 0 homedir /home/cronuser
# HotsName> save config
#
### Format of log files to delete:
# Path: /var/log/disk2/
# 2020-12-31_224903_5008.log
# 2020-12-31_224903_5008.log_stats
# 2020-12-31_224903_5008.logaccount_ptr
# 2020-12-31_224903_5008.loginitial_ptr
# 2020-12-31_224903_5008.logptr
# 2020-12-07_000000.adtlog
# 2020-12-06_000000.adtlogaccount_ptr
# testcpsms01__2020-11-24_000000.adtloginitial_ptr
# testcpsms01__2020-12-02_000000.adtlogptr
#
###########################################################################

###########################################################################
# Crontab Entry Required:
# Create cronjob: crontab -u cronuser -e
# File location: /var/spool/cron/cronuser
#    0 3 */1 * * /bin/cplogdelete
# Test run the cronjob: crontab -u cronuser -l
#
###########################################################################

###########################################################################
# Dependent Files & Scripts Required for Script to Operate
# -Checkpoint environment script: /opt/CPshrd-R60/tmp/.CPprofile.sh
# -Mail must be configured
###########################################################################

# Export the CP environment variables
source /opt/CPshrd-R80/tmp/.CPprofile.sh

# Set the variables
days=$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))

###########################################################################
# Find and delete log files
#
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.log' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.log_stats' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.logaccount_ptr' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.loginitial_ptr' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.logptr' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.adtlog' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.adtlogaccount_ptr' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.adtloginitial_ptr' -type f -mtime +$days -delete
find /var/log/disk2/log/ -path '/var/log/disk2/log/*.adtlogptr' -type f -mtime +$days -delete

# find /var/log/disk2/log/ -path '/var/log/disk2/log/*.log' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))" -delete
# find /var/log/disk2/log/ -path '/var/log/disk2/log/*.log_stats' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))" -delete
# find /var/log/disk2/log/ -path '/var/log/disk2/log/*.logaccount_ptr' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))" -delete
# find /var/log/disk2/log/ -path '/var/log/disk2/log/*.loginitial_ptr' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))" -delete
# find /var/log/disk2/log/ -path '/var/log/disk2/log/*.logptr' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))" -delete
# find /var/log/disk2/log/ -path '/var/log/disk2/log/*.adtlog' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))" -delete
# Generic find
# find /var/log/disk2/log/ -path '/var/log/disk2/log/*' -type f -mtime "+$(( ( $(date '+%s') - $(date -d '3 months ago' '+%s') ) / 86400 ))"


Check Point R55 Script:

Below is a log rotation script I authored for Check Point R55.

#!/bin/bash
###########################################################################
###     ARCHIVE CHECKPOINT LOGS AND MAIL OUT DISK SPACE                 ###
###########################################################################
#
# Edited for Checkpoint Firewall-1 NGX-R55 on Solaris 9
# Revision underway
# File Name:            cplogarchive.sh
# File Location:        /usr/local/sbin/
# Version:              1.0
#
# DESCRIPTION:
# ------------
# Original author: Brett Gardner
#
# Functionality was changed to remove manual log rotation which is now
# handled by CP GUI.
# Archiving of logs now performed on logs older than 7 days, which means
# that all logs from the current week and one previous week should be
# accessible via SmartTracker
#
###########################################################################

###########################################################################
# UPDATE HISTORY
#
# DATE          UPDATED BY      VER#    DETAIL
# 2021 underway    Brett Gardner       1.0     Script creation
#
###########################################################################

###########################################################################
# Crontab Entry Required:
# Archive CP logs every Monday at 01:00
# 00 01 * * 1 /usr/local/sbin/cplogarchive.sh
#
###########################################################################

###########################################################################
# Dependent Files & Scripts Required for Script to Operate
# -Checkpoint environment script: /opt/CPshrd-R60/tmp/.CPprofile.sh
# -Mail must be configured
###########################################################################

# Export the CP environment variables
. /opt/CPshrd-R55/tmp/.CPprofile.sh

# Set the working directory for this script
cd $FWDIR/log

# Set the constants
cEmailBody=/tmp/tmp.$$.txt
cTarList=/tmp/tarlist.$$.txt
cDaysToKeep=7
cExpectedCount=7
cHostname=`/usr/bin/hostname`
cDate=`/bin/date +%Y%b%d`
cFilename=./archive/$cHostname-fwlog-$cDate-$$
cSuccessEmailList=test@test.com
cErrorEmailList=test@test.com
#cErrorEmailList=test@test.com,test2@test.com,test3@test.com

# Set the variables
vActualCount=`find . -mtime +$cDaysToKeep -name '*.log' | wc -l | tr -d " "`
vErrorCode=0

# Initialise the temporary files
touch $cEmailBody
touch $cTarList


###########################################################################
# List and archive log files
#

if [ "$vActualCount" -ne "$cExpectedCount" ]; then
  # Set error code for unexpected file search result
  vErrorCode=1
  echo -e "\nERROR: The log archive script expects to find $cExpectedCount log files to archive, but found $vActualCount!"
  echo "A full error report will be emailed to the QF Firewall Team."
else
  # Find all files with extension ".log" which have not been modified for $cDaysToKeep days
  # Then add all the "logaccount_ptr", "loginitial_ptr", etc files to the list of files to tar
  for i1 in `find . -mtime +$cDaysToKeep -name '*.log'`; do
    echo "`ls -1 $i1*`" >> $cTarList
  done
  echo "Tar'ing files into $cFilename.tar...."
  tar -cvf $cFilename.tar -I $cTarList
  if [ $? -ne 0 ]; then
    # Set error code for tar error/warning
    vErrorCode=2
    echo -e "\nERROR: There was an error/warning when attempting to complete the tar operation"
    echo "A full error report will be emailed to the QF Firewall Team."
  else
    echo "Gzip'ing tarfile...."
    gzip -9 $cFilename.tar
    if [ $? -ne 0 ]; then
      # Set error code for gzip error/warning
      vErrorCode=3
      echo -e "\nERROR: There was an error/warning when attempting to complete the gzip operation"
      echo "A full error report will be emailed to the QF Firewall Team."
    else
      echo "Tarfile gzipped successfully!"
      # Need to start generating successful email body before deleting original files (to get long listing)
      echo "##### The firewall logs for $cHostname have been archived #####" >> $cEmailBody
      echo -e "\nThe following $vActualCount log files were archived:" >> $cEmailBody
      echo "`find . -mtime +$cDaysToKeep -name '*.log' -ls`" >> $cEmailBody
      echo -e "\nAlong with their associated pointer files:" >> $cEmailBody
      # Get long listing of associated pionter files
      echo "`find . -name 'archive' -prune -o -mtime +$cDaysToKeep -a -name '*.log*' -a ! -name '*.log' -ls`" >> $cEmailBody
      # Delete all files which have been archived
      for i2 in `cat $cTarList`; do
        rm $i2
      done
    fi
  fi
fi

###########################################################################
# Mail out info - Report errors if any occurred
#

case $vErrorCode in
0)
  echo -e "\nThe gzipped file is:\n`ls -alp $cFilename*`" >> $cEmailBody
  echo -e "\n---------------" >> $cEmailBody
  echo -e "Running uptime:\n`uptime`" >> $cEmailBody
  echo -e "\n$cHostname disk space as of $cDate:\n`df -h`\n\n\n" >> $cEmailBody
  /bin/mailx -s "$cHostname logs archived on $cDate" $cSuccessEmailList < $cEmailBody
  if [ $? -ne 0 ]; then
    # Note: I may try and generate an SNMP trap here in future
    echo -e "\nERROR: There was a problem sending the report with mailx, please investigate"
    echo -e "\nHere are the contents of the Email file:\n`cat $cEmailBody`"
  fi
  ;;
1)
  rm $cEmailBody; touch $cEmailBody
  echo "##### ATTENTION! There has been an error archiving the firewall logs for $cHostname #####" >> $cEmailBody
  echo -e "\nThe log archive script expects to find $cExpectedCount log files to archive, but found $vActualCount!" >> $cEmailBody
  echo -e "\nThe following log files were found:" >> $cEmailBody
  echo "`find . -mtime +$cDaysToKeep -name '*.log' -ls`" >> $cEmailBody
  echo -e "\nInvestigate this error, then run cplogarchive.sh again, TODAY!\n\n\n" >> $cEmailBody
  /bin/mailx -s "ERROR: $cHostname logs failed to archive on $cDate" $cErrorEmailList < $cEmailBody
  if [ $? -ne 0 ]; then
    # Note: I may try and generate an SNMP trap here in future
    echo -e "\nERROR: There was a problem sending the report with mailx, please investigate"
    echo -e "\nHere are the contents of the Email file:\n`cat $cEmailBody`"
  fi
  ;;
2)
  rm $cEmailBody; touch $cEmailBody
  echo "##### ATTENTION! There has been an error archiving the firewall logs for $cHostname #####" >> $cEmailBody
  echo -e "\nThere was an error/warning when attempting to tar the following files in $FWDIR/log:" >> $cEmailBody
  cat $cTarList >> $cEmailBody
  if [ -e "$cFilename.tar" ]; then
    rm $cFilename.tar       # Remove tar file to save disk space, probably corrupt/incomplete anyway
    if [ $? -ne 0 ]; then
      echo -e "\nERROR: Could not delete failed tar file: $cFilename.tar\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar`"
      echo -e "\nERROR: Could not delete failed tar file: $cFilename.tar\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar`" >> $cEmailBody
    else
      echo -e "\nSuccessfully deleted failed tar file: $cFilename.tar"
      echo -e "\nSuccessfully deleted failed tar file: $cFilename.tar" >> $cEmailBody
    fi
  else
    echo -e "\nFYI: Sometimes a partial tar file is created and needs to be deleted, but no tar file was created as a result of this error."
    echo -e "\nFYI: Sometimes a partial tar file is created and needs to be deleted, but no tar file was created as a result of this error." >> $cEmailBody
  fi
  echo -e "\nInvestigate this error, then run cplogarchive.sh again, TODAY!\n\n\n" >> $cEmailBody
  /bin/mailx -s "ERROR: $cHostname logs failed to archive on $cDate" $cErrorEmailList < $cEmailBody
  if [ $? -ne 0 ]; then
    # Note: I may try and generate an SNMP trap here in future
    echo -e "\nERROR: There was a problem sending the report with mailx, please investigate"
    echo -e "\nHere are the contents of the Email file:\n`cat $cEmailBody`"
  fi
  ;;
3)
  rm $cEmailBody; touch $cEmailBody
  echo "##### ATTENTION! There has been an error archiving the firewall logs for $cHostname #####" >> $cEmailBody
  echo -e "\nThere was an error/warning when attempting to gzip the following tar file in $FWDIR/log:" >> $cEmailBody
  echo "$cFilename.tar" >> $cEmailBody
  if [ -e "$cFilename.tar" -a -e "$cFilename.tar.gz" ]; then
    rm $cFilename.tar       # Remove tar file to save disk space, probably corrupt/incomplete anyway
    if [ $? -ne 0 ]; then
      echo -e "\nERROR: Could not delete the tar file: $cFilename.tar\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar`"
      echo -e "\nERROR: Could not delete the tar file: $cFilename.tar\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar`" >> $cEmailBody
    else
      echo -e "\nSuccessfully deleted tar file: $cFilename.tar"
      echo -e "\nSuccessfully deleted tar file: $cFilename.tar" >> $cEmailBody
    fi
    rm $cFilename.tar.gz    # Remove gz file to save disk space, probably corrupt/incomplete anyway
    if [ $? -ne 0 ]; then
      echo -e "\nERROR: Could not delete the failed gz file: $cFilename.tar.gz\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar.gz`"
      echo -e "\nERROR: Could not delete the failed gz file: $cFilename.tar.gz\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar.gz`" >> $cEmailBody
    else
      echo -e "\nSuccessfully deleted failed gz file: $cFilename.tar.gz"
      echo -e "\nSuccessfully deleted failed gz file: $cFilename.tar.gz" >> $cEmailBody
    fi
  elif [ -e "$cFilename.tar" ]; then
    rm $cFilename.tar       # Clean up tar file which failed to gzip
    if [ $? -ne 0 ]; then
      echo -e "\nERROR: Could not delete the tar file: $cFilename.tar\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar`"
      echo -e "\nERROR: Could not delete the tar file: $cFilename.tar\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar`" >> $cEmailBody
    else
      echo -e "\nSuccessfully deleted tar file: $cFilename.tar"
      echo -e "\nSuccessfully deleted tar file: $cFilename.tar" >> $cEmailBody
    fi
    echo -e "\nFYI: Sometimes a partial gz file is created and needs to be deleted, but no gz file was created as a result of this error."
    echo -e "\nFYI: Sometimes a partial gz file is created and needs to be deleted, but no gz file was created as a result of this error." >> $cEmailBody
  elif [ -e "$cFilename.tar.gz" ]; then
    echo -e "\nFYI: Seems that the gz file has replaced the tar file OK, but compressed gz file may be incomplete/corrupted due to gzip error/warning."
    echo -e "File $cFilename.tar.gz will be deleted. Please investigate cause of error/warning ASAP."
    echo -e "\nFYI: Seems that the gz file has replaced the tar file OK, but compressed gz file may be incomplete/corrupted due to gzip error/warning." >> $cEmailBody
    echo -e "File $cFilename.tar.gz will be deleted. Please investigate cause of error/warning ASAP." >> $cEmailBody
    rm $cFilename.tar.gz    # Remove gz file to save disk space, probably corrupt/incomplete anyway
    if [ $? -ne 0 ]; then
      echo -e "\nERROR: Could not delete the failed gz file: $cFilename.tar.gz\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar.gz`"
      echo -e "\nERROR: Could not delete the failed gz file: $cFilename.tar.gz\nPlease investigate and manually delete ASAP:\n`ls -l $cFilename.tar.gz`" >> $cEmailBody
    else
      echo -e "\nSuccessfully deleted failed gz file: $cFilename.tar.gz"
      echo -e "\nSuccessfully deleted failed gz file: $cFilename.tar.gz" >> $cEmailBody
    fi
  else
    echo -e "\nERROR: Both files $cFilename.tar and $cFilename.tar.gz do not exist! Please investigate ASAP cos that's REALLY weird!"
    echo -e "\nERROR: Both files $cFilename.tar and $cFilename.tar.gz do not exist! Please investigate ASAP cos that's REALLY weird!" >> $cEmailBody
  fi
  echo -e "\nInvestigate this error, then run cplogarchive.sh again, TODAY!\n\n\n" >> $cEmailBody
  /bin/mailx -s "ERROR: $cHostname logs failed to archive on $cDate" $cErrorEmailList < $cEmailBody
  if [ $? -ne 0 ]; then
    # Note: I may try and generate an SNMP trap here in future
    echo -e "\nERROR: There was a problem sending the report with mailx, please investigate"
    echo -e "\nHere are the contents of the Email file:\n`cat $cEmailBody`"
  fi
  ;;
esac
 

###########################################################################
# Cleanup & Set exit code
#

rm $cEmailBody
rm $cTarList

exit $vErrorCode

Feedback and Improvements:

We welcome your suggestions and feedback to enhance this script further. Share your thoughts, and stay tuned for updates including new features and compatibility checks for the latest Check Point versions.

Brett

Author and creator of this site

Leave a Reply