Write a Bash script that automates the process of analyzing log files and generating a daily summary report. The script should perform the following steps:
Input: The script should take the path to the log file as a command-line argument.
Error Count: Analyze the log file and count the number of error messages. An error message can be identified by a specific keyword (e.g., "ERROR" or "Failed"). Print the total error count.
Critical Events: Search for lines containing the keyword "CRITICAL" and print those lines along with the line number.
Top Error Messages: Identify the top 5 most common error messages and display them along with their occurrence count.
Summary Report: Generate a summary report in a separate text file. The report should include:
Date of analysis
Log file name
Total lines processed
Total error count
Top 5 error messages with their occurrence count
List of critical events with line numbers
Bash script:
#!/bin/bash
# Check if a log file path is provided
if [ "$#" -ne 1 ]; then
echo "Usage: $0 /path/to/logfile"
exit 1
fi
logfile="$1"
reportfile="summary_report_$(date +%F).txt"
# Check if the log file exists
if [ ! -f "$logfile" ]; then
echo "Log file does not exist: $logfile"
exit 1
fi
# Initialize variables
total_lines=0
error_count=0
declare -A error_messages
critical_events=()
# Analyze the log file
while IFS= read -r line; do
total_lines=$((total_lines + 1))
# Count errors
if echo "$line" | grep -q "ERROR\|Failed"; then
error_count=$((error_count + 1))
# Extract and count error messages
error_msg=$(echo "$line" | awk '{for(i=1;i<=NF;i++) if ($i ~ /ERROR|Failed/) {print $i}}')
error_messages["$error_msg"]=$((error_messages["$error_msg"] + 1))
fi
# Identify critical events
if echo "$line" | grep -q "CRITICAL"; then
critical_events+=("$total_lines: $line")
fi
done < "$logfile"
# Sort and get top 5 error messages
top_errors=$(for msg in "${!error_messages[@]}"; do
echo "$msg ${error_messages[$msg]}"
done | sort -k2,2nr | head -n 5)
# Generate summary report
{
echo "Date of Analysis: $(date)"
echo "Log File Name: $(basename "$logfile")"
echo "Total Lines Processed: $total_lines"
echo "Total Error Count: $error_count"
echo
echo "Top 5 Error Messages:"
echo "$top_errors"
echo
echo "List of Critical Events:"
for event in "${critical_events[@]}"; do
echo "$event"
done
} > "$reportfile"
echo "Summary report generated: $reportfile"
# Optional enhancement: Archive or move the processed log file
archive_dir="/home/ubuntu/archive"
mkdir -p "$archive_dir"
mv "$logfile" "$archive_dir/"
echo "Log file moved to: $archive_dir/"
Explanation:
Input Validation:
- The script checks if the log file path is provided as an argument and if the file exists.
Variables Initialization:
- Initializes necessary variables and arrays for processing.
Log File Processing:
- Reads the log file line by line, counting total lines and errors, and storing error messages and critical events.
Top Error Messages:
- Uses associative arrays to count error messages and identifies the top 5 most common ones.
Summary Report Generation:
- Writes the summary report to a file, including the date, log file name, total lines processed, total error count, top 5 error messages, and critical events.
Archiving Log File:
- Moves the processed log file to an archive directory.
Output:
Happy Learning ๐