← Back

Linux Logs Investigation: Tools, Scenarios, and Pro Tips for Cybersecurity Operators

In the world of cybersecurity, logs are the breadcrumbs left behind by systems, services, and applications. For Linux systems, these logs are the primary forensic evidence when something suspicious occurs. Whether you're investigating an intrusion, diagnosing a misconfiguration, or verifying compliance, log analysis is at the core of incident response.

Professional cybersecurity operators rely on a variety of tools and strategies to filter noise, detect anomalies, and trace malicious activity in Linux environments.


Common Log Locations in Linux

Most Linux distributions store logs in /var/log/ by default. Here are the key files:

Log FilePurpose
/var/log/auth.log or /var/log/secureAuthentication events, sudo usage, SSH logins
/var/log/syslog or /var/log/messagesSystem-wide events
/var/log/kern.logKernel messages
/var/log/dmesgBoot-time hardware messages
/var/log/apache2/access.log / error.logWeb server activity
/var/log/audit/audit.logSELinux and audit framework logs

Essential Tools for Log Investigation

1. journalctl

For systems using systemd, journalctl is the go-to tool.

# View logs since yesterday
sudo journalctl --since "yesterday"
 
# View only SSH-related logs
sudo journalctl -u ssh.service
 
# Follow logs in real-time
sudo journalctl -f

Pro Tip: Use --grep with regex to search specific patterns directly in the journal:

sudo journalctl --grep "Failed password"

Here's a Journalctl Cheatsheet with 10+ most common use cases, designed for quick reference during Linux log investigations.

View All Logs

sudo journalctl

Shows logs from oldest to newest.

View Logs in Reverse (Newest First)

sudo journalctl -r

Follow Logs in Real-Time

sudo journalctl -f

Like tail -f for the system journal.

View Logs Since a Specific Time

sudo journalctl --since "2025-08-10 14:00"
sudo journalctl --since "yesterday"
sudo journalctl --since "2 hours ago"

View Logs Between Two Dates

sudo journalctl --since "2025-08-10 14:00" --until "2025-08-10 18:00"

Filter by Service (Unit)

sudo journalctl -u ssh.service
sudo journalctl -u nginx.service --since "today"

Filter by Process ID (PID)

sudo journalctl _PID=1234

Search for Keywords

sudo journalctl --grep "Failed password"
sudo journalctl --grep "error" -i   # Case-insensitive

Show Only Logs from Current Boot

sudo journalctl -b

Previous boot:

sudo journalctl -b -1

Limit Output to N Lines

sudo journalctl -n 50

Show Logs in JSON Format (for scripting)

sudo journalctl -o json-pretty

Show Kernel Logs Only

sudo journalctl -k
sudo journalctl -k --since "1 hour ago"

journalctl Pro Tips

  • Combine filters:

    sudo journalctl -u ssh.service --since "yesterday" --grep "Failed"
  • Use less to scroll:

    sudo journalctl | less -S
  • Always run with sudo to see all logs.


2. grep, egrep, and ripgrep (rg)

The classic method for quick filtering.

# Find failed SSH login attempts
grep "Failed password" /var/log/auth.log
 
# Case-insensitive search for suspicious IP addresses
grep -i "192.168.10.15" /var/log/syslog
 
# Ripgrep for blazing speed
rg "error" /var/log/*

Pro Tip: Always use grep --color=auto for quick visual parsing.


3. ausearch & aureport (Auditd Framework)

The Linux Audit framework is powerful for forensic logging.

# Search for all sudo events
sudo ausearch -m USER_CMD --success no
 
# Generate a report of all login attempts
sudo aureport --login --failed

Scenario: Detect unauthorized privilege escalation attempts.


4. logwatch

A log summarizer for quick overviews.

sudo logwatch --detail high --range today

Scenario: Daily automated security review of system activity.


5. GoAccess

A real-time log analyzer for HTTP access logs.

goaccess /var/log/apache2/access.log --log-format=COMBINED -o report.html

Scenario: Detect web scanning attempts or high-volume brute force attacks.


6. fail2ban

While primarily a prevention tool, it logs and tracks failed login attempts.

# View current jail status
sudo fail2ban-client status sshd

Scenario: Investigate repeated failed login attempts before IP banning.


7. ELK Stack (Elasticsearch, Logstash, Kibana)

For large-scale environments, professionals centralize logs and use visual dashboards.

Pro Tip: Integrate Filebeat to ship logs from multiple Linux servers into ELK for correlation.


Real-World Investigation Scenarios

Scenario 1: Suspicious SSH Activity

grep "Failed password" /var/log/auth.log | awk '{print $(NF-3)}' | sort | uniq -c | sort -nr

Purpose: Identify the top offending IPs trying to brute force SSH.

Scenario 2: Detecting Web Attacks

grep -E "UNION|SELECT|INSERT|DROP" /var/log/apache2/access.log

Purpose: Spot possible SQL injection attempts.

Scenario 3: Finding Deleted File Access

If Auditd is enabled:

sudo ausearch -m PATH --key delete

Purpose: See when and by whom files were deleted.


Pro Tips from the Field

  1. Log Rotation Awareness Don't assume logs go back forever — check /etc/logrotate.conf to ensure historical logs are archived before being rotated.

  2. Timestamp Synchronization Always ensure NTP is enabled; mismatched clocks can ruin a forensic timeline.

  3. Combine Tools Use grep for quick filtering, awk for extraction, and sort for counting — the UNIX way.

  4. Whitelist Known Noise For repetitive benign events, maintain a personal grep -v -f whitelist.txt file.

  5. Automate Reports Cron-based log summaries save you time during daily SOC duties.


Summary

Linux log investigation is both an art and a science. The art lies in knowing where to look and recognizing patterns; the science is in methodically filtering, parsing, and correlating events. Whether using built-in tools like journalctl and grep or enterprise-grade stacks like ELK, the core principle is visibility — you can't defend what you can't see.

By combining sharp command-line skills with the right log management tools, cybersecurity operators can quickly detect and respond to threats, turning scattered data into actionable intelligence.


***
Note on Content Creation: This article was developed with the assistance of generative AI like Gemini or ChatGPT. While all public AI strives for accuracy and comprehensive coverage, all content is reviewed and edited by human experts at IsoSecu to ensure factual correctness, relevance, and adherence to our editorial standards.