📝 Terminal Simulator — Module 5

Text Manipulation, Pipelines & Redirection

A 50,000-line log file is useless. But grep + pipes + wc -l turns it into one number your manager can act on.

Module Progress0/8 steps
STEP 1 / 8
grep -i

Find the Crash — Case-Insensitive Search

Real-World Scenario

At 3:47 AM, PagerDuty fired: "Service myapp-api is DOWN." You SSH into the server and need to find the EXACT timestamp and error message in application.log — a 50,000-line file. You know the error contains the word "critical" but not whether it's "Critical", "CRITICAL", or "critical". Using `grep -i` (case-insensitive) ensures you catch ALL variations without guessing.

Technical Breakdown

`grep` searches for pattern matches in files. `-i` makes the search case-insensitive — it matches "CRITICAL", "Critical", "critical", and any other case variant. Without `-i`, grep is case-sensitive by default (a common gotcha). `-n` adds line numbers so you can pinpoint the exact location. In production debugging, ALWAYS use `-i` unless you specifically need case-sensitive matching.

-iCase-insensitive — matches regardless of upper/lower case.
-nShow line numbers for each match.
-cCount matching lines instead of showing them.
-rRecursive — search all files in a directory.
-A NShow N lines AFTER each match (context).
-B NShow N lines BEFORE each match (context).

Your Task

Find all "critical" errors (any case) in the log. Type: grep -in "critical" application.log

devops@prod-server-03 — bash
devops@prod-server:~$

Quick Guide: Text Processing

Understanding the basics in 30 seconds

How It Works

  • grep -i searches case-insensitive — catches ERROR, Error, error
  • grep -r searches recursively through all files in a directory
  • | (pipe) chains commands — output of left feeds input of right
  • > redirects output to a file — WARNING: overwrites existing content!
  • >> appends output to a file — safe, preserves existing content
  • wc -l counts lines — the simplest way to quantify anything
  • sort | uniq -c counts occurrences — find top offenders instantly
  • echo + redirect writes config without needing an editor

Key Benefits

  • Instant log analysis without reading thousands of lines
  • Safe configuration changes with >> (append)
  • Pipeline mastery for complex data analysis in one line
  • Quantify incidents for post-mortem reports
  • Find configuration strings across entire directory trees
  • Automate report generation with redirects

Real-World Uses

  • Finding crash timestamps in 50,000-line application logs
  • Counting failed login attempts for security reports
  • Finding where configuration values are set across a project
  • Saving process snapshots for post-mortem analysis
  • Adding environment variables without a text editor
  • Identifying top attacker IPs from auth logs

The Stream Processing Playbook

Pipes & Redirects — The Core of UNIX

The UNIX philosophy is: "Write programs that do one thing well. Write programs to work together." Pipes and redirects are HOW programs work together. Every DevOps engineer uses these dozens of times daily.

cmd1 | cmd2 → Pipe: stdout of cmd1 → stdin of cmd2
cmd > file → Overwrite file with output
cmd >> file → Append output to file
cmd 2> err.log → Redirect errors to file
cmd > out.log 2>&1 → Capture EVERYTHING to one file

🔍 Search & Filter

  • grep -i — Case-insensitive search
  • grep -r — Recursive directory search
  • grep -c — Count matches
  • grep -v — Invert (exclude matches)
  • grep -A 3 — Show 3 lines after match

📊 Analyze & Count

  • wc -l — Count lines
  • sort | uniq -c — Count occurrences
  • sort -rn — Sort numerically (highest first)
  • awk '{print $N}' — Extract column N
  • head -N / tail -N — First/last N lines

⚠️ The Deadly Difference: > vs >>

> OVERWRITES — the file is completely replaced. Old content is gone forever.

>> APPENDS — new content is added at the end. Old content is preserved.

💀 Horror story: echo "new_var=true" > .env deleted 47 production environment variables. The engineer meant to use >>. Always double-check before pressing Enter!