Master the Linux shell and Bash basics. Learn command-line navigation, pipes, redirection, and the foundation for shell scripting.

In Episode 1, we learned that the shell is your interface to Linux. In Episode 2 and Episode 3, we explored the kernel and permissions.
Now it's time to master the shell itself—specifically Bash, the most common shell on Linux systems.
The shell is where you spend most of your time as a Linux user or engineer. Whether you're:
You're using the shell.
In this episode, we'll cover:
By the end, you'll be comfortable on the command line and ready to move into shell scripting in Episode 5.
We covered the kernel in Episode 2. Let's clarify the relationship:
The shell is a program that reads your commands, parses them, and tells the kernel to execute them. It's the intermediary between you and the kernel.
You → Shell → Kernel → HardwareWhen you type ls, the shell:
ls programLinux has several shells, each with different features:
| Shell | Full Name | Features | Default On |
|---|---|---|---|
| bash | Bourne Again Shell | Feature-rich, widely used, good for scripting | Most Linux distros |
| zsh | Z Shell | Modern, better defaults, powerful completion | macOS |
| fish | Friendly Interactive Shell | User-friendly, great for interactive use | Some distros |
| sh | POSIX Shell | Minimal, portable, used for system scripts | Embedded systems |
| ksh | Korn Shell | Powerful, used in enterprise | Some Unix systems |
| tcsh | TENEX C Shell | C-like syntax | Some BSD systems |
You can check your current shell:
echo $SHELL
# Output: /bin/bash
# List available shells
cat /etc/shellsBash is the industry standard for several reasons:
While other shells are great, learning Bash first is the right choice for IT careers.
You access the shell through a terminal emulator:
ssh user@hostOnce you have a terminal, you're in the shell.
The shell displays a prompt, typically:
alice@ubuntu:~$Breaking this down:
alice: Current userubuntu: Hostname~: Current directory (home directory)$: Prompt character (regular user)#: Prompt character (root user)You type commands after the prompt.
Commands follow a pattern:
command [options] [arguments]Examples:
# Simple command
ls
# Command with options
ls -la
# Command with options and arguments
cp -r source destination
# Multiple commands
ls; pwd; whoamiOptions (flags) modify command behavior:
-l (single dash, single letter)--long (double dash, full word)-la (multiple short options)Arguments are what the command operates on (files, directories, etc.).
Shows your current location in the filesystem:
# Print current directory
pwd
# Output: /home/alice
# Useful in scripts to know where you are
CURRENT_DIR=$(pwd)
echo "Working in: $CURRENT_DIR"Lists files and directories:
# List files in current directory
ls
# List with details (long format)
ls -l
# List all files (including hidden)
ls -a
# List with human-readable sizes
ls -lh
# List recursively
ls -R
# List sorted by modification time
ls -lt
# Combine options
ls -lahChanges your current directory:
# Change to home directory
cd
# Change to specific directory
cd /var/log
# Change to parent directory
cd ..
# Change to previous directory
cd -
# Change to home directory (explicit)
cd ~
# Change to user's home directory
cd ~aliceCreate and remove directories:
# Create a directory
mkdir mydir
# Create nested directories
mkdir -p /path/to/nested/dir
# Remove empty directory
rmdir mydir
# Remove directory and contents
rm -r mydirCreate and remove files:
# Create empty file
touch myfile.txt
# Update file timestamp
touch myfile.txt
# Remove file
rm myfile.txt
# Remove file with confirmation
rm -i myfile.txt
# Remove multiple files
rm file1.txt file2.txt file3.txt# View entire file
cat file.txt
# View file with line numbers
cat -n file.txt
# View file page by page
less file.txt
# View first 10 lines
head file.txt
# View last 10 lines
tail file.txt
# View last 20 lines
tail -n 20 file.txt
# Follow file as it grows (useful for logs)
tail -f /var/log/syslog# Copy file
cp source.txt destination.txt
# Copy directory recursively
cp -r source_dir dest_dir
# Copy with confirmation
cp -i source.txt destination.txt
# Move/rename file
mv old_name.txt new_name.txt
# Move file to directory
mv file.txt /path/to/directory/
# Move with confirmation
mv -i file.txt /path/to/directory/# Find files by name
find /path -name "*.txt"
# Find files by type
find /path -type f
# Find directories
find /path -type d
# Find files modified in last 7 days
find /path -mtime -7
# Find files larger than 100MB
find /path -size +100M
# Find and execute command
find /path -name "*.log" -exec rm {} \;Remember from Episode 3:
# View permissions
ls -l file.txt
# Change permissions
chmod 644 file.txt
# Change ownership
sudo chown user:group file.txt
# Make executable
chmod +x script.shAbsolute paths start from the root (/) and specify the complete path:
# Absolute paths
/home/alice/documents/file.txt
/var/log/syslog
/usr/bin/python3
/etc/nginx/nginx.confAbsolute paths work from anywhere in the filesystem.
Relative paths are relative to your current directory:
# If you're in /home/alice
documents/file.txt # Same as /home/alice/documents/file.txt
./documents/file.txt # Explicit current directory
../bob/documents/file.txt # Parent directorySpecial directory references:
. = Current directory.. = Parent directory~ = Home directory- = Previous directory# Go to home directory
cd ~
# Go to specific user's home
cd ~alice
# Reference home directory in paths
cat ~/documents/file.txt
# Expand home directory
echo ~
# Output: /home/aliceThe shell expands paths before executing commands:
# Tilde expansion
~/documents → /home/alice/documents
# Variable expansion
$HOME/documents → /home/alice/documents
# Wildcard expansion (covered next section)
*.txt → file1.txt file2.txt file3.txtEvery process has three standard streams:
Redirection changes where these streams go.
Redirect stdout to a file:
# Redirect output to file (overwrite)
ls > files.txt
# Redirect output to file (append)
ls >> files.txt
# Redirect stderr to file
ls /nonexistent 2> error.txt
# Redirect both stdout and stderr
ls > output.txt 2>&1
# Redirect to /dev/null (discard)
ls > /dev/nullRedirect stdin from a file:
# Read input from file
sort < unsorted.txt
# Here document (multi-line input)
cat << EOF
This is line 1
This is line 2
EOFUse >> to append instead of overwrite:
# First command creates file
echo "Line 1" > file.txt
# Second command appends
echo "Line 2" >> file.txt
# Result: file.txt contains both lines
cat file.txt
# Output:
# Line 1
# Line 2Separate stdout and stderr:
# Redirect only errors
command 2> errors.txt
# Redirect only success output
command 1> output.txt
# Redirect both to different files
command > output.txt 2> errors.txt
# Redirect both to same file
command > output.txt 2>&1
# Discard errors, keep output
command 2> /dev/nullA pipe (|) connects the stdout of one command to the stdin of another. This allows you to chain commands together.
command1 | command2 | command3The output of command1 becomes the input of command2, and so on.
# List files and count them
ls | wc -l
# Search for pattern and count matches
grep "error" /var/log/syslog | wc -l
# Sort and remove duplicates
cat file.txt | sort | uniq
# Chain multiple commands
ps aux | grep nginx | grep -v grep
# Complex pipeline
cat /var/log/syslog | grep "error" | cut -d: -f1 | sort | uniq -cChain commands with logical operators:
# AND operator (&&) - run next if previous succeeds
cd /var/www && ls
# OR operator (||) - run next if previous fails
cd /nonexistent || echo "Directory not found"
# Semicolon (;) - run regardless of success
cd /tmp; pwd; ls
# Combine operators
mkdir mydir && cd mydir && touch file.txt || echo "Failed"You can combine pipes with redirection:
# Pipe to file
ps aux | grep nginx > nginx_processes.txt
# Pipe and append
ps aux | grep nginx >> nginx_processes.txt
# Pipe and redirect errors
cat /nonexistent | grep error 2> /dev/null
# Complex: pipe, redirect, and chain
grep "error" /var/log/syslog | wc -l > error_count.txt && echo "Done"Wildcards allow you to match multiple files with a single pattern.
The * matches zero or more characters:
# Match all files
ls *
# Match all .txt files
ls *.txt
# Match files starting with 'test'
ls test*
# Match files ending with '.log'
ls *.log
# Remove all .tmp files
rm *.tmp
# Copy all .conf files
cp *.conf /etc/backup/The ? matches exactly one character:
# Match file1.txt, file2.txt, etc.
ls file?.txt
# Match any three-character filename
ls ???
# Match test_a.log, test_b.log, etc.
ls test_?.logUse [...] to match specific characters:
# Match file1.txt or file2.txt
ls file[12].txt
# Match any digit
ls file[0-9].txt
# Match any letter
ls file[a-z].txt
# Match anything except 'a'
ls file[!a].txt
# Match uppercase or lowercase
ls file[A-Za-z].txtUse {...} to expand multiple patterns:
# Create multiple files
touch file{1,2,3}.txt
# Copy to multiple locations
cp myfile.txt {/tmp,/home,/var}/
# Create directory structure
mkdir -p project/{src,tests,docs}
# Expand ranges
echo {1..5}
# Output: 1 2 3 4 5
# Expand with prefix/suffix
echo file{1..3}.{txt,log}
# Output: file1.txt file1.log file2.txt file2.log file3.txt file3.logSearch for patterns in files:
# Search for pattern
grep "error" /var/log/syslog
# Case-insensitive search
grep -i "error" /var/log/syslog
# Show line numbers
grep -n "error" /var/log/syslog
# Invert match (lines NOT matching)
grep -v "error" /var/log/syslog
# Count matches
grep -c "error" /var/log/syslog
# Search recursively in directory
grep -r "error" /var/log/
# Show context (3 lines before and after)
grep -C 3 "error" /var/log/syslogEdit text streams:
# Replace first occurrence on each line
sed 's/old/new/' file.txt
# Replace all occurrences
sed 's/old/new/g' file.txt
# Replace and save to file
sed -i 's/old/new/g' file.txt
# Delete lines matching pattern
sed '/pattern/d' file.txt
# Print only lines matching pattern
sed -n '/pattern/p' file.txt
# Delete specific line number
sed '5d' file.txtProcess structured text:
# Print specific column
awk '{print $1}' file.txt
# Print with condition
awk '$3 > 100 {print $1, $3}' file.txt
# Use different delimiter
awk -F: '{print $1}' /etc/passwd
# Sum column
awk '{sum += $1} END {print sum}' file.txt
# Count lines
awk 'END {print NR}' file.txtExtract columns from text:
# Extract first column (space-delimited)
cut -d' ' -f1 file.txt
# Extract columns 1-3
cut -d: -f1-3 /etc/passwd
# Extract from character position 1-10
cut -c1-10 file.txt
# Extract everything except column 2
cut -d, -f1,3- file.csvSort and remove duplicates:
# Sort lines
sort file.txt
# Sort numerically
sort -n file.txt
# Sort in reverse
sort -r file.txt
# Remove duplicates
sort file.txt | uniq
# Count occurrences
sort file.txt | uniq -c
# Show only duplicates
sort file.txt | uniq -dCount lines, words, and characters:
# Count lines
wc -l file.txt
# Count words
wc -w file.txt
# Count characters
wc -c file.txt
# Count all
wc file.txt
# Count lines in multiple files
wc -l *.txt# View command history
history
# View last 20 commands
history 20
# Search history
history | grep "grep"
# View history file
cat ~/.bash_history
# Clear history
history -c# Run previous command
!!
# Run command number 100
!100
# Run last command starting with 'ls'
!ls
# Run last command and replace text
^old^new^
# Run previous command with sudo
sudo !!Essential Bash keyboard shortcuts:
| Shortcut | Action |
|---|---|
Ctrl+A | Move to beginning of line |
Ctrl+E | Move to end of line |
Ctrl+L | Clear screen |
Ctrl+R | Search history (reverse) |
Ctrl+C | Cancel current command |
Ctrl+D | Exit shell |
Ctrl+U | Clear line |
Ctrl+K | Delete from cursor to end |
Tab | Auto-complete |
Alt+B | Move back one word |
Alt+F | Move forward one word |
Use ! to reference history:
# Last command
!!
# Last argument of previous command
!$
# All arguments of previous command
!*
# Command number 50
!50
# Last command starting with 'grep'
!grepEnvironment variables are named values that store configuration and state information. They're available to all processes and can be referenced in commands and scripts.
# View all environment variables
env
# View specific variable
echo $HOME
# View all variables (including shell variables)
set
# Check if variable exists
echo ${MYVAR:-default}# Set variable for current session
MYVAR="hello"
# Use variable
echo $MYVAR
# Set variable for single command
MYVAR="hello" mycommand
# Make variable permanent (add to ~/.bashrc)
echo 'export MYVAR="hello"' >> ~/.bashrc| Variable | Purpose |
|---|---|
$HOME | User's home directory |
$USER | Current username |
$PWD | Current working directory |
$PATH | Directories to search for commands |
$SHELL | Current shell |
$HOSTNAME | System hostname |
$LANG | Language/locale |
$TERM | Terminal type |
$PS1 | Primary prompt |
$? | Exit status of last command |
Export makes variables available to child processes:
# Set and export
export MYVAR="hello"
# Or export existing variable
MYVAR="hello"
export MYVAR
# View exported variables
export -pAliases are shortcuts for commands:
# Create alias
alias ll='ls -lah'
# Use alias
ll
# View all aliases
alias
# Remove alias
unalias ll
# Make alias permanent (add to ~/.bashrc)
echo "alias ll='ls -lah'" >> ~/.bashrc# Remove single alias
unalias ll
# Remove all aliases
unalias -aFunctions are more powerful than aliases:
# Define function
greet() {
echo "Hello, $1!"
}
# Call function
greet Alice
# Function with multiple commands
backup() {
tar -czf backup.tar.gz ~/documents
echo "Backup complete"
}
# Make function permanent (add to ~/.bashrc)
echo 'greet() { echo "Hello, $1!"; }' >> ~/.bashrcFunctions can accept parameters:
# Function with parameters
add() {
echo $(($1 + $2))
}
# Call with parameters
add 5 3
# Output: 8
# Function with default values
greet() {
local name=${1:-"World"}
echo "Hello, $name!"
}
greet
# Output: Hello, World!
greet Alice
# Output: Hello, Alice!The shebang (#!) is the first line of a script that tells the system which interpreter to use:
#!/bin/bashBreaking it down:
#! = Shebang marker/bin/bash = Path to the interpreterOther shebangs:
#!/bin/sh # POSIX shell
#!/usr/bin/python3 # Python
#!/usr/bin/perl # Perl
#!/usr/bin/env bash # Use bash from PATH (more portable)The shebang must be the first line of the file.
# Make script executable
chmod +x myscript.sh
# Verify it's executable
ls -l myscript.sh
# Output: -rwxr-xr-x 1 alice alice 100 Feb 20 10:30 myscript.sh# Run script from current directory
./myscript.sh
# Run script from anywhere (if in PATH)
myscript.sh
# Run with explicit interpreter
bash myscript.sh
# Run with arguments
./myscript.sh arg1 arg2Different ways to run scripts:
# Method 1: Direct execution (requires shebang and executable permission)
./myscript.sh
# Method 2: Explicit interpreter
bash myscript.sh
# Method 3: Source the script (runs in current shell)
source myscript.sh
# Method 4: Using dot notation (same as source)
. myscript.shDifference: Direct execution and explicit interpreter run in a subshell. Source runs in the current shell, so variables persist.
Mistake: Not quoting variables with spaces
# DON'T DO THIS
FILE=my file.txt
cat $FILE # Tries to open "my" and "file.txt" separately
# DO THIS
FILE="my file.txt"
cat "$FILE" # Opens "my file.txt"Mistake: Redirecting to wrong stream
# DON'T DO THIS - error goes to terminal
command > output.txt # Only stdout redirected
# DO THIS - both stdout and stderr redirected
command > output.txt 2>&1Mistake: Trying to pipe to commands that don't accept stdin
# DON'T DO THIS - cd doesn't read from stdin
ls | cd # Doesn't work
# DO THIS - use xargs or command substitution
cd $(ls -d */)Mistake: Not expanding variables correctly
# DON'T DO THIS
echo 'The value is $MYVAR' # Single quotes prevent expansion
# DO THIS
echo "The value is $MYVAR" # Double quotes allow expansionUse clear, descriptive names for files, variables, and functions:
# Good
backup_database.sh
BACKUP_DIR="/var/backups"
# Bad
script.sh
BD="/var/backups"Always know what a command does before running it, especially with rm:
# Test before executing
ls *.log # See what will be deleted
# Then delete
rm *.log
# Or use confirmation
rm -i *.logTest commands on non-critical data first:
# Test on a single file
sed 's/old/new/' test.txt
# Then apply to all files
sed -i 's/old/new/' *.txtAdd comments to complex one-liners:
# Find all .log files modified in last 7 days and compress them
find /var/log -name "*.log" -mtime -7 -exec gzip {} \;Use the command line for simple, one-off tasks. Use scripts for complex, repetitive tasks:
Command line:
# One-time task
ls -la /var/log | grep errorScript:
# Repetitive task that needs logic
if [ -f /var/log/app.log ]; then
grep error /var/log/app.log | mail admin@example.com
fiIf you find yourself typing the same commands repeatedly, it's time to script:
# If you're doing this multiple times...
cd /var/www/project && git pull && npm install && npm run build
# ...create a script
#!/bin/bash
cd /var/www/project
git pull
npm install
npm run buildWhen you need to automate tasks (cron jobs, CI/CD, etc.), use scripts:
# Schedule with cron
0 2 * * * /usr/local/bin/backup.shThis is where Episode 5 comes in—we'll learn to write production-grade bash scripts.
pwd, ls, cd, cp, mv, rm*, ?, [...], {...}grep, sed, awk, cut, sort, uniq$PATH, $HOME, $USERcd, ls, pwdThe command line is where you'll spend most of your time as a Linux user. The more comfortable you are here, the more productive you'll be.
Ready for the final episode? Continue with Episode 5: Bash Scripting Mastery to learn how to write powerful, production-grade bash scripts that automate real-world tasks.