Initial commit

This commit is contained in:
Zhongwei Li
2025-11-30 08:28:57 +08:00
commit 99185c0a7c
16 changed files with 10488 additions and 0 deletions

470
skills/bash-53-features.md Normal file
View File

@@ -0,0 +1,470 @@
---
name: bash-53-features
description: Bash 5.3 new features and modern patterns (2025)
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# Bash 5.3 Features (2025)
## Overview
Bash 5.3 (released July 2025) introduces significant new features that improve performance, readability, and functionality.
## Key New Features
### 1. In-Shell Command Substitution
**New: ${ command; } syntax** - Executes without forking a subshell (runs in current shell context):
```bash
# OLD way (Bash < 5.3) - Creates subshell
output=$(expensive_command)
# NEW way (Bash 5.3+) - Runs in current shell, faster
output=${ expensive_command; }
```
**Benefits:**
- No subshell overhead (faster)
- Preserves variable scope
- Better performance in loops
**Example:**
```bash
#!/usr/bin/env bash
# Traditional approach
count=0
for file in *.txt; do
lines=$(wc -l < "$file") # Subshell created
((count += lines))
done
# Bash 5.3 approach (faster)
count=0
for file in *.txt; do
lines=${ wc -l < "$file"; } # No subshell
((count += lines))
done
```
### 2. REPLY Variable Command Substitution
**New: ${| command; } syntax** - Stores result in REPLY variable:
```bash
# Runs command, result goes to $REPLY automatically
${| complex_calculation; }
echo "Result: $REPLY"
# Multiple operations
${|
local_var="processing"
echo "$local_var: $((42 * 2))"
}
echo "Got: $REPLY"
```
**Use Cases:**
- Avoid variable naming conflicts
- Clean syntax for temporary values
- Standardized result variable
### 3. Enhanced `read` Builtin
**New: -E option** - Uses readline with programmable completion:
```bash
# Interactive input with tab completion
read -E -p "Enter filename: " filename
# User can now tab-complete file paths!
# With custom completion
read -E -p "Select environment: " env
# Enables full readline features (history, editing)
```
**Benefits:**
- Better UX for interactive scripts
- Built-in path completion
- Command history support
### 4. Enhanced `source` Builtin
**New: -p PATH option** - Custom search path for sourcing:
```bash
# OLD way
source /opt/myapp/lib/helpers.sh
# NEW way - Search custom path
source -p /opt/myapp/lib:/usr/local/lib helpers.sh
# Respects CUSTOM_PATH instead of current directory
CUSTOM_PATH=/app/modules:/shared/lib
source -p "$CUSTOM_PATH" database.sh
```
**Benefits:**
- Modular library organization
- Avoid hard-coded paths
- Environment-specific sourcing
### 5. Enhanced `compgen` Builtin
**New: Variable output option** - Store completions in variable:
```bash
# OLD way - Output to stdout
completions=$(compgen -f)
# NEW way - Directly to variable
compgen -v completions_var -f
# Results now in $completions_var
```
**Benefits:**
- Cleaner completion handling
- No extra subshells
- Better performance
### 6. GLOBSORT Variable
**New: Control glob sorting behavior**:
```bash
# Default: alphabetical sort
echo *.txt
# Sort by modification time (newest first)
GLOBSORT="-mtime"
echo *.txt
# Sort by size
GLOBSORT="size"
echo *.txt
# Reverse alphabetical
GLOBSORT="reverse"
echo *.txt
```
**Options:**
- `name` - Alphabetical (default)
- `reverse` - Reverse alphabetical
- `size` - By file size
- `mtime` - By modification time
- `-mtime` - Reverse modification time
### 7. BASH_TRAPSIG Variable
**New: Signal number variable in traps**:
```bash
#!/usr/bin/env bash
set -euo pipefail
# BASH_TRAPSIG contains the signal number being handled
handle_signal() {
echo "Caught signal: $BASH_TRAPSIG" >&2
case "$BASH_TRAPSIG" in
15) echo "SIGTERM (15) received, shutting down gracefully" ;;
2) echo "SIGINT (2) received, cleaning up" ;;
*) echo "Signal $BASH_TRAPSIG received" ;;
esac
}
trap handle_signal SIGTERM SIGINT SIGHUP
```
**Benefits:**
- Reusable signal handlers
- Dynamic signal-specific behavior
- Better logging and debugging
### 8. Floating-Point Arithmetic
**New: `fltexpr` loadable builtin**:
```bash
# Enable floating-point support
enable -f /usr/lib/bash/fltexpr fltexpr
# Perform calculations
fltexpr result = 42.5 * 1.5
echo "$result" # 63.75
# Complex expressions
fltexpr pi_area = 3.14159 * 5 * 5
echo "Area: $pi_area"
```
**Use Cases:**
- Scientific calculations
- Financial computations
- Avoid external tools (bc, awk)
## Performance Improvements
### Avoid Subshells
```bash
# ❌ OLD (Bash < 5.3) - Multiple subshells
for i in {1..1000}; do
result=$(echo "$i * 2" | bc)
process "$result"
done
# ✅ NEW (Bash 5.3+) - No subshells
for i in {1..1000}; do
result=${ echo $((i * 2)); }
process "$result"
done
```
**Performance Gain:** ~40% faster in benchmarks
### Efficient File Processing
```bash
#!/usr/bin/env bash
# Process large file efficiently
process_log() {
local line_count=0
local error_count=0
while IFS= read -r line; do
((line_count++))
# Bash 5.3: No subshell for grep
if ${ grep -q "ERROR" <<< "$line"; }; then
((error_count++))
fi
done < "$1"
echo "Processed $line_count lines, found $error_count errors"
}
process_log /var/log/app.log
```
## Migration Guide
### Check Bash Version
```bash
#!/usr/bin/env bash
# Require Bash 5.3+
if ((BASH_VERSINFO[0] < 5 || (BASH_VERSINFO[0] == 5 && BASH_VERSINFO[1] < 3))); then
echo "Error: Bash 5.3+ required (found $BASH_VERSION)" >&2
exit 1
fi
```
### Feature Detection
```bash
# Test for 5.3 features
has_bash_53_features() {
# Try using ${ } syntax
if eval 'test=${ echo "yes"; }' 2>/dev/null; then
return 0
else
return 1
fi
}
if has_bash_53_features; then
echo "Bash 5.3 features available"
else
echo "Using legacy mode"
fi
```
### Gradual Adoption
```bash
#!/usr/bin/env bash
set -euo pipefail
# Support both old and new bash
if ((BASH_VERSINFO[0] > 5 || (BASH_VERSINFO[0] == 5 && BASH_VERSINFO[1] >= 3))); then
# Bash 5.3+ path
result=${ compute_value; }
else
# Legacy path
result=$(compute_value)
fi
```
## Best Practices (2025)
1. **Use ${ } for performance-critical loops**
```bash
for item in "${large_array[@]}"; do
processed=${ transform "$item"; }
done
```
2. **Use ${| } for clean temporary values**
```bash
${| calculate_hash "$file"; }
if [[ "$REPLY" == "$expected_hash" ]]; then
echo "Valid"
fi
```
3. **Enable readline for interactive scripts**
```bash
read -E -p "Config file: " config
```
4. **Use source -p for modular libraries**
```bash
source -p "$LIB_PATH" database.sh logging.sh
```
5. **Document version requirements**
```bash
# Requires: Bash 5.3+ for performance features
```
## Compatibility Notes
### Bash 5.3 Availability (2025)
**Note:** Bash 5.3 (released July 2025) is the latest stable version. There is no Bash 5.4 as of October 2025.
- **Linux**: Ubuntu 24.04+, Fedora 40+, Arch (current)
- **macOS**: Homebrew (`brew install bash`)
- **Windows**: WSL2 with Ubuntu 24.04+
- **Containers**: `bash:5.3` official image
### C23 Conformance
Bash 5.3 updated to C23 language standard. **Note:** K&R style C compilers are no longer supported.
### Fallback Pattern
```bash
#!/usr/bin/env bash
set -euo pipefail
# Detect bash version
readonly BASH_53_PLUS=$((BASH_VERSINFO[0] > 5 || (BASH_VERSINFO[0] == 5 && BASH_VERSINFO[1] >= 3)))
process_items() {
local item
for item in "$@"; do
if ((BASH_53_PLUS)); then
result=${ transform "$item"; } # Fast path
else
result=$(transform "$item") # Compatible path
fi
echo "$result"
done
}
```
## Real-World Examples
### Fast Log Parser
```bash
#!/usr/bin/env bash
set -euo pipefail
# Parse log file (Bash 5.3 optimized)
parse_log() {
local file="$1"
local stats_errors=0
local stats_warnings=0
local stats_lines=0
while IFS= read -r line; do
((stats_lines++))
# Fast pattern matching (no subshell)
${| grep -q "ERROR" <<< "$line"; } && ((stats_errors++))
${| grep -q "WARN" <<< "$line"; } && ((stats_warnings++))
done < "$file"
echo "Lines: $stats_lines, Errors: $stats_errors, Warnings: $stats_warnings"
}
parse_log /var/log/application.log
```
### Interactive Configuration
```bash
#!/usr/bin/env bash
set -euo pipefail
# Interactive setup with readline
setup_config() {
echo "Configuration Setup"
echo "==================="
# Tab completion for paths
read -E -p "Data directory: " data_dir
read -E -p "Config file: " config_file
# Validate and store
${|
[[ -d "$data_dir" ]] && echo "valid" || echo "invalid"
}
if [[ "$REPLY" == "valid" ]]; then
echo "DATA_DIR=$data_dir" > config.env
echo "CONFIG_FILE=$config_file" >> config.env
echo "✓ Configuration saved"
else
echo "✗ Invalid directory" >&2
return 1
fi
}
setup_config
```
## Resources
- [Bash 5.3 Release Notes](https://lists.gnu.org/archive/html/bash-announce/2025-07/msg00000.html)
- [Bash Manual - Command Substitution](https://www.gnu.org/software/bash/manual/html_node/Command-Substitution.html)
- [ShellCheck Bash 5.3 Support](https://github.com/koalaman/shellcheck/releases)
---
**Bash 5.3 provides significant performance and usability improvements. Adopt these features gradually while maintaining backwards compatibility for older systems.**

1318
skills/bash-master/SKILL.md Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,992 @@
# Common Bash Patterns and Anti-Patterns
Collection of proven patterns and common mistakes in bash scripting with explanations and solutions.
---
## Table of Contents
1. [Variable Handling](#variable-handling)
2. [Command Execution](#command-execution)
3. [File Operations](#file-operations)
4. [String Processing](#string-processing)
5. [Arrays and Loops](#arrays-and-loops)
6. [Conditionals and Tests](#conditionals-and-tests)
7. [Functions](#functions)
8. [Error Handling](#error-handling)
9. [Process Management](#process-management)
10. [Security Patterns](#security-patterns)
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
## Variable Handling
### Pattern: Safe Variable Expansion
```bash
# ✓ GOOD: Always quote variables
echo "$variable"
cp "$source" "$destination"
rm -rf "$directory"
# ✗ BAD: Unquoted variables
echo $variable # Word splitting and globbing
cp $source $destination # Breaks with spaces
rm -rf $directory # VERY DANGEROUS unquoted
```
**Why:** Unquoted variables undergo word splitting and pathname expansion, leading to unexpected behavior.
### Pattern: Default Values
```bash
# ✓ GOOD: Use parameter expansion for defaults
timeout="${TIMEOUT:-30}"
config="${CONFIG_FILE:-$HOME/.config/app.conf}"
# ✗ BAD: Manual check
if [ -z "$TIMEOUT" ]; then
timeout=30
else
timeout="$TIMEOUT"
fi
```
**Why:** Parameter expansion is concise, readable, and handles edge cases correctly.
### Anti-Pattern: Confusing Assignment and Comparison
```bash
# ✗ VERY BAD: Using = instead of ==
if [ "$var" = "value" ]; then # Assignment in POSIX test!
echo "Match"
fi
# ✓ GOOD: Use == or = correctly
if [[ "$var" == "value" ]]; then # Comparison in bash
echo "Match"
fi
# ✓ GOOD: POSIX-compliant
if [ "$var" = "value" ]; then # Single = is correct in [ ]
echo "Match"
fi
```
**Why:** In `[[ ]]`, both `=` and `==` work. In `[ ]`, only `=` is POSIX-compliant.
### Anti-Pattern: Unset Variable Access
```bash
# ✗ BAD: Accessing undefined variables
echo "Value: $undefined_variable" # Silent error, prints "Value: "
# ✓ GOOD: Use set -u
set -u
echo "Value: $undefined_variable" # Error: undefined_variable: unbound variable
# ✓ GOOD: Provide default
echo "Value: ${undefined_variable:-default}"
```
**Why:** `set -u` catches typos and logic errors early.
---
## Command Execution
### Pattern: Check Command Existence
```bash
# ✓ GOOD: Use command -v
if command -v jq &> /dev/null; then
echo "jq is installed"
else
echo "jq is not installed" >&2
exit 1
fi
# ✗ BAD: Using which
if which jq; then # Deprecated, not POSIX
echo "jq is installed"
fi
# ✗ BAD: Using type
if type jq; then # Verbose output
echo "jq is installed"
fi
```
**Why:** `command -v` is POSIX-compliant, silent, and reliable.
### Pattern: Command Substitution
```bash
# ✓ GOOD: Modern syntax with $()
result=$(command arg1 arg2)
timestamp=$(date +%s)
# ✗ BAD: Backticks (hard to nest)
result=`command arg1 arg2`
timestamp=`date +%s`
# ✓ GOOD: Nested substitution
result=$(echo "Outer: $(echo "Inner")")
# ✗ BAD: Nested backticks (requires escaping)
result=`echo "Outer: \`echo \"Inner\"\`"`
```
**Why:** `$()` is easier to read, nest, and maintain.
### Anti-Pattern: Useless Use of Cat
```bash
# ✗ BAD: UUOC (Useless Use of Cat)
cat file.txt | grep "pattern"
# ✓ GOOD: Direct input
grep "pattern" file.txt
# ✗ BAD: Multiple cats
cat file1 | grep pattern | cat | sort | cat
# ✓ GOOD: Direct pipeline
grep pattern file1 | sort
```
**Why:** Unnecessary `cat` wastes resources and adds extra processes.
### Anti-Pattern: Using ls in Scripts
```bash
# ✗ BAD: Parsing ls output
for file in $(ls *.txt); do
echo "$file"
done
# ✓ GOOD: Use globbing
for file in *.txt; do
[[ -f "$file" ]] || continue # Skip if no matches
echo "$file"
done
# ✗ BAD: Counting files with ls
count=$(ls -1 | wc -l)
# ✓ GOOD: Use array
files=(*)
count=${#files[@]}
```
**Why:** `ls` output is meant for humans, not scripts. Parsing it breaks with spaces, newlines, etc.
---
## File Operations
### Pattern: Safe File Reading
```bash
# ✓ GOOD: Preserve leading/trailing whitespace and backslashes
while IFS= read -r line; do
echo "Line: $line"
done < file.txt
# ✗ BAD: Without IFS= (strips leading/trailing whitespace)
while read -r line; do
echo "Line: $line"
done < file.txt
# ✗ BAD: Without -r (interprets backslashes)
while IFS= read line; do
echo "Line: $line"
done < file.txt
```
**Why:** `IFS=` prevents trimming, `-r` prevents backslash interpretation.
### Pattern: Null-Delimited Files
```bash
# ✓ GOOD: For filenames with special characters
find . -name "*.txt" -print0 | while IFS= read -r -d '' file; do
echo "Processing: $file"
done
# Or with mapfile (bash 4+)
mapfile -d '' -t files < <(find . -name "*.txt" -print0)
for file in "${files[@]}"; do
echo "Processing: $file"
done
# ✗ BAD: Newline-delimited (breaks with newlines in filenames)
find . -name "*.txt" | while IFS= read -r file; do
echo "Processing: $file"
done
```
**Why:** Filenames can contain any character except null and slash.
### Anti-Pattern: Testing File Existence Incorrectly
```bash
# ✗ BAD: Using ls to test existence
if ls file.txt &> /dev/null; then
echo "File exists"
fi
# ✓ GOOD: Use test operators
if [[ -f file.txt ]]; then
echo "File exists"
fi
# ✓ GOOD: Different tests
[[ -e path ]] # Exists (file or directory)
[[ -f file ]] # Regular file
[[ -d dir ]] # Directory
[[ -L link ]] # Symbolic link
[[ -r file ]] # Readable
[[ -w file ]] # Writable
[[ -x file ]] # Executable
```
**Why:** Test operators are the correct, efficient way to check file properties.
### Pattern: Temporary Files
```bash
# ✓ GOOD: Secure temporary file
temp_file=$(mktemp)
trap 'rm -f "$temp_file"' EXIT
# Use temp file
echo "data" > "$temp_file"
# ✗ BAD: Insecure temp file
temp_file="/tmp/myapp.$$"
echo "data" > "$temp_file"
# No cleanup!
# ✓ GOOD: Temporary directory
temp_dir=$(mktemp -d)
trap 'rm -rf "$temp_dir"' EXIT
```
**Why:** `mktemp` creates secure, unique files and prevents race conditions.
---
## String Processing
### Pattern: String Manipulation with Parameter Expansion
```bash
# ✓ GOOD: Use bash parameter expansion
filename="document.tar.gz"
basename="${filename%%.*}" # document
extension="${filename##*.}" # gz
name="${filename%.gz}" # document.tar
# ✗ BAD: Using external commands
basename=$(echo "$filename" | sed 's/\..*$//')
extension=$(echo "$filename" | awk -F. '{print $NF}')
```
**Why:** Parameter expansion is faster and doesn't spawn processes.
### Pattern: String Comparison
```bash
# ✓ GOOD: Use [[ ]] for strings
if [[ "$string1" == "$string2" ]]; then
echo "Equal"
fi
# ✓ GOOD: Pattern matching
if [[ "$filename" == *.txt ]]; then
echo "Text file"
fi
# ✓ GOOD: Regex matching
if [[ "$email" =~ ^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$ ]]; then
echo "Valid email"
fi
# ✗ BAD: Using grep for simple string check
if echo "$string" | grep -q "substring"; then
echo "Found"
fi
# ✓ GOOD: Use substring matching
if [[ "$string" == *"substring"* ]]; then
echo "Found"
fi
```
**Why:** `[[ ]]` is bash-native, faster, and more readable.
### Anti-Pattern: Word Splitting Issues
```bash
# ✗ BAD: Unquoted expansion with spaces
var="file1.txt file2.txt"
for file in $var; do # Splits on spaces!
echo "$file" # file1.txt, then file2.txt
done
# ✓ GOOD: Use array
files=("file1.txt" "file2.txt")
for file in "${files[@]}"; do
echo "$file"
done
# ✗ BAD: Word splitting in command arguments
file="my file.txt"
rm $file # Tries to remove "my" and "file.txt"!
# ✓ GOOD: Quote variables
rm "$file"
```
**Why:** Word splitting on spaces is a major source of bugs.
---
## Arrays and Loops
### Pattern: Array Declaration and Use
```bash
# ✓ GOOD: Array declaration
files=("file1.txt" "file2.txt" "file 3.txt")
# ✓ GOOD: Array expansion (each element quoted)
for file in "${files[@]}"; do
echo "$file"
done
# ✗ BAD: Unquoted array expansion
for file in ${files[@]}; do # Word splitting!
echo "$file"
done
# ✓ GOOD: Add to array
files+=("file4.txt")
# ✓ GOOD: Array length
echo "Count: ${#files[@]}"
# ✓ GOOD: Array indices
for i in "${!files[@]}"; do
echo "File $i: ${files[$i]}"
done
```
**Why:** Proper array handling prevents word splitting and globbing issues.
### Pattern: Reading Command Output into Array
```bash
# ✓ GOOD: mapfile/readarray (bash 4+)
mapfile -t lines < file.txt
# ✓ GOOD: With command substitution
mapfile -t files < <(find . -name "*.txt")
# ✗ BAD: Word splitting
files=($(find . -name "*.txt")) # Breaks with spaces in filenames!
# ✓ GOOD: Alternative (POSIX-compatible)
while IFS= read -r file; do
files+=("$file")
done < <(find . -name "*.txt")
```
**Why:** `mapfile` is efficient and handles special characters correctly.
### Anti-Pattern: C-Style For Loops for Arrays
```bash
# ✗ BAD: C-style loop for arrays
for ((i=0; i<${#files[@]}; i++)); do
echo "${files[$i]}"
done
# ✓ GOOD: For-in loop
for file in "${files[@]}"; do
echo "$file"
done
# ✓ ACCEPTABLE: When you need the index
for i in "${!files[@]}"; do
echo "Index $i: ${files[$i]}"
done
```
**Why:** For-in loops are simpler and less error-prone.
### Pattern: Loop over Range
```bash
# ✓ GOOD: Brace expansion
for i in {1..10}; do
echo "$i"
done
# ✓ GOOD: With variables (bash 4+)
start=1
end=10
for i in $(seq $start $end); do
echo "$i"
done
# ✓ GOOD: C-style (arithmetic)
for ((i=1; i<=10; i++)); do
echo "$i"
done
# ✗ BAD: Using seq in a loop unnecessarily
for i in $(seq 1 1000000); do # Creates huge string in memory!
echo "$i"
done
# ✓ GOOD: Use C-style for large ranges
for ((i=1; i<=1000000; i++)); do
echo "$i"
done
```
**Why:** Choose the right loop construct based on the use case.
---
## Conditionals and Tests
### Pattern: File Tests
```bash
# ✓ GOOD: Use appropriate test
if [[ -f "$file" ]]; then # Regular file
if [[ -d "$dir" ]]; then # Directory
if [[ -e "$path" ]]; then # Exists (any type)
if [[ -L "$link" ]]; then # Symbolic link
if [[ -r "$file" ]]; then # Readable
if [[ -w "$file" ]]; then # Writable
if [[ -x "$file" ]]; then # Executable
if [[ -s "$file" ]]; then # Non-empty file
# ✗ BAD: Incorrect test
if [[ -e "$file" ]]; then # Exists, but could be directory!
cat "$file" # Fails if directory
fi
# ✓ GOOD: Specific test
if [[ -f "$file" ]]; then
cat "$file"
fi
```
**Why:** Use the most specific test for your use case.
### Pattern: Numeric Comparison
```bash
# ✓ GOOD: Arithmetic context
if (( num > 10 )); then
echo "Greater than 10"
fi
# ✓ GOOD: Test operator
if [[ $num -gt 10 ]]; then
echo "Greater than 10"
fi
# ✗ BAD: String comparison for numbers
if [[ "$num" > "10" ]]; then # Lexicographic comparison!
echo "Greater than 10" # "9" > "10" is true!
fi
```
**Why:** Use numeric comparison operators for numbers.
### Anti-Pattern: Testing Boolean Strings
```bash
# ✗ BAD: Comparing to string "true"
if [[ "$flag" == "true" ]]; then
do_something
fi
# ✓ GOOD: Use boolean variable directly
flag=false # or true
if $flag; then
do_something
fi
# ✓ BETTER: Use integers for flags
flag=0 # false
flag=1 # true
if (( flag )); then
do_something
fi
# ✓ GOOD: For command success/failure
if command; then
echo "Success"
fi
```
**Why:** Boolean strings are error-prone; use actual booleans or return codes.
### Pattern: Multiple Conditions
```bash
# ✓ GOOD: Logical operators
if [[ condition1 && condition2 ]]; then
echo "Both true"
fi
if [[ condition1 || condition2 ]]; then
echo "At least one true"
fi
if [[ ! condition ]]; then
echo "False"
fi
# ✗ BAD: Separate tests
if [ condition1 -a condition2 ]; then # Deprecated
echo "Both true"
fi
# ✗ BAD: Nested ifs for AND
if [[ condition1 ]]; then
if [[ condition2 ]]; then
echo "Both true"
fi
fi
```
**Why:** `&&` and `||` in `[[ ]]` are clearer and recommended.
---
## Functions
### Pattern: Function Return Values
```bash
# ✓ GOOD: Return status, output to stdout
get_value() {
local value="result"
if [[ -n "$value" ]]; then
echo "$value"
return 0
else
return 1
fi
}
# Usage
if result=$(get_value); then
echo "Got: $result"
else
echo "Failed"
fi
# ✗ BAD: Using return for data
get_value() {
return 42 # Can only return 0-255!
}
result=$? # Gets 42, but limited range
```
**Why:** `return` is for exit status (0-255), not data. Output to stdout for data.
### Pattern: Local Variables in Functions
```bash
# ✓ GOOD: Declare local variables
my_function() {
local arg="$1"
local result=""
result=$(process "$arg")
echo "$result"
}
# ✗ BAD: Global variables
my_function() {
arg="$1" # Pollutes global namespace!
result="" # Global variable!
result=$(process "$arg")
echo "$result"
}
```
**Why:** Local variables prevent unexpected side effects.
### Anti-Pattern: Capturing Local Command Failure
```bash
# ✗ BAD: Local declaration masks command failure
my_function() {
local result=$(command_that_fails) # $? is from 'local', not 'command'!
echo "$result"
}
# ✓ GOOD: Separate declaration and assignment
my_function() {
local result
result=$(command_that_fails) || return 1
echo "$result"
}
# ✓ GOOD: Check command separately
my_function() {
local result
if ! result=$(command_that_fails); then
return 1
fi
echo "$result"
}
```
**Why:** Combining `local` and command substitution hides command failure.
---
## Error Handling
### Pattern: Check Command Success
```bash
# ✓ GOOD: Direct check
if ! command; then
echo "Command failed" >&2
exit 1
fi
# ✓ GOOD: With logical operator
command || {
echo "Command failed" >&2
exit 1
}
# ✓ GOOD: Capture output and check
if ! output=$(command 2>&1); then
echo "Command failed: $output" >&2
exit 1
fi
# ✗ BAD: Not checking status
command # What if it fails?
next_command
```
**Why:** Always check if commands succeed unless failure is acceptable.
### Pattern: Error Messages to stderr
```bash
# ✓ GOOD: Errors to stderr
echo "Error: Invalid argument" >&2
# ✗ BAD: Errors to stdout
echo "Error: Invalid argument"
# ✓ GOOD: Error function
error() {
echo "ERROR: $*" >&2
}
error "Something went wrong"
```
**Why:** stderr is for errors, stdout is for data output.
### Pattern: Cleanup on Exit
```bash
# ✓ GOOD: Trap for cleanup
temp_file=$(mktemp)
cleanup() {
rm -f "$temp_file"
}
trap cleanup EXIT
# Do work with temp_file
# ✗ BAD: Manual cleanup (might not run)
temp_file=$(mktemp)
# Do work
rm -f "$temp_file" # Doesn't run if script exits early!
```
**Why:** Trap ensures cleanup runs on exit, even on errors.
### Anti-Pattern: Silencing Errors
```bash
# ✗ BAD: Silencing errors
command 2>/dev/null # What if it fails?
next_command
# ✓ GOOD: Check status even if silencing output
if ! command 2>/dev/null; then
echo "Command failed" >&2
exit 1
fi
# ✓ ACCEPTABLE: When failure is expected and acceptable
if command 2>/dev/null; then
echo "Command succeeded"
else
echo "Command failed (expected)"
fi
```
**Why:** Silencing errors without checking status leads to silent failures.
---
## Process Management
### Pattern: Background Jobs
```bash
# ✓ GOOD: Track background jobs
long_running_task &
pid=$!
# Wait for completion
if wait "$pid"; then
echo "Task completed successfully"
else
echo "Task failed" >&2
fi
# ✓ GOOD: Multiple background jobs
job1 &
pid1=$!
job2 &
pid2=$!
wait "$pid1" "$pid2"
```
**Why:** Proper job management prevents zombie processes.
### Pattern: Timeout for Commands
```bash
# ✓ GOOD: Use timeout command (if available)
if timeout 30 long_running_command; then
echo "Completed within timeout"
else
echo "Timed out or failed" >&2
fi
# ✓ GOOD: Manual timeout implementation
timeout_command() {
local timeout=$1
shift
"$@" &
local pid=$!
( sleep "$timeout"; kill "$pid" 2>/dev/null ) &
local killer=$!
if wait "$pid" 2>/dev/null; then
kill "$killer" 2>/dev/null
wait "$killer" 2>/dev/null
return 0
else
return 1
fi
}
timeout_command 30 long_running_command
```
**Why:** Prevents scripts from hanging indefinitely.
### Anti-Pattern: Killing Processes Unsafely
```bash
# ✗ BAD: kill -9 immediately
kill -9 "$pid"
# ✓ GOOD: Graceful shutdown first
kill -TERM "$pid"
sleep 2
if kill -0 "$pid" 2>/dev/null; then
echo "Process still running, forcing..." >&2
kill -KILL "$pid"
fi
# ✓ GOOD: With timeout
graceful_kill() {
local pid=$1
local timeout=${2:-10}
kill -TERM "$pid" 2>/dev/null || return 0
for ((i=0; i<timeout; i++)); do
if ! kill -0 "$pid" 2>/dev/null; then
return 0
fi
sleep 1
done
echo "Forcing kill of $pid" >&2
kill -KILL "$pid" 2>/dev/null
}
```
**Why:** SIGTERM allows graceful shutdown; SIGKILL should be last resort.
---
## Security Patterns
### Pattern: Input Validation
```bash
# ✓ GOOD: Whitelist validation
validate_action() {
local action=$1
case "$action" in
start|stop|restart|status)
return 0
;;
*)
echo "Error: Invalid action: $action" >&2
return 1
;;
esac
}
# ✗ BAD: No validation
action="$1"
systemctl "$action" myservice # User can pass arbitrary commands!
# ✓ GOOD: Validate first
if validate_action "$1"; then
systemctl "$1" myservice
else
exit 1
fi
```
**Why:** Whitelist validation prevents command injection.
### Pattern: Avoid eval
```bash
# ✗ BAD: eval with user input
eval "$user_command" # DANGEROUS!
# ✓ GOOD: Use arrays
command_args=("$arg1" "$arg2" "$arg3")
command "${command_args[@]}"
# ✗ BAD: Dynamic variable names
eval "var_$name=value"
# ✓ GOOD: Associative arrays (bash 4+)
declare -A vars
vars[$name]="value"
```
**Why:** `eval` with user input is a security vulnerability.
### Pattern: Safe PATH
```bash
# ✓ GOOD: Set explicit PATH
export PATH="/usr/local/bin:/usr/bin:/bin"
# ✓ GOOD: Use absolute paths for critical commands
/usr/bin/rm -rf "$directory"
# ✗ BAD: Trusting user's PATH
rm -rf "$directory" # What if there's a malicious 'rm' in PATH?
```
**Why:** Prevents PATH injection attacks.
---
## Summary
**Most Critical Patterns:**
1. Always quote variable expansions: `"$var"`
2. Use `set -euo pipefail` for safety
3. Prefer `[[ ]]` over `[ ]` in bash
4. Use arrays for lists: `"${array[@]}"`
5. Check command success: `if ! command; then`
6. Use local variables in functions
7. Errors to stderr: `echo "Error" >&2`
8. Use `mktemp` for temporary files
9. Cleanup with traps: `trap cleanup EXIT`
10. Validate all user input
**Most Dangerous Anti-Patterns:**
1. Unquoted variables: `$var`
2. Parsing `ls` output
3. Using `eval` with user input
4. Silencing errors without checking
5. Not using `set -u` or defaults
6. Global variables in functions
7. Word splitting on filenames
8. Testing strings with `>` for numbers
9. `kill -9` without trying graceful shutdown
10. Trusting user PATH
Following these patterns and avoiding anti-patterns will result in robust, secure, and maintainable bash scripts.

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,739 @@
# Bash Scripting Resources
Comprehensive directory of authoritative sources, tools, and learning resources for bash scripting.
---
## Table of Contents
1. [Official Documentation](#official-documentation)
2. [Style Guides and Standards](#style-guides-and-standards)
3. [Tools and Utilities](#tools-and-utilities)
4. [Learning Resources](#learning-resources)
5. [Community Resources](#community-resources)
6. [Books](#books)
7. [Cheat Sheets and Quick References](#cheat-sheets-and-quick-references)
8. [Testing and Quality](#testing-and-quality)
9. [Platform-Specific Resources](#platform-specific-resources)
10. [Advanced Topics](#advanced-topics)
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
## Official Documentation
### Bash Manual
**GNU Bash Reference Manual**
- **URL:** https://www.gnu.org/software/bash/manual/
- **Description:** The authoritative reference for bash features, syntax, and built-ins
- **Use for:** Detailed feature documentation, syntax clarification, version-specific features
**Bash Man Page**
```bash
man bash # Complete bash documentation
man bash-builtins # Built-in commands
```
- **Use for:** Quick reference on local system, offline documentation
### POSIX Standards
**POSIX Shell Command Language**
- **URL:** https://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3_chap02.html
- **Description:** IEEE/Open Group specification for portable shell scripting
- **Use for:** Writing portable scripts, understanding sh vs bash differences
**POSIX Utilities**
- **URL:** https://pubs.opengroup.org/onlinepubs/9699919799/idx/utilities.html
- **Description:** Standard utilities available in POSIX-compliant systems
- **Use for:** Portable command usage, cross-platform compatibility
### Command Documentation
**GNU Coreutils Manual**
- **URL:** https://www.gnu.org/software/coreutils/manual/
- **Description:** Documentation for core GNU utilities (ls, cat, grep, etc.)
- **Use for:** Understanding Linux command behavior, GNU-specific features
**Man Pages Online**
- **URL:** https://man7.org/linux/man-pages/
- **URL:** https://www.freebsd.org/cgi/man.cgi (BSD/macOS)
- **Description:** Online searchable man pages
- **Use for:** Quick online reference, comparing Linux vs BSD commands
---
## Style Guides and Standards
### Google Shell Style Guide
**URL:** https://google.github.io/styleguide/shellguide.html
**Key Points:**
- Industry-standard practices from Google
- Covers naming conventions, formatting, best practices
- When to use shell vs other languages
- Safety and portability guidelines
**Use for:** Professional code style, team standards, code reviews
### Defensive Bash Programming
**URL:** https://kfirlavi.herokuapp.com/blog/2012/11/14/defensive-bash-programming
**Key Points:**
- Writing robust bash scripts
- Error handling patterns
- Safe coding practices
- Code organization
**Use for:** Improving script reliability, avoiding common pitfalls
### Shell Style Guide (GitHub)
**URL:** https://github.com/bahamas10/bash-style-guide
**Key Points:**
- Community-driven style guidelines
- Practical examples
- Modern bash features
**Use for:** Alternative perspectives on style, community standards
---
## Tools and Utilities
### ShellCheck
**Website:** https://www.shellcheck.net/
**GitHub:** https://github.com/koalaman/shellcheck
**Online Tool:** https://www.shellcheck.net/ (paste code for instant feedback)
**Description:** Static analysis tool for shell scripts
**Installation:**
```bash
# Ubuntu/Debian
apt-get install shellcheck
# macOS
brew install shellcheck
# Windows (Scoop)
scoop install shellcheck
# Via Docker
docker run --rm -v "$PWD:/mnt" koalaman/shellcheck script.sh
```
**Usage:**
```bash
shellcheck script.sh # Check script
shellcheck -x script.sh # Follow source statements
shellcheck -f json script.sh # JSON output
shellcheck -e SC2086 script.sh # Exclude specific warnings
```
**ShellCheck Wiki:** https://www.shellcheck.net/wiki/
- Detailed explanations of every warning
- **Use for:** Understanding and fixing ShellCheck warnings
### shfmt
**GitHub:** https://github.com/mvdan/sh
**Description:** Shell script formatter
**Installation:**
```bash
# macOS
brew install shfmt
# Go
go install mvdan.cc/sh/v3/cmd/shfmt@latest
```
**Usage:**
```bash
shfmt -i 4 -w script.sh # Format with 4-space indent
shfmt -d script.sh # Show diff without modifying
shfmt -l script.sh # List files that would be changed
```
**Use for:** Consistent code formatting, automated formatting in CI
### BATS (Bash Automated Testing System)
**GitHub:** https://github.com/bats-core/bats-core
**Description:** Testing framework for bash scripts
**Installation:**
```bash
git clone https://github.com/bats-core/bats-core.git
cd bats-core
./install.sh /usr/local
```
**Usage:**
```bash
bats test/ # Run all tests
bats test/script.bats # Run specific test file
bats --tap test/ # TAP output format
```
**Documentation:** https://bats-core.readthedocs.io/
**Use for:** Unit testing bash scripts, CI/CD integration
### bashate
**GitHub:** https://github.com/openstack/bashate
**Description:** Style checker (used by OpenStack)
**Installation:**
```bash
pip install bashate
```
**Usage:**
```bash
bashate script.sh
bashate -i E006 script.sh # Ignore specific errors
```
**Use for:** Additional style checking beyond ShellCheck
### checkbashisms
**Package:** devscripts (Debian)
**Description:** Checks for bashisms in sh scripts
**Installation:**
```bash
apt-get install devscripts # Ubuntu/Debian
```
**Usage:**
```bash
checkbashisms script.sh
checkbashisms -f script.sh # Force check even if #!/bin/bash
```
**Use for:** Ensuring POSIX compliance, portable scripts
---
## Learning Resources
### Interactive Tutorials
**Bash Academy**
- **URL:** https://www.bash.academy/
- **Description:** Modern, comprehensive bash tutorial
- **Topics:** Basics, scripting, advanced features
- **Use for:** Learning bash from scratch, structured learning path
**Learn Shell**
- **URL:** https://www.learnshell.org/
- **Description:** Interactive bash tutorial with exercises
- **Use for:** Hands-on practice, beginners
**Bash Scripting Tutorial**
- **URL:** https://linuxconfig.org/bash-scripting-tutorial
- **Description:** Comprehensive tutorial series
- **Use for:** Step-by-step learning, examples
### Guides and Documentation
**Bash Guide for Beginners**
- **URL:** https://tldp.org/LDP/Bash-Beginners-Guide/html/
- **Author:** The Linux Documentation Project
- **Description:** Comprehensive guide covering basics to intermediate
- **Use for:** Structured learning, reference material
**Advanced Bash-Scripting Guide**
- **URL:** https://tldp.org/LDP/abs/html/
- **Description:** In-depth coverage of advanced bash topics
- **Topics:** Complex scripting, text processing, system administration
- **Use for:** Advanced techniques, real-world examples
**Bash Hackers Wiki**
- **URL:** https://wiki.bash-hackers.org/
- **Alternative:** https://flokoe.github.io/bash-hackers-wiki/ (maintained mirror)
- **Description:** Community-driven bash documentation
- **Use for:** In-depth explanations, advanced topics, edge cases
**Greg's Wiki (Wooledge)**
- **URL:** https://mywiki.wooledge.org/
- **Key Pages:**
- https://mywiki.wooledge.org/BashFAQ
- https://mywiki.wooledge.org/BashPitfalls
- https://mywiki.wooledge.org/BashGuide
- **Description:** High-quality bash Q&A and guides
- **Use for:** Common questions, avoiding pitfalls, best practices
### Video Courses
**Bash Scripting on Linux (Udemy)**
- **Description:** Comprehensive video course
- **Use for:** Visual learners
**Shell Scripting: Discover How to Automate Command Line Tasks (Udemy)**
- **Description:** Practical shell scripting course
- **Use for:** Automation-focused learning
**LinkedIn Learning - Learning Bash Scripting**
- **Description:** Professional development course
- **Use for:** Structured corporate training
---
## Community Resources
### Stack Overflow
**Bash Tag**
- **URL:** https://stackoverflow.com/questions/tagged/bash
- **Use for:** Specific problems, code review, troubleshooting
**Top Questions:**
- **URL:** https://stackoverflow.com/questions/tagged/bash?tab=Votes
- **Use for:** Common problems and solutions
### Unix & Linux Stack Exchange
**URL:** https://unix.stackexchange.com/
**Shell Tag:** https://unix.stackexchange.com/questions/tagged/shell
**Bash Tag:** https://unix.stackexchange.com/questions/tagged/bash
**Use for:** Unix/Linux-specific questions, system administration
### Reddit
**/r/bash**
- **URL:** https://www.reddit.com/r/bash/
- **Description:** Bash scripting community
- **Use for:** Discussions, learning resources, help
**/r/commandline**
- **URL:** https://www.reddit.com/r/commandline/
- **Description:** Command-line interface community
- **Use for:** CLI tips, tools, productivity
### IRC/Chat
**Freenode #bash**
- **URL:** irc://irc.freenode.net/bash
- **Description:** Real-time bash help channel
- **Use for:** Live help, quick questions
**Libera.Chat #bash**
- **URL:** irc://irc.libera.chat/bash
- **Description:** Alternative IRC channel
- **Use for:** Live community support
---
## Books
### "Classic Shell Scripting" by Arnold Robbins & Nelson Beebe
**Publisher:** O'Reilly
**ISBN:** 978-0596005955
**Topics:**
- Shell basics and portability
- Text processing and filters
- Shell programming patterns
**Use for:** Comprehensive reference, professional development
### "Learning the bash Shell" by Cameron Newham
**Publisher:** O'Reilly
**ISBN:** 978-0596009656
**Topics:**
- Bash basics
- Command-line editing
- Shell programming
**Use for:** Systematic learning, reference
### "Bash Cookbook" by Carl Albing & JP Vossen
**Publisher:** O'Reilly
**ISBN:** 978-1491975336
**Topics:**
- Solutions to common problems
- Recipes and patterns
- Real-world examples
**Use for:** Problem-solving, practical examples
### "Wicked Cool Shell Scripts" by Dave Taylor & Brandon Perry
**Publisher:** No Starch Press
**ISBN:** 978-1593276027
**Topics:**
- Creative shell scripting
- System administration
- Fun and practical scripts
**Use for:** Inspiration, practical applications
### "The Linux Command Line" by William Shotts
**Publisher:** No Starch Press
**ISBN:** 978-1593279523
**Free PDF:** https://linuxcommand.org/tlcl.php
**Topics:**
- Command-line basics
- Shell scripting fundamentals
- Linux system administration
**Use for:** Beginners, comprehensive introduction
---
## Cheat Sheets and Quick References
### Bash Cheat Sheet (DevHints)
**URL:** https://devhints.io/bash
**Content:**
- Quick syntax reference
- Common patterns
- Parameter expansion
- Conditionals and loops
**Use for:** Quick lookups, syntax reminders
### Bash Scripting Cheat Sheet (GitHub)
**URL:** https://github.com/LeCoupa/awesome-cheatsheets/blob/master/languages/bash.sh
**Content:**
- Comprehensive syntax guide
- Examples and explanations
- Best practices
**Use for:** Single-file reference
### explainshell.com
**URL:** https://explainshell.com/
**Description:** Interactive tool that explains shell commands
**Example:** Paste `tar -xzvf file.tar.gz` to get detailed explanation of each flag
**Use for:** Understanding complex commands, learning command options
### Command Line Fu
**URL:** https://www.commandlinefu.com/
**Description:** Community-contributed command-line snippets
**Use for:** One-liners, clever solutions, learning new commands
### tldr Pages
**URL:** https://tldr.sh/
**GitHub:** https://github.com/tldr-pages/tldr
**Description:** Simplified man pages with examples
**Installation:**
```bash
npm install -g tldr
# Or
brew install tldr
```
**Usage:**
```bash
tldr tar
tldr grep
tldr find
```
**Use for:** Quick command examples, practical usage
---
## Testing and Quality
### Testing Frameworks
**BATS (Bash Automated Testing System)**
- **URL:** https://github.com/bats-core/bats-core
- **Documentation:** https://bats-core.readthedocs.io/
- **Use for:** Unit testing
**shUnit2**
- **URL:** https://github.com/kward/shunit2
- **Description:** xUnit-based unit testing framework
- **Use for:** Alternative to BATS
**Bash Unit**
- **URL:** https://github.com/pgrange/bash_unit
- **Description:** Bash unit testing
- **Use for:** Lightweight testing
### CI/CD Integration
**GitHub Actions Example**
```yaml
name: Test
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install ShellCheck
run: sudo apt-get install -y shellcheck
- name: Run ShellCheck
run: find . -name "*.sh" -exec shellcheck {} +
- name: Install BATS
run: |
git clone https://github.com/bats-core/bats-core.git
cd bats-core
sudo ./install.sh /usr/local
- name: Run Tests
run: bats test/
```
**GitLab CI Example**
```yaml
test:
image: koalaman/shellcheck-alpine
script:
- find . -name "*.sh" -exec shellcheck {} +
bats:
image: bats/bats
script:
- bats test/
```
### Code Coverage
**bashcov**
- **URL:** https://github.com/infertux/bashcov
- **Description:** Code coverage for bash
- **Installation:** `gem install bashcov`
- **Use for:** Measuring test coverage
---
## Platform-Specific Resources
### Linux
**Linux Man Pages**
- **URL:** https://man7.org/linux/man-pages/
- **Use for:** Linux-specific command documentation
**systemd Documentation**
- **URL:** https://www.freedesktop.org/software/systemd/man/
- **Use for:** systemd service management
### macOS
**macOS Man Pages**
- **URL:** https://www.freebsd.org/cgi/man.cgi
- **Description:** BSD-based commands (similar to macOS)
- **Use for:** macOS command differences
**Homebrew**
- **URL:** https://brew.sh/
- **Use for:** Installing GNU tools on macOS
### Windows
**Git for Windows**
- **URL:** https://gitforwindows.org/
- **Documentation:** https://github.com/git-for-windows/git/wiki
- **Use for:** Git Bash on Windows
**WSL Documentation**
- **URL:** https://docs.microsoft.com/en-us/windows/wsl/
- **Use for:** Windows Subsystem for Linux
**Cygwin**
- **URL:** https://www.cygwin.com/
- **Use for:** POSIX environment on Windows
### Containers
**Docker Bash Best Practices**
- **URL:** https://docs.docker.com/develop/develop-images/dockerfile_best-practices/
- **Use for:** Bash in containers
**Container Best Practices**
- **URL:** https://cloud.google.com/architecture/best-practices-for-building-containers
- **Use for:** Production container scripts
---
## Advanced Topics
### Process Substitution
**Greg's Wiki:**
- **URL:** https://mywiki.wooledge.org/ProcessSubstitution
- **Use for:** Understanding `<()` syntax
### Parameter Expansion
**Bash Hackers Wiki:**
- **URL:** https://wiki.bash-hackers.org/syntax/pe
- **Use for:** Complete parameter expansion reference
### Regular Expressions
**Bash Regex:**
- **URL:** https://mywiki.wooledge.org/RegularExpression
- **Use for:** Regex in bash `[[ =~ ]]`
**PCRE vs POSIX:**
- **URL:** https://www.regular-expressions.info/posix.html
- **Use for:** Understanding regex flavors
### Parallel Processing
**GNU Parallel:**
- **URL:** https://www.gnu.org/software/parallel/
- **Tutorial:** https://www.gnu.org/software/parallel/parallel_tutorial.html
- **Use for:** Parallel command execution
### Job Control
**Bash Job Control:**
- **URL:** https://www.gnu.org/software/bash/manual/html_node/Job-Control.html
- **Use for:** Background jobs, job management
---
## Troubleshooting Resources
### Debugging Tools
**bashdb**
- **URL:** http://bashdb.sourceforge.net/
- **Description:** Bash debugger
- **Use for:** Step-by-step debugging
**xtrace**
```bash
set -x # Enable
set +x # Disable
```
- **Use for:** Trace command execution
**PS4 for Better Trace Output**
```bash
export PS4='+(${BASH_SOURCE}:${LINENO}): ${FUNCNAME[0]:+${FUNCNAME[0]}(): }'
set -x
```
### Common Issues
**Bash Pitfalls**
- **URL:** https://mywiki.wooledge.org/BashPitfalls
- **Description:** 50+ common mistakes in bash
- **Use for:** Avoiding and fixing common errors
**Bash FAQ**
- **URL:** https://mywiki.wooledge.org/BashFAQ
- **Description:** Frequently asked questions
- **Use for:** Quick answers to common questions
---
## Summary: Where to Find Information
| Question Type | Resource |
|---------------|----------|
| Syntax reference | Bash Manual, DevHints cheat sheet |
| Best practices | Google Shell Style Guide, ShellCheck |
| Portable scripting | POSIX specification, checkbashisms |
| Quick examples | tldr, explainshell.com |
| Common mistakes | Bash Pitfalls, ShellCheck Wiki |
| Advanced topics | Bash Hackers Wiki, Greg's Wiki |
| Testing | BATS documentation |
| Platform differences | Platform-specific docs, Stack Overflow |
| Troubleshooting | Stack Overflow, Unix & Linux SE |
| Learning path | Bash Academy, TLDP guides |
---
## Quick Resource Lookup
**When writing a new script:**
1. Start with template from Google Style Guide
2. Use ShellCheck while developing
3. Reference Bash Manual for specific features
4. Check Bash Pitfalls for common mistakes
**When debugging:**
1. Use `set -x` for tracing
2. Check ShellCheck warnings
3. Search Bash Pitfalls
4. Search Stack Overflow for specific error
**When learning:**
1. Start with Bash Academy or TLDP
2. Use explainshell.com for commands
3. Read Greg's Wiki for in-depth topics
4. Practice with BATS tests
**When ensuring quality:**
1. Run ShellCheck
2. Run shellcheck
3. Format with shfmt
4. Write BATS tests
5. Review against Google Style Guide
These resources provide authoritative, up-to-date information for all aspects of bash scripting.

View File

@@ -0,0 +1,798 @@
# Windows Git Bash / MINGW Path Conversion & Shell Detection
**CRITICAL KNOWLEDGE FOR BASH SCRIPTING ON WINDOWS**
This reference provides comprehensive guidance for handling path conversion and shell detection in Git Bash/MINGW/MSYS2 environments on Windows - essential knowledge for cross-platform bash scripting.
---
## Table of Contents
1. [Path Conversion in Git Bash/MINGW](#path-conversion-in-git-bashMINGW)
2. [Shell Detection Methods](#shell-detection-methods)
3. [Claude Code Specific Issues](#claude-code-specific-issues)
4. [Practical Solutions](#practical-solutions)
5. [Best Practices](#best-practices)
---
## Path Conversion in Git Bash/MINGW
### Automatic Conversion Behavior
Git Bash/MINGW automatically converts Unix-style paths to Windows paths when passing arguments to native Windows programs. Understanding this behavior is critical for writing portable scripts.
**Conversion Rules:**
```bash
# Unix → Windows path conversion
/foo → C:/Program Files/Git/usr/foo
# Path lists (colon-separated → semicolon-separated)
/foo:/bar → C:\msys64\foo;C:\msys64\bar
# Arguments with paths
--dir=/foo → --dir=C:/msys64/foo
```
### What Triggers Conversion
Automatic path conversion is triggered by:
```bash
# ✓ Leading forward slash (/) in arguments
command /c/Users/username/file.txt
# ✓ Colon-separated path lists
export PATH=/usr/bin:/usr/local/bin
# ✓ Arguments after - or , with path components
command --path=/tmp/data
```
### What's Exempt from Conversion
These patterns do NOT trigger automatic conversion:
```bash
# ✓ Arguments containing = (variable assignments)
VAR=/path/to/something
# ✓ Drive specifiers (C:)
C:/Windows/System32
# ✓ Arguments with ; (already Windows format)
PATH=C:\foo;C:\bar
# ✓ Arguments starting with // (Windows switches or UNC paths)
//server/share
command //e //s # Command-line switches
```
### Control Environment Variables
**MSYS_NO_PATHCONV** (Git for Windows only):
```bash
# Disable ALL path conversion
export MSYS_NO_PATHCONV=1
command /path/to/file
# Per-command usage (recommended)
MSYS_NO_PATHCONV=1 command /path/to/file
# Value doesn't matter, just needs to be defined
MSYS_NO_PATHCONV=0 # Still disables conversion
```
**MSYS2_ARG_CONV_EXCL** (MSYS2 only):
```bash
# Exclude everything
export MSYS2_ARG_CONV_EXCL="*"
# Exclude specific prefixes
export MSYS2_ARG_CONV_EXCL="--dir=;/test"
# Multiple patterns (semicolon-separated)
export MSYS2_ARG_CONV_EXCL="--path=;--config=;/tmp"
```
**MSYS2_ENV_CONV_EXCL**:
```bash
# Prevents environment variable conversion
# Same syntax as MSYS2_ARG_CONV_EXCL
export MSYS2_ENV_CONV_EXCL="MY_PATH;CONFIG_DIR"
```
### Manual Conversion with cygpath
The `cygpath` utility provides precise control over path conversion:
```bash
# Convert Windows → Unix format
unix_path=$(cygpath -u "C:\Users\username\file.txt")
# Result: /c/Users/username/file.txt
# Convert Unix → Windows format
windows_path=$(cygpath -w "/c/Users/username/file.txt")
# Result: C:\Users\username\file.txt
# Convert to mixed format (forward slashes, Windows drive)
mixed_path=$(cygpath -m "/c/Users/username/file.txt")
# Result: C:/Users/username/file.txt
# Convert absolute path
absolute_path=$(cygpath -a "relative/path")
# Convert multiple paths
cygpath -u "C:\path1" "C:\path2"
```
**Practical cygpath usage:**
```bash
#!/usr/bin/env bash
# Cross-platform path handling
get_native_path() {
local path="$1"
# Check if running on Windows (Git Bash/MINGW)
if [[ "$OSTYPE" == "msys" ]] || [[ "$OSTYPE" == "mingw"* ]]; then
# Convert to Windows format for native programs
cygpath -w "$path"
else
# Already Unix format on Linux/macOS
echo "$path"
fi
}
# Usage
native_path=$(get_native_path "/c/Users/data")
windows_program.exe "$native_path"
```
### Common Workarounds
When automatic conversion causes issues:
**1. Use double slashes:**
```bash
# Problem: /e gets converted to C:/Program Files/Git/e
command /e /s
# Solution: Use double slashes
command //e //s # Treated as switches, not paths
```
**2. Use dash notation:**
```bash
# Problem: /e flag converted to path
command /e /s
# Solution: Use dash notation
command -e -s
```
**3. Set MSYS_NO_PATHCONV temporarily:**
```bash
# Disable conversion for single command
MSYS_NO_PATHCONV=1 command /path/with/special/chars
# Or export for script section
export MSYS_NO_PATHCONV=1
command1 /path1
command2 /path2
unset MSYS_NO_PATHCONV
```
**4. Quote paths with spaces:**
```bash
# Always quote paths with spaces
command "/c/Program Files/App/file.txt"
# Or escape spaces
command /c/Program\ Files/App/file.txt
```
---
## Shell Detection Methods
### Method 1: $OSTYPE (Fastest, Bash-Only)
Best for: Quick platform detection in bash scripts
```bash
#!/usr/bin/env bash
case "$OSTYPE" in
linux-gnu*)
echo "Linux"
;;
darwin*)
echo "macOS"
;;
cygwin*)
echo "Cygwin"
;;
msys*)
echo "MSYS/Git Bash/MinGW"
# Most common in Git for Windows
;;
win*)
echo "Windows (native)"
;;
*)
echo "Unknown: $OSTYPE"
;;
esac
```
**Advantages:**
- Fast (shell variable, no external command)
- Reliable for bash
- No forking required
**Disadvantages:**
- Bash-specific (not available in POSIX sh)
- Less detailed than uname
### Method 2: uname -s (Most Portable)
Best for: Maximum portability and detailed information
```bash
#!/bin/sh
# Works in any POSIX shell
case "$(uname -s)" in
Darwin*)
echo "macOS"
;;
Linux*)
# Check for WSL
if grep -qi microsoft /proc/version 2>/dev/null; then
echo "Windows Subsystem for Linux (WSL)"
else
echo "Linux (native)"
fi
;;
CYGWIN*)
echo "Cygwin"
;;
MINGW64*)
echo "Git Bash 64-bit / MINGW64"
;;
MINGW32*)
echo "Git Bash 32-bit / MINGW32"
;;
MSYS_NT*)
echo "MSYS"
;;
*)
echo "Unknown: $(uname -s)"
;;
esac
```
**Common uname -s outputs:**
| Output | Platform | Description |
|--------|----------|-------------|
| `Darwin` | macOS | All macOS versions |
| `Linux` | Linux/WSL | Check `/proc/version` for WSL |
| `MINGW64_NT-10.0-*` | Git Bash | Git for Windows (64-bit) |
| `MINGW32_NT-10.0-*` | Git Bash | Git for Windows (32-bit) |
| `CYGWIN_NT-*` | Cygwin | Cygwin environment |
| `MSYS_NT-*` | MSYS | MSYS environment |
**Advantages:**
- Works in any POSIX shell
- Detailed system information
- Standard on all Unix-like systems
**Disadvantages:**
- Requires forking (slower than $OSTYPE)
- Output format varies by OS version
### Method 3: $MSYSTEM (MSYS2/Git Bash Specific)
Best for: Detecting MINGW subsystem type
```bash
#!/usr/bin/env bash
case "$MSYSTEM" in
MINGW64)
echo "Native Windows 64-bit environment"
# Build native Windows 64-bit applications
;;
MINGW32)
echo "Native Windows 32-bit environment"
# Build native Windows 32-bit applications
;;
MSYS)
echo "POSIX-compliant environment"
# Build POSIX applications (depend on msys-2.0.dll)
;;
"")
echo "Not running in MSYS2/Git Bash"
;;
*)
echo "Unknown MSYSTEM: $MSYSTEM"
;;
esac
```
**MSYSTEM Values:**
| Value | Purpose | Path Conversion | Libraries |
|-------|---------|-----------------|-----------|
| `MINGW64` | Native Windows 64-bit | Automatic | Windows native (mingw-w64) |
| `MINGW32` | Native Windows 32-bit | Automatic | Windows native (mingw) |
| `MSYS` | POSIX environment | Minimal | POSIX (msys-2.0.dll) |
**WARNING:** Never set `$MSYSTEM` manually outside of MSYS2/Git Bash shells! It's automatically set by the environment and changing it can break the system.
**Advantages:**
- Precise subsystem detection
- Important for build systems
- Fast (environment variable)
**Disadvantages:**
- Only available in MSYS2/Git Bash
- Not set on other platforms
### Comprehensive Detection Function
Combine all methods for robust detection:
```bash
#!/usr/bin/env bash
detect_platform() {
local platform=""
local details=""
# Check MSYSTEM first (most specific for Git Bash)
if [[ -n "${MSYSTEM:-}" ]]; then
platform="gitbash"
details="$MSYSTEM"
echo "platform=$platform subsystem=$MSYSTEM"
return 0
fi
# Check OSTYPE
case "$OSTYPE" in
linux-gnu*)
# Distinguish WSL from native Linux
if grep -qi microsoft /proc/version 2>/dev/null; then
platform="wsl"
if [[ -n "${WSL_DISTRO_NAME:-}" ]]; then
details="$WSL_DISTRO_NAME"
fi
else
platform="linux"
fi
;;
darwin*)
platform="macos"
;;
msys*|mingw*|cygwin*)
platform="gitbash"
;;
*)
# Fallback to uname
case "$(uname -s 2>/dev/null)" in
MINGW*|MSYS*)
platform="gitbash"
;;
CYGWIN*)
platform="cygwin"
;;
*)
platform="unknown"
;;
esac
;;
esac
echo "platform=$platform${details:+ details=$details}"
}
# Usage
platform_info=$(detect_platform)
echo "$platform_info"
```
---
## Claude Code Specific Issues
### Issue #2602: Snapshot Path Conversion Failure
**Problem:**
```
/usr/bin/bash: line 1: C:UsersDavid...No such file
```
**Root Cause:**
- Node.js `os.tmpdir()` returns Windows paths (e.g., `C:\Users\...`)
- Git Bash expects Unix paths (e.g., `/c/Users/...`)
- Automatic conversion fails due to path format mismatch
**Solution (Claude Code v1.0.51+):**
Set environment variable before starting Claude Code:
```powershell
# PowerShell
$env:CLAUDE_CODE_GIT_BASH_PATH = "C:\Program Files\git\bin\bash.exe"
```
```cmd
# CMD
set CLAUDE_CODE_GIT_BASH_PATH=C:\Program Files\git\bin\bash.exe
```
```bash
# Git Bash (add to ~/.bashrc)
export CLAUDE_CODE_GIT_BASH_PATH="C:\\Program Files\\git\\bin\\bash.exe"
```
**Note:** Versions 1.0.72+ reportedly work without modifications, but setting the environment variable ensures compatibility.
### Other Known Issues
**Drive letter duplication:**
```bash
# Problem
cd D:\dev
pwd
# Output: D:\d\dev (incorrect)
# Solution: Use Unix-style path in Git Bash
cd /d/dev
pwd
# Output: /d/dev
```
**Spaces in paths:**
```bash
# Problem: Unquoted path with spaces
cd C:\Program Files\App # Fails
# Solution: Always quote paths with spaces
cd "C:\Program Files\App"
cd /c/Program\ Files/App
```
**VS Code extension Git Bash detection:**
VS Code may not auto-detect Git Bash. Configure manually in settings:
```json
{
"terminal.integrated.defaultProfile.windows": "Git Bash",
"terminal.integrated.profiles.windows": {
"Git Bash": {
"path": "C:\\Program Files\\Git\\bin\\bash.exe"
}
}
}
```
---
## Practical Solutions
### Cross-Platform Path Handling Function
```bash
#!/usr/bin/env bash
# Convert path to format appropriate for current platform
normalize_path_for_platform() {
local path="$1"
case "$OSTYPE" in
msys*|mingw*)
# On Git Bash, convert to Unix format if Windows format provided
if [[ "$path" =~ ^[A-Z]:\\ ]]; then
# Windows path detected, convert to Unix
path=$(cygpath -u "$path" 2>/dev/null || echo "$path")
fi
;;
*)
# On Linux/macOS, path is already correct
;;
esac
echo "$path"
}
# Convert path to native format for external programs
convert_to_native_path() {
local path="$1"
case "$OSTYPE" in
msys*|mingw*)
# Convert to Windows format for native Windows programs
cygpath -w "$path" 2>/dev/null || echo "$path"
;;
*)
# Already native on Linux/macOS
echo "$path"
;;
esac
}
# Example usage
input_path="/c/Users/username/file.txt"
normalized=$(normalize_path_for_platform "$input_path")
echo "Normalized: $normalized"
native=$(convert_to_native_path "$normalized")
echo "Native: $native"
```
### Script Template for Windows Compatibility
```bash
#!/usr/bin/env bash
set -euo pipefail
# Detect if running on Git Bash/MINGW
is_git_bash() {
[[ "$OSTYPE" == "msys" ]] || [[ "$OSTYPE" == "mingw"* ]]
}
# Handle path conversion based on platform
get_path() {
local path="$1"
if is_git_bash; then
# Ensure Unix format in Git Bash
if [[ "$path" =~ ^[A-Z]:\\ ]]; then
cygpath -u "$path"
else
echo "$path"
fi
else
echo "$path"
fi
}
# Call Windows program from Git Bash
call_windows_program() {
local program="$1"
shift
local args=("$@")
if is_git_bash; then
# Disable path conversion for complex arguments
MSYS_NO_PATHCONV=1 "$program" "${args[@]}"
else
"$program" "${args[@]}"
fi
}
# Main script logic
main() {
local file_path="$1"
# Normalize path
file_path=$(get_path "$file_path")
# Process file
echo "Processing: $file_path"
# Call Windows program if needed
if is_git_bash; then
local native_path
native_path=$(cygpath -w "$file_path")
call_windows_program notepad.exe "$native_path"
fi
}
main "$@"
```
### Handling Command-Line Arguments
```bash
#!/usr/bin/env bash
# Parse arguments that might contain paths
parse_arguments() {
while [[ $# -gt 0 ]]; do
case "$1" in
--path=*)
local path="${1#*=}"
# Disable conversion for this specific argument pattern
MSYS2_ARG_CONV_EXCL="--path=" command --path="$path"
shift
;;
--dir)
local dir="$2"
# Use converted path
local native_dir
if command -v cygpath &>/dev/null; then
native_dir=$(cygpath -w "$dir")
else
native_dir="$dir"
fi
command --dir "$native_dir"
shift 2
;;
*)
shift
;;
esac
done
}
```
---
## Best Practices
### 1. Always Quote Paths
```bash
# ✗ WRONG - Breaks with spaces
cd $path
# ✓ CORRECT - Works with all paths
cd "$path"
```
### 2. Use cygpath for Reliable Conversion
```bash
# ✗ WRONG - Manual conversion is error-prone
path="${path//\\/\/}"
path="${path/C:/\/c}"
# ✓ CORRECT - Use cygpath
path=$(cygpath -u "$path")
```
### 3. Detect Platform Before Path Operations
```bash
# ✓ CORRECT - Platform-aware
if [[ "$OSTYPE" == "msys" ]] || [[ "$OSTYPE" == "mingw"* ]]; then
# Git Bash specific handling
path=$(cygpath -u "$windows_path")
else
# Linux/macOS handling
path="$unix_path"
fi
```
### 4. Use MSYS_NO_PATHCONV Sparingly
```bash
# ✗ WRONG - Disables all conversion globally
export MSYS_NO_PATHCONV=1
# ✓ CORRECT - Per-command when needed
MSYS_NO_PATHCONV=1 command --flag=/value
```
### 5. Test on Target Platform
Always test scripts on Windows with Git Bash if that's a target platform:
```bash
# Test script
bash -n script.sh # Syntax check
shellcheck script.sh # Static analysis
bash script.sh # Run on actual platform
```
### 6. Document Platform Requirements
```bash
#!/usr/bin/env bash
#
# Platform Support:
# - Linux: Full support
# - macOS: Full support
# - Windows Git Bash: Requires Git for Windows 2.x+
# - Windows WSL: Full support
#
# Known Issues:
# - Path conversion may occur when calling Windows programs from Git Bash
# - Use MSYS_NO_PATHCONV=1 if experiencing path-related errors
```
### 7. Use Forward Slashes in Git Bash
```bash
# ✓ PREFERRED - Works in all environments
cd /c/Users/username/project
# ✗ AVOID - Requires escaping or quoting
cd "C:\Users\username\project"
cd C:\\Users\\username\\project
```
### 8. Check for cygpath Availability
```bash
# Graceful fallback if cygpath not available
convert_path() {
local path="$1"
if command -v cygpath &>/dev/null; then
cygpath -u "$path"
else
# Manual conversion as fallback
echo "$path" | sed 's|\\|/|g' | sed 's|^\([A-Z]\):|/\L\1|'
fi
}
```
---
## Quick Reference Card
### Path Conversion Control
| Variable | Scope | Effect |
|----------|-------|--------|
| `MSYS_NO_PATHCONV=1` | Git for Windows | Disables all conversion |
| `MSYS2_ARG_CONV_EXCL="pattern"` | MSYS2 | Excludes specific patterns |
| `MSYS2_ENV_CONV_EXCL="var"` | MSYS2 | Excludes environment variables |
### Shell Detection Variables
| Variable | Available | Purpose |
|----------|-----------|---------|
| `$OSTYPE` | Bash | Quick OS type detection |
| `$MSYSTEM` | MSYS2/Git Bash | Subsystem type (MINGW64/MINGW32/MSYS) |
| `$(uname -s)` | All POSIX | Detailed OS identification |
### cygpath Quick Reference
| Command | Purpose |
|---------|---------|
| `cygpath -u "C:\path"` | Windows → Unix format |
| `cygpath -w "/c/path"` | Unix → Windows format |
| `cygpath -m "/c/path"` | Unix → Mixed format (forward slashes) |
| `cygpath -a "path"` | Convert to absolute path |
### Common Issues & Solutions
| Problem | Solution |
|---------|----------|
| Path with spaces breaks | Quote the path: `"$path"` |
| Flag `/e` converted to path | Use `//e` or `-e` instead |
| Drive duplication `D:\d\` | Use Unix format: `/d/` |
| Windows program needs Windows path | Use `cygpath -w "$unix_path"` |
| Script fails in Claude Code | Set `CLAUDE_CODE_GIT_BASH_PATH` |
---
## Summary
Understanding Git Bash/MINGW path conversion is essential for writing robust cross-platform bash scripts that work on Windows. Key takeaways:
1. **Automatic conversion** happens for Unix-style paths in arguments
2. **Control conversion** using `MSYS_NO_PATHCONV` and `MSYS2_ARG_CONV_EXCL`
3. **Use cygpath** for reliable manual path conversion
4. **Detect platform** using `$OSTYPE`, `$MSYSTEM`, or `uname -s`
5. **Quote all paths** to handle spaces and special characters
6. **Test on target platforms** to catch platform-specific issues
7. **Document requirements** so users know what to expect
With this knowledge, you can write bash scripts that work seamlessly across Linux, macOS, Windows Git Bash, WSL, and other Unix-like environments.

View File

@@ -0,0 +1,686 @@
---
name: debugging-troubleshooting-2025
description: Comprehensive bash script debugging and troubleshooting techniques for 2025
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# Bash Debugging & Troubleshooting (2025)
## Overview
Comprehensive debugging techniques and troubleshooting patterns for bash scripts following 2025 best practices.
## Debug Mode Techniques
### 1. Basic Debug Mode (set -x)
```bash
#!/usr/bin/env bash
set -euo pipefail
# Enable debug mode
set -x
# Your commands here
command1
command2
# Disable debug mode
set +x
# Continue without debug
command3
```
### 2. Enhanced Debug Output (PS4)
```bash
#!/usr/bin/env bash
set -euo pipefail
# Custom debug prompt with file:line:function
export PS4='+(${BASH_SOURCE}:${LINENO}): ${FUNCNAME[0]:+${FUNCNAME[0]}(): }'
set -x
my_function() {
local var="value"
echo "$var"
}
my_function
set +x
```
**Output:**
```
+(script.sh:10): my_function(): local var=value
+(script.sh:11): my_function(): echo value
value
```
### 3. Conditional Debugging
```bash
#!/usr/bin/env bash
set -euo pipefail
# Enable via environment variable
DEBUG="${DEBUG:-false}"
debug() {
if [[ "$DEBUG" == "true" ]]; then
echo "[DEBUG] $*" >&2
fi
}
# Usage
debug "Starting process"
process_data
debug "Process complete"
# Run: DEBUG=true ./script.sh
```
### 4. Debugging Specific Functions
```bash
#!/usr/bin/env bash
set -euo pipefail
# Debug wrapper
debug_function() {
local func_name="$1"
shift
echo "[TRACE] Calling: $func_name $*" >&2
set -x
"$func_name" "$@"
local exit_code=$?
set +x
echo "[TRACE] Exit code: $exit_code" >&2
return $exit_code
}
# Usage
my_complex_function() {
local arg1="$1"
# Complex logic
echo "Result: $arg1"
}
debug_function my_complex_function "test"
```
## Tracing and Profiling
### 1. Execution Time Profiling
```bash
#!/usr/bin/env bash
set -euo pipefail
# Profile function execution time
profile() {
local start_ns end_ns duration_ms
start_ns=$(date +%s%N)
"$@"
local exit_code=$?
end_ns=$(date +%s%N)
duration_ms=$(( (end_ns - start_ns) / 1000000 ))
echo "[PROFILE] '$*' took ${duration_ms}ms (exit: $exit_code)" >&2
return $exit_code
}
# Usage
profile slow_command arg1 arg2
```
### 2. Function Call Tracing
```bash
#!/usr/bin/env bash
set -euo pipefail
# Trace all function calls
trace_on() {
set -o functrace
trap 'echo "[TRACE] ${FUNCNAME[0]}() called from ${BASH_SOURCE[1]}:${BASH_LINENO[0]}" >&2' DEBUG
}
trace_off() {
set +o functrace
trap - DEBUG
}
# Usage
trace_on
function1
function2
trace_off
```
### 3. Variable Inspection
```bash
#!/usr/bin/env bash
set -euo pipefail
# Inspect all variables at any point
inspect_vars() {
echo "=== Variable Dump ===" >&2
declare -p | grep -v "^declare -[^ ]*r " | sort >&2
echo "===================" >&2
}
# Inspect specific variable
inspect_var() {
local var_name="$1"
echo "[INSPECT] $var_name = ${!var_name:-<unset>}" >&2
}
# Usage
my_var="test"
inspect_var my_var
inspect_vars
```
## Error Handling and Recovery
### 1. Trap-Based Error Handler
```bash
#!/usr/bin/env bash
set -euo pipefail
# Comprehensive error handler
error_handler() {
local exit_code=$?
local line_number=$1
echo "ERROR: Command failed with exit code $exit_code" >&2
echo " File: ${BASH_SOURCE[1]}" >&2
echo " Line: $line_number" >&2
echo " Function: ${FUNCNAME[1]:-main}" >&2
# Print stack trace
local frame=0
while caller $frame; do
((frame++))
done >&2
exit "$exit_code"
}
trap 'error_handler $LINENO' ERR
# Your script logic
risky_command
```
### 2. Dry-Run Mode
```bash
#!/usr/bin/env bash
set -euo pipefail
DRY_RUN="${DRY_RUN:-false}"
# Safe execution wrapper
execute() {
if [[ "$DRY_RUN" == "true" ]]; then
echo "[DRY-RUN] Would execute: $*" >&2
return 0
else
"$@"
fi
}
# Usage
execute rm -rf /tmp/data
execute cp file.txt backup/
# Run: DRY_RUN=true ./script.sh
```
### 3. Rollback on Failure
```bash
#!/usr/bin/env bash
set -euo pipefail
OPERATIONS=()
# Track operations for rollback
track_operation() {
local rollback_cmd="$1"
OPERATIONS+=("$rollback_cmd")
}
# Execute rollback
rollback() {
echo "Rolling back operations..." >&2
for ((i=${#OPERATIONS[@]}-1; i>=0; i--)); do
echo " Executing: ${OPERATIONS[$i]}" >&2
eval "${OPERATIONS[$i]}" || true
done
}
trap rollback ERR EXIT
# Example usage
mkdir /tmp/mydir
track_operation "rmdir /tmp/mydir"
touch /tmp/mydir/file.txt
track_operation "rm /tmp/mydir/file.txt"
# If script fails, rollback executes automatically
```
## Common Issues and Solutions
### 1. Script Works Interactively but Fails in Cron
**Problem:** Script runs fine manually but fails when scheduled.
**Solution:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Fix PATH for cron
export PATH="/usr/local/bin:/usr/bin:/bin"
# Set working directory
cd "$(dirname "$0")" || exit 1
# Log everything for debugging
exec 1>> /var/log/myscript.log 2>&1
echo "[$(date)] Script starting"
# Your commands here
echo "[$(date)] Script complete"
```
### 2. Whitespace in Filenames Breaking Script
**Problem:** Script fails when processing files with spaces.
**Debugging:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Show exactly what the script sees
debug_filename() {
local filename="$1"
echo "Filename: '$filename'" >&2
echo "Length: ${#filename}" >&2
hexdump -C <<< "$filename" >&2
}
# Proper handling
while IFS= read -r -d '' file; do
debug_filename "$file"
# Process "$file"
done < <(find . -name "*.txt" -print0)
```
### 3. Script Behaves Differently on Different Systems
**Problem:** Works on Linux but fails on macOS.
**Debugging:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Platform detection and debugging
detect_platform() {
echo "=== Platform Info ===" >&2
echo "OS: $OSTYPE" >&2
echo "Bash: $BASH_VERSION" >&2
echo "PATH: $PATH" >&2
# Check tool versions
for tool in sed awk grep; do
if command -v "$tool" &> /dev/null; then
echo "$tool: $($tool --version 2>&1 | head -1)" >&2
fi
done
echo "====================" >&2
}
detect_platform
# Use portable patterns
case "$OSTYPE" in
linux*) SED_CMD="sed" ;;
darwin*) SED_CMD=$(command -v gsed || echo sed) ;;
*) echo "Unknown platform" >&2; exit 1 ;;
esac
```
### 4. Variable Scope Issues
**Problem:** Variables not available where expected.
**Debugging:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Show variable scope
test_scope() {
local local_var="local"
global_var="global"
echo "Inside function:" >&2
echo " local_var=$local_var" >&2
echo " global_var=$global_var" >&2
}
test_scope
echo "Outside function:" >&2
echo " local_var=${local_var:-<not set>}" >&2
echo " global_var=${global_var:-<not set>}" >&2
# Subshell scope issue
echo "test" | (
read -r value
echo "In subshell: $value"
)
echo "After subshell: ${value:-<not set>}" # Empty!
```
## Interactive Debugging
### 1. Breakpoint Pattern
```bash
#!/usr/bin/env bash
set -euo pipefail
# Interactive breakpoint
breakpoint() {
local message="${1:-Breakpoint}"
echo "$message" >&2
echo "Variables:" >&2
declare -p | grep -v "^declare -[^ ]*r " >&2
read -rp "Press Enter to continue, 'i' for inspect: " choice
if [[ "$choice" == "i" ]]; then
bash # Drop into interactive shell
fi
}
# Usage
value=42
breakpoint "Before critical operation"
critical_operation "$value"
```
### 2. Watch Mode (Continuous Debugging)
```bash
#!/usr/bin/env bash
# Watch script execution in real-time
watch_script() {
local script="$1"
shift
while true; do
clear
echo "=== Running: $script $* ==="
echo "=== $(date) ==="
bash -x "$script" "$@" 2>&1 | tail -50
sleep 2
done
}
# Usage: watch_script myscript.sh arg1 arg2
```
## Logging Best Practices
### 1. Structured Logging
```bash
#!/usr/bin/env bash
set -euo pipefail
readonly LOG_FILE="${LOG_FILE:-/var/log/myscript.log}"
log() {
local level="$1"
shift
local timestamp
timestamp=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
echo "${timestamp} [${level}] $*" | tee -a "$LOG_FILE" >&2
}
log_info() { log "INFO" "$@"; }
log_warn() { log "WARN" "$@"; }
log_error() { log "ERROR" "$@"; }
log_debug() { [[ "${DEBUG:-false}" == "true" ]] && log "DEBUG" "$@"; }
# Usage
log_info "Starting process"
log_debug "Debug info"
log_error "Something failed"
```
### 2. Log Rotation Awareness
```bash
#!/usr/bin/env bash
set -euo pipefail
# Ensure log file exists and is writable
setup_logging() {
local log_file="${1:-/var/log/myscript.log}"
local log_dir
log_dir=$(dirname "$log_file")
if [[ ! -d "$log_dir" ]]; then
mkdir -p "$log_dir" || {
echo "Cannot create log directory: $log_dir" >&2
return 1
}
fi
if [[ ! -w "$log_dir" ]]; then
echo "Log directory not writable: $log_dir" >&2
return 1
fi
# Redirect all output to log
exec 1>> "$log_file"
exec 2>&1
}
setup_logging
```
## Performance Debugging
### 1. Identify Slow Commands
```bash
#!/usr/bin/env bash
set -euo pipefail
# Profile each command in script
profile_script() {
export PS4='+ $(date +%s.%N) ${BASH_SOURCE}:${LINENO}: '
set -x
# Your commands here
command1
command2
command3
set +x
}
# Analyze output:
# + 1698765432.123456 script.sh:10: command1 (fast)
# + 1698765437.654321 script.sh:11: command2 (5 seconds - slow!)
```
### 2. Memory Usage Tracking
```bash
#!/usr/bin/env bash
set -euo pipefail
# Track memory usage
check_memory() {
local pid=${1:-$$}
ps -o pid,vsz,rss,comm -p "$pid" | tail -1
}
# Monitor during execution
monitor_memory() {
while true; do
check_memory
sleep 1
done &
local monitor_pid=$!
# Your commands here
"$@"
kill "$monitor_pid" 2>/dev/null || true
wait "$monitor_pid" 2>/dev/null || true
}
monitor_memory ./memory_intensive_task.sh
```
## Testing Patterns
### 1. Unit Test Template
```bash
#!/usr/bin/env bash
# test_functions.sh
# Source the script to test
source ./functions.sh
# Test counter
TESTS_RUN=0
TESTS_PASSED=0
TESTS_FAILED=0
# Assert function
assert_equals() {
local expected="$1"
local actual="$2"
local test_name="${3:-Test}"
((TESTS_RUN++))
if [[ "$expected" == "$actual" ]]; then
echo "$test_name" >&2
((TESTS_PASSED++))
else
echo "$test_name" >&2
echo " Expected: $expected" >&2
echo " Actual: $actual" >&2
((TESTS_FAILED++))
fi
}
# Run tests
test_add_numbers() {
local result
result=$(add_numbers 2 3)
assert_equals "5" "$result" "add_numbers 2 3"
}
test_add_numbers
# Summary
echo "========================================" >&2
echo "Tests run: $TESTS_RUN" >&2
echo "Passed: $TESTS_PASSED" >&2
echo "Failed: $TESTS_FAILED" >&2
[[ $TESTS_FAILED -eq 0 ]]
```
## ShellCheck Integration
```bash
#!/usr/bin/env bash
set -euo pipefail
# Validate script with ShellCheck
validate_script() {
local script="$1"
if ! command -v shellcheck &> /dev/null; then
echo "ShellCheck not installed" >&2
return 1
fi
echo "Running ShellCheck on $script..." >&2
if shellcheck --severity=warning "$script"; then
echo "✓ ShellCheck passed" >&2
return 0
else
echo "✗ ShellCheck failed" >&2
return 1
fi
}
# Usage
validate_script myscript.sh
```
## Resources
- [Bash Hackers Wiki - Debugging](https://wiki.bash-hackers.org/scripting/debuggingtips)
- [ShellCheck](https://www.shellcheck.net/)
- [BATS Testing Framework](https://github.com/bats-core/bats-core)
- [Bash Reference Manual - Debugging](https://www.gnu.org/software/bash/manual/html_node/The-Set-Builtin.html)
---
**Effective debugging requires systematic approaches, comprehensive logging, and proper tooling. Master these techniques for production-ready bash scripts in 2025.**

View File

@@ -0,0 +1,691 @@
---
name: modern-automation-patterns
description: Modern DevOps and CI/CD automation patterns with containers and cloud (2025)
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# Modern Automation Patterns (2025)
## Overview
Production-ready patterns for DevOps automation, CI/CD pipelines, and cloud-native operations following 2025 industry standards.
## Container-Aware Scripting
### Docker/Kubernetes Detection
```bash
#!/usr/bin/env bash
set -euo pipefail
# Detect container environment
detect_container() {
if [[ -f /.dockerenv ]]; then
echo "docker"
elif grep -q docker /proc/1/cgroup 2>/dev/null; then
echo "docker"
elif [[ -n "${KUBERNETES_SERVICE_HOST:-}" ]]; then
echo "kubernetes"
else
echo "host"
fi
}
# Container-aware configuration
readonly CONTAINER_ENV=$(detect_container)
case "$CONTAINER_ENV" in
docker|kubernetes)
# Container-specific paths
DATA_DIR="/data"
CONFIG_DIR="/config"
;;
host)
# Host-specific paths
DATA_DIR="/var/lib/app"
CONFIG_DIR="/etc/app"
;;
esac
```
### Minimal Container Scripts
```bash
#!/bin/sh
# Use /bin/sh for Alpine-based containers (no bash)
set -eu # Note: No pipefail in POSIX sh
# Check if running as PID 1
if [ $$ -eq 1 ]; then
# PID 1 must handle signals properly
trap 'kill -TERM $child 2>/dev/null' TERM INT
# Start main process in background
/app/main &
child=$!
# Wait for child
wait "$child"
else
# Not PID 1, run directly
exec /app/main
fi
```
### Health Check Scripts
```bash
#!/usr/bin/env bash
# healthcheck.sh - Container health probe
set -euo pipefail
# Quick health check (< 1 second)
check_health() {
local timeout=1
# Check process is running
if ! pgrep -f "myapp" > /dev/null; then
echo "Process not running" >&2
return 1
fi
# Check HTTP endpoint
if ! timeout "$timeout" curl -sf http://localhost:8080/health > /dev/null; then
echo "Health endpoint failed" >&2
return 1
fi
# Check critical files
if [[ ! -f /app/ready ]]; then
echo "Not ready" >&2
return 1
fi
return 0
}
check_health
```
## CI/CD Pipeline Patterns
### GitHub Actions Helper
```bash
#!/usr/bin/env bash
# ci-helper.sh - GitHub Actions utilities
set -euo pipefail
# Detect GitHub Actions
is_github_actions() {
[[ "${GITHUB_ACTIONS:-false}" == "true" ]]
}
# GitHub Actions output
gh_output() {
local name="$1"
local value="$2"
if is_github_actions; then
echo "${name}=${value}" >> "$GITHUB_OUTPUT"
fi
}
# GitHub Actions annotations
gh_error() {
if is_github_actions; then
echo "::error::$*"
else
echo "ERROR: $*" >&2
fi
}
gh_warning() {
if is_github_actions; then
echo "::warning::$*"
else
echo "WARN: $*" >&2
fi
}
gh_notice() {
if is_github_actions; then
echo "::notice::$*"
else
echo "INFO: $*"
fi
}
# Set job summary
gh_summary() {
if is_github_actions; then
echo "$*" >> "$GITHUB_STEP_SUMMARY"
fi
}
# Usage example
gh_notice "Starting build"
build_result=$(make build 2>&1)
gh_output "build_result" "$build_result"
gh_summary "## Build Complete\n\nStatus: Success"
```
### Azure DevOps Helper
```bash
#!/usr/bin/env bash
# azdo-helper.sh - Azure DevOps utilities
set -euo pipefail
# Detect Azure DevOps
is_azure_devops() {
[[ -n "${TF_BUILD:-}" ]]
}
# Azure DevOps output variable
azdo_output() {
local name="$1"
local value="$2"
if is_azure_devops; then
echo "##vso[task.setvariable variable=$name]$value"
fi
}
# Azure DevOps logging
azdo_error() {
if is_azure_devops; then
echo "##vso[task.logissue type=error]$*"
else
echo "ERROR: $*" >&2
fi
}
azdo_warning() {
if is_azure_devops; then
echo "##vso[task.logissue type=warning]$*"
else
echo "WARN: $*" >&2
fi
}
# Section grouping
azdo_section_start() {
is_azure_devops && echo "##[section]$*"
}
azdo_section_end() {
is_azure_devops && echo "##[endsection]"
}
# Usage
azdo_section_start "Running Tests"
test_result=$(npm test)
azdo_output "test_result" "$test_result"
azdo_section_end
```
### Multi-Platform CI Detection
```bash
#!/usr/bin/env bash
set -euo pipefail
# Detect CI environment
detect_ci() {
if [[ "${GITHUB_ACTIONS:-false}" == "true" ]]; then
echo "github"
elif [[ -n "${TF_BUILD:-}" ]]; then
echo "azuredevops"
elif [[ -n "${GITLAB_CI:-}" ]]; then
echo "gitlab"
elif [[ -n "${CIRCLECI:-}" ]]; then
echo "circleci"
elif [[ -n "${JENKINS_URL:-}" ]]; then
echo "jenkins"
else
echo "local"
fi
}
# Universal output
ci_output() {
local name="$1"
local value="$2"
local ci_env
ci_env=$(detect_ci)
case "$ci_env" in
github)
echo "${name}=${value}" >> "$GITHUB_OUTPUT"
;;
azuredevops)
echo "##vso[task.setvariable variable=$name]$value"
;;
gitlab)
echo "${name}=${value}" >> ci_output.env
;;
*)
echo "export ${name}=\"${value}\""
;;
esac
}
# Universal error
ci_error() {
local ci_env
ci_env=$(detect_ci)
case "$ci_env" in
github)
echo "::error::$*"
;;
azuredevops)
echo "##vso[task.logissue type=error]$*"
;;
*)
echo "ERROR: $*" >&2
;;
esac
}
```
## Cloud Provider Patterns
### AWS Helper Functions
```bash
#!/usr/bin/env bash
set -euo pipefail
# Check AWS CLI availability
require_aws() {
if ! command -v aws &> /dev/null; then
echo "Error: AWS CLI not installed" >&2
exit 1
fi
}
# Get AWS account ID
get_aws_account_id() {
aws sts get-caller-identity --query Account --output text
}
# Get secret from AWS Secrets Manager
get_aws_secret() {
local secret_name="$1"
aws secretsmanager get-secret-value \
--secret-id "$secret_name" \
--query SecretString \
--output text
}
# Upload to S3 with retry
s3_upload_retry() {
local file="$1"
local s3_path="$2"
local max_attempts=3
local attempt=1
while ((attempt <= max_attempts)); do
if aws s3 cp "$file" "$s3_path"; then
return 0
fi
echo "Upload failed (attempt $attempt/$max_attempts)" >&2
((attempt++))
sleep $((attempt * 2))
done
return 1
}
# Assume IAM role
assume_role() {
local role_arn="$1"
local session_name="${2:-bash-script}"
local credentials
credentials=$(aws sts assume-role \
--role-arn "$role_arn" \
--role-session-name "$session_name" \
--query Credentials \
--output json)
export AWS_ACCESS_KEY_ID=$(echo "$credentials" | jq -r .AccessKeyId)
export AWS_SECRET_ACCESS_KEY=$(echo "$credentials" | jq -r .SecretAccessKey)
export AWS_SESSION_TOKEN=$(echo "$credentials" | jq -r .SessionToken)
}
```
### Azure Helper Functions
```bash
#!/usr/bin/env bash
set -euo pipefail
# Check Azure CLI
require_az() {
if ! command -v az &> /dev/null; then
echo "Error: Azure CLI not installed" >&2
exit 1
fi
}
# Get Azure subscription ID
get_az_subscription_id() {
az account show --query id --output tsv
}
# Get secret from Azure Key Vault
get_keyvault_secret() {
local vault_name="$1"
local secret_name="$2"
az keyvault secret show \
--vault-name "$vault_name" \
--name "$secret_name" \
--query value \
--output tsv
}
# Upload to Azure Blob Storage
az_blob_upload() {
local file="$1"
local container="$2"
local blob_name="${3:-$(basename "$file")}"
az storage blob upload \
--file "$file" \
--container-name "$container" \
--name "$blob_name" \
--overwrite
}
# Get managed identity token
get_managed_identity_token() {
local resource="${1:-https://management.azure.com/}"
curl -sf -H Metadata:true \
"http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-01&resource=$resource" \
| jq -r .access_token
}
```
## Parallel Processing Patterns
### GNU Parallel
```bash
#!/usr/bin/env bash
set -euo pipefail
# Check for GNU parallel
if command -v parallel &> /dev/null; then
# Process files in parallel
export -f process_file
find data/ -name "*.json" | parallel -j 4 process_file {}
else
# Fallback to bash background jobs
max_jobs=4
job_count=0
for file in data/*.json; do
process_file "$file" &
((job_count++))
if ((job_count >= max_jobs)); then
wait -n
((job_count--))
fi
done
wait # Wait for remaining jobs
fi
```
### Job Pool Pattern
```bash
#!/usr/bin/env bash
set -euo pipefail
# Job pool implementation
job_pool_init() {
readonly MAX_JOBS="${1:-4}"
JOB_POOL_PIDS=()
}
job_pool_run() {
# Wait if pool is full
while ((${#JOB_POOL_PIDS[@]} >= MAX_JOBS)); do
job_pool_wait_any
done
# Start new job
"$@" &
JOB_POOL_PIDS+=($!)
}
job_pool_wait_any() {
# Wait for any job to finish
if ((${#JOB_POOL_PIDS[@]} > 0)); then
local pid
wait -n
# Remove finished PIDs
for i in "${!JOB_POOL_PIDS[@]}"; do
if ! kill -0 "${JOB_POOL_PIDS[$i]}" 2>/dev/null; then
unset 'JOB_POOL_PIDS[$i]'
fi
done
JOB_POOL_PIDS=("${JOB_POOL_PIDS[@]}") # Reindex
fi
}
job_pool_wait_all() {
wait
JOB_POOL_PIDS=()
}
# Usage
job_pool_init 4
for file in data/*.txt; do
job_pool_run process_file "$file"
done
job_pool_wait_all
```
## Deployment Patterns
### Blue-Green Deployment
```bash
#!/usr/bin/env bash
set -euo pipefail
# Blue-green deployment helper
deploy_blue_green() {
local new_version="$1"
local health_check_url="$2"
echo "Starting blue-green deployment: $new_version"
# Deploy to green (inactive) environment
echo "Deploying to green environment..."
deploy_to_environment "green" "$new_version"
# Health check
echo "Running health checks..."
if ! check_health "$health_check_url"; then
echo "Health check failed, rolling back" >&2
return 1
fi
# Switch traffic to green
echo "Switching traffic to green..."
switch_traffic "green"
# Keep blue as backup for rollback
echo "Deployment complete. Blue environment kept for rollback."
}
check_health() {
local url="$1"
local max_attempts=30
local attempt=1
while ((attempt <= max_attempts)); do
if curl -sf "$url" > /dev/null; then
return 0
fi
echo "Health check attempt $attempt/$max_attempts failed"
sleep 2
((attempt++))
done
return 1
}
```
### Canary Deployment
```bash
#!/usr/bin/env bash
set -euo pipefail
# Canary deployment with gradual rollout
deploy_canary() {
local new_version="$1"
local stages=(5 10 25 50 100) # Percentage stages
echo "Starting canary deployment: $new_version"
for percentage in "${stages[@]}"; do
echo "Rolling out to $percentage% of traffic..."
# Update traffic split
update_traffic_split "$percentage" "$new_version"
# Monitor for issues
echo "Monitoring for 5 minutes..."
if ! monitor_metrics 300; then
echo "Issues detected, rolling back" >&2
update_traffic_split 0 "$new_version"
return 1
fi
echo "Stage $percentage% successful"
done
echo "Canary deployment complete"
}
monitor_metrics() {
local duration="$1"
local end_time=$((SECONDS + duration))
while ((SECONDS < end_time)); do
# Check error rate
local error_rate
error_rate=$(get_error_rate)
if ((error_rate > 5)); then
echo "Error rate too high: $error_rate%" >&2
return 1
fi
sleep 10
done
return 0
}
```
## Logging and Monitoring
### Structured Logging
```bash
#!/usr/bin/env bash
set -euo pipefail
# JSON structured logging
log_json() {
local level="$1"
local message="$2"
shift 2
local timestamp
timestamp=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
# Build JSON
local json
json=$(jq -n \
--arg timestamp "$timestamp" \
--arg level "$level" \
--arg message "$message" \
--arg script "$SCRIPT_NAME" \
--argjson context "$(echo "$@" | jq -Rs .)" \
'{
timestamp: $timestamp,
level: $level,
message: $message,
script: $script,
context: $context
}')
echo "$json" >&2
}
log_info() { log_json "INFO" "$@"; }
log_error() { log_json "ERROR" "$@"; }
log_warn() { log_json "WARN" "$@"; }
# Usage
log_info "Processing file" "file=/data/input.txt" "size=1024"
```
## Resources
- [12-Factor App Methodology](https://12factor.net/)
- [Container Best Practices](https://cloud.google.com/architecture/best-practices-for-building-containers)
- [CI/CD Best Practices](https://www.atlassian.com/continuous-delivery/principles/continuous-integration-vs-delivery-vs-deployment)
---
**Modern automation requires cloud-native patterns, container awareness, and robust CI/CD integration. These patterns ensure production-ready deployments in 2025.**

View File

@@ -0,0 +1,582 @@
---
name: security-first-2025
description: Security-first bash scripting patterns for 2025 (mandatory validation, zero-trust)
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# Security-First Bash Scripting (2025)
## Overview
2025 security assessments reveal **60%+ of exploited automation tools lacked adequate input sanitization**. This skill provides mandatory security patterns.
## Critical Security Patterns
### 1. Input Validation (Non-Negotiable)
**Every input MUST be validated before use:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# ✅ REQUIRED: Validate all inputs
validate_input() {
local input="$1"
local pattern="$2"
local max_length="${3:-255}"
# Check empty
if [[ -z "$input" ]]; then
echo "Error: Input required" >&2
return 1
fi
# Check pattern
if [[ ! "$input" =~ $pattern ]]; then
echo "Error: Invalid format" >&2
return 1
fi
# Check length
if [[ ${#input} -gt $max_length ]]; then
echo "Error: Input too long (max $max_length)" >&2
return 1
fi
return 0
}
# Usage
read -r user_input
if validate_input "$user_input" '^[a-zA-Z0-9_-]+$' 50; then
process "$user_input"
else
exit 1
fi
```
### 2. Command Injection Prevention
**NEVER use eval or dynamic execution with user input:**
```bash
# ❌ DANGEROUS - Command injection vulnerability
user_input="$(cat user_file.txt)"
eval "$user_input" # NEVER DO THIS
# ❌ DANGEROUS - Indirect command injection
grep "$user_pattern" file.txt # If pattern is "-e /etc/passwd"
# ✅ SAFE - Use -- separator
grep -- "$user_pattern" file.txt
# ✅ SAFE - Use arrays
grep_args=("$user_pattern" "file.txt")
grep "${grep_args[@]}"
# ✅ SAFE - Validate before use
if [[ "$user_pattern" =~ ^[a-zA-Z0-9]+$ ]]; then
grep "$user_pattern" file.txt
fi
```
### 3. Path Traversal Prevention
**Sanitize and validate ALL file paths:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Sanitize path components
sanitize_path() {
local path="$1"
# Remove dangerous patterns
path="${path//..\/}" # Remove ../
path="${path//\/..\//}" # Remove /../
path="${path#/}" # Remove leading /
echo "$path"
}
# Validate path is within allowed directory
is_safe_path() {
local file_path="$1"
local base_dir="$2"
# Resolve to absolute paths
local real_path real_base
real_path=$(readlink -f "$file_path" 2>/dev/null) || return 1
real_base=$(readlink -f "$base_dir" 2>/dev/null) || return 1
# Check path starts with base
[[ "$real_path" == "$real_base"/* ]]
}
# Usage
user_file=$(sanitize_path "$user_input")
if is_safe_path "/var/app/uploads/$user_file" "/var/app/uploads"; then
cat "/var/app/uploads/$user_file"
else
echo "Error: Access denied" >&2
exit 1
fi
```
### 4. Secure Temporary Files
**Never use predictable temp file names:**
```bash
# ❌ DANGEROUS - Race condition vulnerability
temp_file="/tmp/myapp.tmp"
echo "data" > "$temp_file" # Can be symlinked by attacker
# ❌ DANGEROUS - Predictable name
temp_file="/tmp/myapp-$$.tmp" # PID can be guessed
# ✅ SAFE - Use mktemp
temp_file=$(mktemp)
chmod 600 "$temp_file" # Owner-only permissions
echo "data" > "$temp_file"
# ✅ SAFE - Automatic cleanup
readonly TEMP_FILE=$(mktemp)
trap 'rm -f "$TEMP_FILE"' EXIT INT TERM
# ✅ SAFE - Temp directory
readonly TEMP_DIR=$(mktemp -d)
trap 'rm -rf "$TEMP_DIR"' EXIT INT TERM
chmod 700 "$TEMP_DIR"
```
### 5. Secrets Management
**NEVER hardcode secrets or expose them:**
```bash
# ❌ DANGEROUS - Hardcoded secrets
DB_PASSWORD="supersecret123"
# ❌ DANGEROUS - Secrets in environment (visible in ps)
export DB_PASSWORD="supersecret123"
# ✅ SAFE - Read from secure file
if [[ -f /run/secrets/db_password ]]; then
DB_PASSWORD=$(< /run/secrets/db_password)
chmod 600 /run/secrets/db_password
else
echo "Error: Secret not found" >&2
exit 1
fi
# ✅ SAFE - Use cloud secret managers
get_secret() {
local secret_name="$1"
# AWS Secrets Manager
aws secretsmanager get-secret-value \
--secret-id "$secret_name" \
--query SecretString \
--output text
}
DB_PASSWORD=$(get_secret "production/database/password")
# ✅ SAFE - Prompt for sensitive data (no echo)
read -rsp "Enter password: " password
echo # Newline after password
```
### 6. Privilege Management
**Follow least privilege principle:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Check not running as root
if [[ $EUID -eq 0 ]]; then
echo "Error: Do not run as root" >&2
exit 1
fi
# Drop privileges if started as root
drop_privileges() {
local target_user="$1"
if [[ $EUID -eq 0 ]]; then
echo "Dropping privileges to $target_user" >&2
exec sudo -u "$target_user" "$0" "$@"
fi
}
# Run specific command with minimal privileges
run_privileged() {
local command="$1"
shift
# Use sudo with minimal scope
sudo --non-interactive \
--reset-timestamp \
"$command" "$@"
}
# Usage
drop_privileges "appuser"
```
### 7. Environment Variable Sanitization
**Clean environment before executing:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Clean environment
clean_environment() {
# Unset dangerous variables
unset IFS
unset CDPATH
unset GLOBIGNORE
# Set safe PATH (absolute paths only)
export PATH="/usr/local/bin:/usr/bin:/bin"
# Set safe IFS
IFS=$'\n\t'
}
# Execute command in clean environment
exec_clean() {
env -i \
HOME="$HOME" \
USER="$USER" \
PATH="/usr/local/bin:/usr/bin:/bin" \
"$@"
}
# Usage
clean_environment
exec_clean /usr/local/bin/myapp
```
### 8. Absolute Path Usage (2025 Best Practice)
**Always use absolute paths to prevent PATH hijacking:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# ❌ DANGEROUS - Vulnerable to PATH manipulation
curl https://example.com/data
jq '.items[]' data.json
# ✅ SAFE - Absolute paths
/usr/bin/curl https://example.com/data
/usr/bin/jq '.items[]' data.json
# ✅ SAFE - Verify command location
CURL=$(command -v curl) || { echo "curl not found" >&2; exit 1; }
"$CURL" https://example.com/data
```
**Why This Matters:**
- Prevents malicious binaries in user PATH
- Standard practice in enterprise environments
- Required for security-sensitive scripts
### 9. History File Protection (2025 Security)
**Disable history for credential operations:**
```bash
#!/usr/bin/env bash
set -euo pipefail
# Disable history for this session
HISTFILE=/dev/null
export HISTFILE
# Or disable specific commands
HISTIGNORE="*password*:*secret*:*token*"
export HISTIGNORE
# Handle sensitive operations
read -rsp "Enter database password: " db_password
echo
# Use password (not logged to history)
/usr/bin/mysql -p"$db_password" -e "SELECT 1"
# Clear variable
unset db_password
```
## Security Checklist (2025)
Every script MUST pass these checks:
### Input Validation
- [ ] All user inputs validated with regex patterns
- [ ] Maximum length enforced on all inputs
- [ ] Empty/null inputs rejected
- [ ] Special characters escaped or rejected
### Command Safety
- [ ] No eval with user input
- [ ] No dynamic variable names from user input
- [ ] All command arguments use -- separator
- [ ] Arrays used instead of string concatenation
### File Operations
- [ ] All paths validated against directory traversal
- [ ] Temp files created with mktemp
- [ ] File permissions set restrictively (600/700)
- [ ] Cleanup handlers registered (trap EXIT)
### Secrets
- [ ] No hardcoded passwords/keys/tokens
- [ ] Secrets read from secure storage
- [ ] Secrets never logged or printed
- [ ] Secrets cleared from memory when done
### Privileges
- [ ] Runs with minimum required privileges
- [ ] Root execution rejected unless necessary
- [ ] Privilege drops implemented where needed
- [ ] Sudo scope minimized
### Error Handling
- [ ] set -euo pipefail enabled
- [ ] All errors logged to stderr
- [ ] Sensitive data not exposed in errors
- [ ] Exit codes meaningful
## Automated Security Scanning
### ShellCheck Integration
```bash
# .github/workflows/security.yml
name: Security Scan
on: [push, pull_request]
jobs:
shellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: ShellCheck
run: |
# Fail on security issues
find . -name "*.sh" -exec shellcheck \
--severity=error \
--enable=all \
{} +
```
### Custom Security Linting
```bash
#!/usr/bin/env bash
# security-lint.sh - Check scripts for security issues
set -euo pipefail
lint_script() {
local script="$1"
local issues=0
echo "Checking: $script"
# Check for eval
if grep -n "eval" "$script"; then
echo " ❌ Found eval (command injection risk)"
((issues++))
fi
# Check for hardcoded secrets
if grep -nE "(password|secret|token|key)\s*=\s*['\"][^'\"]+['\"]" "$script"; then
echo " ❌ Found hardcoded secrets"
((issues++))
fi
# Check for predictable temp files
if grep -n "/tmp/[a-zA-Z0-9_-]*\\.tmp" "$script"; then
echo " ❌ Found predictable temp file"
((issues++))
fi
# Check for unquoted variables
if grep -nE '\$[A-Z_]+[^"]' "$script"; then
echo " ⚠️ Found unquoted variables"
((issues++))
fi
if ((issues == 0)); then
echo " ✓ No security issues found"
fi
return "$issues"
}
# Scan all scripts
total_issues=0
while IFS= read -r -d '' script; do
lint_script "$script" || ((total_issues++))
done < <(find . -name "*.sh" -type f -print0)
if ((total_issues > 0)); then
echo "❌ Found security issues in $total_issues scripts"
exit 1
else
echo "✓ All scripts passed security checks"
fi
```
## Real-World Secure Script Template
```bash
#!/usr/bin/env bash
#
# Secure Script Template (2025)
#
set -euo pipefail
IFS=$'\n\t'
readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
readonly SCRIPT_NAME="$(basename "${BASH_SOURCE[0]}")"
# Security: Reject root execution
if [[ $EUID -eq 0 ]]; then
echo "Error: Do not run as root" >&2
exit 1
fi
# Security: Clean environment
export PATH="/usr/local/bin:/usr/bin:/bin"
unset CDPATH GLOBIGNORE
# Security: Secure temp file
readonly TEMP_FILE=$(mktemp)
trap 'rm -f "$TEMP_FILE"; exit' EXIT INT TERM
chmod 600 "$TEMP_FILE"
# Validate input
validate_input() {
local input="$1"
if [[ -z "$input" ]]; then
echo "Error: Input required" >&2
return 1
fi
if [[ ! "$input" =~ ^[a-zA-Z0-9_/-]+$ ]]; then
echo "Error: Invalid characters in input" >&2
return 1
fi
if [[ ${#input} -gt 255 ]]; then
echo "Error: Input too long" >&2
return 1
fi
return 0
}
# Sanitize file path
sanitize_path() {
local path="$1"
path="${path//..\/}"
path="${path#/}"
echo "$path"
}
# Main function
main() {
local user_input="${1:-}"
# Validate
if ! validate_input "$user_input"; then
exit 1
fi
# Sanitize
local safe_path
safe_path=$(sanitize_path "$user_input")
# Process safely
echo "Processing: $safe_path"
# ... your logic here ...
}
main "$@"
```
## Compliance Standards (2025)
### CIS Benchmarks
- Use ShellCheck for automated compliance
- Implement input validation on all user data
- Secure temporary file handling
- Least privilege execution
### NIST Guidelines
- Strong input validation (NIST SP 800-53)
- Secure coding practices
- Logging and monitoring
- Access control enforcement
### OWASP Top 10
- A03: Injection - Prevent command injection
- A01: Broken Access Control - Path validation
- A02: Cryptographic Failures - Secure secrets
## Resources
- [CIS Docker Benchmark](https://www.cisecurity.org/benchmark/docker)
- [OWASP Command Injection](https://owasp.org/www-community/attacks/Command_Injection)
- [ShellCheck Security Rules](https://www.shellcheck.net/wiki/)
- [NIST SP 800-53](https://csrc.nist.gov/publications/detail/sp/800-53/rev-5/final)
---
**Security-first development is non-negotiable in 2025. Every script must pass all security checks before deployment.**

View File

@@ -0,0 +1,441 @@
---
name: shellcheck-cicd-2025
description: ShellCheck validation as non-negotiable 2025 workflow practice
---
## 🚨 CRITICAL GUIDELINES
### Windows File Path Requirements
**MANDATORY: Always Use Backslashes on Windows for File Paths**
When using Edit or Write tools on Windows, you MUST use backslashes (`\`) in file paths, NOT forward slashes (`/`).
**Examples:**
- ❌ WRONG: `D:/repos/project/file.tsx`
- ✅ CORRECT: `D:\repos\project\file.tsx`
This applies to:
- Edit tool file_path parameter
- Write tool file_path parameter
- All file operations on Windows systems
### Documentation Guidelines
**NEVER create new documentation files unless explicitly requested by the user.**
- **Priority**: Update existing README.md files rather than creating new documentation
- **Repository cleanliness**: Keep repository root clean - only README.md unless user requests otherwise
- **Style**: Documentation should be concise, direct, and professional - avoid AI-generated tone
- **User preference**: Only create additional .md files when user specifically asks for documentation
---
# ShellCheck CI/CD Integration (2025)
## ShellCheck: Non-Negotiable in 2025
ShellCheck is now considered **mandatory** in modern bash workflows (2025 best practices):
### Latest Version: v0.11.0 (August 2025)
**What's New:**
- Full Bash 5.3 support (`${| cmd; }` and `source -p`)
- **New warnings**: SC2327/SC2328 (capture group issues)
- **POSIX.1-2024 compliance**: SC3013 removed (-ot/-nt/-ef now POSIX standard)
- Enhanced static analysis capabilities
- Improved performance and accuracy
### Why Mandatory?
- Catches subtle bugs before production
- Prevents common security vulnerabilities
- Enforces consistent code quality
- Required by most DevOps teams
- Standard in enterprise environments
- Supports latest POSIX.1-2024 standard
## Installation
```bash
# Ubuntu/Debian
apt-get install shellcheck
# macOS
brew install shellcheck
# Alpine (Docker)
apk add shellcheck
# Windows (WSL/Git Bash)
choco install shellcheck
# Or download binary
wget https://github.com/koalaman/shellcheck/releases/latest/download/shellcheck-stable.linux.x86_64.tar.xz
tar -xf shellcheck-stable.linux.x86_64.tar.xz
sudo cp shellcheck-stable/shellcheck /usr/local/bin/
```
## GitHub Actions Integration
### Mandatory Pre-Merge Check
```yaml
# .github/workflows/shellcheck.yml
name: ShellCheck
on:
pull_request:
paths:
- '**.sh'
- '**Dockerfile'
push:
branches: [main]
jobs:
shellcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Run ShellCheck
uses: ludeeus/action-shellcheck@master
with:
severity: warning
format: gcc # or: tty, json, checkstyle
scandir: './scripts'
# Fail on any issues
ignore_paths: 'node_modules'
# Block merge on failures
- name: Annotate PR
if: failure()
uses: actions/github-script@v6
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: '⛔ ShellCheck validation failed. Fix issues before merging.'
})
```
## Azure DevOps Integration
```yaml
# azure-pipelines.yml
trigger:
- main
pr:
- main
stages:
- stage: Validate
jobs:
- job: ShellCheck
pool:
vmImage: 'ubuntu-24.04'
steps:
- script: |
sudo apt-get install -y shellcheck
displayName: 'Install ShellCheck'
- script: |
find . -name "*.sh" -type f | xargs shellcheck --format=gcc --severity=warning
displayName: 'Run ShellCheck'
failOnStderr: true
- task: PublishTestResults@2
condition: always()
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/shellcheck-results.xml'
failTaskOnFailedTests: true
```
## Git Hooks (Pre-Commit)
```bash
# .git/hooks/pre-commit
#!/usr/bin/env bash
set -o errexit
set -o nounset
set -o pipefail
# Find all staged .sh files
mapfile -t STAGED_SH < <(git diff --cached --name-only --diff-filter=ACMR | grep '\.sh$' || true)
if [ ${#STAGED_SH[@]} -eq 0 ]; then
exit 0
fi
echo "Running ShellCheck on staged files..."
# Run ShellCheck
shellcheck --format=gcc --severity=warning "${STAGED_SH[@]}"
if [ $? -ne 0 ]; then
echo "⛔ ShellCheck failed. Fix issues before committing."
exit 1
fi
echo "✅ ShellCheck passed"
exit 0
```
**Install Pre-Commit Hook:**
```bash
chmod +x .git/hooks/pre-commit
# Or use pre-commit framework
# .pre-commit-config.yaml
repos:
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.11.0.0
hooks:
- id: shellcheck
args: ['--severity=warning']
# Install
pip install pre-commit
pre-commit install
```
## VS Code Integration
```json
// .vscode/settings.json
{
"shellcheck.enable": true,
"shellcheck.run": "onType",
"shellcheck.executablePath": "/usr/local/bin/shellcheck",
"shellcheck.exclude": ["SC1090", "SC1091"], // Optional excludes
"shellcheck.customArgs": [
"-x", // Follow source files
"--severity=warning"
]
}
```
## Docker Build Integration
```dockerfile
# Dockerfile with ShellCheck validation
FROM alpine:3.19 AS builder
# Install ShellCheck
RUN apk add --no-cache shellcheck bash
# Copy scripts
COPY scripts/ /scripts/
# Validate all scripts before continuing
RUN find /scripts -name "*.sh" -type f -exec shellcheck --severity=warning {} +
# Final stage
FROM alpine:3.19
COPY --from=builder /scripts/ /scripts/
RUN chmod +x /scripts/*.sh
ENTRYPOINT ["/scripts/entrypoint.sh"]
```
## Common ShellCheck Rules (2025)
### New in v0.11.0: SC2327/SC2328 - Capture Groups
```bash
# ❌ Bad - Capture groups may not work as expected
if [[ "$string" =~ ([0-9]+)\.([0-9]+) ]]; then
echo "$1" # Wrong: $1 is script arg, not capture group
fi
# ✅ Good - Use BASH_REMATCH array
if [[ "$string" =~ ([0-9]+)\.([0-9]+) ]]; then
echo "${BASH_REMATCH[1]}.${BASH_REMATCH[2]}"
fi
```
### SC2294: eval Negates Array Benefits (New)
```bash
# ❌ Bad - eval defeats array safety
eval "command ${array[@]}"
# ✅ Good - Direct array usage
command "${array[@]}"
```
### SC2295: Quote Expansions Inside ${}
```bash
# ❌ Bad
echo "${var-$default}" # $default not quoted
# ✅ Good
echo "${var-"$default"}"
```
### SC2086: Quote Variables
```bash
# ❌ Bad
file=$1
cat $file # Fails if filename has spaces
# ✅ Good
file=$1
cat "$file"
```
### SC2046: Quote Command Substitution
```bash
# ❌ Bad
for file in $(find . -name "*.txt"); do
echo $file
done
# ✅ Good
find . -name "*.txt" -print0 | while IFS= read -r -d '' file; do
echo "$file"
done
```
### SC2155: Separate Declaration and Assignment
```bash
# ❌ Bad
local result=$(command) # Hides command exit code
# ✅ Good
local result
result=$(command)
```
### SC2164: Use cd || exit
```bash
# ❌ Bad
cd /some/directory
./script.sh # Runs in wrong dir if cd fails
# ✅ Good
cd /some/directory || exit 1
./script.sh
```
## Google Shell Style Guide (50-Line Limit)
2025 recommendation: Keep scripts under 50 lines:
```bash
# ❌ Bad: 500-line monolithic script
#!/usr/bin/env bash
# ... 500 lines of code ...
# ✅ Good: Modular scripts < 50 lines each
# lib/logging.sh (20 lines)
log_info() { echo "[INFO] $*"; }
log_error() { echo "[ERROR] $*" >&2; }
# lib/validation.sh (30 lines)
validate_input() { ... }
check_dependencies() { ... }
# main.sh (40 lines)
source "$(dirname "$0")/lib/logging.sh"
source "$(dirname "$0")/lib/validation.sh"
main() {
validate_input "$@"
check_dependencies
# ... core logic ...
}
main "$@"
```
## Enforce in CI/CD
### Fail Build on Issues
```yaml
# Strict enforcement
- name: ShellCheck (Strict)
run: |
shellcheck --severity=warning scripts/*.sh
# Exit code 1 fails the build
# Advisory only (warnings but don't fail)
- name: ShellCheck (Advisory)
run: |
shellcheck --severity=warning scripts/*.sh || true
# Logs warnings but doesn't fail
```
### Generate Reports
```bash
# JSON format for parsing
shellcheck --format=json scripts/*.sh > shellcheck-report.json
# GitHub annotations format
shellcheck --format=gcc scripts/*.sh
# Human-readable
shellcheck --format=tty scripts/*.sh
```
## Modern Error Handling Trio (2025)
Always use with ShellCheck validation:
```bash
#!/usr/bin/env bash
# Modern error handling (non-negotiable in 2025)
set -o errexit # Exit on command failure
set -o nounset # Exit on undefined variable
set -o pipefail # Exit on pipe failure
# ShellCheck approved
main() {
local config_file="${1:?Config file required}"
if [[ ! -f "$config_file" ]]; then
echo "Error: Config file not found: $config_file" >&2
return 1
fi
# Safe command execution
local result
result=$(process_config "$config_file")
echo "$result"
}
main "$@"
```
## Best Practices (2025)
1. **Run ShellCheck in CI/CD (mandatory)**
2. **Use pre-commit hooks** to catch issues early
3. **Keep scripts under 50 lines** (Google Style Guide)
4. **Use modern error handling trio** (errexit, nounset, pipefail)
5. **Fix all warnings** before merging
6. **Document any disabled rules** with reasoning
7. **Integrate with IDE** for real-time feedback
## Resources
- [ShellCheck](https://www.shellcheck.net)
- [Google Shell Style Guide](https://google.github.io/styleguide/shellguide.html)
- [ShellCheck GitHub Action](https://github.com/ludeeus/action-shellcheck)