Effective process management and job control are crucial skills for creating efficient and powerful shell Script. In this article, we'll explore techniques to manage multiple processes, control job execution, and implement parallel processing in Bash Script.
To run a command in the background, append an ampersand (&) to the command:
long_running_command &
echo "This will print immediately"
Use these commands to manage background jobs:
Use Ctrl+Z to suspend a running foreground job, then:
bg # Resume the job in the background
fg # Resume the job in the foreground
To terminate a job:
kill %n # Kill job number n
kill -9 %n # Force kill job number n
Process substitution allows you to use the output of a command as a file:
diff <(ls dir1) <(ls dir2)
GNU Parallel is a powerful tool for parallel execution:
# Install GNU Parallel
sudo apt-get install parallel
# Example usage
parallel echo ::: A B C
Here's a simple way to run commands in parallel:
#!/bin/bash
process_file() {
echo "Processing $1"
sleep 2 # Simulate work
echo "Finished processing $1"
}
for file in file1 file2 file3 file4; do
process_file "$file" &
done
wait # Wait for all background jobs to finish
echo "All files processed"
Use the trap command to handle signals:
#!/bin/bash
cleanup() {
echo "Cleaning up..."
# Add cleanup code here
}
trap cleanup EXIT SIGINT SIGTERM
# Your script commands here
Use the kill command to send signals to processes:
kill -SIGTERM $PID # Send SIGTERM to process with PID
kill -9 $PID # Send SIGKILL (force quit) to process with PID
Use commands like top, ps, and time to monitor process resources:
top -b -n 1 # Display top processes
ps aux | grep script_name # Find specific processes
time ./myscript.sh # Measure execution time
Let's create a script that demonstrates parallel processing of multiple files:
#!/bin/bash
# Maximum number of parallel processes
MAX_PROCS=4
# Function to process a file
process_file() {
local file=$1
echo "Processing $file"
sleep 2 # Simulate work
wc -l "$file" > "$file.count"
echo "Finished processing $file"
}
# Export the function so it's available to parallel
export -f process_file
# Create some sample files
for i in {1..10}; do
echo "This is file $i" > "file$i.txt"
echo "It has a second line" >> "file$i.txt"
done
# Process files in parallel
find . -name "file*.txt" | parallel -j $MAX_PROCS process_file
# Wait for all background jobs to finish
wait
echo "All files processed. Results:"
cat *.count
# Cleanup
rm file*.txt file*.count
echo "Script completed"
Mastering process management and job control in shell Script allows you to create more efficient and powerful Bash tools. By leveraging background processes, parallel execution, and proper signal handling, you can significantly improve the performance and reliability of your Script, especially when dealing with time-consuming tasks or multiple operations.
In our next article, we'll explore advanced text manipulation techniques in shell Script, including awk and sed for complex data processing. Stay tuned for more insights into advanced Bash scripting!