Skip to main content

Commander of Shell: Makefile

· 3 min read

In the world of software development and DevOps, Makefile is an underrated yet powerful tool. It not only automates the compilation process but also serves as a versatile shell task manager, handling testing, deployment, and even system administration. When you need to organize and execute a sequence of shell commands efficiently, Makefile acts as the commander, ensuring a structured and streamlined execution.

What is Makefile?

Originally designed for Unix systems to manage code compilation via the make command, Makefile has evolved far beyond its initial purpose. Today, it is widely used as a powerful automation tool in DevOps workflows.

Why Use Makefile?

  1. Organized Command Execution: Define and reuse a series of shell commands to avoid manual input.
  2. Incremental Execution: make manages dependencies and only executes necessary tasks, improving efficiency.
  3. Cross-Platform Compatibility: Works on macOS, Linux, and Windows (via WSL or MinGW).
  4. Team Collaboration: A unified Makefile enables team members to execute development, testing, and deployment processes effortlessly.

Basic Makefile Syntax

A Makefile consists of a target, dependencies, and commands, following this structure:

target: dependencies
command

Example:

build:
echo "Starting compilation..."
gcc main.c -o main

Running make build triggers the echo and gcc commands.

Advanced Syntax

1. Suppressing Command Output with @

By default, Makefile prints executed commands. To suppress output, prefix commands with @:

echo_test:
@echo "This is a hidden command"

Executing make echo_test displays only This is a hidden command without showing echo itself.

2. Using Functions

Makefile includes built-in functions like shell, which executes shell commands and returns results:

CURRENT_DIR := $(shell pwd)

echo_dir:
@echo "Current directory: $(CURRENT_DIR)"

3. Built-in Variables and Pattern Rules

  • $@: Represents the target name
  • $^: All dependencies
  • $<: The first dependency

Example:

%.o: %.c
gcc -c $< -o $@

This rule compiles all .c files into .o files automatically.

In this rule, .c is the dependency (another line of instruction), and .o is the executed command. You are correct—commands can be named after file names.

.PHONY: build

That is why you may see a .PHONY section, which informs make that build is a command, not a generated file. make checks whether a target has already been created to avoid redundant execution.

Makefile in DevOps

1. Automating Development Setup

setup:
apt update && apt install -y python3
pip install -r requirements.txt

2. Running Tests & CI/CD

test:
pytest tests/

3. Deployment & Version Management

deploy:
scp main user@server:/app/
ssh user@server "systemctl restart app"

Conclusion

Makefile is one of the best tools for managing shell commands, making software development and DevOps workflows more structured and efficient. By utilizing @ for cleaner output, built-in functions for flexibility, and variables for automation, you can enhance readability and performance. If your project hasn’t adopted Makefile yet, start today and let it be your automation commander!

The Foundation of DevOps: Shell

· 3 min read

In the IT world, Shell is one of the first tools engineers encounter. Whether managing servers, executing batch tasks, or handling CI/CD workflows, Shell plays an irreplaceable role. For DevOps, Shell is not just an interface to the operating system but also the foundation for automation.

What is Shell?

Shell is the bridge between the operating system and the user, responsible for parsing commands and passing them to the system kernel for execution. In Linux and macOS, Bash (Bourne Again Shell) is the most commonly used Shell, while Windows offers PowerShell.

Why Does DevOps Need Shell?

  1. Automation Foundation: Whether configuring servers, managing networks, or handling daily tasks, Shell scripts are the most lightweight and efficient option.
  2. Cross-Platform Support: Shell can run across different systems and execute remotely via SSH without requiring additional installations.
  3. Integration with Toolchains: Whether Ansible, Docker, Kubernetes, or CI/CD tools like Jenkins and GitHub Actions, almost all can seamlessly integrate with Shell scripts.
  4. Efficient Batch Processing: Allows executing multiple commands in batch, improving management efficiency and reducing repetitive work.

Shell Basics

Commands and Pipelines: Shell supports command combinations such as grep, awk, and sed, which can be linked together using | for efficient data processing.

Variables and Parameters: Define variables like $HOME and use $1, $2, ... to retrieve parameters, enhancing script flexibility.

Conditional Statements and Loops: Use if-else, for, and while for control flow, for example:

#!/bin/bash
if command -v python3 &> /dev/null; then
echo "Python3 is installed"
else
echo "Python3 is not installed"
fi

Functions: Improve script readability and reusability, for example:

#!/bin/bash
function backup() {
tar -czf backup.tar.gz /important/data
}
backup

Shell Applications in DevOps

1. Server Automation

Using Shell scripts to automate server updates and monitor system status:

#!/bin/bash
apt update && apt upgrade -y
systemctl restart nginx

2. CI/CD Pipeline Integration

Control testing and deployment processes in GitHub Actions or Jenkins using Shell:

#!/bin/bash
echo "Starting tests..."
npm test
echo "Tests completed, deploying..."
scp -r ./build user@server:/var/www/html

3. Monitoring and Alerts

Using Shell to monitor CPU and memory usage and send notifications when anomalies occur:

#!/bin/bash
usage=$(df -h | grep '/dev/sda1' | awk '{print $5}' | sed 's/%//')
if [ "$usage" -gt 90 ]; then
echo "Disk usage exceeds 90%" | mail -s "Warning" admin@example.com
fi

Conclusion

Shell is not just an entry-level tool for DevOps but a key to improving work efficiency. With Shell, we can quickly automate daily operations and enhance the reliability of IT systems. If you're not yet familiar with Shell, now is the time to start learning and make it your DevOps powerhouse!

DevOps Assistant: Ansible

· 3 min read

Have you ever worked with remote machines? In the past, IT teams would use SSH to access servers, enter Linux commands line by line, and rely on outdated shell scripts left by predecessors to execute tasks on each machine. This process was time-consuming and error-prone, especially when dealing with software updates or debugging shell scripts.

In modern software development, automation has become key to improving efficiency and reliability. Among many automation tools, Ansible stands out as an essential assistant for DevOps teams. This article introduces Ansible's core features, advantages, and how to integrate it into your daily workflow.

What is Ansible?

Ansible is an open-source automation tool designed to simplify IT infrastructure management and application deployment. Using a straightforward and intuitive YAML syntax, Ansible makes configuration easy to learn and implement.

Advantages

  1. Easy to Use: Ansible uses YAML for configuration, which is simple and structured, helping beginners quickly get started.

  2. Highly Extensible: With a rich library of modules, developers can also write custom modules for specific needs.

  3. Secure and Reliable: No extra ports or agent installations are needed, reducing potential security risks.

  4. Cross-Platform Support: Compatible with multiple operating systems (e.g., Linux, Windows), making it suitable for heterogeneous environments.

How to Use Ansible?