Logo

dev-resources.site

for different kinds of informations.

Mastering Cost Optimisation with Shell Scripting: Automate Log Storage in S3 for Budget-Friendly Infrastructure

Published at
1/13/2025
Categories
devops
aws
jenkins
Author
pravesh_sudha_3c2b0c2b5e0
Categories
3 categories in total
devops
open
aws
open
jenkins
open
Author
25 person written this
pravesh_sudha_3c2b0c2b5e0
open
Mastering Cost Optimisation with Shell Scripting: Automate Log Storage in S3 for Budget-Friendly Infrastructure

Learn how to leverage Shell scripting to streamline Jenkins log management and reduce cloud storage costs with AWS S3

💡 Introduction

Welcome to the world of DevOps! Today, we are diving into shell scripting by creating a script that will help reduce our infrastructure costs by storing Jenkins logs in AWS S3 rather than VMs.

Imagine a company like Google, which has thousands of microservices running, collecting logs, metrics, and traces stored in servers. Popular observability stacks used by companies are ELK or EFK. These tools collect different types of logs, such as:

  • Application Logs: High-priority logs used to troubleshoot applications.
  • Kubernetes Control-Plane Logs: High-priority logs for troubleshooting cluster issues.
  • Infrastructure Logs: Logs from tools like Jenkins or Terraform.

Infrastructure logs are typically not as critical to store on servers. For example, if a Jenkins build fails, notifications are sent via email or Slack, enabling instant troubleshooting. However, retaining these logs for backup and restoration purposes is important.

To address this, we’ve created a shell script that uploads Jenkins build logs to S3, ensuring cost optimisation while maintaining access to logs when needed.


💡 Prerequisites

Before starting the project, ensure the following requirements are met:

  • An AWS Account.
  • Basic understanding of Shell Scripting.
  • Basic understanding of AWS S3.
  • Jenkins Installed on your system.

💡 Setting Up the Environment

Before we create the script, we need:

  1. A Sample Jenkins Project:
    • Create a pipeline project in Jenkins named hello-world-project using a Hello World template.

Image description

  • Run the pipeline 3–4 times to generate 3–4 log files.

Image description

  1. An S3 Bucket:
    • Create an S3 bucket. For example: bucket-for-jenkins-logs (ensure the name is unique).

Image description

  1. Configure AWS CLI:
    • Generate an access key from AWS IAM.
    • Use the command aws configure to set up your AWS CLI.

Image description

  1. Locate the Jenkins Home Directory:
    • Go to Jenkins > Manage Jenkins > System. The Jenkins home directory path is displayed at the top.

Image description

Image description

Image description


💡 Writing the Shell Script

Now, let’s create the script that will automate the upload process:

The Script: s3upload.sh

#!/bin/bash

######################
## Author: Pravesh-Sudha
## Description: Shell script to upload Jenkins logs to S3 bucket
## Version: v1
######################

# Variables
JENKINS_HOME="/Users/praveshsudha/.jenkins"  # Replace with your Jenkins home directory
S3_BUCKET="s3://bucket-for-jenkins-logs"  # Replace with your S3 bucket name
DATE=$(date +%Y-%m-%d)  # Today's date

# Check if AWS CLI is installed
if ! command -v aws &> /dev/null; then
    echo "AWS CLI is not installed. Please install it to proceed."
    exit 1
fi

# Iterate through all job directories
for job_dir in "$JENKINS_HOME/jobs/"*/; do
    job_name=$(basename "$job_dir")

    # Iterate through build directories for the job
    for build_dir in "$job_dir/builds/"*/; do
        # Get build number and log file path
        build_number=$(basename "$build_dir")
        log_file="$build_dir/log"

        # Check if log file exists and was created today
        if [ -f "$log_file" ] && [ "$(date -r "$log_file" +%Y-%m-%d)" == "$DATE" ]; then
            # Upload log file to S3 with the build number as the filename
            aws s3 cp "$log_file" "$S3_BUCKET/$job_name-$build_number.log" --only-show-errors

            if [ $? -eq 0 ]; then
                echo "Uploaded: $job_name/$build_number to $S3_BUCKET/$job_name-$build_number.log"
            else
                echo "Failed to upload: $job_name/$build_number"
            fi
        fi
    done
done
Enter fullscreen mode Exit fullscreen mode

Running the Script

  1. Grant Execution Permissions:
   chmod 777 s3upload.sh
Enter fullscreen mode Exit fullscreen mode
  1. Execute the Script:
   ./s3upload.sh
Enter fullscreen mode Exit fullscreen mode

The script will:

  • Iterate through Jenkins job and build directories.
  • Identify logs created on the current date.
  • Upload logs to your specified S3 bucket with appropriate naming.

Image description

Image description


💡 Conclusion

Congratulations! 🎉 You’ve successfully created a shell script to automate the process of uploading Jenkins logs to S3, reducing costs and ensuring seamless log management.

But there’s more! AWS S3 supports Lifecycle Management, allowing you to define rules to automatically transition older or less important logs to cheaper storage classes like Glacier or Deep Archive. These options provide even greater cost savings for infrequently accessed logs.

This project demonstrates how a simple shell script, combined with cloud services, can solve real-world challenges in a DevOps workflow.

Keep experimenting, keep learning, and most importantly, keep scripting! 😊

🚀 For more informative blog, Follow me on Hashnode, X(Twitter) and LinkedIn.

jenkins Article's
30 articles in total
Favicon
Mastering Cost Optimisation with Shell Scripting: Automate Log Storage in S3 for Budget-Friendly Infrastructure
Favicon
Can’t access username or password, forgot these credentials after installing Jenkins
Favicon
Deploying a Next.js UI App on S3 Using Jenkins🤩
Favicon
How to install Jenkins in ubuntu
Favicon
Integrating Maven with Jenkins: A Step-by-Step Guide
Favicon
"Is Jenkins better than Docker? Or are they meant for different purposes?"
Favicon
Can a Python Server (Serving HTML with Jinja2) Interact and Modify Files in a Jenkins Pipeline?
Favicon
Streamlining CI/CD: A Complete Guide to Installing Jenkins on AWS EC2
Favicon
[Boost]
Favicon
Best Practices of Optimizing CI/CD Pipelines: Jenkins Consultancy
Favicon
deploy Jenkins using docker compose with production ready
Favicon
Pipeline CD en Jenkins para terraform AWS EKS
Favicon
[Boost]
Favicon
Mastering Jenkins: A Step-by-Step Guide to Setting Up and Supercharging Your CI/CD Workflow
Favicon
13 Best DevOps Tools for Enhancing Business Processes
Favicon
A Complete Guide to Setting Up Nexus (2 Ways) + How to Connect Nexus to Jenkins
Favicon
Automating Kubernetes Deployments with CI/CD Pipelines (GitLab, Jenkins)
Favicon
Accelerate Releases with Shift-Left Validation: A Custom CI/CD Configuration Framework
Favicon
Ci CD pipeline with Jenkins
Favicon
Building a Secure and Scalable CI/CD Pipeline for EKS Using Jenkins and GitHub Actions
Favicon
Strategies for Improving Jenkins Pipeline Performance: Best Practices and Implementation
Favicon
Connecting Jenkins to Slack: A Beginner's Guide
Favicon
From 41 Minutes to 8 Minutes: How I Made Our CI/CD Pipeline 5x Faster
Favicon
Jenkins Guide for Beginners
Favicon
DevOps and Security: How To Build Resilient Pipelines
Favicon
Cómo instalar Jenkins en AWS: Guía paso a paso
Favicon
Automating Docker Workflows with Jenkins: A Complete Guide
Favicon
Top 5 Free DevOps Certification Courses to Boost Your Career!
Favicon
Jenkins with PHP – Run Your First Pipeline
Favicon
How to Manage Terraform with Jenkins

Featured ones: