dev-resources.site
for different kinds of informations.
Using MinIO Server for Local Development: A Smarter Alternative to S3
File uploads are a staple of many applications, enabling users to share images, videos, documents, and more. While S3 is a popular choice for handling file storage, itโs not always ideal for local development and testing. The need to make HTTP calls via the AWS SDK introduces latency, and relying on S3 during development can be slow and cumbersome. Mocking or spying during end-to-end testing might work, but it often feels like a compromise.
Thatโs where MinIO Server comes inโa high-performance, S3-compatible object storage solution thatโs perfect for local development. MinIO offers the same S3 APIs, making it a seamless alternative to integrate into your existing workflows. In this article, weโll demonstrate how to set up and use MinIO with a Node.js application, while also highlighting its unique benefits.
Why MinIO?
- Lightning-fast local operations instead of internet-dependent API calls
- Identical S3 API compatibility โ no code changes needed
- A sleek web interface for manual file management
- Cost savings on S3 requests during development
- Perfect isolation for testing environments
- Zero internet dependency for file operations
Setting Up MinIO with Docker
Getting started is straightforward. While you can download MinIO directly, Docker makes it even easier. Hereโs how to configure MinIO using Docker Compose.
- Install Docker: If you donโt already have Docker installed, download it here.
- Create a
docker-compose.yml
File: Add the following configuration:
version: "3"
services:
file-storage:
restart: always
image: minio/minio:RELEASE.2022-06-11T19-55-32Z
container_name: s3-file-storage
environment:
MINIO_ROOT_USER: admin
MINIO_ROOT_PASSWORD: Password1234
command: server --console-address ":9001" /file-storage-volume
volumes:
- file-storage-volume:/file-storage-volume
ports:
- 9000:9000 # Files served via this port
- 9001:9001 # Admin console accessible via this port
volumes:
file-storage-volume:
Run Docker Compose:
docker-compose up -d
Once running, you can access:
- API endpoint at
http://localhost:9000
- Web console at
http://localhost:9001
(login with admin/Password1234)
Integrating MinIO with Your Node.js Application
Here's where the magic happens. MinIO seamlessly replaces S3 in your development environment while keeping production untouched. Here's how to set it up:
Install Dependencies
Make sure you have the AWS SDK installed:
npm install @aws-sdk/client-s3
Configure MinIO and S3
Use the following configuration to create a storage instance:
import { once } from "lodash";
import { S3 } from "@aws-sdk/client-s3";
const config = {
awsRegion: "eu-west-2",
minioBaseUrl: "http://localhost:9000",
minioUsername: "admin",
minioPassword: "Password1234",
};
const isDevelopmentOrTest = process.env.NODE_ENV !== "production";
const createOrGetStorageInstance = once(() => {
if (isDevelopmentOrTest) {
return new S3({
endpoint: config.minioBaseUrl,
forcePathStyle: true,
region: config.awsRegion,
credentials: {
accessKeyId: config.minioUsername,
secretAccessKey: config.minioPassword,
},
});
}
return new S3({ region: config.awsRegion });
});
export const s3 = createOrGetStorageInstance();
This setup uses MinIO during development and seamlessly falls back to S3 in production.
Putting It All Together: File Upload Example
Here's a practical example using Express and express-fileupload:
Install Dependencies
npm install express express-fileupload
Implement File Upload
Hereโs a simple Express route for uploading files:
import express from "express";
import fileUpload from "express-fileupload";
const app = express();
app.use(fileUpload());
app.post("/upload", async (req, res) => {
if (!req.files||!req.files.files) {
return res.status(400).send("No files were uploaded.");
}
const uploadedFiles = Array.isArray(req.files.files)
? req.files.files
: [req.files.files];
const uploadedUrls = await Promise.all(
uploadedFiles.map(async (file) => {
const key = `uploads/${file.name}`;
const uploadParams = {
Bucket: "my-bucket",
Key: key,
Body: file.data,
ContentType: file.mimetype,
};
const uploadResult = await s3.upload(uploadParams).promise();
return uploadResult.Location;
})
);
res.json({ message: "Files uploaded successfully", urls: uploadedUrls });
});
app.listen(3000, () => console.log("Server running on port 3000"));
### Postman Example:
Below is an example of how to test this route using Postman:
Pro Tips for MinIO Usage
- Web Console Features: Take advantage of MinIO's web interface to:
- Browse uploaded files
- Create and manage buckets
- Set up access policies
- Monitor usage statistics
- Development Workflow:
- Create separate buckets for different file types (images, documents, etc.)
- Use the web console to verify uploads during development
- Use environment variables to switch between MinIO and S3
Conclusion
MinIO Server is a game-changer for local development. It provides the perfect balance of functionality and simplicity, making your development process faster and more reliable. With identical S3 API compatibility and a user-friendly web interface, it's the ideal solution for local development and testing environments.
Remember: your production environment still uses real S3, but your development environment gets all the benefits of local, lightning-fast file operations. It's the best of both worlds!
Try it out in your next project, and you'll wonder how you ever developed without it.
Source Code ๐
GitHub Repository: https://github.com/Nedum84/minio-s3-server
Let's Connect ๐
- Twitter: @thenelson_o
- LinkedIn: Nelson Odo
What Do You Think? ๐ญ
Share your thoughts! Did this post help you?
Have ideas or feedback to share? Letโs continue the conversation in the comments below! ๐
Featured ones: