Posted by Kosal
When dealing with large files or high-traffic applications, traditional file upload methods in Node.js can quickly become inefficient and memory-intensive. Node.js streams offer a more scalable, memory-efficient way to handle file uploads by processing data in chunks rather than loading everything into memory at once. This article explores how to use streams to optimize file uploads in your Node.js applications.
Streams in Node.js are objects that let you read data from a source or write data to a destination in a continuous fashion. Instead of buffering the entire file in memory, streams allow you to process data as it arrives — making them ideal for handling large file uploads or high-volume traffic.
Benefits of using streams for file uploads:
Here’s how you can set up a simple HTTP server that processes file uploads using streams.
mkdir file-upload-streams
cd file-upload-streams
npm init -y
npm install formidable
Use the built-in http
module and the formidable
library to parse incoming form data:
// server.js
import http from 'node:http';
import fs from 'node:fs';
import path from 'node:path';
import { fileURLToPath } from 'node:url';
import formidable, { errors as formidableErrors } from 'formidable';
// Setup __dirname since we're using ESM
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// Ensure the uploads directory exists
const uploadDir = path.join(__dirname, 'uploads');
if (!fs.existsSync(uploadDir)) {
fs.mkdirSync(uploadDir);
}
const server = http.createServer(async (req, res) => {
if (req.url === '/api/upload' && req.method.toLowerCase() === 'post') {
const form = formidable({
uploadDir,
keepExtensions: true, // Preserve file extension
maxFileSize: 200 * 1024 * 1024, // 200MB limit
multiples: true,
});
try {
const [fields, files] = await form.parse(req);
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ message: 'Files uploaded successfully', fields, files }, null, 2));
} catch (err) {
if (err.code === formidableErrors.maxFieldsExceeded) {
console.error('Too many fields!');
}
console.error(err);
res.writeHead(err.httpCode || 400, { 'Content-Type': 'text/plain' });
res.end(String(err));
}
return;
}
// Display upload form
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end(`
<h2>Upload a File with Node.js + Formidable</h2>
<form action="/api/upload" enctype="multipart/form-data" method="post">
<div>Text field title: <input type="text" name="title" /></div>
<div>Files: <input type="file" name="multipleFiles" multiple /></div>
<input type="submit" value="Upload" />
</form>
`);
});
server.listen(3000, () => {
console.log('Server listening on http://localhost:3000/');
});
Run the server and open your browser at http://localhost:3000
. Upload a file using the form. The server will stream the file directly to disk, avoiding large memory usage.
When working with streams for uploads, keep the following in mind:
.on('error', callback)
.Node.js streams provide a robust and scalable solution for handling file uploads efficiently. By processing data in chunks, you can minimize memory usage, improve performance, and scale your application with ease. Whether saving files to local storage or uploading them directly to a cloud service like S3, streams are a powerful tool every Node.js developer should understand and leverage.
Reference: