Node.js Performance Tips
Introduction
Node.js is renowned for its high performance, but poorly optimized applications can suffer from slow response times and high resource consumption. This comprehensive guide covers essential techniques to optimize your Node.js applications for production environments.
1. Use the Cluster Module
Node.js runs on a single thread by default. Use the cluster module to spawn multiple worker processes that share the same port, utilizing all CPU cores.
const cluster = require('cluster');
const os = require('os');
if (cluster.isMaster) {
const numCPUs = os.cpus().length;
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
cluster.on('exit', (worker) => {
console.log(`Worker ${worker.id} died`);
cluster.fork(); // Restart worker
});
} else {
// Start your server here
require('./app');
}2. Implement Caching Strategies
Caching dramatically reduces database queries and API calls. Use Redis for distributed caching.
const redis = require('redis');
const client = redis.createClient();
async function getUser(id) {
const cacheKey = `user:${id}`;
// Try cache first
const cached = await client.get(cacheKey);
if (cached) {
return JSON.parse(cached);
}
// Fetch from database
const user = await db.users.findById(id);
// Cache for 1 hour
await client.setEx(cacheKey, 3600, JSON.stringify(user));
return user;
}3. Use Streaming for Large Data
Streams process data in chunks, reducing memory consumption for large files or datasets.
const fs = require('fs');
// Bad: Loads entire file into memory
fs.readFile('large-file.txt', (err, data) => {
res.send(data);
});
// Good: Streams data in chunks
const readStream = fs.createReadStream('large-file.txt');
readStream.pipe(res);4. Optimize Database Queries
Database queries are often the bottleneck. Use indexes, connection pooling, and query optimization.
Use Connection Pooling
const pool = new Pool({
max: 20,
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000,
});Select Only Required Fields
// Bad: Fetches all columns
const users = await User.find();
// Good: Selects specific fields
const users = await User.find().select('name email');5. Use Compression
Enable gzip/brotli compression to reduce response sizes significantly.
const compression = require('compression');
const express = require('express');
const app = express();
// Enable compression for all responses
app.use(compression());6. Profile Your Application
Use profiling tools to identify bottlenecks and memory leaks.
node --prof app.js node --prof-process isolate-*.log > processed.txt
7. Avoid Blocking the Event Loop
Keep the event loop free for I/O operations. Move CPU-intensive tasks to worker threads.
const { Worker } = require('worker_threads');
function runCPUIntensiveTask(data) {
return new Promise((resolve, reject) => {
const worker = new Worker('./cpu-task.js', {
workerData: data
});
worker.on('message', resolve);
worker.on('error', reject);
});
}8. Use Proper Logging
Excessive logging can impact performance. Use structured logging with appropriate log levels.
const pino = require('pino');
const logger = pino({
level: process.env.LOG_LEVEL || 'info',
transport: {
target: 'pino-pretty',
options: { colorize: true }
}
});
logger.info({ userId: 123 }, 'User logged in');9. Implement Rate Limiting
Protect your API from abuse and ensure fair resource distribution.
const rateLimit = require('express-rate-limit');
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // limit each IP to 100 requests per windowMs
message: 'Too many requests from this IP'
});
app.use('/api/', limiter);10. Monitor Memory Usage
Monitor and fix memory leaks to prevent application crashes.
// Monitor memory usage
setInterval(() => {
const usage = process.memoryUsage();
console.log({
rss: `${Math.round(usage.rss / 1024 / 1024)}MB`,
heapTotal: `${Math.round(usage.heapTotal / 1024 / 1024)}MB`,
heapUsed: `${Math.round(usage.heapUsed / 1024 / 1024)}MB`
});
}, 60000);Performance Checklist
- ✓Use clustering to utilize all CPU cores
- ✓Implement caching with Redis
- ✓Use streams for large data processing
- ✓Enable compression (gzip/brotli)
- ✓Optimize database queries and indexes
- ✓Profile application regularly
- ✓Implement rate limiting
- ✓Monitor memory usage and fix leaks
Conclusion
Optimizing Node.js applications requires a multi-faceted approach covering clustering, caching, database optimization, and careful monitoring. By implementing these techniques, you can build high-performance applications that scale effectively under load. Remember to profile regularly and optimize based on real-world data.