Node.js Streams: Processing Large Files Without Running Out of Memory
javascript
dev.to
The Memory Problem // This will OOM on a 2GB file const data = await fs.readFile('huge-file.csv'); // reads entire file into memory const lines = data.toString().split('\n'); // crashes with: JavaScript heap out of memory Streams process data in chunks. You never load the full file—you process pieces as they arrive. Reading Files With Streams import { createReadStream } from 'fs'; import { createInterface } from 'readline'; async function processCSV(filePath: