Examples to read a file in Node.js, including:
- Read a file’s entire contents (small file)
fs.readFile()
, asynchronouslyfs.readFileSync()
- Performance considerations
fs.read()
, with your control: read chunk size, position, etc.
- Read a file line by line (big file)
readline
, Nodejs’s native packageline-reader
, open source package which you need to install.- Method your should avoid to read a huge file lines
- No need to check file existence before reading, process it in the callback.
Read a file’s entire contents at once (small file)
Methods:
fs.readFile()
Asynchronously reads the entire contents of a file.fs.readFileSync()
Similar tofs.readFileAsync()
but synchronous.
Method 1 — fs.readFile()
(Asynchronously)
fs.readFile()
asynchronously reads the entire contents of a file.
words.txt
spring 春
summer 夏
autumn 秋
winter 冬
index.js
const fs = require('fs');
// Asynchronously get the entire contents of a file.
function readFileAsync(file) {
// If the file does not yet exist, there will be an error of "no such file or directory".
fs.readFile(file, 'utf8' , (err, data) => {
if (err) {
console.error(err)
}
console.log(data);
});
};
readFileAsync('./words.txt');
Output:
spring 春
summer 夏
autumn 秋
winter 冬
Method 2 — fs.readFileSync()
fs.readFileSync()
returns the contents of a file.
const fs = require('fs');
// Returns the contents of a file.
function readFileSync(file) {
try {
const data = fs.readFileSync(file, 'utf8');
console.log(data);
} catch (err) {
console.error(err);
}
}
Performance considerations
fs.readFile()
and fs.readFileSync()
reads the entire contents of the file in memory before returning the file data. That means it will have big impact on memory cost. For a big file, it is preferred to use streaming via fs.createReadStream()
.
fs.readFile():
The
fs.readFile()
function buffers the entire file. To minimize memory costs, when possible prefer streaming viafs.createReadStream()
.Aborting an ongoing request does not abort individual operating system requests but rather the internal buffering
fs.readFile
performs.Performance Considerations
The
fs.readFile()
method asynchronously reads the contents of a file into memory one chunk at a time, allowing the event loop to turn between each chunk. This allows the read operation to have less impact on other activity that may be using the underlying libuv thread pool but means that it will take longer to read a complete file into memory.The additional read overhead can vary broadly on different systems and depends on the type of file being read. If the file type is not a regular file (a pipe for instance) and Node.js is unable to determine an actual file size, each read operation will load on 64 KiB of data. For regular files, each read will process 512 KiB of data.
For applications that require as-fast-as-possible reading of file contents, it is better to use
fs.read()
directly and for application code to manage reading the full contents of the file itself.
Here is an example using fs.read()
:
const fs = require('fs');
function read(file) {
fs.open(file, 'r', (err, fd) => {
if (err) {
console.error(err);
return;
}
let buffer = new Buffer.alloc(1024);
// Call fs.read(fd, buffer, offset, length, position, callback)
fs.read(fd,
buffer,
0, // offset, the position in buffer to write the data to.
buffer.length, // length, the number of bytes to read.
0, // position, specifies where to begin reading from in the file.
(err, bytes, buffer) => { // callback
if (err) {
console.log(err);
}
if (bytes) {
console.log(buffer.slice(0, bytes).toString());
}
fs.close(fd, (err) => console.log(err));
});
// Or you can call fs.read(fd[, options], callback) which provides default buffer:
// fs.read(fd, (err, bytes, buffer) => { // User default options (Buffer.alloc(16384))
// if (err) {
// console.log(err);
// }
// if (bytes) {
// console.log(buffer.slice(0, bytes).toString());
// }
// fs.close(fd, (err) => console.log(err));
// });
});
}
read('./words.txt');
Output:
spring 春
summer 夏
autumn 秋
winter 冬
null
Read a file line by line (big file)
Method 1 — Use native readline
module
The NodeJS’s native readline module provides an interface for reading data from a Readable stream (such as process.stdin
) one line at a time. By using stream, this method is proper for reading big files (See more at “Performance considerations” part above).
Below examples come from nodejs: readline, but they are rewritten with require()
.
function readFileLines(file) {
const rl = readline.createInterface({
input: fs.createReadStream(file),
// Set crlfDelay to Infinity is reasonal if file has rn line terminater.
// The default value for crlfDelay is 100 which means if the delay
// between r and n exceeds 100 milliseconds, both r and n will be
// treated as separate end-of-line input.
crlfDelay: Infinity, // For rn terminater so that rn will be treated as one single line.
});
rl.on('line', line => {
console.log(line);
});
}
If you would like to use await
rather then :
// Await version, it is a bit slower than the 'line' event version on above.
async function readFileLines2(file) {
const rl = readline.createInterface({
input: fs.createReadStream(file),.
crlfDelay: Infinity, // For rn terminater so that rn will be treated as one single line.
});
for await (const line of rl) {
// Each line in input.txt will be successively available here as `line`.
console.log(`Line from file: ${line}`);
}
}
Currently, for await…of loop can be a bit slower. If async/await
flow and speed are both essential, a mixed approach can be applied:
async function readFileLines3(file) {
try {
const rl = createInterface({
input: createReadStream(file),
crlfDelay: Infinity,
});
rl.on('line', (line) => {
// Process the line.
});
await once(rl, 'close');
console.log('File processed.');
} catch (err) {
console.error(err);
}
}
Method 2 — Use third party open-sourceline-reader
module
const lreader = require('line-reader');
lreader .eachLine('gfg.txt', (line, islast) => {
// line: the line read.
// islast: boolean, whether the line read was the last
// line of the file.
console.log(line);
});
Method you should avoid to read a huge file lines
Reading the entire contents of a huge file at once into memory will make your memory intensive and overload your system.
const fs = require('fs')
fs.readFileSync('big-file.txt', 'utf-8')
.split(/r?n/).forEach(line) => console.log(line));
About fs.access()
fs.access()
tests a user’s permissions for the file or directory specified by path
.
fs.access():
Do not use
fs.access()
to check for the accessibility of a file before callingfs.open()
,fs.readFile()
, orfs.writeFile()
. Doing so introduces a race condition, since other processes may change the file’s state between the two calls. Instead, user code should open/read/write the file directly and handle the error raised if the file is not accessible.