How to use the createReadStream function from fs-extra

Find comprehensive JavaScript fs-extra.createReadStream code examples handpicked from public code repositorys.

fs-extra.createReadStream is a function in the fs-extra library that creates a readable stream to read data from a file.

497
498
499
500
501
502
503
504
505
506
        // Resolve the promise
        return resolve(true);
    });

    // Pipe the "from" stream into the "to" stream
    fs.createReadStream(from).pipe(writeStream);
} catch (e) {
    // Failed
    return reject(e);
}
fork icon86
star icon477
watch icon0

730
731
732
733
734
735
736
737
738
739
.map(async (url) => {
  const extension = path.extname(new URL(url).pathname);
  const sourceFile = s3data.find((d) => d.endsWith(extension));
  const sourceChecksum = await generateChecksumFromStream(
    'cksum',
    fs.createReadStream(require.resolve(sourceFile))
  );
  const file = files.find((f) => f.fileName.endsWith(extension));

  const filepath = `/${file.bucket}/${file.key}`;
fork icon100
star icon223
watch icon24

+ 14 other calls in file

How does fs-extra.createReadStream work?

fs-extra.createReadStream is a function in the fs-extra library that creates a readable stream to read data from a file. Here's how it works:

  1. First, you need to have the fs-extra library installed in your project. This can be done using a package manager like npm or yarn.

  2. Once you have fs-extra installed, you can import the fs-extra module in your code and use the fs-extra.createReadStream function to create a readable stream.

  3. The fs-extra.createReadStream function takes the path to the file you want to read as its argument. It returns a readable stream that you can use to read the contents of the file.

  4. When you create a readable stream using fs-extra.createReadStream, you can optionally specify the size of the chunks of data that will be read from the file at a time. This can be useful if you want to process the data in chunks instead of all at once.

  5. Once you have a readable stream, you can read data from it using the read method, which reads a specified number of bytes from the stream, or the pipe method, which pipes the data to a writable stream or transforms it using a transform stream.

  6. When you're finished reading data from the stream, you should close it using the close method to free up any system resources it was using.

Here is an example of using fs-extra.createReadStream to read the contents of a file:

javascript
const fs = require('fs-extra'); // Create a readable stream to read data from a file const stream = fs.createReadStream('/path/to/file.txt', { encoding: 'utf8' }); // Read data from the stream and log it to the console stream.on('data', (chunk) => { console.log(chunk); }); // Close the stream when we're finished reading data stream.on('close', () => { console.log('Stream closed'); });

In this example, we create a readable stream using fs-extra.createReadStream and pass it the path to a file we want to read. We also specify that we want to read the file as a UTF-8 encoded string by passing { encoding: 'utf8' } as the second argument.

We then log the data we read from the stream to the console using the 'data' event, which is emitted each time the stream has data available. We also log a message to the console when the stream is closed using the 'close' event.

Note that in a real-world application, you would typically use fs-extra.createReadStream to read large files or files that are too big to fit in memory, and handle errors and edge cases when reading data from the stream.

490
491
492
493
494
495
496
497
498
499
  const directory = await unzipper.Open.file(req.files.actions[0].path)
  if (directory.files.length !== 1) return res.status(400).send('only accept zip archive with a single file inside')
  actionsMime = mime.lookup(directory.files[0].path)
  inputStream = directory.files[0].stream()
} else {
  inputStream = fs.createReadStream(req.files.actions[0].path)
  // handle .csv.gz file or other .gz files
  if (req.files.actions[0].originalname.endsWith('.gz')) {
    actionsMime = mime.lookup(req.files.actions[0].originalname.slice(0, -3))
    if (actionsMime) actionsMime += '+gzip'
fork icon6
star icon26
watch icon4

+ 4 other calls in file

227
228
229
230
231
232
233
234
235
236
- fs.appendFile():将数据附加到文件。如果文件不存在,则创建它
- fs.chmod():更改通过传递的文件名指定的文件的权限。相关阅读:fs.lchmod(),fs.fchmod()
- fs.chown():更改由传递的文件名指定的文件的所有者和组。相关阅读:fs.fchown(),fs.lchown()
- fs.close():关闭文件描述符
- fs.copyFile():复制文件
- fs.createReadStream():创建可读的文件流
- fs.createWriteStream():创建可写文件流
- fs.link():创建指向文件的新硬链接
- fs.mkdir(): 新建一个文件夹
- fs.mkdtemp():创建一个临时目录
fork icon0
star icon12
watch icon1

+ 41 other calls in file

Ai Example

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
const fs = require("fs-extra");

// Create a readable stream to read data from a file
const stream = fs.createReadStream("/path/to/file.txt", {
  highWaterMark: 1024,
});

// Read data from the stream and log it to the console
stream.on("data", (chunk) => {
  console.log(chunk);
});

// Close the stream when we're finished reading data
stream.on("close", () => {
  console.log("Stream closed");
});

In this example, we create a readable stream using fs-extra.createReadStream and pass it the path to a file we want to read. We also specify that we want to read the file in chunks of 1024 bytes by passing { highWaterMark: 1024 } as the second argument. We then log the data we read from the stream to the console using the 'data' event, which is emitted each time the stream has data available. The 'data' event will be emitted multiple times as the stream reads the file in chunks. We also log a message to the console when the stream is closed using the 'close' event. Note that in this example, we're just logging the data we read from the stream to the console. In a real-world application, you would typically use the data for some purpose, such as processing it or writing it to a file.

113
114
115
116
117
118
119
120
121
122

const extractZipNodeAsync = (zipFile, destPath, descript="") => {
  log(`Extract ${descript}`)
  return new Promise((resolve) => {
    fs.ensureDirSync(path.dirname(destPath))
    fs.createReadStream(zipFile)
      .pipe(unzip.Extract({ path: destPath }))
      .on('close', () => {
        log(`Extracting ${descript} finished`)
        return resolve()
fork icon356
star icon0
watch icon1

+ 3 other calls in file

1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
    backdrop: "rgba(0,0,0, 0.4)",
  });

  ipcRenderer.send("track-event", "Error", `Download Template - ${templateItem}`);
} else {
  fs.createReadStream(templatePath).pipe(fs.createWriteStream(destinationPath));
  var emessage = `Successfully saved '${templateItem}' to ${destinationFolder}`;
  Swal.fire({
    icon: "success",
    title: "Download successful",
fork icon9
star icon22
watch icon9

+ 9 other calls in file

330
331
332
333
334
335
336
337
338
339
340
341
    const cacheFile = path.join(CACHE_PATH,`${cacheId}`);
    const {size} = fs.statSync(cacheFile);
    res.header('Content-Length', size);
    console.log(`File size: ${size}`);


    fs.createReadStream(cacheFile).pipe(res);
});


// Purge cache storage and DB
server.post('/_apis/artifactcache/clean', (req, res) => {
fork icon6
star icon30
watch icon1

16
17
18
19
20
21
22
23
24
25
26
 * @returns {Promise<void>} The promise which will go fulfilled after done.
 * @private
 */
function copyFileContent(source, output, transforms) {
    return new Promise((resolve, reject) => {
        const reader = fs.createReadStream(source)
        const writer = fs.createWriteStream(output)
        const streams = [reader]


        /**
fork icon3
star icon27
watch icon2

+ 8 other calls in file

41
42
43
44
45
46
47
48
49
50

this.emit('start')

fs.ensureDirSync( path.dirname(this.output) )

const rs = fs.createReadStream(this.input)
const ws = fs.createWriteStream(this.output)

rs.on('data', ( chunk ) => {
  chunk = Buffer.isBuffer(chunk) ? chunk.toString('utf8') : chunk
fork icon1
star icon4
watch icon0

250
251
252
253
254
255
256
257
258
259
for (let filename of filenames) {
    let bufs = []; // stores the iOS manifest file in plist format, extracted from the ipa

    // use promise to be able to wait for end of data operation
    await new Promise((resolve, reject) => {
        fs.createReadStream(filename)
        .pipe(unzip.ParseOne(/^Payload\/[^\/]+.app\/Info.plist$/))

        .on('data', (chunk) => {
            bufs.push(chunk);
fork icon3
star icon3
watch icon2

+ 9 other calls in file

351
352
353
354
355
356
357
358
359
360
    filter.extname.indexOf(path.extname(_src)) != -1
  ) {
    return
  }
  // 通过管道来传输流
  fs.createReadStream(_src).pipe(fs.createWriteStream(_dst))
} else if (st.isDirectory()) {
  // 如果是目录则递归调用自身
  self.copyDir(_src, _dst, null, filter)
}
fork icon0
star icon2
watch icon0

1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
    this._extractJdkZip(a.to, dataDir, self)

} else {
    // Tar.gz
    let h = null
    fs.createReadStream(a.to)
        .on('error', err => AssetGuard.logger.error(err))
        .pipe(zlib.createGunzip())
        .on('error', err => AssetGuard.logger.error(err))
        .pipe(tar.extract(dataDir, {
fork icon0
star icon1
watch icon0

+ 6 other calls in file

62
63
64
65
66
67
68
69
70
71
const ACL = this.parseOptional(options.acl, 'string', undefined);
const s3 = this.getS3Provider(provider);

if (!s3) throw new Error(`S3 provider "${provider}" doesn't exist.`);

let Body = fs.createReadStream(path);

const result = await s3.putObject({ Bucket, ACL, Key, ContentType, ContentDisposition, Body });

try {
fork icon0
star icon1
watch icon1

7266
7267
7268
7269
7270
7271
7272
7273
7274
7275
    }
  })
}

function copyFile (file, target) {
  var readStream = fs.createReadStream(file.name)
  var writeStream = fs.createWriteStream(target, { mode: file.mode })

  readStream.on('error', onError)
  writeStream.on('error', onError)
fork icon0
star icon1
watch icon0

+ 17 other calls in file

344
345
346
347
348
349
350
351
352
353
}

return S3.getInstance(this.context)
  .then((s3) => {
    const s3Params = {
      Body: fs.createReadStream(filePath),
      Key: cfnFile,
    };
    logger('updateResourceStack.s3.uploadFile', [{ Key: s3Params.cfnFile }])();
    return s3.uploadFile(s3Params, false);
fork icon754
star icon0
watch icon148

+ 3 other calls in file

120
121
122
123
124
125
126
127
128
129
130
131
    }


    // name the output file
    res.attachment(req.params.importID + ".zip");


    fs.createReadStream(resultZip).pipe(res);
  } catch (e) {
    return next(new Error("Failed to download import"));
  }
});
fork icon69
star icon0
watch icon23

+ 4 other calls in file

100
101
102
103
104
105
106
107
108
109
    encoding: metadata.encoding,
    extension: metadata.extension,
});
const gfs = Grid(dbfs.db);

const inputStream = fs.createReadStream(filePath);
const outputStream = gfs.createWriteStream({
    filename: fsName,
});
inputStream.pipe(outputStream);
fork icon0
star icon2
watch icon3

+ 29 other calls in file

1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
//                         const name = combined.substring(combined.lastIndexOf('/')+1)
//                         const fDir = path.join(dataDir, name)
//                         const jre = new Asset(name, null, parseInt(resp.headers['content-length']), opts, fDir)
//                         this.java = new DLTracker([jre], jre.size, (a, self) => {
//                             let h = null
//                             fs.createReadStream(a.to)
//                                 .on('error', err => console.log(err))
//                                 .pipe(zlib.createGunzip())
//                                 .on('error', err => console.log(err))
//                                 .pipe(tar.extract(dataDir, {
fork icon2
star icon1
watch icon0

+ 7 other calls in file

715
716
717
718
719
720
721
722
723
724
.then(obj => {
  let stream;
  if (/^http/.test(installPackage)) {
    stream = hyperquest(installPackage);
  } else {
    stream = fs.createReadStream(installPackage);
  }
  return extractStream(stream, obj.tmpdir).then(() => obj);
})
.then(obj => {
fork icon1
star icon0
watch icon2

+ 4 other calls in file

389
390
391
392
393
394
395
396
397
398
});

// Load the EEG data from the CSV file
let eegData = [];
let headers = [];
fs.createReadStream("./data.csv")
  .pipe(csv1())
  .on("headers", (row) => {
    headers = row;
  })
fork icon0
star icon1
watch icon1

+ 3 other calls in file

function icon

fs-extra.readFileSync is the most popular function in fs-extra (9724 examples)