How to use the createReadStream function from fs-extra
Find comprehensive JavaScript fs-extra.createReadStream code examples handpicked from public code repositorys.
fs-extra.createReadStream is a function in the fs-extra library that creates a readable stream to read data from a file.
497 498 499 500 501 502 503 504 505 506
// Resolve the promise return resolve(true); }); // Pipe the "from" stream into the "to" stream fs.createReadStream(from).pipe(writeStream); } catch (e) { // Failed return reject(e); }
730 731 732 733 734 735 736 737 738 739
.map(async (url) => { const extension = path.extname(new URL(url).pathname); const sourceFile = s3data.find((d) => d.endsWith(extension)); const sourceChecksum = await generateChecksumFromStream( 'cksum', fs.createReadStream(require.resolve(sourceFile)) ); const file = files.find((f) => f.fileName.endsWith(extension)); const filepath = `/${file.bucket}/${file.key}`;
+ 14 other calls in file
How does fs-extra.createReadStream work?
fs-extra.createReadStream
is a function in the fs-extra library that creates a readable stream to read data from a file. Here's how it works:
First, you need to have the fs-extra library installed in your project. This can be done using a package manager like npm or yarn.
Once you have fs-extra installed, you can import the
fs-extra
module in your code and use thefs-extra.createReadStream
function to create a readable stream.The
fs-extra.createReadStream
function takes the path to the file you want to read as its argument. It returns a readable stream that you can use to read the contents of the file.When you create a readable stream using
fs-extra.createReadStream
, you can optionally specify the size of the chunks of data that will be read from the file at a time. This can be useful if you want to process the data in chunks instead of all at once.Once you have a readable stream, you can read data from it using the
read
method, which reads a specified number of bytes from the stream, or thepipe
method, which pipes the data to a writable stream or transforms it using a transform stream.When you're finished reading data from the stream, you should close it using the
close
method to free up any system resources it was using.
Here is an example of using fs-extra.createReadStream
to read the contents of a file:
javascriptconst fs = require('fs-extra');
// Create a readable stream to read data from a file
const stream = fs.createReadStream('/path/to/file.txt', { encoding: 'utf8' });
// Read data from the stream and log it to the console
stream.on('data', (chunk) => {
console.log(chunk);
});
// Close the stream when we're finished reading data
stream.on('close', () => {
console.log('Stream closed');
});
In this example, we create a readable stream using fs-extra.createReadStream
and pass it the path to a file we want to read. We also specify that we want to read the file as a UTF-8 encoded string by passing { encoding: 'utf8' }
as the second argument.
We then log the data we read from the stream to the console using the 'data'
event, which is emitted each time the stream has data available. We also log a message to the console when the stream is closed using the 'close'
event.
Note that in a real-world application, you would typically use fs-extra.createReadStream
to read large files or files that are too big to fit in memory, and handle errors and edge cases when reading data from the stream.
490 491 492 493 494 495 496 497 498 499
const directory = await unzipper.Open.file(req.files.actions[0].path) if (directory.files.length !== 1) return res.status(400).send('only accept zip archive with a single file inside') actionsMime = mime.lookup(directory.files[0].path) inputStream = directory.files[0].stream() } else { inputStream = fs.createReadStream(req.files.actions[0].path) // handle .csv.gz file or other .gz files if (req.files.actions[0].originalname.endsWith('.gz')) { actionsMime = mime.lookup(req.files.actions[0].originalname.slice(0, -3)) if (actionsMime) actionsMime += '+gzip'
+ 4 other calls in file
GitHub: 5102a/My_Growth
227 228 229 230 231 232 233 234 235 236
- fs.appendFile():将数据附加到文件。如果文件不存在,则创建它 - fs.chmod():更改通过传递的文件名指定的文件的权限。相关阅读:fs.lchmod(),fs.fchmod() - fs.chown():更改由传递的文件名指定的文件的所有者和组。相关阅读:fs.fchown(),fs.lchown() - fs.close():关闭文件描述符 - fs.copyFile():复制文件 - fs.createReadStream():创建可读的文件流 - fs.createWriteStream():创建可写文件流 - fs.link():创建指向文件的新硬链接 - fs.mkdir(): 新建一个文件夹 - fs.mkdtemp():创建一个临时目录
+ 41 other calls in file
Ai Example
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
const fs = require("fs-extra"); // Create a readable stream to read data from a file const stream = fs.createReadStream("/path/to/file.txt", { highWaterMark: 1024, }); // Read data from the stream and log it to the console stream.on("data", (chunk) => { console.log(chunk); }); // Close the stream when we're finished reading data stream.on("close", () => { console.log("Stream closed"); });
In this example, we create a readable stream using fs-extra.createReadStream and pass it the path to a file we want to read. We also specify that we want to read the file in chunks of 1024 bytes by passing { highWaterMark: 1024 } as the second argument. We then log the data we read from the stream to the console using the 'data' event, which is emitted each time the stream has data available. The 'data' event will be emitted multiple times as the stream reads the file in chunks. We also log a message to the console when the stream is closed using the 'close' event. Note that in this example, we're just logging the data we read from the stream to the console. In a real-world application, you would typically use the data for some purpose, such as processing it or writing it to a file.
GitHub: gpf0710/poi
113 114 115 116 117 118 119 120 121 122
const extractZipNodeAsync = (zipFile, destPath, descript="") => { log(`Extract ${descript}`) return new Promise((resolve) => { fs.ensureDirSync(path.dirname(destPath)) fs.createReadStream(zipFile) .pipe(unzip.Extract({ path: destPath })) .on('close', () => { log(`Extracting ${descript} finished`) return resolve()
+ 3 other calls in file
1510 1511 1512 1513 1514 1515 1516 1517 1518 1519
backdrop: "rgba(0,0,0, 0.4)", }); ipcRenderer.send("track-event", "Error", `Download Template - ${templateItem}`); } else { fs.createReadStream(templatePath).pipe(fs.createWriteStream(destinationPath)); var emessage = `Successfully saved '${templateItem}' to ${destinationFolder}`; Swal.fire({ icon: "success", title: "Download successful",
+ 9 other calls in file
330 331 332 333 334 335 336 337 338 339 340 341
const cacheFile = path.join(CACHE_PATH,`${cacheId}`); const {size} = fs.statSync(cacheFile); res.header('Content-Length', size); console.log(`File size: ${size}`); fs.createReadStream(cacheFile).pipe(res); }); // Purge cache storage and DB server.post('/_apis/artifactcache/clean', (req, res) => {
GitHub: bcomnes/cpx2
16 17 18 19 20 21 22 23 24 25 26
* @returns {Promise<void>} The promise which will go fulfilled after done. * @private */ function copyFileContent(source, output, transforms) { return new Promise((resolve, reject) => { const reader = fs.createReadStream(source) const writer = fs.createWriteStream(output) const streams = [reader] /**
+ 8 other calls in file
GitHub: wk-js/starter-nanogl
41 42 43 44 45 46 47 48 49 50
this.emit('start') fs.ensureDirSync( path.dirname(this.output) ) const rs = fs.createReadStream(this.input) const ws = fs.createWriteStream(this.output) rs.on('data', ( chunk ) => { chunk = Buffer.isBuffer(chunk) ? chunk.toString('utf8') : chunk
250 251 252 253 254 255 256 257 258 259
for (let filename of filenames) { let bufs = []; // stores the iOS manifest file in plist format, extracted from the ipa // use promise to be able to wait for end of data operation await new Promise((resolve, reject) => { fs.createReadStream(filename) .pipe(unzip.ParseOne(/^Payload\/[^\/]+.app\/Info.plist$/)) .on('data', (chunk) => { bufs.push(chunk);
+ 9 other calls in file
GitHub: stevekeol/DApp-Hangzhou
351 352 353 354 355 356 357 358 359 360
filter.extname.indexOf(path.extname(_src)) != -1 ) { return } // 通过管道来传输流 fs.createReadStream(_src).pipe(fs.createWriteStream(_dst)) } else if (st.isDirectory()) { // 如果是目录则递归调用自身 self.copyDir(_src, _dst, null, filter) }
1577 1578 1579 1580 1581 1582 1583 1584 1585 1586
this._extractJdkZip(a.to, dataDir, self) } else { // Tar.gz let h = null fs.createReadStream(a.to) .on('error', err => AssetGuard.logger.error(err)) .pipe(zlib.createGunzip()) .on('error', err => AssetGuard.logger.error(err)) .pipe(tar.extract(dataDir, {
+ 6 other calls in file
GitHub: impress-dev/HelloWorld
62 63 64 65 66 67 68 69 70 71
const ACL = this.parseOptional(options.acl, 'string', undefined); const s3 = this.getS3Provider(provider); if (!s3) throw new Error(`S3 provider "${provider}" doesn't exist.`); let Body = fs.createReadStream(path); const result = await s3.putObject({ Bucket, ACL, Key, ContentType, ContentDisposition, Body }); try {
7266 7267 7268 7269 7270 7271 7272 7273 7274 7275
} }) } function copyFile (file, target) { var readStream = fs.createReadStream(file.name) var writeStream = fs.createWriteStream(target, { mode: file.mode }) readStream.on('error', onError) writeStream.on('error', onError)
+ 17 other calls in file
344 345 346 347 348 349 350 351 352 353
} return S3.getInstance(this.context) .then((s3) => { const s3Params = { Body: fs.createReadStream(filePath), Key: cfnFile, }; logger('updateResourceStack.s3.uploadFile', [{ Key: s3Params.cfnFile }])(); return s3.uploadFile(s3Params, false);
+ 3 other calls in file
GitHub: davidmerfield/Blot
120 121 122 123 124 125 126 127 128 129 130 131
} // name the output file res.attachment(req.params.importID + ".zip"); fs.createReadStream(resultZip).pipe(res); } catch (e) { return next(new Error("Failed to download import")); } });
+ 4 other calls in file
GitHub: Cloud-V/Backend
100 101 102 103 104 105 106 107 108 109
encoding: metadata.encoding, extension: metadata.extension, }); const gfs = Grid(dbfs.db); const inputStream = fs.createReadStream(filePath); const outputStream = gfs.createWriteStream({ filename: fsName, }); inputStream.pipe(outputStream);
+ 29 other calls in file
1660 1661 1662 1663 1664 1665 1666 1667 1668 1669
// const name = combined.substring(combined.lastIndexOf('/')+1) // const fDir = path.join(dataDir, name) // const jre = new Asset(name, null, parseInt(resp.headers['content-length']), opts, fDir) // this.java = new DLTracker([jre], jre.size, (a, self) => { // let h = null // fs.createReadStream(a.to) // .on('error', err => console.log(err)) // .pipe(zlib.createGunzip()) // .on('error', err => console.log(err)) // .pipe(tar.extract(dataDir, {
+ 7 other calls in file
715 716 717 718 719 720 721 722 723 724
.then(obj => { let stream; if (/^http/.test(installPackage)) { stream = hyperquest(installPackage); } else { stream = fs.createReadStream(installPackage); } return extractStream(stream, obj.tmpdir).then(() => obj); }) .then(obj => {
+ 4 other calls in file
GitHub: LiveBacteria/eegBackend
389 390 391 392 393 394 395 396 397 398
}); // Load the EEG data from the CSV file let eegData = []; let headers = []; fs.createReadStream("./data.csv") .pipe(csv1()) .on("headers", (row) => { headers = row; })
+ 3 other calls in file
fs-extra.readFileSync is the most popular function in fs-extra (9724 examples)