To see how that can go wrong, print console.log at the end of the method.
Things that can go wrong in general:
- Arbitrary order.
- printFiles can finish running before printing files.
- Poor performance.
These are not always wrong but frequently are in standard use cases.
Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.
import fs from 'fs-promise'
async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))
for(const file of files)
console.log(await file)
}
printFiles()
This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.
This will:
- Initiate all of the file reads to happen in parallel.
- Preserve the order via the use of map to map file names to promises to wait for.
- Wait for each promise in the order defined by the array.
With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.
It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.
The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.
With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.
Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.
There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.
In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.
Use this mock to help tell the difference between solutions:
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};
const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));
for(const file of files)
console.log(await file);
};
})(mock());
console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();
0
Created by jgmjgm on 2020-03-07 13:22:17 +0000 UTC
Share