Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 749 Vote(s) - 3.57 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Using async/await with a forEach loop

#11
[Bergi's solution][1] works nicely when `fs` is promise based.
You can use `bluebird`, `fs-extra` or `fs-promise` for this.

**However, solution for** node's native `fs` libary is as follows:


```
const result = await Promise.all(filePaths
.map( async filePath => {
const fileContents = await getAssetFromCache(filePath, async function() {

// 1. Wrap with Promise
// 2. Return the result of the Promise
return await new Promise((res, rej) => {
fs.readFile(filePath, 'utf8', function(err, data) {
if (data) {
res(data);
}
});
});
});

return fileContents;
}));
```

**Note:**
`require('fs')` compulsorily takes function as 3rd arguments, otherwise throws error:

```
TypeError [ERR_INVALID_CALLBACK]: Callback must be a function
```



[1]:

[To see links please register here]

Reply

#12
In addition to [@Bergi’s answer](

[To see links please register here]

), I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each `readFile` individually, you create an array of promises, each which you await at the end.

import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();

const promises = files.map((file) => fs.readFile(file, 'utf8'))

const contents = await Promise.all(promises)

contents.forEach(console.log);
}

Note that the function passed to `.map()` does not need to be `async`, since `fs.readFile` returns a Promise object anyway. Therefore `promises` is an array of Promise objects, which can be sent to `Promise.all()`.

In @Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes *after* the large file in the `files` array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.
Reply

#13
To see how that can go wrong, print console.log at the end of the method.

Things that can go wrong in general:

* Arbitrary order.
* printFiles can finish running before printing files.
* Poor performance.

These are not always wrong but frequently are in standard use cases.

Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.

```js
import fs from 'fs-promise'

async function printFiles () {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))

for(const file of files)
console.log(await file)
}

printFiles()
```

This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.

This will:

* Initiate all of the file reads to happen in parallel.
* Preserve the order via the use of map to map file names to promises to wait for.
* Wait for each promise in the order defined by the array.

With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.

It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.

The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.

With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.

Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.

There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.

In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.

Use this mock to help tell the difference between solutions:

```js
(async () => {
const start = +new Date();
const mock = () => {
return {
fs: {readFile: file => new Promise((resolve, reject) => {
// Instead of this just make three files and try each timing arrangement.
// IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
const time = Math.round(100 + Math.random() * 4900);
console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
setTimeout(() => {
// Bonus material here if random reject instead.
console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
resolve(file);
}, time);
})},
console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
};
};

const printFiles = (({fs, console, getFilePaths}) => {
return async function() {
const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));

for(const file of files)
console.log(await file);
};
})(mock());

console.log(`Running at ${new Date() - start}`);
await printFiles();
console.log(`Finished running at ${new Date() - start}`);
})();

```
Reply

#14
The [p-iteration][1] module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.

An example with your case:

const { forEach } = require('p-iteration');
const fs = require('fs-promise');

(async function printFiles () {
const files = await getFilePaths();

await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
})();



[1]:

[To see links please register here]

Reply

#15
Today I came across multiple solutions for this. Running the async await functions in the forEach Loop. By building the wrapper around we can make this happen.

[More detailed explanation on how it works internally, for the native forEach and why it is not able to make a async function call and other details on the various methods are provided in link here][1]

The multiple ways through which it can be done and they are as follows,

Method 1 : Using the wrapper.

await (()=>{
return new Promise((resolve,reject)=>{
items.forEach(async (item,index)=>{
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
if(index === items.length-1){
resolve('Done')
}
});
});
})();

Method 2: Using the same as a generic function of Array.prototype

Array.prototype.forEachAsync.js

if(!Array.prototype.forEachAsync) {
Array.prototype.forEachAsync = function (fn){
return new Promise((resolve,reject)=>{
this.forEach(async(item,index,array)=>{
await fn(item,index,array);
if(index === array.length-1){
resolve('done');
}
})
});
};
}

Usage :

require('./Array.prototype.forEachAsync');

let count = 0;

let hello = async (items) => {

// Method 1 - Using the Array.prototype.forEach

await items.forEachAsync(async () => {
try{
await someAPICall();
} catch(e) {
console.log(e)
}
count++;
});

console.log("count = " + count);
}

someAPICall = () => {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve("done") // or reject('error')
}, 100);
})
}

hello(['', '', '', '']); // hello([]) empty array is also be handled by default

Method 3 :

Using Promise.all

await Promise.all(items.map(async (item) => {
await someAPICall();
count++;
}));

console.log("count = " + count);

Method 4 : Traditional for loop or modern for loop

// Method 4 - using for loop directly

// 1. Using the modern for(.. in..) loop
for(item in items){

await someAPICall();
count++;
}

//2. Using the traditional for loop

for(let i=0;i<items.length;i++){

await someAPICall();
count++;
}


console.log("count = " + count);




[1]:

[To see links please register here]

Reply

#16
Just adding to the original answer

- The parallel reading syntax in the original answer is sometimes confusing and difficult to read, maybe we can write it in a different approach

```
async function printFiles() {
const files = await getFilePaths();
const fileReadPromises = [];

const readAndLogFile = async filePath => {
const contents = await fs.readFile(file, "utf8");
console.log(contents);
return contents;
};

files.forEach(file => {
fileReadPromises.push(readAndLogFile(file));
});

await Promise.all(fileReadPromises);
}

```

- For sequential operation, not just **for...of**, normal for loop will also work

```
async function printFiles() {
const files = await getFilePaths();

for (let i = 0; i < files.length; i++) {
const file = files[i];
const contents = await fs.readFile(file, "utf8");
console.log(contents);
}
}

```
Reply

#17
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the `printFiles` function does immediately return after that.

### Reading in sequence

If you want to read the files in sequence, **you cannot use `forEach`** indeed. Just use a modern `for … of` loop instead, in which `await` will work as expected:

async function printFiles () {
const files = await getFilePaths();

for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}

### Reading in parallel

If you want to read the files in parallel, **you cannot use `forEach`** indeed. Each of the `async` callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use `map` instead, and you can await the array of promises that you'll get with `Promise.all`:

async function printFiles () {
const files = await getFilePaths();

await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
Reply

#18
Like @Bergi's response, but with one difference.

`Promise.all` rejects all promises if one gets rejected.

So, use a recursion.

```js
const readFilesQueue = async (files, index = 0) {
const contents = await fs.readFile(files[index], 'utf8')
console.log(contents)

return files.length <= index
? readFilesQueue(files, ++index)
: files

}

const printFiles async = () => {
const files = await getFilePaths();
const printContents = await readFilesQueue(files)

return printContents
}

printFiles()
```

**PS**

`readFilesQueue` is outside of `printFiles` cause the side effect* introduced by `console.log`, it's better to mock, test, and or spy so, it's not cool to have a function that returns the content(sidenote).

Therefore, the code can simply be designed by that: three separated functions that are "pure"** and introduce no side effects, process the entire list and can easily be modified to handle failed cases.

```js
const files = await getFilesPath()

const printFile = async (file) => {
const content = await fs.readFile(file, 'utf8')
console.log(content)
}

const readFiles = async = (files, index = 0) => {
await printFile(files[index])

return files.lengh <= index
? readFiles(files, ++index)
: files
}

readFiles(files)
````

**Future edit/current state**

Node supports top-level await (this doesn't have a plugin yet, won't have and can be enabled via harmony flags), it's cool but doesn't solve one problem (strategically I work only on LTS versions). How to get the files?

Using composition. Given the code, causes to me a sensation that this is inside a module, so, should have a function to do it. If not, you should use an IIFE to wrap the role code into an async function creating simple module that's do all for you, or you can go with the right way, there is, composition.

```js
// more complex version with IIFE to a single module
(async (files) => readFiles(await files())(getFilesPath)
```

Note that the name of variable changes due to semantics. You pass a functor (a function that can be invoked by another function) and recieves a pointer on memory that contains the initial block of logic of the application.

But, if's not a module and you need to export the logic?

Wrap the functions in a async function.

```js
export const readFilesQueue = async () => {
// ... to code goes here
}
```

Or change the names of variables, whatever...


------------

`*` by side effect menans any colacteral effect of application that can change the statate/behaviour or introuce bugs in the application, like IO.

`**` by "pure", it's in apostrophe since the functions it's not pure and the code can be converged to a pure version, when there's no console output, only data manipulations.

Aside this, to be pure, you'll need to work with monads that handles the side effect, that are error prone, and treats that error separately of the application.
Reply

#19
You can use `Array.prototype.forEach`, but async/await is not so compatible. This is because the promise returned from an async callback expects to be resolved, but `Array.prototype.forEach` does not resolve any promises from the execution of its callback. So then, you can use forEach, but you'll have to handle the promise resolution yourself.

Here is a way to read and print each file in series using `Array.prototype.forEach`
```javascript
async function printFilesInSeries () {
const files = await getFilePaths()

let promiseChain = Promise.resolve()
files.forEach((file) => {
promiseChain = promiseChain.then(() => {
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
})
})
await promiseChain
}
```
Here is a way (still using `Array.prototype.forEach`) to print the contents of files in parallel
```javascript
async function printFilesInParallel () {
const files = await getFilePaths()

const promises = []
files.forEach((file) => {
promises.push(
fs.readFile(file, 'utf8').then((contents) => {
console.log(contents)
})
)
})
await Promise.all(promises)
}
```
Reply

#20
If you'd like to iterate over all elements concurrently:

```js
async function asyncForEach(arr, fn) {
await Promise.all(arr.map(fn));
}
```

If you'd like to iterate over all elements non-concurrently (e.g. when your mapping function has side effects or running mapper over all array elements at once would be too resource costly):

Option A: Promises

```js
function asyncForEachStrict(arr, fn) {
return new Promise((resolve) => {
arr.reduce(
(promise, cur, idx) => promise
.then(() => fn(cur, idx, arr)),
Promise.resolve(),
).then(() => resolve());
});
}
```

Option B: async/await

```js
async function asyncForEachStrict(arr, fn) {
for (let idx = 0; idx < arr.length; idx += 1) {
const cur = arr[idx];

await fn(cur, idx, arr);
}
}
```
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through