Hello everyone, I am Tiago and I am developing an interactive visualization of DNA sequences using vivagraph.js. So I basically I have two lists (one for nodes
and another one for links
that freeze my browser (using firefox, but chrome also becomes slow), when I am adding the nodes
and the links
to the graph. So I have started playing around with some concurrency and queues to handle this. My idea was to limit the number of nodes
and links
being added simultaneously. First I started with something very simple:
const addAllNodes = (array) => {
// adds nodes using vivagraph functions
}
const limit = 10
let running = 0
const scheduler = () => {
while(running < limit && json.nodes.length > 0) {
const array = json.nodes.shift()
console.log(array)
addAllNodes(array, () => {
running--
if (json.nodes.length > 0) {
scheduler()
}
})
running++
}
}
scheduler()
However this often results in too much recursion which is expected since I am running scheduler
function inside itself.
async
, I have managed to use async.queue
but it still freezes my browser and performance is pretty much the same as my implementation with no concurrency handling.var queue = async.queue(addAllNodes, 10)
queue.drain = function(){
// after getting all nodes, setup another concurrency for all links
var queue2 = async.queue(addAllLinks, 10)
queue2.drain = function(){
renderGraph()
}
// attempting to queue json.links, which are the links to be added to the graph AFTER adding the nodes to the graph
queue2.push(json.links)
}
// attempting to queue json.nodes which are basically the nodes I want to add first to the graph
queue.push(json.nodes)
Hi @tiagofilipe12. To prevent the browser from freezing, make sure you aren't blocking the event loop. For example, if the vivagraph
functions are synchronous, make sure you delay your calls to callback
using async.setImmediate
or a similar function.
Also, as a side note, for an application like this, async.forEachLimit
might be a better choice.
async.forEachLimit(json.nodes, 10, (node, callback) => {
// function for adding a single node
async.setImmediate(callback);
}, (err) => {
if (err) {
// error handling
}
// similar function for links
// the final callback for links should call renderGraph
});
I'm new to node/async
function echo(x) {
console.log(x)
return x
}
list = ['a', 'b', 'c', 'd', 'e']
async.concatLimit(list, 2, echo, x => console.log(x));
Result:
-> node test.js
a
b
Shouldn't it wait instead of discarding other elements? Also I callback didn't happen as I expected.
@sayanarijit the iteratee
, in this case echo
, is passed a callback
that it needs to invoke before async
starts processing more items.
function echo(x, callback) {
console.log(x)
callback(null, x)
}
list = ['a', 'b', 'c', 'd', 'e']
async.concatLimit(list, 2, echo, (err, results) => console.log(results))
See the concatLimit docs for more info.
Hi,
I'm new to async and have been trying to figure out the working of filter
method.
Here's a sample code for filtering fields whose value is an odd number,
var async = require('async');
async.filter({a: 1, b: 2, c: 3}, function(num, callback) {
callback(null, num & 1);
}, function(err, results) {
console.log(results, typeof results);
});
It returns undefined 'undefined'
. I am not sure what I'm doing wrong here. Would someone help me out? Thanks!
iteratee
expects a boolean in the callback.
const p = new Promise((resolve, reject) => {
setTimeout(() => {
resolve(1);
}, 20000);
});
async.map(configs, async item => {
const res = await p;
console.log(item);
return res;
}, (err, result) => {
console.log('^^^^', err, result);
});
why in this code async.map callback never called? seems like it does not wait for p
promise?
currying
to pass callback , Is it a good practice or should I use call
?
fetch(...)
.then((results) => {
let list = ...
async.mapLimit(list, async (item) => ... );
return ... ?
/// ???
});
Hello, how do I use async.queue in a sequential manner?
I've read that for a static array of tasks, sequential execution can be achieved through async.series, if one requires push/pop, one should use queue instead. I've set up my queue as
const RequestQueue = queue(() => {}, 1);
but the 2nd item (a fetch request) always executes before the 1st item (also a fetch request) finishes.
Sample Task:
retrieveConnectionsTask = () => {
RequestWrapper.get("/myEndpoint")
.then(response =>
response.json()
)
.then(data => {
//doSomethingWithData
console.log("retrieveConnectionsTask Returned");
console.dir(this.state.connections);
})
.catch(error => console.log(error));
};
would appreciate any advice.
Thank you!
Hello everyone.
You might be interested by an open-source project I recently started which is basically a rewrite of Async.js but aimed for exclusive usage of async/await and promises.
The code is here: https://github.com/nicolas-van/modern-async
I would welcome any comment.
async
. Mine is iter-tools
, and it only focuses on iterables, but I think it has significant advantages. It ensures resources are released (by calling iterator.return()
if it is present), it has a better approach to parallelization, and it offers more performant implementations of methods over sync iterators. In 8.0
I plan to change my methods around a bit so that import { asyncMap } from 'iter-tools'
beceomes import { map } from 'iter-tools/async'
, which I think will be more familiar to users of this library.