These are chat archives for arenanet/api-cdi

27th
Feb 2016
smiley
@codemasher
Feb 27 2016 00:17
does this even still exist in php7?
a reason, why i always used fsockopen back in the day when i made http://worldofpadman.net
we had no curl server side but suhoshin installed :D
so it was not only pretty hard to get into the server, but also out
I dunno, I think we're still using php5 for the production stuff I run on the side
smiley
@codemasher
Feb 27 2016 00:27
but i also think the curl_multi along with throttling is not much of a problem, unless you basically disable throttling by allowing an insane amount of concurrent requests
which causes those neat green spikes on /v2/items in your fancy log
:D
does curl_multi have a request limit by default?
smiley
@codemasher
Feb 27 2016 00:28
doesn't seem so
I was under the impression that it ran all of them concurrently unless you asked it not to
smiley
@codemasher
Feb 27 2016 00:29
(although i haven't dug that deep into it because i had trhottling planned from the beginning)
default is no limit, it seems
smiley
@codemasher
Feb 27 2016 00:31
i wonder why all the "RollingCurl" implementations don't use this
almost sure there has to be a reason :D
Pat Cavit
@tivac
Feb 27 2016 00:32
probably because running EVERYTHING ALL AT ONCE is usually a bad idea
smiley
@codemasher
Feb 27 2016 00:33
heh
but thats what people nowadays want!
those neat green spikes
smiley
@codemasher
Feb 27 2016 00:35
yass
so, thats all out of your 100Mbit line? :D
This message was deleted
This message was deleted
pretty sure we've got GbE links between the servers
but that's not necessarily network traffic, it's just bytes fetched from the cache
smiley
@codemasher
Feb 27 2016 00:37
12.5m is pretty much 100Mbit
(which may or may not be running locally)
smiley
@codemasher
Feb 27 2016 00:38
ah^
meh, will go to bed again - brain bubbling with ideas but body too sleepy :zzz:
David Reeß
@queicherius
Feb 27 2016 16:29
Anyone here as a tip for serializing & unserializing a JS object to string (for saving in redis) without blocking? (Or at least more performant than JSON.parse) :sob:
idivait
@idivait
Feb 27 2016 17:01
Is there any way you can chunk the data before parsing? If not, this is async, but slower overall: http://azimi.me/2015/07/30/non-blocking-async-json-parse.html?utm_source=javascriptweekly&utm_medium=email
Basically it uses the non-blocking nature of the Fetch API.
David Reeß
@queicherius
Feb 27 2016 17:08
That's really interesting. Sadly I am on node - which doesn't have the fetch API :( And the polyfills for it just use the blocking parsing call - basically the same as wrapping the parse in a setTimeout. It still blocks, just later. :/
idivait
@idivait
Feb 27 2016 17:17
You might checkout this: https://www.npmjs.com/package/json-parse-stream . But from what I've read, you sacrifice performance to get it streaming.
David Reeß
@queicherius
Feb 27 2016 17:18
Huh. I'll try it out in a second. Wonder how that didnt come up for me after all the stuff I googled. :D
idivait
@idivait
Feb 27 2016 17:20
I got to it from an issue on node requesting a native streaming parser.
David Reeß
@queicherius
Feb 27 2016 17:33
Hmm, the overhead of converting the string i have into a stream and then parsing it with that is pretty high. Especially because in the end I want a complete object again and with this I would have to build it myself...
I really hope it makes it into core at some point
queicherius @queicherius going to try and build the build the full object out of the streaming parser again. zzz
David Reeß
@queicherius
Feb 27 2016 18:01
NVM that. Just parsing a 100mb object with that takes about a second for native json parse and more than a few minutes with that library. :/
idivait
@idivait
Feb 27 2016 18:05
Blech. Yeah, was worried bout that.
David Reeß
@queicherius
Feb 27 2016 18:30
https://github.com/queicherius/playground/blob/master/javascript/serializing-big-objects.md - if anyone has more ideas, let me know. For now I guess I'll live with the blocking call. :zzz:
Thanks for the library links tho @idivait :)
Should write a node-module which does deserialization off the JS heap on a separate thread, then provides a proxy object which allocates onto the JS heap on-demand
I'd be kinda surprised if such a thing didn't already exist, parsing large JSON blobs seems like a common concern
yessss, https://www.gw2pvp.de/web/ has been updated to use /v2/pvp/standings
windwarrior
@windwarrior
Feb 27 2016 21:16
Can't you like use a module written in C, and something like node-ffi?
David Reeß
@queicherius
Feb 27 2016 21:25
Gonna add that to the list of "things to try", but it feels overkill doing something like that for a problem as simple as JSON parsing...
windwarrior
@windwarrior
Feb 27 2016 22:37
well yeah might be
but "make it fast" sometimes means porting code to a faster (or more controllable) enviroment