Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 05:41
    shicks commented #525
  • Nov 26 21:35
    depfu[bot] labeled #840
  • Nov 26 21:35
    depfu[bot] opened #840
  • Nov 26 20:31

    depfu[bot] on update

    Update prettier to version 2.5.0 (compare)

  • Nov 18 07:51
    depfu[bot] synchronize #832
  • Nov 18 07:51

    depfu[bot] on update

    Update jest to version 27.3.1 (compare)

  • Nov 18 07:44

    depfu[bot] on update

    (compare)

  • Nov 18 07:44

    papandreou on master

    Update karma to version 6.3.9 Merge pull request #839 from un… (compare)

  • Nov 18 07:44
    papandreou closed #839
  • Nov 18 00:05
    depfu[bot] labeled #839
  • Nov 18 00:05
    depfu[bot] opened #839
  • Nov 17 23:00

    depfu[bot] on update

    Update karma to version 6.3.9 (compare)

  • Nov 08 15:30
    depfu[bot] synchronize #832
  • Nov 08 15:30

    depfu[bot] on update

    Update jest to version 27.3.1 (compare)

  • Nov 08 15:27

    depfu[bot] on update

    (compare)

  • Nov 08 15:27

    papandreou on master

    Update karma to version 6.3.8 Merge pull request #838 from un… (compare)

  • Nov 08 15:27
    papandreou closed #838
  • Nov 08 15:25
    depfu[bot] labeled #838
  • Nov 08 15:25
    depfu[bot] opened #838
  • Nov 08 15:01

    depfu[bot] on update

    Update karma to version 6.3.8 (compare)

Peter Müller
@Munter
That's the addition of a cache layer in front of Prismic for our website
Andreas Lind
@papandreou
:heart_eyes_cat:
Gustav Nikolaj
@gustavnikolaj
Hi guys :) Thought I'd break the silence here before we get to the two full calendar months of no action on the channel... What S3 client should one use on node.js these days? I haven't had much luck with the official one, but all the other popular ones I find seem to be abandoned and broken on recent node versions :)
Sune Simonsen
@sunesimonsen
I think Zendesk is using the official @aws-sdk/client-s3, but I also remember it to be pretty bad.
Gustav Nikolaj
@gustavnikolaj
The newer one (the scoped package named one) is much better than the old one (package name aws-sdk). Unfortunately that newer one doesn't seem to work with our non-aws s3 implementation from our storage vendor :(
The old one is like 60 megabytes of actual javascript code, and it adds 5 seconds of startup time to my server when run on my development machine :D
Sune Simonsen
@sunesimonsen
haha :joy:
There goes the micro service :sweat:
If there is even such a thing.
Andreas Lind
@papandreou
I think it depends on whether you're planning to use advanced features such as pre-signed POST urls, request signing etc.
Gustav Nikolaj
@gustavnikolaj
I need to have multipart uploads, and that's pretty much it :) It's just for uploading a heapdump once in a while
Peter Müller
@Munter
Don't dump Heap ;)
Gustav Nikolaj
@gustavnikolaj
10hi :)
Andreas Lind
@papandreou
:smirk_cat:
Then a super minimal client should be fine, maybe just an HTTP client :)
Gustav Nikolaj
@gustavnikolaj
I found this alternative client, from another s3 compatible storage vendor, which seems to work fine :) https://www.npmjs.com/package/minio
@papandreou yeah, I was considering doing that, but the files are big enough that I'm going to need to split them into multiple requests, so it's a bit more involved than just using an http agent :)
Andreas Lind
@papandreou
Are they really too big to be streamed?
IDK if the HTTP-based api supports PUT with Content-Range
Gustav Nikolaj
@gustavnikolaj

The largest object that can be uploaded in a single PUT is 5 gigabytes. For objects larger than 100 megabytes, customers should consider using the Multipart Upload capability.
https://aws.amazon.com/s3/faqs/

We have average memory consumption of a couple hundred of megabytes, but it will typically be used to debug containers nearing the memory limit. So I'd expect sizes in the interval from ~200 MB to 4 GB. Should be enough with a single PUT then

All the docs I've read so far only mentioned the recommendation of > 100 megs
But it's internal network traffic all of it, so it should be alright :)
Andreas Lind
@papandreou
Watch out for all the PII, private keys etc. that such a heap snapshot might contain :)
Gustav Nikolaj
@gustavnikolaj
Thanks. That's good advice. It's definitely not something you want to be too liberal about
Gustav Nikolaj
@gustavnikolaj
How do you prove, in an objective way, that a node.js server is doing something blocking? :D
Gustav Nikolaj
@gustavnikolaj
My best idea so far is to graph the amount of processes running, and the throughput/requests per sec, and then see if we ever have a higher throughput than the number of processes
Sune Simonsen
@sunesimonsen
@Munter @alexjeffburke do you know a good way to add typescript definitions a dynamic library but where I want to expose types for the public API.
Is type annotations in comments an option?
Peter Müller
@Munter
@sunesimonsen I remember a couple of people on twitter talking about being able to write js with jsdoc, and having typescript types extracted from that. The problem is that jsdoc is pretty limited when it comes to generics, if you might want to use those. And when I tried this approach in the very early days of that tooling, the output wasn't super useful
You don't want to write .d.ts files by hand?
Sune Simonsen
@sunesimonsen
I have a setup where I can generate the .d.ts from JSDoc, but I think I'll just write the .d.ts by hand. Then I avoid any new tooling.
It is pretty limited amount of typing I want to expose.
Peter Müller
@Munter
If it's for unexpected, i have some type definitions i have copied around for a few years now, which take care of most of the API besides assertions
Sune Simonsen
@sunesimonsen
it is for work, I want as little types as I can get away with 😂
Andreas Lind
@papandreou
There’s a library called blocked that uses a setInterval or similar to guesstimate how many milliseconds the event loop has been blocked for, and fires an event if it crosses a threshold you’ve specified.
It has a drop-in replacement called blocked-at that uses async_hooks to also provide a stack trace of where the blocking operation was initiated. Unfortunately it has a big perf overhead, so it’s not suitable for production.
@gustavnikolaj :point_up:
Sune Simonsen
@sunesimonsen
Screenshot 2021-10-07 at 21.14.28.png
Scripting with https://github.com/sunesimonsen/transformation is pretty fun and useful.
Gustav Nikolaj
@gustavnikolaj
I am debugging a memory leak in a node.js application. I observe that my rss is way bigger than my heapTotal and external memory segments combined when I inspect process.memoryUsage(). I would expect that heapTotal + external gives me all the memory used by js objects and c++ objects referenced from js. Am I missing something? Wouldn't a runaway rss number be an indication that theres some native code leaking? I am observing up to 70 to 80 percent of memory use being outside of heap and external...
Thanks for the suggestions on the event loop blocking @papandreou. I managed to convince the teams that they actually had a block without having to resort to scientific proofs :) Unfortunately we were sick all week so I never got back to you :(
Andreas Lind
@papandreou
@gustavnikolaj, external will only include memory that is allocated outside the JS heap and correctly reported using Isolate::AdjustAmountOfExternalAllocatedMemory: https://v8docs.nodesource.com/node-4.8/d5/dda/classv8_1_1_isolate.html#ae1a59cac60409d3922582c4af675473e
... so it's not an exact science. V8 doesn't automatically know that malloc gets called.
Maybe that explains the gap, depending on which native modules are involved?
Gustav Nikolaj
@gustavnikolaj
That could certainly be an explanation
These are the modules using nan as a dependency - that's the quickest way I know to check for native modules :) iconv, sharp, genx and node-expat.
I'd by default suspect that latter two most, but it actually seems to happen regardless of wether I engage the XML-related bit of the server :D
Andreas Lind
@papandreou
sharp, for sure :)
Gustav Nikolaj
@gustavnikolaj
We saw some very unfortunate interactions with sharp and ubuntu 18.04. We went from stable memory use to rampant leaks, pretty much making the apps unusable due to oomkilling... But they seemed mostly solved by enabling jemalloc.
But I'll take a look at sharp :) Thanks for the pointer!
Andreas Lind
@papandreou
At least the symptoms you list are consistent with what I've seen with sharp before.