Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • 14:00
    bershanskiy commented #3149
  • 13:58
    zekth commented #3171
  • 13:55
    kt3k synchronize #3184
  • 13:52

    ry on master

    test: improve http_proxy test (… (compare)

  • 13:52
    ry closed #3185
  • 13:18
    kt3k edited #3185
  • 13:07
    kt3k opened #3185
  • 12:57
    Phrohdoh edited #179
  • 12:36
    Phrohdoh synchronize #179
  • 12:33
    CLAassistant commented #179
  • 12:32
    CLAassistant commented #179
  • 12:32
    Phrohdoh opened #179
  • 12:27
    yorkie commented #3135
  • 12:27
    yorkie commented #3135
  • 11:14
    manyuanrong synchronize #3182
  • 11:11
    nayeemrmn commented #3160
  • 11:09
    nayeemrmn commented #3160
  • 11:08
    nayeemrmn commented #3160
  • 11:06
    nayeemrmn commented #3160
  • 10:57
    AleksandrukTad commented #3164
Kitson Kelly
@kitsonk
I guess I am missing how "useful" a lockfile in the cache directory is above what is already there. And how it would behave in life. It needs to be easy to tightly coupled a specific workload, does a "global" lockfile really help anyone? How do you distribute it so that it can be leveraged easily?
Andy Hayden
@hayd
It seems like generating a lock file and executing a lock file are two things you want, explicitly / without filename sbiffing/searching.
Ryan Dahl
@ry
deno --lock-write=./deno.lock https://deno.land/std/examples/cat.ts
deno --lock-check=./deno.lock https://deno.land/std/examples/cat.ts
Kitson Kelly
@kitsonk
ok, +1 for that...
and deno --lock-check=https://example.com/deno.lock https://deno.land/std/examples/cat.ts would work as well?
Qwerasd
@qwerasd205
Working on a utf-8 decoder to (hopefully) replace the horribly slow one Deno has for TextDecoder, currently decoding utf-8 text at ~80 MB/s avg. Hopefully error handling doesn't slow that down too much.
Ryan Dahl
@ry
@kitsonk remote lockfile sounds good!
Qwerasd
@qwerasd205
(And hopefully I can add some more optimizations to speed it up further)
Ryan Dahl
@ry
@qwerasd205 cool. Let's get a benchmark up before you land, so we can watch the improvement.
Do you have a good test script?

If you could add it to this:
https://github.com/denoland/deno/blob/master/tools/benchmark.py#L22

we will get memory and execution speed information automatically in https://deno.land/benchmarks.html

Qwerasd
@qwerasd205
Currently I'm just timing some decodes of (rather artificial, about to go grab some actual real world examples) large text.
Ryan Dahl
@ry
(when you add the benchmark, don't make it too crazy - it has to run in CI - and that time is important to optimize - no more than 30 seconds runtime I'd say - preferably less)
Qwerasd
@qwerasd205
I really quite dislike dealing with git, to be honest, and don't already have a Deno repo set up on my machine so it's a pain to deal with πŸ˜… -- In the issue thread about the poor text decoder perf, @kitsonk offered to get the utf8 decoder I'm writing working in Deno.
Kitson Kelly
@kitsonk
yeah, I can also try to cover off the benchmark too. I feel guilty because I think I originally did the text decoder implementation :blush:
Qwerasd
@qwerasd205
Well from the comment in it it seems like you based it off of another implementation that wasn't performance focused.
Current state of my decode function (if you have questions about the test files just ask)
Deno's TextDecoder:
ENGLISHBIBLE.txt [4.20MB] Decode finished in 0.978s -- 4.29 MB/s
JAPANESEBIBLE.txt [5.35MB] Decode finished in 0.924s -- 5.79 MB/s
printableUTF8.txt [0.22MB] Decode finished in 0.028s -- 7.85 MB/s
My function:
ENGLISHBIBLE.txt [4.20MB] Decode finished in 0.062s -- 67.72 MB/s
JAPANESEBIBLE.txt [5.35MB] Decode finished in 0.054s -- 99.05 MB/s
printableUTF8.txt [0.22MB] Decode finished in 0.002s -- 109.89 MB/s
Kitson Kelly
@kitsonk
@qwerasd205 awesome!
Qwerasd
@qwerasd205
Thanks! ☺️ I still want to try to make it just a bit faster, though.
Daniel Buckmaster
@crabmusket

it seems like currently, npm lock files solve 2 problems:

  1. making sure all devs are working with the same dependencies (including transitive ones), to avoid weird discrepancies
  2. providing security using SRI

the second can indeed be solved by vendoring, but a lot of devs find that distasteful
the first problem is harder, especially if you want to e.g. depend on a major version, instead of a specific commit hash or full x.y.z version

Qwerasd
@qwerasd205
Sweet, up to 80-115 MB/s by changing char code buffer size.
Daniel Buckmaster
@crabmusket

however, deno actually has a solution to problem 1 built in already: import maps

NPM's lockfile basically says "hey, you asked for version ^2.0 of somelib - so I've installed version 2.2.3". in an import map, that would be described as a mapping from deno.land/x/somelib@^2.0 to deno.land/x/somelib@e5cf671a4. when the import map is checked in, all devs will import the same exact commit, while the code file can still specify the semantically-correct, nice-looking version

add onto that a kind of go.sum-like file containing the SRI of deno.land/x/somelib@e5cf671a4, and both needs are taken care of?

Qwerasd
@qwerasd205
Sweet, I was doing a modulo operation where it wasn't required. Removing it sped the function up significantly.
ENGLISHBIBLE.txt [4.20MB] Decode finished in 0.044s -- 95.42 MB/s
JAPANESEBIBLE.txt [5.35MB] Decode finished in 0.042s -- 127.35 MB/s
printableUTF8.txt [0.22MB] Decode finished in 0.002s -- 109.89 MB/s
I'm satisfied with that.
Qwerasd
@qwerasd205
@ry If you've already started working my function in to Deno, please note that I've updated the gist with another performance improvement. (I removed excess data from the inline data tables, which has the added benefit of removing an addition operation from one of the access calculations)
ENGLISHBIBLE.txt [4.20MB] Decode finished in 0.042s -- 99.96 MB/s
JAPANESEBIBLE.txt [5.35MB] Decode finished in 0.040s -- 133.72 MB/s
printableUTF8.txt [0.22MB] Decode finished in 0.002s -- 109.89 MB/s
^-^
Kevin (Kun) "Kassimo" Qian
@kevinkassimo
It seems that running cargo build after modifying libdeno contents does not trigger rebuild?
Qwerasd
@qwerasd205
Oh wow oops there's a bug in the version of my decoder I gisted
Qwerasd
@qwerasd205
@ry you planning on working on getting it in to Deno after the benchmark is merged?
Ryan Dahl
@ry
I am doing maintenance on deno.land right now... there may be problems (especially with TLS certs)
@qwerasd205 sure
Qwerasd
@qwerasd205
Alright.
Siddhant N Trivedi
@sidntrivedi012
@ry Any work on the Deno_website2 where I can help ?
Ryan Dahl
@ry
@sidntrivedi012 you can make the table of contents work in /manual - that would be helpful
also in /style_guide
Siddhant N Trivedi
@sidntrivedi012
@ry Sure, Will start working on it. Thanks.
Qwerasd
@qwerasd205
Deno's TextDecoder:
ENGLISHBIBLE.txt [4.20MB] Decode finished 5 times in 4.892s -- 4.29 MB/s
JAPANESEBIBLE.txt [5.35MB] Decode finished 5 times in 4.672s -- 5.72 MB/s
printableUTF8.txt [0.22MB] Decode finished 5 times in 0.126s -- 8.72 MB/s
My function:
ENGLISHBIBLE.txt [4.20MB] Decode finished 100 times in 4.198s -- 100.01 MB/s
JAPANESEBIBLE.txt [5.35MB] Decode finished 100 times in 3.410s -- 156.86 MB/s
printableUTF8.txt [0.22MB] Decode finished 100 times in 0.156s -- 140.89 MB/s
I am quite happy with the performance. It does favor text with more multi-byte characters, just because of how it's written, and it favors text with consistent distribution of byte lengths for characters (Likely due to branch prediction)
Ryan Dahl
@ry
@qwerasd205 nice!
Qwerasd
@qwerasd205
Thanks ^-^
Qwerasd
@qwerasd205
Made a few more micro-optimizations which increase the speed by a few MB/s
Ryan Dahl
@ry
submit a patch
cli/js/text_encoder.ts
Qwerasd
@qwerasd205
I really hate dealing with git, and I don't have an environment set up to make and test changes to Deno, so writing this function is the best I'm gonna be able to do for you πŸ˜… -- I did look at that and notice I needed to add support for throwing errors on encoding errors and for ignoring the BOM at the start of the stream (if present), so I added support for those to the function and updated the gist.
It seems like it would be relatively simple to remove the older decoder and insert my function, but I don't have the wherewithal right now to set up an environment to make and test changes to Deno. I'm actually quite tired and about to go to bed now. Good night. 😴
Yusuke Sakurai
@keroxp
@ry can you access this? https://deno.land/std@v0.20.0/http/server.ts deno.land aren't serving previous deno_std releases now.
Nayeem Rahman
@nayeemrmn
  1. making sure all devs are working with the same dependencies (including transitive ones), to avoid weird discrepancies
@crabmusket Deno doesn't do anything about peer dependencies, so this is "solved" already.
Luca Casonato
@lucacasonato
@keroxp This is probably related to the move from denoland/deno_std to denoland/deno/std for the standard library. @ry tried to make this work for pre 0.21.0 versions but that doesn't seem to work: for now you will have to update to v0.21.0. denoland/registry@7fd99c2
Nayeem Rahman
@nayeemrmn
The directory listing is all wrong, too. https://deno.land/std/ points to root of denoland/deno. Only the directory listing, though. The version tag thing is different... I think it was already an issue.
Daniel Buckmaster
@crabmusket

@nayeemrmn but I can get a different version of a library depending on when I hit a URL. for example,

  1. colleague A imports foobar@v1 which currently resolves to the equivalent of v1.2.3
  2. foobar author publishes foobar v1.2.4, fixing a small bug
  3. colleague B clones colleague A's new code, and deno downloads foobar@v1, which resolves to the equivalent of v1.2.4

now colleague A observes the small bug, while colleague B doesn't, and the reason why isn't obvious, since they have the same code, but different dependency versions
am I misrepresenting this example?