- Join over
**1.5M+ people** - Join over
**100K+ communities** - Free
**without limits** - Create
**your own community**

- May 20 05:04
ocramz on gh-pages

Add `sampling` (compare)

- May 19 09:03
ocramz on gh-pages

Add kdt, Supervised Learning se… (compare)

- Apr 14 01:32tonyday567 removed as member
- Jan 30 07:37
ocramz on gh-pages

Add arrayfire (compare)

- Jan 02 12:51
ocramz on gh-pages

add inliterate (compare)

- Jan 02 12:43
ocramz on gh-pages

update hvega entry (compare)

- Jul 01 2019 09:43dmvianna added as member
- Jun 15 2019 04:55
ocramz on gh-pages

Add pcg-random (compare)

- Jun 14 2019 16:08ocramz labeled #42
- Jun 14 2019 16:08ocramz labeled #42
- Jun 14 2019 16:08ocramz labeled #42
- Jun 14 2019 16:08ocramz labeled #42
- Jun 14 2019 16:08ocramz labeled #42
- Jun 14 2019 16:08ocramz labeled #42
- Jun 14 2019 16:08ocramz opened #42
- Jun 14 2019 16:08ocramz opened #42
- Jun 06 2019 18:21
ocramz on gh-pages

Fix graphite link Merge pull request #41 from alx… (compare)

- Jun 06 2019 18:21ocramz closed #41
- Jun 06 2019 18:21ocramz closed #41
- Jun 06 2019 17:32alx741 opened #41

This seems like a crowd that could potentially benefit from information in this blog post: https://www.reddit.com/r/haskell/comments/edr9n4/random_benchmarks/

is anyone else experiencing long compilation times with Frames?

Hi!

I'm proposing new lens-based API for statistics: bos/statistics#162 What to you think about it?

TL/DR example of use: `meanOf (each . filtered (>0) . to log)`

will compute mean of logarithm of every positive number.

Also various workarounds are possible. Splitting package into core algorithms with less nice API and nice API wrappers.

@Shimuuar that's a cool idea :) I've never strayed away from the blessed-yet-horrific

You'd expose prisms if some statistics couldn't be computed computed for some input data

`lens`

myself (even if quick ways to convert exist, considering another lib for the sake of err messages could be a good idea), but in the meantime it's safe to stick to `lens`

due to widespreadness anywayYou'd expose prisms if some statistics couldn't be computed computed for some input data

same for stddev, multiply each sample by

`σ'/σ`

There are also notions of “optics" that might be a different/useful fit here. I’m thinking of another Chris Penner post: https://chrispenner.ca/posts/algebraic, about “Algebraic Lenses” which he characterizes thusly: "an Algebraic lens allows us to run some aggregation over a collection of substates of our input, then use the result of the aggregation to pick some result to return.” Maybe something to keep an eye on. I’m thinking about how those optics and “Kaleidoscopes” fit in for map/reduce type operations.

Anyway, the lens thing is cool! I wonder how easy/hard it would be to have conversions between the Control.Foldl versions of these things. The applicative instance of Folds is useful for combining one-pass operations on the same data. Is there some equally straightforward way to combine

`meanOf folded`

and `sumOf folded`

so that the fold only happens once? That might go back to the idea of “meanOf” as the optic and then maybe there’s a way to compose the optics?
As for foldl-like functionality I'm not sure seems difficult since accumulator type leaks in type signatures. But could be possible.

Another problem is one frequently *needs* multiple passes over data to avoid precision loss. Numerically stable computation of variance requires computing mean first

Foldl dodges that accumulator-type-leaking problem by quantifying over it and then adding an extra bit to the data type so you can return something else, thus separating the accumulator type from the return type. Then it is an Applicative in that returned type rather than the accumulator type.

Not sure how the multi-pass part could work but it’d be very cool if two multi-pass computations could be composed such that they need only do the max of the indivudual number of passes each needs. I wonder of there is a way to stack them, as a type-level list, like a list of effect handlers?

Not sure how the multi-pass part could work but it’d be very cool if two multi-pass computations could be composed such that they need only do the max of the indivudual number of passes each needs. I wonder of there is a way to stack them, as a type-level list, like a list of effect handlers?

@Shimuuar Yep, I think what @adamConnerSax is mentionning is realizable via

`foldOver`

http://hackage.haskell.org/package/foldl-1.4.5/docs/Control-Foldl.html#v:foldOver to which you give any lens/prism/traversal, and a Fold from http://hackage.haskell.org/package/foldl-statistics
(cc @MMesch didn't you use that pattern in the past?)

I think that’s all there is for an exact median but there are approximate methods eg https://github.com/tonyday567/online/blob/master/online/src/Online/Medians.hs#L23

hi, hugging face has released a new Rust implementation of their byte pair tokenizers: https://github.com/huggingface/tokenizers

they are looking for language binding contributions. Haskell bindings would be more than welcome. they would fill a gap we currently have with hasktorch where we work towards pre-trained Bert and gpt-2 models people can use in their projects

they are looking for language binding contributions. Haskell bindings would be more than welcome. they would fill a gap we currently have with hasktorch where we work towards pre-trained Bert and gpt-2 models people can use in their projects

last time i looked at hasktorch it was missing things (or at least i had the impression). I would prefer to migrate to hasktorch in the next months/years .. provided my supervisor is ok with that & i can train ppl. on the job with it.. ;)

But then there will maybe be more contributions from my side ..

I'm steadily working towards what I informally call "haskformers"...

watch it happen here: hasktorch/hasktorch#269

watch it happen here: hasktorch/hasktorch#269

this is just a transformer language model (gpt-2 style) implementation.

later we want to be able to load hugging face transformers models and also use their tokenizers. hence the call for help