Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jun 16 17:16

    NickSeagull on master

    Remove unused email (compare)

  • Jun 16 17:16
    NickSeagull closed #2
  • Jun 16 17:16
    NickSeagull closed #2
  • Jun 16 17:16
    NickSeagull commented #2
  • Jun 16 17:16
    NickSeagull commented #2
  • Mar 06 02:25
    dmvianna closed #33
  • Mar 06 02:25
    dmvianna closed #33
  • Feb 04 2021 22:49
    flak153 removed as member
  • May 20 2020 05:04

    ocramz on gh-pages

    Add `sampling` (compare)

  • May 19 2020 09:03

    ocramz on gh-pages

    Add kdt, Supervised Learning se… (compare)

  • Apr 14 2020 01:32
    tonyday567 removed as member
  • Jan 30 2020 07:37

    ocramz on gh-pages

    Add arrayfire (compare)

  • Jan 02 2020 12:51

    ocramz on gh-pages

    add inliterate (compare)

  • Jan 02 2020 12:43

    ocramz on gh-pages

    update hvega entry (compare)

  • Jul 01 2019 09:43
    dmvianna added as member
  • Jun 15 2019 04:55

    ocramz on gh-pages

    Add pcg-random (compare)

  • Jun 14 2019 16:08
    ocramz labeled #42
  • Jun 14 2019 16:08
    ocramz labeled #42
  • Jun 14 2019 16:08
    ocramz labeled #42
  • Jun 14 2019 16:08
    ocramz labeled #42
Yves Parès
@YPares
Hey guys, I'm looking for simple models in Haskell to generate random unique names (as in first names), just to identify stuff (as random UUIDs would, but in a pronounceable manner). Grenade seems to be able to train models that could do that, but maybe it's overkill, and also it doesn't seem really maintained. Any idea? I'd like something light that generates serializable models so it can easily be embedded in a Haskell application
Man of Letters
@man_of_letters:mozilla.org
[m]
@YPares: hi! what kind of neural networks are these (in newbie terms: fully connected, convolutional, recurrent, etc.)? Do you have in mind some classic model described somewhere in detail?
also, is the pronounceability learned by imitation and not verified, or is there a loss function that really enforces that during the training? (pardon me if that's trivial to google)
Samuel Schlesinger
@SamuelSchlesinger

chreekat: python is great for doing science because it runs your code no matter what. No matter what they say about pre-registering experiments, scientists just love to tinker and make up hypotheses after the fact

I must say, I've had some horrible experiments where I run Python code for hours and then finally it prints out:

Traceback (most recent call last):
  File "boop.py", line 1, in <module>
    print(x)
NameError: name 'x' is not defined
Though that's completely consistent with what you said :P
@ocramz_:matrix.org
Man of Letters
@man_of_letters:mozilla.org
[m]
and they say Haskell compilation takes a long time...
anton_5
@anton_5:matrix.org
[m]
Looks like a really nice library! https://backprop.jle.im/
Marco Zocca
@ocramz_:matrix.org
[m]
anton_5: backprop is very handy. I've used it a few times
1 reply
Man of Letters
@man_of_letters:mozilla.org
[m]
anton_5: what are the use cases you are thinking of?
1 reply
Man of Letters
@man_of_letters:mozilla.org
[m]
oh, cool; for ML, as in neural nets, backprop should be great; at least, regarding performance on CPU and if your nets have large tensors in them, as opposed to very many smaller ones (it uses blas/lapack under the hood)
anton_5
@anton_5:matrix.org
[m]
Oh, nice, good to know, thanks! I wonder if it can work with the accelerate as well.
Man of Letters
@man_of_letters:mozilla.org
[m]
I'd love to see somebody do that
anton_5
@anton_5:matrix.org
[m]
Looks like accelerate has bindings to cublas, so, in theory, it should be possible to substitute regular blas in the backprop for cublas.
Or use the arrayfire bindings.
mentions library https://gitlab.com/sacha-sokoloski/goal used in https://elifesciences.org/articles/64615 , probably of interest to some datahaskellers :)
Kevin Brubeck Unhammer
@unhammer
How does that compare to http://repa.ouroborus.net/ ?
Ananthakrishna Gopal
@ananthakrishnagopal
Hi! I come from a pure math background with an interest in data. I recently stumbled upon haskell and really found it cool. I feel that a really nice way for me to get familiar and learn more about haskell, work on some cool stuff and contribute to the community. How do I go about all this? Thanks :)
I have experience in coding with c/c++/openmp/mpi/cuda/python and other basic ml libraries
1 reply
Man of Letters
@man_of_letters:mozilla.org
[m]
Welcome! So, would you like to contribute to a Haskell data-related project? Right now, or after learning Haskell in a conventional way? Haskell does take some adjusting, though the challenging experience is described as beneficial by the survivors.
Kevin Brubeck Unhammer
@unhammer
I'd say the best thing one can do when starting out is to use haskell to solve a real problem. There will be bumps in the road. Maybe you will fix some of them along the way, which would be great
Ananthakrishna Gopal
@ananthakrishnagopal
@man_of_letters:mozilla.org I think I would like to contribute to a project right now and pick up things I don't know along the way. Thanks!
@unhammer Yup, I think so too :)
Man of Letters
@man_of_letters:mozilla.org
[m]
cool! any active project strikes your fancy?
Kevin Brubeck Unhammer
@unhammer
anton_5
@anton_5:matrix.org
[m]
Yep
Ananthakrishna Gopal
@ananthakrishnagopal
@unhammer I don't think I'd have access to a Cuda-enabled system as of now, atleast for a few more weeks. @man_of_letters:mozilla.org I was thinking of the lines of something related to computational geometry or something of sorts?
Ananthakrishna Gopal
@ananthakrishnagopal
hmm
Ananthakrishna Gopal
@ananthakrishnagopal
Just a question, if my company says that any work generated by the associate arising out/in connect of my connection with the company counts as confidential stuff, then would I be able to contribute to these kind of projects or would I need to talk about these things with my company first? I'm new to all these things, so please don't mind these basic questions.
Man of Letters
@man_of_letters:mozilla.org
[m]
re geometry, this is not computational geometry, I think, but a bit related: https://gitlab.com/sacha-sokoloski/goal
re your company, IANAL, but if you are not doing it in company time, they should not have a say, and if you do, they probably should
Ananthakrishna Gopal
@ananthakrishnagopal
thanks! will take a look
chreekat
@b:chreekat.net
[m]
Some companies do claim to have rights to any and all work regardless of when it's done, but then, those claims may not be enforceable, so you probably want specific legal advice in any case
*may not be enforceable depending on local laws
1 reply
Tony Day
@tonyday567
Anyone interested in a reboot of data haskell? I'm doing some small scale numerics and wouldn't mind seeing how others are developing, what libraries are active and just to say hello to fellow travellers.
1 reply
Adam Conner-Sax
@adamcs:matrix.org
[m]
Sure! I’ve been working on a Haskell->Stan (the Bayesian modeling language) interface. Someone else had done a bunch of the runtime stuff (hs-cmdstan) and I’ve been building some stuff around writing the Stan code from Haskell as well as integrated data-handling, etc. kind of a mess right now but does the job. Thinking about integration with Monad-Bayes.
Man of Letters
@man_of_letters:mozilla.org
[m]
Hi! I'm implementing yet another automatic differentiation library with a bunch of co-conspirators and a major struggle in that is wrapping my head around some data science concepts, e.g., data normalization (Mikolaj/horde-ad#42).
4 replies
Kevin C
@dataopt
@adamcs:matrix.org I am looking into Monad-Bayes too and I'm trying to see if it can get traction. I haven't used Stan but have used Turing.jl. I would love to see how monad-bayes could fit into dataHaskell.
adamcs
@adamcs:matrix.org
[m]
@dataopt: I’m intrigued by monad-Bayes as well. I doubt it will be fast enough for my purposes but it also opens up some algorithmic possibilities that might balance that out. Plus I’d rather keep all of it in Haskell as it would simplify the data-pipeline.
Separately, I continue to think that the difficulty of managing flexible row types is a key issue. I’ve opted for Frames mostly, but I often end up with monstrous sets of constraints when writing anything polymorphic in the row content.
1 reply
chreekat
@b:chreekat.net
[m]
Why is that?
Adam Conner-Sax
@adamConnerSax
Not sure about the randomness part, but for the Hamiltonian Monte Carlo bit it’s more that Stan is intensely optimized. So it would, I think, take a lot of work for another implementation to catch up. But there are interesting variations on HMC that might be easier to implement on a fresh Haskell library than on a large scala/c++ codebase.
Tony Day
@tonyday567
The Prob category is getting a lot of love in category theory circles as well, and monad-bayes is a good hunting ground for ideas.
gitter is old tech now right? where would the new kids want to hang out now?
1 reply
We will never beat the blas bindings of hmatrix in performance, and I think stan bindings are in the same category - will remain useful for a good while.
Man of Letters
@man_of_letters:mozilla.org
[m]
gitter has been somehow ported or bridged to Matrix and matrix (directly, not via gitter) is where I post from right now: https://matrix.to/#/!JSHTXUPHaIJhjEINXX:matrix.org?via=m.topoi.dev&via=gitter.im&via=matrix.org
tonyday567
@tonyday567:matrix.org
[m]
Part of my motivation is I wanted to do a bit of a survey, state of play, for data haskell, what's out there right now and what might be on the radar. data haskell might not be the right name for it though. More like practical machine learning.
Kevin Brubeck Unhammer
@unhammer
I use gitter via IRC ¯_(ツ)_/¯