Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Nov 05 07:41

    mstksg on master

    new github workflow system (compare)

  • Nov 05 07:19

    mstksg on actions

    big reset (compare)

  • Nov 05 06:47

    mstksg on actions

    cache too big :( (compare)

  • Nov 05 06:02

    mstksg on actions

    indentor (compare)

  • Nov 05 06:01

    mstksg on actions

    dist (compare)

  • Nov 05 05:46

    mstksg on actions

    use cache fork (compare)

  • Nov 05 05:22

    mstksg on actions

    i guess no (compare)

  • Nov 05 04:39

    mstksg on actions

    always cache cabal (compare)

  • Nov 04 22:54

    mstksg on actions

    runs (compare)

  • Nov 04 22:52

    mstksg on actions

    dump matrix (compare)

  • Nov 04 22:48

    mstksg on actions

    matrix as keys (compare)

  • Nov 04 22:42

    mstksg on actions

    new stuff (compare)

  • Nov 04 22:32

    mstksg on actions

    why they different? (compare)

  • Nov 04 22:05

    mstksg on actions

    why (compare)

  • Nov 04 22:00

    mstksg on actions

    better copy (compare)

  • Nov 04 21:56

    mstksg on actions

    no cabal for now (compare)

  • Nov 04 21:52

    mstksg on actions

    splat (compare)

  • Nov 04 21:49

    mstksg on actions

    hashfile relative? (compare)

  • Nov 04 21:47

    mstksg on actions

    wow (compare)

  • Nov 04 21:41

    mstksg on actions

    fix stack yaml lock (compare)

Sam Stites
@stites
type-driven, on-the-fly generation of lenses without template haskell that is as fast as handwritten instances (with inspection regression testing of Core)
(since backprop works well with lenses, I thought I might share)
Justin Le
@mstksg
ah yeah, i've seen it :) it's great
this space is really awesome
Sam Stites
@stites
: D
I went to a few of the FHPC talks too, they seem awesome
Justin Le
@mstksg
afaik i tried something similar with backprop's HKD interface, but i wonder if there is any more i can do to incorporate
Sam Stites
@stites
HKD?
Justin Le
@mstksg
where those this weekend?
Sam Stites
@stites
yup
Justin Le
@mstksg
higher-kinded data
Sam Stites
@stites
oh!
Justin Le
@mstksg
sad i missed out on the event now ;_;
Sam Stites
@stites
there will be more next year!
NPFL is also sort of a data-haskell project
Numerical programming in functional languages
and I think it's going to need submissions! Dominic (idontgetoutmuch) said it got a bit stressful trying to rally people to submit
Sam Stites
@stites
say! I am running up against a bit of an API change that I think I want to encorporate into hasktorch that I think you have good insight into
Primarily, I am realizing that a mutable API is nessecary and I'm wondering if there are good options which exist. I think backprop uses ST to mutate gradients inplace, is that right?
Justin Le
@mstksg
i'm going to try to submit! also heard HCAR is coming up too
hm, do you mean other libraries that manage mutation and stuff?
yeah, backprop uses ST
but most of the code is polymorphic over PrimMonad
Sam Stites
@stites
good to know! I guess I wasn't thinking of PrimMonad
Tony Day
@tonyday567
@stites I've been using generic-lens for a while now. Highly addictive.
Sam Stites
@stites
good to know!
Justin Le
@mstksg
n.b. issues with upstream deps have been resolved and backprop is back on nightlies
Justin Le
@mstksg
interestingly enough vector-sized, which a lot of my math packages rely on, is blocked on haskell/hackage-server#788
Bogdan Penkovsky
@masterdezign
Hello!
I decided to give backprop a try, but I am really struggling with a basic matrix multiplication test.
Bogdan Penkovsky
@masterdezign

I guess, there was no need to wrap data in BVar. This seems to solve the problem:

testMul :: H.L 2 2
testMul = evalBP2 (BP.<>) mk mk

mk :: H.L 2 2
mk = H.build (\a b -> a + b)

Bogdan Penkovsky
@masterdezign
I have another question: how do we count the number of matrix columns using Numeric.LinearAlgebra.Static? For instance, Xavier initialization depends on the number of inputs. Thank you
Bogdan Penkovsky
@masterdezign
Hi! I have tried to better understand how backprop works and have written a short post http://penkovsky.com/post/neural-networks-3/
Bogdan Penkovsky
@masterdezign
To answer my previous question, I have written a not very elegant function getDim:
testMul :: H.L 2 2
testMul = evalBP2 (BP.<>) mk (H.tr mk)

mk :: H.L 2 2
mk = H.build (\a b -> a + b)

getDim :: H.Sized Double s LA.Matrix => s -> (Int, Int)
getDim s =
  let m = H.extract s :: LA.Matrix Double
  in (LA.rows m, LA.cols m)

main = do
  print $ getDim testMul
Justin Le
@mstksg
@masterdezign sorry, I have been absent from this room for a while, I hope you have been able to resolve your issues :)
if not, feel free to ask them again
Bogdan Penkovsky
@masterdezign
@mstksg No worries, those are resolved :)
Justin Le
@mstksg
good to hear :)
Bogdan Penkovsky
@masterdezign
Hello! I am playing with Accelerate LLVM PTX backend + backprop. I have a conceptual question: how do I run all the backprop computations to obtain several weight matrices? Shall I create a data structure that can be "lifted" to an equivalent of a tuple of Acc Arrays?
Bogdan Penkovsky
@masterdezign
  • To be more precise, I can express all neural network (fully-connected) layers as a list.
    Then, the network is:
network :: (Reifies s W) =>
=> BVar s (Acc (Matrix Float)) -- ^ Inputs 
-> BVar s [Acc (Matrix Float)]  -- ^ Weights
-> BVar s (Acc (Matrix Float))  -- ^ Outputs
Applying something like gradBP network is expected to produce [Acc (Matrix Float)], a list of gradients. Now, in gradient descent-like scenario, one would iteratively subtract that gradients from the initial weights thus obtaining a new list of weights (or equivalent). In terms of Accelerate, a GPU algorithm is constructed. To obtain the result Acc a a function run :: Arrays a => Acc a -> a is applied. How do I represent [Acc (Matrix Float)] as Arrays a => a so that I can take advantage of the backprop library?
Bogdan Penkovsky
@masterdezign
This is related to the issue #8
Justin Le
@mstksg
hi @masterdezign , sorry, I think I must have missed your messages earlier
i'll look at this deeper, but i think roughly the idea is to give Acc an appropriate Backprop instance
where add would be "GPU add", to provide another Acc
As for running, you might have to figure out a way to put an [Acc (Matrix Float)] as an instance of Arrays
i feel like this should be a part of the GPU library
Justin Le
@mstksg
nd how would this list/b 3
sorry, wrong window :)