Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Oct 10 21:27

    mstksg on gh-pages

    Update gh-pages at Thu Oct 10 1… (compare)

  • Oct 01 21:55

    mstksg on actions

    dbueg (compare)

  • Oct 01 21:46

    mstksg on actions

    echo path (compare)

  • Oct 01 21:45

    mstksg on actions

    home (compare)

  • Oct 01 21:44

    mstksg on actions

    what (compare)

  • Oct 01 21:42

    mstksg on actions

    add action (compare)

  • Oct 01 16:48

    mstksg on actions

    try case statement (compare)

  • Oct 01 16:43

    mstksg on actions

    fix if then elses (compare)

  • Oct 01 16:40

    mstksg on actions

    checking what is going on (compare)

  • Oct 01 16:35

    mstksg on actions

    darn boolean blindness (compare)

  • Oct 01 16:33

    mstksg on actions

    stack install behind drawin gua… (compare)

  • Oct 01 06:06

    mstksg on actions

    different curling for osx (compare)

  • Oct 01 05:34

    mstksg on actions

    add resolvers (compare)

  • Oct 01 00:13

    mstksg on actions

    building works (compare)

  • Sep 30 23:55

    mstksg on actions

    do i sudo (compare)

  • Sep 30 23:54

    mstksg on actions

    did i kill the path (compare)

  • Sep 30 23:52

    mstksg on actions

    lib lapack (compare)

  • Sep 30 23:36

    mstksg on actions

    why no path (compare)

  • Sep 30 23:35

    mstksg on actions

    path (compare)

  • Sep 30 23:34

    mstksg on actions

    test (compare)

Sam Stites
@stites
type-driven, on-the-fly generation of lenses without template haskell that is as fast as handwritten instances (with inspection regression testing of Core)
(since backprop works well with lenses, I thought I might share)
Justin Le
@mstksg
ah yeah, i've seen it :) it's great
this space is really awesome
Sam Stites
@stites
: D
I went to a few of the FHPC talks too, they seem awesome
Justin Le
@mstksg
afaik i tried something similar with backprop's HKD interface, but i wonder if there is any more i can do to incorporate
Sam Stites
@stites
HKD?
Justin Le
@mstksg
where those this weekend?
Sam Stites
@stites
yup
Justin Le
@mstksg
higher-kinded data
Sam Stites
@stites
oh!
Justin Le
@mstksg
sad i missed out on the event now ;_;
Sam Stites
@stites
there will be more next year!
NPFL is also sort of a data-haskell project
Numerical programming in functional languages
and I think it's going to need submissions! Dominic (idontgetoutmuch) said it got a bit stressful trying to rally people to submit
Sam Stites
@stites
say! I am running up against a bit of an API change that I think I want to encorporate into hasktorch that I think you have good insight into
Primarily, I am realizing that a mutable API is nessecary and I'm wondering if there are good options which exist. I think backprop uses ST to mutate gradients inplace, is that right?
Justin Le
@mstksg
i'm going to try to submit! also heard HCAR is coming up too
hm, do you mean other libraries that manage mutation and stuff?
yeah, backprop uses ST
but most of the code is polymorphic over PrimMonad
Sam Stites
@stites
good to know! I guess I wasn't thinking of PrimMonad
Tony Day
@tonyday567
@stites I've been using generic-lens for a while now. Highly addictive.
Sam Stites
@stites
good to know!
Justin Le
@mstksg
n.b. issues with upstream deps have been resolved and backprop is back on nightlies
Justin Le
@mstksg
interestingly enough vector-sized, which a lot of my math packages rely on, is blocked on haskell/hackage-server#788
Bogdan Penkovsky
@masterdezign
Hello!
I decided to give backprop a try, but I am really struggling with a basic matrix multiplication test.
Bogdan Penkovsky
@masterdezign

I guess, there was no need to wrap data in BVar. This seems to solve the problem:

testMul :: H.L 2 2
testMul = evalBP2 (BP.<>) mk mk

mk :: H.L 2 2
mk = H.build (\a b -> a + b)

Bogdan Penkovsky
@masterdezign
I have another question: how do we count the number of matrix columns using Numeric.LinearAlgebra.Static? For instance, Xavier initialization depends on the number of inputs. Thank you
Bogdan Penkovsky
@masterdezign
Hi! I have tried to better understand how backprop works and have written a short post http://penkovsky.com/post/neural-networks-3/
Bogdan Penkovsky
@masterdezign
To answer my previous question, I have written a not very elegant function getDim:
testMul :: H.L 2 2
testMul = evalBP2 (BP.<>) mk (H.tr mk)

mk :: H.L 2 2
mk = H.build (\a b -> a + b)

getDim :: H.Sized Double s LA.Matrix => s -> (Int, Int)
getDim s =
  let m = H.extract s :: LA.Matrix Double
  in (LA.rows m, LA.cols m)

main = do
  print $ getDim testMul
Justin Le
@mstksg
@masterdezign sorry, I have been absent from this room for a while, I hope you have been able to resolve your issues :)
if not, feel free to ask them again
Bogdan Penkovsky
@masterdezign
@mstksg No worries, those are resolved :)
Justin Le
@mstksg
good to hear :)
Bogdan Penkovsky
@masterdezign
Hello! I am playing with Accelerate LLVM PTX backend + backprop. I have a conceptual question: how do I run all the backprop computations to obtain several weight matrices? Shall I create a data structure that can be "lifted" to an equivalent of a tuple of Acc Arrays?
Bogdan Penkovsky
@masterdezign
  • To be more precise, I can express all neural network (fully-connected) layers as a list.
    Then, the network is:
network :: (Reifies s W) =>
=> BVar s (Acc (Matrix Float)) -- ^ Inputs 
-> BVar s [Acc (Matrix Float)]  -- ^ Weights
-> BVar s (Acc (Matrix Float))  -- ^ Outputs
Applying something like gradBP network is expected to produce [Acc (Matrix Float)], a list of gradients. Now, in gradient descent-like scenario, one would iteratively subtract that gradients from the initial weights thus obtaining a new list of weights (or equivalent). In terms of Accelerate, a GPU algorithm is constructed. To obtain the result Acc a a function run :: Arrays a => Acc a -> a is applied. How do I represent [Acc (Matrix Float)] as Arrays a => a so that I can take advantage of the backprop library?
Bogdan Penkovsky
@masterdezign
This is related to the issue #8
Justin Le
@mstksg
hi @masterdezign , sorry, I think I must have missed your messages earlier
i'll look at this deeper, but i think roughly the idea is to give Acc an appropriate Backprop instance
where add would be "GPU add", to provide another Acc
As for running, you might have to figure out a way to put an [Acc (Matrix Float)] as an instance of Arrays
i feel like this should be a part of the GPU library
Justin Le
@mstksg
nd how would this list/b 3
sorry, wrong window :)