Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Aug 13 09:02

    mstksg on gh-pages

    Update gh-pages at Tue Aug 13 0… (compare)

  • Aug 13 08:36

    mstksg on v0.2.6.3

    (compare)

  • Aug 13 08:32

    mstksg on gh-pages

    Update gh-pages at Tue Aug 13 0… (compare)

  • Aug 13 08:32

    mstksg on master

    changelog and cleanup (compare)

  • Aug 03 18:20

    mstksg on master

    Backprop instances for vinyl ty… (compare)

  • Jun 28 21:49

    mstksg on master

    new travis system (compare)

  • May 09 19:02
    mstksg commented #14
  • May 09 19:02
    mstksg closed #14
  • May 09 19:02
    mstksg commented #14
  • May 08 20:10
    mstksg commented #14
  • May 08 16:24
    fabianmurariu opened #14
  • May 06 17:00
    masterdezign commented #8
  • May 06 16:52
    masterdezign commented #8
  • May 06 16:51
    masterdezign commented #8
  • Apr 20 23:37
    zhujinxuan commented #9
  • Apr 20 23:24
    zhujinxuan commented #9
  • Apr 10 22:07
    mstksg closed #9
  • Apr 10 22:07
    mstksg commented #11
  • Apr 10 22:06
    mstksg closed #11
  • Apr 08 12:04
    martin-neuhaeusser commented #13
Sam Stites
@stites
good to know! Thanks for being on top of this, @mstksg !
Justin Le
@mstksg
no problem @stites! yeah, finally got a chance to upgrade all my libs today to new ghc. vector-sized also has some issues stemming from a dep as well.
Sam Stites
@stites
cool cool! heads-up: I emailed Simon PJ your purely-typed models series (and an intro to backprop) at ICFP
Justin Le
@mstksg
ah wow, thanks for the heads up, i'm honored :)
Sam Stites
@stites
: D
I guess this means he doesn't reddit : P
Sam Stites
@stites
also, ICFP also debut'd this library (perhaps again): https://hackage.haskell.org/package/generic-lens
type-driven, on-the-fly generation of lenses without template haskell that is as fast as handwritten instances (with inspection regression testing of Core)
(since backprop works well with lenses, I thought I might share)
Justin Le
@mstksg
ah yeah, i've seen it :) it's great
this space is really awesome
Sam Stites
@stites
: D
I went to a few of the FHPC talks too, they seem awesome
Justin Le
@mstksg
afaik i tried something similar with backprop's HKD interface, but i wonder if there is any more i can do to incorporate
Sam Stites
@stites
HKD?
Justin Le
@mstksg
where those this weekend?
Sam Stites
@stites
yup
Justin Le
@mstksg
higher-kinded data
Sam Stites
@stites
oh!
Justin Le
@mstksg
sad i missed out on the event now ;_;
Sam Stites
@stites
there will be more next year!
NPFL is also sort of a data-haskell project
Numerical programming in functional languages
and I think it's going to need submissions! Dominic (idontgetoutmuch) said it got a bit stressful trying to rally people to submit
Sam Stites
@stites
say! I am running up against a bit of an API change that I think I want to encorporate into hasktorch that I think you have good insight into
Primarily, I am realizing that a mutable API is nessecary and I'm wondering if there are good options which exist. I think backprop uses ST to mutate gradients inplace, is that right?
Justin Le
@mstksg
i'm going to try to submit! also heard HCAR is coming up too
hm, do you mean other libraries that manage mutation and stuff?
yeah, backprop uses ST
but most of the code is polymorphic over PrimMonad
Sam Stites
@stites
good to know! I guess I wasn't thinking of PrimMonad
Tony Day
@tonyday567
@stites I've been using generic-lens for a while now. Highly addictive.
Sam Stites
@stites
good to know!
Justin Le
@mstksg
n.b. issues with upstream deps have been resolved and backprop is back on nightlies
Justin Le
@mstksg
interestingly enough vector-sized, which a lot of my math packages rely on, is blocked on haskell/hackage-server#788
Bogdan Penkovsky
@masterdezign
Hello!
I decided to give backprop a try, but I am really struggling with a basic matrix multiplication test.
Bogdan Penkovsky
@masterdezign

I guess, there was no need to wrap data in BVar. This seems to solve the problem:

testMul :: H.L 2 2
testMul = evalBP2 (BP.<>) mk mk

mk :: H.L 2 2
mk = H.build (\a b -> a + b)

Bogdan Penkovsky
@masterdezign
I have another question: how do we count the number of matrix columns using Numeric.LinearAlgebra.Static? For instance, Xavier initialization depends on the number of inputs. Thank you
Bogdan Penkovsky
@masterdezign
Hi! I have tried to better understand how backprop works and have written a short post http://penkovsky.com/post/neural-networks-3/
Bogdan Penkovsky
@masterdezign
To answer my previous question, I have written a not very elegant function getDim:
testMul :: H.L 2 2
testMul = evalBP2 (BP.<>) mk (H.tr mk)

mk :: H.L 2 2
mk = H.build (\a b -> a + b)

getDim :: H.Sized Double s LA.Matrix => s -> (Int, Int)
getDim s =
  let m = H.extract s :: LA.Matrix Double
  in (LA.rows m, LA.cols m)

main = do
  print $ getDim testMul
Justin Le
@mstksg
@masterdezign sorry, I have been absent from this room for a while, I hope you have been able to resolve your issues :)
if not, feel free to ask them again
Bogdan Penkovsky
@masterdezign
@mstksg No worries, those are resolved :)
Justin Le
@mstksg
good to hear :)
Bogdan Penkovsky
@masterdezign
Hello! I am playing with Accelerate LLVM PTX backend + backprop. I have a conceptual question: how do I run all the backprop computations to obtain several weight matrices? Shall I create a data structure that can be "lifted" to an equivalent of a tuple of Acc Arrays?
Bogdan Penkovsky
@masterdezign
  • To be more precise, I can express all neural network (fully-connected) layers as a list.
    Then, the network is:
network :: (Reifies s W) =>
=> BVar s (Acc (Matrix Float)) -- ^ Inputs 
-> BVar s [Acc (Matrix Float)]  -- ^ Weights
-> BVar s (Acc (Matrix Float))  -- ^ Outputs
Applying something like gradBP network is expected to produce [Acc (Matrix Float)], a list of gradients. Now, in gradient descent-like scenario, one would iteratively subtract that gradients from the initial weights thus obtaining a new list of weights (or equivalent). In terms of Accelerate, a GPU algorithm is constructed. To obtain the result Acc a a function run :: Arrays a => Acc a -> a is applied. How do I represent [Acc (Matrix Float)] as Arrays a => a so that I can take advantage of the backprop library?
Bogdan Penkovsky
@masterdezign
This is related to the issue #8