Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • Jul 09 2018 08:50
    ArtLinkov closed #88
  • Jul 09 2018 08:50
    ArtLinkov commented #88
  • Jul 08 2018 11:16
    sancarn commented #89
  • Jul 08 2018 09:30

    bararchy on master

    Update README.md (compare)

  • Jul 08 2018 09:29

    bararchy on master

    logo add (#89) Merged. Thanks … (compare)

  • Jul 08 2018 09:29
    bararchy closed #89
  • Jul 08 2018 09:22
    psikoz commented #89
  • Jul 08 2018 09:22
    psikoz synchronize #89
  • Jul 08 2018 09:21
    psikoz synchronize #89
  • Jul 08 2018 08:55
    bararchy commented #89
  • Jul 08 2018 08:49
    psikoz commented #89
  • Jul 08 2018 08:49
    psikoz synchronize #89
  • Jul 08 2018 08:24
    bararchy commented #89
  • Jul 08 2018 08:16
    psikoz commented #88
  • Jul 08 2018 08:16
    psikoz opened #89
  • Jul 08 2018 07:32
    ArtLinkov commented #88
  • Jul 07 2018 11:45
    psikoz commented #88
  • Jul 05 2018 11:45
    bararchy closed #86
  • Jul 05 2018 11:44
    bararchy closed #39
  • Jul 05 2018 11:44
    bararchy commented #39
Dru Jensen
@drujensen
looks like the cnn doesn’t have the ability to save to a file yet
so i’m kinda stuck. :-(
Bar Hofesh
@bararchy
Yeha, still working on that, @ArtLinkov is cleaning last bugs and he is working on small adjustment, then I'll implement the save/load for CNN
Dru Jensen
@drujensen
ok, np. I will still test the cnn training part
ArtLinkov
@ArtLinkov
@drujensen Regarding your question, each channel is stored within a filter (wich has one x,y matrix of neurons). Since after we pass the input layer the feature maps created by the conv layer are the "channels".
ArtLinkov
@ArtLinkov
Oh and regarding the right format for input data, for training you need an array of data points: [input, expected_output]
input = array(array(array(Num))) which is Channel < Row < Column (Neuron)
expected_output = array(Num)
Eric Londo
@londoed
Is anything moving along with this project?
Bar Hofesh
@bararchy
Hi @londoed , we are right now deep into improving the CNN module , you can check the rolling PR, also working behind the scenes on GPU or multi-thread support for specific operations
Have you tried our library?
I see you have began to work on your own machine learning lib for Crystal https://github.com/londoed/entropx
it sounds interesting :)
@ArtLinkov is our main guy working on SHAInet
ArtLinkov
@ArtLinkov
Hey, welcome aboard @londoed
As @bararchy mentioned, we are working on it
Would love to hear your opinion on our humble project :)
Eric Londo
@londoed
Yeah, coming from a deep C++/Python data science background I thought trying to write some ML to learn Crystal would be a good idea. Just checking in to see what's out there at the moment, I'll probably run through some of your libraries in my free time! I wasn't planning on creating anything serious, just kind of a fun project to learn Crystal...it's been a while since I've used Ruby so getting used to the syntax again. Though Crystal looks very promising!
Bar Hofesh
@bararchy
@londoed If you see something you like or have any suggestion let us know, if you want to add more stuff to the shard we accept PRs :)
ArtLinkov
@ArtLinkov
@londoed feel free to scrutinize my math (or anything else in our project) :)
Our goal here is to make SHAInet a useful ML tool for the crystal community, and we happily accept any help.
I am self taught in ML, and although I have a decent math background, I'm by no means an expert. So another set of eyes to keep us honest is more than welcome
Bar Hofesh
@bararchy
Later today expect a release of SHAInet with usable CNN :) save/load from file , and examples on how to use
Dru Jensen
@drujensen
:sparkles: :clap:
Bar Hofesh
@bararchy
Sorry for the delay @drujensen I'll finish it ASAP
Dru Jensen
@drujensen
@bararchy no rush
Bar Hofesh
@bararchy
@drujensen your example on heart-disese prediction using SHAInet is cool :)
ArtLinkov
@ArtLinkov
@hugoabonizio we added you to contributors, welcome aboard :)
Bar Hofesh
@bararchy
README referance :) , thanks for all the help
also @drujensen
Bar Hofesh
@bararchy
v2.1.0 of SHAInet is out, checkout the CNN examples in the README
Hugo Abonizio
@hugoabonizio
@ArtLinkov thank you guys for this great lib! :smile:
Bar Hofesh
@bararchy
Anyone interested, keep an eye on NeuraLegion/shainet#73, @ArtLinkov is adding a cool suprise to the network!
Eric Londo
@londoed
Sounds cool, I'll keep my eyes peeled!
Bar Hofesh
@bararchy
Thanks @londoed :) were almost finished
ArtLinkov
@ArtLinkov
Hey all, we finished working on the cool new evolutionary optimizer for SHAInet! :D It doesn't use any back-propagation and is very promising. If you want more info on the method check this paper: https://blog.openai.com/evolution-strategies/
Bar Hofesh
@bararchy
:tada:
Bar Hofesh
@bararchy
it's really cool and blazing fast
Dru Jensen
@drujensen
what? no back propogation! wow! yes, I will. awesome stuff. :100:
til - that synapses only send messages in one direction and that back propogation is not a good model for learning. @ArtLinkov do you think the evolution strategy better mimics how our brain actually functions?
Bar Hofesh
@bararchy
The most accurate neural network (biologically wise) is the Spiking neural networks (SNNs)
Which I hope I can convince @ArtLinkov to add ;)
Dru Jensen
@drujensen
Interesting! I will check it out.
ArtLinkov
@ArtLinkov
@drujensen maybe there is something more similar in the way that trains the network in a more "noisy" way such as an ES strategy. After all, the way biology works is that there is huge vriability in the signals each tissue/cell/protein experiences to make "decisions". There are miriads of forces that "tug" on them towards decions from various angles and what is chossen eventually is a "sum" of them (roughly speaking).
We are still a long way from modeling real neurons because right now all we are doing is change basically one dimentional information within the neuron. And for the complexity of the brain it has much more inforamtion that is stored and transmitted simultaniously
Also, another problem is that the way computers work is that they must perfom the operations one at a time, which the brain doesn't do
Dru Jensen
@drujensen
Speaking of parallelism, it looks like the evolution strategy is better suited for threads and multicore cpus. Crystal really needs support for threads. I am hoping they are still working on that.
Bar Hofesh
@bararchy
They are, but until full support and memory sharing is achieved we cannot take advantage of that for now
Eric Londo
@londoed
Hey guys, just reading through some of the new source code...looking pretty good, congrats! Just found a tiny typo in one of the comments on line 189 of math/functions.cr
It says "vector element-by-element multiplication", which was done in the previous proc. Should change "multiplication" to "summation" or "addition"
Bar Hofesh
@bararchy
@londoed Thanks for helping with this, @ArtLinkov can you take a look?
ArtLinkov
@ArtLinkov
Will do, thanks @londoed
ArtLinkov
@ArtLinkov
Guys check out the latest version, we added an autosave feature to the training :smile:
Bar Hofesh
@bararchy
:tada:
Dru Jensen
@drujensen
:clap: