Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jul 02 01:44

    peteschultz on test-dependency

    (compare)

  • Jul 02 01:42

    peteschultz on m-to-n-io-overhaul

    Sets tolerance for PtwiseQuotie… (compare)

  • Jul 01 17:51

    peteschultz on m-to-n-io-overhaul

    One more logging bugfix. Hopefu… (compare)

  • Jun 30 23:33

    peteschultz on m-to-n-io-overhaul

    More bugfixes to printf logging (compare)

  • Jun 30 21:41

    peteschultz on m-to-n-io-overhaul

    Updates opt-report-phase option… (compare)

  • Jun 30 19:26

    peteschultz on tutorial-multiple-adapt-schemes

    (compare)

  • Jun 30 19:26

    peteschultz on tutorial-batchMethod-random

    (compare)

  • Jun 30 19:25

    peteschultz on tutorial-2022-updates

    (compare)

  • Jun 30 19:25

    peteschultz on pybind-merge-p

    (compare)

  • Jun 30 19:25

    peteschultz on params-group-order

    (compare)

  • Jun 30 19:25

    peteschultz on misc-updates

    (compare)

  • Jun 30 19:25

    peteschultz on logfiles-update

    (compare)

  • Jun 30 19:25

    peteschultz on layers-without-layer-subclasses

    (compare)

  • Jun 30 19:25

    peteschultz on eliminate-shadowing-vars

    (compare)

  • Jun 30 19:23

    peteschultz on misc-updates

    InitVFromFile verifies file and… Removes unused function argument GitRevisionString() in utils di… and 1 more (compare)

  • Jun 30 19:23

    peteschultz on params-group-order

    LIFTestProbe can precede the la… outputParams does not change th… (compare)

  • Jun 30 19:23

    peteschultz on tutorial-multiple-adapt-schemes

    Tutorial has different timescal… Allows empty string for adaptiv… (compare)

  • Jun 30 19:23

    peteschultz on pybind-merge-p

    Fixes package.path for generate… Updates stb_image and stb_image… PythonBindings works with PV_US… and 3 more (compare)

  • Jun 30 19:23

    peteschultz on tutorial-batchMethod-random

    Changes LCA_CIFAR tutorial to b… (compare)

  • Jun 30 19:23

    peteschultz on master

    Changes nbatch in LCA tutorial … (compare)

Sheng Lundquist
@slundqui
Currently, there was one system test in mlearning/tests, which tests various models (such as mlp, mlp with softmax, convolutions, relu, etc) for correctness in gradients. Currently, I have a system test for BatchNorm that tests the feedforward pass, and am planning on writing a test for testing the correctness of the gradient in the backwards pass. I had hoped to keep these 2 tests together, but under your suggestion, we would either have to break the feedforward test and the backward test of BatchNorm up into 2 seperate tests (with the backward test grouped with the rest of the gradient checks), or we would have to copy GradientCheckConn into 2 separate places
There actually is a lot of repeated code in many of the tests, such as probes that check for zeros in a specific layer
Perhaps the solution is to make shared custom layers and connections for system tests
(and probes)
peteschultz
@peteschultz
There's a RequireAllZeroActivityProbe in pv-core that sets a flag if anything in the layer it targets becomes nonzero. I'm not sure if you mean that there are multiple tests that use that probe, or multiple tests with probes that duplicate the probe's functionality.
Also, I'm not following why the tests that were together in mlearning/tests need to be broken in two if they're moved to PVSystemTests.
Sheng Lundquist
@slundqui
There is code that duplicates the functionality of the probe (most likely my fault, sorry!) in various system tests. I was using that as an example, although I guess GradientCheckConn is likely to be an anomaly.
when the code was in auxlib/mlearning, GradientCheckConn was defined in the auxlib itself
so the 2 tests both linked to the auxlib src, which contained GradientCheckConn among various other files
but now that we're moving away from the auxlib structure, the only way 2 tests can link to the same code is if the code is in pv-core
William Shainin
@wshainin
I'm a little confused. Are you just asking where these tests should go? Like should the grad check for batch norm go in a batch norm test dir, instead of the bin-o-gradient-tests dir?
I do think that maybe the AlexNetTest should live in a projects directory. I think if the mlearning tests are going to be integrated into OpenPV, then they should all be as simple as possible, (e.g. gradient through ReLU, pooling individually)
If a test is needed to make sure all the parts are working together, should this go in the tests directory? If there's a bug when the gradient gets un-pooled, AlexNetTest would fail, but it might be hard to tell why.
William Shainin
@wshainin
I think it's ok to put a gradient check in pv-core
But I'm probably not the person to make that decision
Also, the refactor should make tests more modular, and make sharing code easier, and also necessitate a real gradient check (or undo checker) connection in the core
William Shainin
@wshainin
Is the gradient check conn just taking the numerical derivative?
Sheng Lundquist
@slundqui
yes
adds a delta to the input, recomputes loss, and checks the difference of the loss with the calculated backprop gradient
peteschultz
@peteschultz
After talking to Will, I think that there should be one test that uses GradientCheckConn, whose purpose is to check the gradients of various transfer functions. That test could have different params files, one for sigmoid, one for relu, etc, but its only purpose should be to verify taking the gradient using numerical derivatives. All other tests should then be able to assume that the gradient is being taken correctly. Will tells me that your plan for BatchNormTest might be to take numerical gradients as a check on the answer, but that seems to me to be mixing too much functionality into one test. BatchNormTest should be written to check the result of normalizing across the batch, on the assumption that everything else can be regarded as correct because it is being verified by its own test.
Sheng Lundquist
@slundqui
so what you're saying is that the one test that uses gradientcheckconn should include the check for BackwardsBatchConn?
in other words, there should be a "CheckGradientsTest" that checks for all gradients for any and all different types of transfer functions (such as sigmoid, relu, batchnorm, etc)?
peteschultz
@peteschultz
That's what I had in mind, although it isn't clear to me that BatchNorm is a transfer function (I haven't looked at the code for it though). I've been thinking in terms of having the test verify individual layers that do these transfer functions. But I'm now thinking that what might be driving this, is that you also want to verify that an entire hierarchy is moving the overall error function in the direction of gradient descent. I missed that earlier.
Sheng Lundquist
@slundqui
sure, that's also true. But I think it's sufficient to test each layer individually and assume it will work when you add them all together