Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • May 21 19:55
    garkenyon closed #317
  • May 21 19:22
    peteschultz assigned #317
  • May 21 19:22
    peteschultz opened #317
  • May 18 21:26

    peteschultz on m-to-n-io-overhaul

    Expands LocalPatchWeightsFileTe… (compare)

  • May 06 21:39

    peteschultz on m-to-n-io-overhaul

    Fix -r flag if Checkpoint0 is t… Removes the phRecv[n] timers Adds PV_TIMER_VERBOSE preproces… and 2 more (compare)

  • Apr 20 21:33

    peteschultz on m-to-n-io-overhaul

    Moves LayerProbe io timer to Ba… Updates to HyPerCol initialize… Adds two column timers, build a… (compare)

  • Apr 05 21:18

    peteschultz on m-to-n-io-overhaul

    Updates interface for timers pi… (compare)

  • Apr 05 20:05

    peteschultz on m-to-n-io-overhaul

    Updates interface for timers pi… (compare)

  • Apr 05 19:48

    peteschultz on m-to-n-io-overhaul

    Bugfix for checkpointing random… Upgrades timers to use clock_ge… (compare)

  • Mar 29 17:42

    peteschultz on default-probe-filepos

    (compare)

  • Mar 24 22:43

    peteschultz on m-to-n-io-overhaul

    Better error checking in MPIBlo… (compare)

  • Mar 24 19:03

    peteschultz on m-to-n-io-overhaul

    Hotfix: Fixes FilenameParserLay… (compare)

  • Feb 22 20:46

    peteschultz on m-to-n-io-overhaul

    Fixes makeDirectory for absolut… (compare)

  • Feb 15 20:33

    peteschultz on m-to-n-io-overhaul

    Adds a tolerance to LocalPatchW… (compare)

  • Feb 15 18:39

    peteschultz on m-to-n-io-overhaul

    Hotfix: ProbeStatsFlagTest bugf… Fixing minor memory leaks (compare)

  • Feb 06 00:45

    peteschultz on develop

    Adds a ColumnEnergyProbe to Pro… (compare)

  • Feb 06 00:06

    peteschultz on develop

    Adds clobberFlag to MPIRecvStre… (compare)

  • Feb 05 19:49

    peteschultz on develop

    endfor, endif, endwhile changed… (compare)

  • Feb 05 01:10

    peteschultz on develop

    Hotfix: Probes only write heade… (compare)

  • Feb 04 17:37

    peteschultz on develop

    Hotfix: end_try_catch changed t… (compare)

Sheng Lundquist
@slundqui
maybe have error take "this"?
Error(this) << "Oh no, error!"
Jeff Bowles
@riskybacon
We're thinking the same thing.
Sheng Lundquist
@slundqui
well it looks like if you have a reference to the object
at least for layers
we have getKeyword() and getName()
as for the wiki entries
I think we should separate user wikis (e.g. how do I write this params file) and developer wikis (e.g. how do I print an error in my custom layer)
although I guess writing custom layers is part of being a user
Jeff Bowles
@riskybacon
Regardless, I think you're making a good point about separating out this information.
peteschultz
@peteschultz
If you're inside a BaseObject the getName and getKeyword methods exist; otherwise they generally don't. Could we give BaseObject methods called Info, Error, and Debug, and also have there be Info, Error, and Debug functions defined in the PV namespace but outside of any class? I believe that in that case, inside a BaseObject class you'd get the method but outside you'd get the PV:: function, which seems like what Sheng is asking for. That does mean that they wouldn't be defined as macros, though.
Sheng Lundquist
@slundqui
So I guess I also found a few other things useful for reducing boilerplate code
like PVAssert and PVAlloc
what do they do differently?
Sheng Lundquist
@slundqui
also, including the header files via <utils/PV*> gives the following error
[ 94%] Building CXX object /home/sheng/workspace/Projects/mlearning/CMakeFiles/mlearning.dir/src/columns/MLPRegisterKeywords.cpp.o
In file included from /usr/include/c++/4.8/type_traits:35:0,
from /home/sheng/workspace/OpenPV/pv-core/src/utils/PVLog.hpp:6,
from /home/sheng/workspace/Projects/mlearning/src/columns/../layers/BackwardsBatchNorm.hpp:13,
from /home/sheng/workspace/Projects/mlearning/src/columns/MLPRegisterKeywords.cpp:7:
/usr/include/c++/4.8/bits/c++0x_warning.h:32:2: error: #error This file requires compiler and library support for the ISO C++ 2011 standard. This support is currently experimental, and must be enabled with the -std=c++11 or -std=gnu++11 compiler options.
#error This file requires compiler and library support for the \
^
make[2]: *** [/home/sheng/workspace/Projects/mlearning/CMakeFiles/mlearning.dir/src/columns/MLPRegisterKeywords.cpp.o] Error 1
make[1]: *** [/home/sheng/workspace/Projects/mlearning/CMakeFiles/mlearning.dir/all] Error 2
make: *** [all] Error 2
peteschultz
@peteschultz
PVAssert and PVAlloc are wrappers around the standard assert and malloc calls but give more information in the error messages when they fail.
Sheng Lundquist
@slundqui
Okay, thanks. I figured out the error as well, looks like I needed to add c++11 flags to the cmake of my project
Is there a cmakelists template for various projects?
peteschultz
@peteschultz
The pv_config_project macro defined in cmake/PVConfigProject.cmake sets the c++11 flag. Right now the idea is that pv_config_project is called at the beginning of the OpenPV/CMakeLists.txt project, but that's changing. Perhaps someone has written a new project CMakeLists.txt but I don't think that's happened yet.
Sheng Lundquist
@slundqui
Ah okay. I think I'm using an old CMakeLists that used to be in auxlib, I'll have to look into the CMake changes.
So another question. There exists system tests that used to be in the old auxlib directory that makes extensive use of GradientCheckConn, which used to be a custom class under the mlearning auxlib
I know @garkenyon has told me to add BatchNorm and BackwardsBatchNorm layers into pv-core (in the devel branch), but I want system tests that check the gradients of BackwardsBatchNorm
So now that we no longer have an auxlib directory, where do you think a custom connection GradientCheckConn belongs?
peteschultz
@peteschultz
If I'm following, the system test in auxlib/mlearning is being moved to PVSystemTests, and that system test is the only place that GradientCheckConn would be used, correct? If that's right, then it seems that GradientCheckConn belongs in the test in PVSystemTests.
Sheng Lundquist
@slundqui
Currently, there was one system test in mlearning/tests, which tests various models (such as mlp, mlp with softmax, convolutions, relu, etc) for correctness in gradients. Currently, I have a system test for BatchNorm that tests the feedforward pass, and am planning on writing a test for testing the correctness of the gradient in the backwards pass. I had hoped to keep these 2 tests together, but under your suggestion, we would either have to break the feedforward test and the backward test of BatchNorm up into 2 seperate tests (with the backward test grouped with the rest of the gradient checks), or we would have to copy GradientCheckConn into 2 separate places
There actually is a lot of repeated code in many of the tests, such as probes that check for zeros in a specific layer
Perhaps the solution is to make shared custom layers and connections for system tests
(and probes)
peteschultz
@peteschultz
There's a RequireAllZeroActivityProbe in pv-core that sets a flag if anything in the layer it targets becomes nonzero. I'm not sure if you mean that there are multiple tests that use that probe, or multiple tests with probes that duplicate the probe's functionality.
Also, I'm not following why the tests that were together in mlearning/tests need to be broken in two if they're moved to PVSystemTests.
Sheng Lundquist
@slundqui
There is code that duplicates the functionality of the probe (most likely my fault, sorry!) in various system tests. I was using that as an example, although I guess GradientCheckConn is likely to be an anomaly.
when the code was in auxlib/mlearning, GradientCheckConn was defined in the auxlib itself
so the 2 tests both linked to the auxlib src, which contained GradientCheckConn among various other files
but now that we're moving away from the auxlib structure, the only way 2 tests can link to the same code is if the code is in pv-core
William Shainin
@wshainin
I'm a little confused. Are you just asking where these tests should go? Like should the grad check for batch norm go in a batch norm test dir, instead of the bin-o-gradient-tests dir?
I do think that maybe the AlexNetTest should live in a projects directory. I think if the mlearning tests are going to be integrated into OpenPV, then they should all be as simple as possible, (e.g. gradient through ReLU, pooling individually)
If a test is needed to make sure all the parts are working together, should this go in the tests directory? If there's a bug when the gradient gets un-pooled, AlexNetTest would fail, but it might be hard to tell why.
William Shainin
@wshainin
I think it's ok to put a gradient check in pv-core
But I'm probably not the person to make that decision
Also, the refactor should make tests more modular, and make sharing code easier, and also necessitate a real gradient check (or undo checker) connection in the core
William Shainin
@wshainin
Is the gradient check conn just taking the numerical derivative?
Sheng Lundquist
@slundqui
yes
adds a delta to the input, recomputes loss, and checks the difference of the loss with the calculated backprop gradient
peteschultz
@peteschultz
After talking to Will, I think that there should be one test that uses GradientCheckConn, whose purpose is to check the gradients of various transfer functions. That test could have different params files, one for sigmoid, one for relu, etc, but its only purpose should be to verify taking the gradient using numerical derivatives. All other tests should then be able to assume that the gradient is being taken correctly. Will tells me that your plan for BatchNormTest might be to take numerical gradients as a check on the answer, but that seems to me to be mixing too much functionality into one test. BatchNormTest should be written to check the result of normalizing across the batch, on the assumption that everything else can be regarded as correct because it is being verified by its own test.
Sheng Lundquist
@slundqui
so what you're saying is that the one test that uses gradientcheckconn should include the check for BackwardsBatchConn?
in other words, there should be a "CheckGradientsTest" that checks for all gradients for any and all different types of transfer functions (such as sigmoid, relu, batchnorm, etc)?
peteschultz
@peteschultz
That's what I had in mind, although it isn't clear to me that BatchNorm is a transfer function (I haven't looked at the code for it though). I've been thinking in terms of having the test verify individual layers that do these transfer functions. But I'm now thinking that what might be driving this, is that you also want to verify that an entire hierarchy is moving the overall error function in the direction of gradient descent. I missed that earlier.
Sheng Lundquist
@slundqui
sure, that's also true. But I think it's sufficient to test each layer individually and assume it will work when you add them all together