Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Adam Gayoso
    @adamgayoso
    First message! Hi everyone, we will try gitter as a platform for asking development-related questions. This hopefully reduces the formality of making issues, discourse posts, etc, and enables faster prototyping of models.
    Eduardo Beltrame
    @Munfred
    W00t!!!
    Vitalii Kleshchevnikov
    @vitkl

    Hi Adam and everyone,

    This is quite cool!

    I want to ask you these 3 things for a while now (any thoughts would be appreciated):

    1. Do you have any tips about making the encoder NN work well? For example, which data transformations for input to the encoder NN work best? how many layers is a good starting point (e.g. for complex datasets with >50 cell populations)? Anything else relevant?
    2. What is the purpose of KL warmup step? What function is this trying to achieve?
    3. If you have any experience with this, which transformation works best for transforming the output of encoder NN positive variables (exp, softplus, softplus with a different beta, ...)? E.g. variance of the posterior distribution, positive latent variables (thinking of cell2location and other models I am working on)
    Adam Gayoso
    @adamgayoso
    1. Do you have any tips about making the encoder NN work well? For example, which data transformations for input to the encoder NN work best? how many layers is a good starting point (e.g. for complex datasets with >50 cell populations)? Anything else relevant?
    For expression data we typically just log1p the data before encoding as a first layer, and for most of our models, 1 hidden layer is good. I think if you take a look at our basic Encoder class, it's probably useful for your situation

    What is the purpose of KL warmup step? What function is this trying to achieve?

    https://github.com/YosefLab/scvi-tools/issues/735#issuecomment-681201770

    If you have any experience with this, which transformation works best for transforming the output of encoder NN positive variables (exp, softplus, softplus with a different beta, ...)? E.g. variance of the posterior distribution, positive latent variables (thinking of cell2location and other models I am working on)

    We typically just use exp, though I've heard good things about softplus.

    Vitalii Kleshchevnikov
    @vitkl
    I see. Thanks!
    We typically just use exp, though I've heard good things about softplus.
    Vitalii Kleshchevnikov
    @vitkl
    Softplus improves stability of cell2location training and also improves accuracy on simulated data. By adding softplus to numpyro I was able to get the same accuracy as pymc3. It is a bit surprising that pytorch doesn't have a softplus transform class. Actually, port of cell2location to scVI-pyro is taking a while because things like that need to be implemented. I am actually quite curious if it helps for the main scVI model
    Would it be fair to say that KL warmup is essentially training the encoder NN to predict the distribution prior?
    Vitalii Kleshchevnikov
    @vitkl
    I am wondering if someone evaluated it's use for informative priors (or for informed initialisation, e.g. initialising factors analysis model with PCA loadings)
    Adam Gayoso
    @adamgayoso

    It is a bit surprising that pytorch doesn't have a softplus transform class.

    Why can't you use this or this?

    Would it be fair to say that KL warmup is essentially training the encoder NN to predict the distribution prior?

    I'm not sure I would characterize it this way, but I'm not 100% sure what you mean.

    Valentine Svensson
    @vals
    What is preferred way to install scVI from the repo? When I use python setup.py develop it doesn't seem to install the dependencies
    Ah never mind, it seems they do get installed when using the pip install -e .[dev] method
    13 replies
    njbernstein
    @njbernstein
    Are the core trainers deprecated, e.g. scvi.core.trainers.ClassifierTrainer?
    What's the relationship between them and the lightning trainers if any?
    4 replies
    njbernstein
    @njbernstein
    development guide link on github 404s https://scvi-tools.org/en/stable/development.html
    3 replies
    njbernstein
    @njbernstein
    @adamgayoso when training SOLO I get the following warning: you defined a validation_step but have no val_dataloader. Skipping validation loop The other TrainingPlans dont seem to have a val_dataloader so Im not quite sure what the issue is
    63 replies
    munfred
    @munfred:munfred.com
    [m]

    Hi folks, just wanted to show the live deploy of a simngle page app for scvi DE I made using flask: https://cengen-de.textpressolab.com/

    I made this deploy for wormbase.org to allow people to explore a new dataset with 100k cells and 65k C. elegans neurons that came out, but you can deploy it with your own data easily if you have the trained model

    1 reply
    njbernstein
    @njbernstein
    When I run pytest I get a few errors saying: ModuleNotFoundError: No module named 'rich'
    7 replies
    njbernstein
    @njbernstein
    Also when is the next scvi-tools release planned for?
    Adam Gayoso
    @adamgayoso
    by march
    Adam Gayoso
    @adamgayoso
    @njbernstein I can release an alpha or beta version sooner though that you can use as a dependency through pip
    Valentine Svensson
    @vals
    Hey, is it possible to track more things than the 4 quantities defined in LossRecorder over training?
    5 replies