by

Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Sep 15 08:56

    SebastianJL on serialization

    (compare)

  • Sep 15 08:52
    SebastianJL opened #262
  • Sep 15 08:52
    SebastianJL opened #262
  • Sep 15 08:52

    SebastianJL on delete-superfluous-script

    Deleted superfluous script in d… (compare)

  • Aug 28 19:13

    mayou36 on gh-pages

    Deploy zfit/zfit to github.com/… (compare)

  • Aug 28 18:53

    mayou36 on develop

    Removed Typo Merge pull request #261 from xo… (compare)

  • Aug 28 18:53
    mayou36 closed #261
  • Aug 28 18:53
    mayou36 closed #261
  • Aug 28 18:53
    mayou36 commented #261
  • Aug 28 13:12
    xolotl90 opened #261
  • Aug 28 13:12
    xolotl90 opened #261
  • Aug 06 14:58

    mayou36 on gh-pages

    Deploy zfit/zfit to github.com/… (compare)

  • Aug 06 14:38

    mayou36 on typo

    (compare)

  • Aug 06 14:38

    mayou36 on develop

    Fix typo in index.rst Merge pull request #260 from zf… (compare)

  • Aug 06 14:38
    mayou36 closed #260
  • Aug 06 14:38
    mayou36 closed #260
  • Aug 04 09:30
    olantwin review_requested #260
  • Aug 04 09:30
    olantwin review_requested #260
  • Aug 04 09:30
    olantwin opened #260
  • Aug 04 09:30
    olantwin opened #260
Jonas Eschle
@mayou36
Hey, we're still polishing a good implementation of it. For the moment being, there are two options: either use zfit-physics package, where you can access a GaussianKDE if you want to use kernel density estimation with zfit_physics.unstable.pdf.GaussianKDE (this would be the preferred one) or alternatively, the following self-made implementation of a histogram. With it you can also use e.g. the scikit-learn kde instead. :
class HistPDF(zfit.pdf.BasePDF):

    def __init__(self, hist_args, hist_bins, obs, name='HistPDF'):
        self.rv_hist = scipy.stats.rv_histogram([hist_args, hist_bins])  # or something, unsure
        super().__init__(obs=obs, name=name)

    def _unnormalized_pdf(self, x):
        x = z.unstack_x(x)
        probs =  z.py_function(func=self.rv_hist.pdf, inp=[x], Tout=tf.float64)
        probs.set_shape(x.shape)
        return probs
greennerve
@greennerve
thank you
Jonas Eschle
@mayou36
Just write me directly in case you have any troubles with it
Colm Harold Murphy
@chm-ipmu

I would like to implement a Gaussian constraint for a series of correlated parameters. I see in the documentation that the 'uncertainty' argument of the GaussianConstraint constructor can take a covariance matrix, which sounds like what I need.

The central values that I want to be constrained are measured first in a separate fit. So let's say I run this fit and measure the values of a, b, c, and their uncertainties calculated from FitResult.error(), and the covariance matrix from FitResult.covariance(). I then store them to disk and read them in during the fit in which I want to utilize the Gaussian constraints.

I should then read in those values (central values, errors, and covariance matrix), and create something like:

covariance = np.fromfile("previous_fit_result_covariance.np")
constraints = zfit.constraint.GaussianConstraint(params=[a, b, c], observartions=[a_measured, b_measured, c_measured], uncertainty=covariance)

Where covariance is the old fit results .covariance() array (a square array in the order a, b, c)?

Thank you!

Jonas Eschle
@mayou36
Unfortunately, I cannot test this atm, but to my understanding, this should work. Or did you encounter any problems?
Colm Harold Murphy
@chm-ipmu
That's not a problem. It's a kind of pre-emptive question. At the moment I am assuming no correlations and just using the simple constraint for each parameter in turn, but I would eventually like to switch to use the approach which utilizes the cov. matrix.
By the way - is the new release available on conda? Thanks for all the hard work
Matthieu Marinangeli
@marinang
This should work but you have to make sure that the columns and rows of the covariance matrix are ordered the same was as params = [a, b, c].
To make sure of that you can call fitresult.covariance(params=[a, b, c]).
Jonas Eschle
@mayou36

By the way - is the new release available on conda? Thanks for all the hard work

We had a problem with a dependency, so we push for a 0.5.1 and will also announce it here, should be out in no time.

Jonas Eschle
@mayou36
zfit version 0.5 has been released with many new features and improvements, such as:
  • stable minimization
  • fine-grained tracing of functions and option to clear cache
  • flexible spaces that allow arbitrary functions as limits
  • ... and more. You can check all changes in the changelog
Ilya Komarov
@mozgit

Dear all,
I'm trying to use zfit_physics package
import zfit_physics as zphys
but I got the crash:

ImportError                               Traceback (most recent call last)
<ipython-input-5-51e107d35723> in <module>
----> 1 import zfit_physics

/usr/local/Cellar/root/6.20.00_1/lib/root/ROOT.py in _importhook(name, *args, **kwds)
    520       except Exception:
    521          pass
--> 522    return _orig_ihook( name, *args, **kwds )
    523 
    524 __builtin__.__import__ = _importhook

~/Env_For_All/lib/python3.7/site-packages/zfit_physics/__init__.py in <module>
     21 
     22 from . import pdf
---> 23 from . import unstable

/usr/local/Cellar/root/6.20.00_1/lib/root/ROOT.py in _importhook(name, *args, **kwds)
    520       except Exception:
    521          pass
--> 522    return _orig_ihook( name, *args, **kwds )
    523 
    524 __builtin__.__import__ = _importhook

~/Env_For_All/lib/python3.7/site-packages/zfit_physics/unstable/__init__.py in <module>
----> 1 from . import pdf

/usr/local/Cellar/root/6.20.00_1/lib/root/ROOT.py in _importhook(name, *args, **kwds)
    520       except Exception:
    521          pass
--> 522    return _orig_ihook( name, *args, **kwds )
    523 
    524 __builtin__.__import__ = _importhook

~/Env_For_All/lib/python3.7/site-packages/zfit_physics/unstable/pdf.py in <module>
----> 1 from ..models.pdf_conv import ConvPDF
      2 from ..models.pdf_kde import GaussianKDE

/usr/local/Cellar/root/6.20.00_1/lib/root/ROOT.py in _importhook(name, *args, **kwds)
    520       except Exception:
    521          pass
--> 522    return _orig_ihook( name, *args, **kwds )
    523 
    524 __builtin__.__import__ = _importhook

~/Env_For_All/lib/python3.7/site-packages/zfit_physics/models/pdf_conv.py in <module>
      8 from zfit.util import ztyping
      9 from zfit.util import exception
---> 10 from zfit.util.exception import DueToLazynessNotImplementedError
     11 
     12 import zfit.models.functor

ImportError: cannot import name 'DueToLazynessNotImplementedError' from 'zfit.util.exception' (/Users/ilya/Env_For_All/lib/python3.7/site-packages/zfit/util/exception.py)

I've just updated to the lates zfit version (zfit-0.5.1) and just installed zfit_physics. Did something break inbetween the releases or am I missing something before running zfit_physics?

Ilya Komarov
@mozgit
( I installed zfit_physics with pip)
Jonas Eschle
@mayou36
Hi, yes, currently, zfit_physics should be installed from git, as pip install git+https://github.com/zfit/zfit-physics. We'll fix that soon
Ilya Komarov
@mozgit
Thanks!
Also, is there any example showing how to convolute my signal peaking pdf with gaussian-like resolution? I can see some snippets in tests of zfit_physics, but I'm not sure if this is what I need
Jonas Eschle
@mayou36
Since it is still somewhat under development, there aren't really public snippets around. It depends on what you need, but we can discuss in a private chat
Jonas Eschle
@mayou36

A new version of zfit 0.5.2 and of zfit-physics is out. featuring Python 3.8 and TF 2.2 support (the pip version only supports this currently).
A bug was fixed that messed with toy studies using the sampler.
Debugging is simpler and can now be done as:

zfit.run.set_graph_mode(False)
# stuff here runs like numpy

Release and all changes
zfit-physics features now an ARGUS PDF

secholak
@secholak
Hi all. I'm performing several fits with the same routine and recreating my zfit parameters every time(with new ranges/dependencies/etc). To avoid the naming conflict I was using zfit.run.create_session(reset_graph=True) by suggestion of @marinang. Now it seems that zfit.run.clear_graph_cache() has to be used instead, but it doesn't do the same job. Do I miss something? (PS: In principle I could recreate parameter every time with a new name, but I'm wondering if there're more sophisticated solution )
Jonas Eschle
@mayou36
Hey, why exactly are you recreating the parameters and how are you running different fits? I mean what is the goal of it, toy sensitivity studies?
Colm Harold Murphy
@chm-ipmu
Can I ask, what is the progress with the convolution of a resolution function? Is there a rough timeline for its implementation? Cheers
Jonas Eschle
@mayou36
Hey, a resolution function with parameters that depend on your data? we have actually a working (private) example of it. Would you be willing to "test" it? (it's actually working, we're just trying to think of a nice API)
Colm Harold Murphy
@chm-ipmu
Yes, it does. I'd be happy to "test" it, yes, thanks! I will only need it in about a months time though
Jonas Eschle
@mayou36
Okey, feel free to ask back whenever you need it and we can provide you with the most up-to-date version
Oliver Lantwin
@olantwin
@chm-ipmu I'm looking into this currently with Jonas. Currently it's pretty tailored to my analysis (based on a prototype in zphys), but will try to generalise it for public use.
Colm Harold Murphy
@chm-ipmu
Will do, cheers.
gpinnaan
@gpinnaan_gitlab
dear all,
I’ve been trying to add a double crystal ball with a Chebyshev polynomial with SumPDF method but when I try to use the pdf method on the the sum I get this run time error: ValueError: Shapes (1,) and () are not compatible. It happens only if the chebyshev is rank > 1. I also notice that I don’t have the same problem with a chebyshev2 or Legendre polynomial.
Jonas Eschle
@mayou36
Hi, many thanks for reporting this. I've opened an issue, maybe you want to follow. We'll take a look at it ASAP
zfit/zfit#250
CasaisA
@CasaisA

Hi, quick question. I'm trying to use zfit to do a simultaneous fit to two double Crystal balls that I construct out of single Crystal Balls (the CB definition is taken from zfit itself). When I do it on the CPU it doesn't seem to quite converge but if I do it on the GPU it complains in this way:

2020-07-02 17:55:20.111312: E tensorflow/core/kernels/check_numerics_op.cc:289] abnormal_detected_host @0x7ef97c619e00 = {0, 1} Check if pdf output contains any NaNs of Infs

I cross checked that this fit can be done with the same parameter configuration in RooFit, so I'm a bit lost on what might be going wrong... Thanks !

Jonas Eschle
@mayou36
Hi, this seems as if NaNs are produced during the fit, which most often comes from having zeros or negative values in the pdf. But I may need to check the full error stack, you can PM me.
CasaisA
@CasaisA

Hello again. Another quick question. Is it possible to do something like

pdf = zfit.pdf.ProductPDF(pdfs=[pdf1,pdf2])
pdfExtended = pdf.create_extended( [n1,n2] )

Thanks !

actually I would like to do something more like:
pdfProd1 =zfit.pdf.ProductPDF(pdfs=[pdf1,pdf2])
pdfProd2 =zfit.pdf.ProductPDF(pdfs=[pdf3,pdf4])

pdfProd1Extended = pdf.create_extended( [n1,n2] )
pdfProd2Extended = pdf.create_extended( [n3,n4] )

pdf = zfit.pdf.SumPDF(pdfs=[pdfProd1Extended,pdfProd2Extended])

I also tried creating

pdfProd1Extended = pdfProd1.create_extended(n1)

...

and then doing the products and summing - but then zfit doesn't recognize those prodPDFs as extended anymore ...

Jonas Eschle
@mayou36
Hey, so in zfit, an extended PDF is a normal pdf with an additional attribute, a yield. If a model is given to a loss with a data object, an ExtendedUnbinnedNLL will create an UnbinnedNLL with an extra term, a poissonian with the yield and the number of events as parameters. The yield is thereby taken from the model. Now if a submodel has a yield, this can sometimes be propagated and, effectively, a ComposedParameter is built (e.g. for the Sum). This is however done conservatively, and not the case for the product. This can be done by hand: just create a composed parameter being e.g. the product of two other parameters that you call yields and set this as the yield of the model given to the NLL.
horace-cl
@horace-cl

Hi everyone!
Maybe this is not the place to ask for, but If anyone can point me in the right direction it will be very nice.

I am trying to fit a 5 parameter (a, b, c, d, e) model, where one of the parameters is constrained by another, let's say:
0< d < 1
e < |d|
I have only created the zfit.Parameters and put the limits such that the ranges accessible to them are valid, again, let's say:

d = zfit.Parameter('d', 0.5, 0.3, 1.0, 0.01)
e = zfit.Parameter('e', 0.1, 0.0, 0.3, 0.01)

It has been working well so far, but I think it is not the right way to do it.
So my question is, what is the correct way to deal with this kind of constraint?
Cheers

Jonas Eschle
@mayou36

Hey, on asking: if you can, it is prefered to ask on StackOverflow (maybe you can even post this question and I'm gonna add my answer, just to make it easier accessible for others.

I would use this limits with caution, as they block the variables, ideally, they should be far off the final value. There are two ways, you can either impose a constraint "mathematically" as a logical consequence, so define one parameter from another using a composed parameter (which is a function of other parameters). If possible, this should be the prefered way.
Another option is to impose this restrictions in the likelihood with an additional term. This, however, can have repercusions as you modify the likelihood. The minimizer will find a minimum, but this is maybe not the minimum you have looked for. What you can use are SimpleConstraints and add a penalty term to the likelihood if any of the above is violated (e.g. tf.cast(tf.greater(d, 1), tf.float64) * 100.). Maybe make also sure that minuit is run with use_minuit_grad.

horace-cl
@horace-cl
Thanks! and I will ask this on StackOverflow
donalrinho
@donalrinho
Hi all! Does anyone have an example of how to plot the 1D projections of a 3D function? I have used partial_integrate before with a 2D PDF, but I don't know the syntax for the case where I need to integrate over two other dimensions. Any help would be great! Donal
Jonas Eschle
@mayou36

Hey, the syntax is basically the same. The only difference is that you need now to integrate out two other dimensions and instead of providing 1D limits, you need 2D limits:

# assuming obs{1,2,3}
obs1 = zfit.Space('obs1', (-1, 1))
obs2 = zfit.Space('obs2', (-10, 3))
obs12 = obs1 * obs2
pdf.partial_integrate(...)   # same as 1D case, just with a 2d space as the integration limits

In case this doesn't work, you can also ping me directly

donalrinho
@donalrinho
Thanks Jonas, this did the trick :)
donalrinho
@donalrinho

Hi folks. I'm trying to run an unbinned 3D angular fit in zfit, where the input data is a sample with per-event sWeights assigned from a separate mass peak fit. I think I'm running into issues of negatively weighted events in some regions of the phase space, as zfit gives the error:

Traceback (most recent call last):
  File "python/fitting/unbinned_angular_fit.py", line 443, in <module>
    main()
  File "python/fitting/unbinned_angular_fit.py", line 440, in main
    run_fit(args.Syst, args.Toy)
  File "python/fitting/unbinned_angular_fit.py", line 349, in run_fit
    result = minimizer.minimize(nll)
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/baseminimizer.py", line 265, in minimize
    return self._hook_minimize(loss=loss, params=params)
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/baseminimizer.py", line 274, in _hook_minimize
    return self._call_minimize(loss=loss, params=params)
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/baseminimizer.py", line 278, in _call_minimize
    return self._minimize(loss=loss, params=params)
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/minimizer_minuit.py", line 179, in _minimize
    result = minimizer.migrad(**minimize_options)
  File "src/iminuit/_libiminuit.pyx", line 859, in iminuit._libiminuit.Minuit.migrad
RuntimeError: exception was raised in user function
User function arguments:
    Hm_amp = +nan
    Hm_phi = +0.06
    Hp_phi = -0.07
Original python exception in user function:
RuntimeError: Loss starts already with NaN, cannot minimize.
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/minimizer_minuit.py", line 121, in func
    values=info_values)
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/baseminimizer.py", line 47, in minimize_nan
    return self._minimize_nan(loss=loss, params=params, minimizer=minimizer, values=values)
  File "/home/dhill/miniconda/envs/ana_env/lib/python3.7/site-packages/zfit/minimizers/baseminimizer.py", line 107, in _minimize_nan
    raise RuntimeError("Loss starts already with NaN, cannot minimize.")

I can avoid this error by restricting one of the angle observable ranges slightly, to avoid the region with small numbers of data events where some data is weighted negatively (signal is being over-subtracted by the sWeights). But I wondered if there is another way around this in zfit? Perhaps theUnbinnedNLL method explicitly requires positive events, so the negatively weighted data points could be set to zero or a small positive value instead? Any help would be great! Thanks, Donal

Johannes Lade
@SebastianJL
Hey @donalrinho
Consider asking your question on StackOverflow. This way other people can profit from the answer as well
You can find a link that automatically taga the question correctly on our website https://zfit.readthedocs.io/en/latest/ask_a_question.html
donalrinho
@donalrinho
Johannes Lade
@SebastianJL
Thanks a lot.
@mayou36 can you help out here?
donalrinho
@donalrinho
Hi all, sorry to bump the above question again. Any input would be very helpful. Cheers, Donal
Jonas Eschle
@mayou36
Hey we've been all a little on vacation and are taking a look at it, sorry for the delay
But, in principle, this should work without a problem, so we'll be checking that