Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Tai Sakuma
    @TaiSakuma
    ok. thank you
    Charles Escott
    @EscottC
    Hi
    Jim Pivarski
    @jpivarski
    @EscottC Hi!
    Tai Sakuma
    @TaiSakuma
    Hi
    Tai Sakuma
    @TaiSakuma
    how would you midify __repr__ of a function? would you do this?
    Jonas Eschle
    @jonas-eschle
    This seems the way to go, since repr is implemented as type(func).__repr__(func) with func being your function. Therefore, some kind of wrapping is necessary. Either with a function wrapper as desribed in your link or, depending on your context you may even need to change more, by creating a class instead of a function and implement the __call__.
    Tai Sakuma
    @TaiSakuma
    Thanks. I have been trying. repr() works. But the problem is the decorated function is not picklable.
    Jonas Eschle
    @jonas-eschle

    Did you use the functools.wraps? Should be used anyway for any wrapping, like:

    def decorator(func):
        @functools.wraps(func)
        def new_func():
            print(f"Wrapped {func}")
        return new_func

    the problem arises from name clash, e.g. your wrapped function has the same name as the unwrapped. functools.wraps solves that. Explanation e.g. here

    Tai Sakuma
    @TaiSakuma
    Yes. That is what I have been trying. I'm not sure where to put wraps.
    The example wraps a function with another function. I need to wrap a functiona with a class (with __call__()).
    Henry Schreiner
    @henryiii
    This should get you pretty close, but does not copy the signature; you’ll need something more powerful for that, like wrapt or decorator.
    In [11]: def nice_repr(func):
        ...:     class NiceFunction:
        ...:         def __repr__(self):
        ...:             return f"Nice repr function of {func.__name__}"
        ...:         @functools.wraps(func)
        ...:         def __call__(self, *args, **kargs):
        ...:             return func(*args, **kargs)
        ...:     return NiceFunction()
    
    In [17]: @nice_repr
        ...: def f(x: float):
        ...:     'Squares a float'
        ...:     return x**2
    
    In [18]: f
    Out[18]: Nice repr function of f
    
    In [19]: f(2)
    Out[19]: 4
    
    In [20]: f?
    Signature:      f()
    Type:           NiceFunction
    String form:    Nice repr function of f
    Docstring:      <no docstring>
    Call docstring: Squares a float
    Tai Sakuma
    @TaiSakuma
    Thank you.
    I cannot pickle f
    I get AttributeError: Can't pickle local object 'nice_repr.<locals>.NiceFunction'
    This is the code I have now. The pickle at L25 doesn't work.
    Tai Sakuma
    @TaiSakuma
    It appears to be impossible.
    Chris Burr
    @chrisburr
    If it's important enough to justify adding a dependency I think cloudpickle can be used
    Tai Sakuma
    @TaiSakuma
    I have been keeping my code running on both batch systems and multiprocessing. I can use cloudpickle for batch systems. But I don't think that I get to choose s serializer for multiprocessing.
    Henry Schreiner
    @henryiii
    I think you can pull the class outside the decorator. I don’t think there’s any reason it has to be inside. Just add an init where you pass in the function you want to store, and make it a member.
    Nevermind, not quite that easy. Will think about it again on Tuesday.
    Henry Schreiner
    @henryiii
    It’s easy without a decorator, though.
    class NiceFunction:
        def __init__(self, function):
            self.func = function
        def __repr__(self):
            return f"Nice repr function of {self.func.__name__}"
        def __call__(self, *args, **kargs):
            return self.func(*args, **kargs)
    def nice_repr(func):
        return NiceFunction(func)
    def f(x: float):
        'Squares a float'
        return x**2
    ff = nice_repr(f)
    f(3)
    9
    f
    <function __main__.f(x: float)>
    import pickle
    pickle.dumps(ff)
    b'\x80\x03c__main__\nNiceFunction\nq\x00)\x81q\x01}q\x02X\x04\x00\x00\x00funcq\x03c__main__\nf\nq\x04sb.'
    Tai Sakuma
    @TaiSakuma
    Yes. That is true. Because this is possible, I thought it should be possible to do with a decorateor, which is just equivallent to do f = nice_repr(f).
    But that seems to be actually impossible.
    Luke Kreczko
    @kreczko

    OK, it might be simple, but I cannot see it atm.

    I am looking at a dictionary of generators and want to unpack the values. The straight-forward way is to unpack the generators first and then match them up against against the keys (AFAIK the order is preserved). However, this uses two for-loops and an additional dictionary.

    Is there an easy way to shorten this?

    generators = dict(
            t1 = range(0, 20, 2),
            t2 = range(10),
            t3 = range(0, 100, 10),
        )
        for g in six.moves.zip(*six.itervalues(generators)):
            data = {}
            for name, value in six.moves.zip(generators, g):
                data[name] = value
            print(data)
        # desired output per iteration:
        # {'t1':0, 't2': 0, 't3': 0}
        # {'t1':2, 't2': 1, 't3': 10}
        # ...
    Luke Kreczko
    @kreczko
    In the real example the generator is quite I/O heavy so I do not want to have the full range at once
    Luke Kreczko
    @kreczko
    thx @benkrikler - having a intermediate function as a generator (i.e. yield {name: value}) does the job without much additional time
    Henry Schreiner
    @henryiii
    You can make it look a little shorter, but the main way to reduce the output would be to have a generator in the middle:
    def iter_dict(gen):
        for g in six.moves.zip(*six.itervalues(gen)):
            data = {name:value for name, value in six.moves.zip(gen, g)}
            yield data
    
    for item in iter_dict(generators):
        print(item)
    Luke Kreczko
    @kreczko
    thx @henryiii !
    benkrikler
    @benkrikler

    (Copying here from our other communication, @kreczko): I think a generator comprehension in modern python could also work nicely:

    from  six.moves import zip as s_zip
    out_gen = (dict(s_zip(generators, g)) for g in s_zip(*six.itervalues(generators)))

    Not likely faster than Henry's code, but a bit less code.

    Henry Schreiner
    @henryiii
    Fun qoute from the “What will not change in Python 3” PEP from many years ago: "Simple is better than complex. This idea extends to the parser. Restricting Python's grammar to an LL(1) parser is a blessing, not a curse. It puts us in handcuffs that prevent us from going overboard and ending up with funky grammar rules like some other dynamic languages that will go unnamed, such as Perl."
    The same document claims "There will be no alternative binding operators such as :=.” - as you may know, that’s no longer true, Python 3.8 will have := (along with shared memory multiprocessing and positional-only parameters)
    And the f-strings with = sounds very useful! print(f”{1+2=}") prints 1+2=3
    Henry Schreiner
    @henryiii

    Example of what I’d use that for:

    >>> x = 23
    >>> print(f"{x=}")
    x=23

    This imitates the similar features in Matlab or CMake that I miss in Python.

    Luke Kreczko
    @kreczko
    Things are getting better and I prefer old rules being broken in order to introduce useful change than adamantly refuse changes
    But HEP projects still have to make the move to Python 3, yet alone Py >= 3.7
    Henry Schreiner
    @henryiii
    I’m hoping since we’ve waited so long to move in HEP, we’ll at least move to no less than Python 3.6. That’s a really fantastic version of Python.
    Luke Kreczko
    @kreczko
    Yes, most projects I've seen are aiming for 3.6.5 - I guess because there is nothing newer on /cvmfs/sft.cern.ch/lcg/releases/Python (3.5.2, 3.6.3 & 3.6.5)
    Henry Schreiner
    @henryiii
    3.7 is a bit harder to support on older systems, due to the minimum OpenSSL requirement (I think), I think that’s why adoption has been a little slow - Travis on Trusty doesn’t support 3.7, for example (PSA: Trusty is being retired as the Travis default this month due to EOL)
    Luke Kreczko
    @kreczko
    I see. So things should start to improve now as many in HEP are moving to CentOS 7
    Henry Schreiner
    @henryiii
    Or CentOS 8 - have no idea how long the building process will take, but am quite excited to have reasonably modern GCC :)
    Avoiding Python 2.6 is a good reason to get off of SLC 6, though. It’s becoming really hard to support in Plumbum.
    Luke Kreczko
    @kreczko
    @henryiii I did not realise you are the #1 maintainer of plumbum (https://github.com/tomerfiliba/plumbum/graphs/contributors)! Thx a lot! It is a very useful library
    BenGalewsky
    @BenGalewsky
    Does anyone have experience using the arrow serialization in awkward array? I have what should be a simple example, that I cannot get to work
    Albert Puig
    @apuignav

    Dear all,
    as part of our work in zfit, we have released a standalone package for multibody phasespace generation à la TGenPhaseSpace. It's pure python, based of tensorflow. On top of simple phasespace generation, it can build more complex decay chains, handle resonances, etc, all in a very simple manner.

    The package is called phasespace (https://github.com/zfit/phasespace), it's well documented and fairly easy to use, so no excuses for not trying :-)

    Let us know what you think, we highly appreciate any feedback and suggestions from the software community here

    Jonas+Albert

    Hans Dembinski
    @HDembinski
    Good, I guess.
    For those who are forced to use Windows
    Matthieu Marinangeli
    @marinang
    Hi, does any of you know if the "moment morphing method" described in this paper https://www.sciencedirect.com/science/article/pii/S0168900214011814 is implemented in python and outside of ROOT? This is used to interpolates pdf shape, when you want to do scans for instance for me a LLP search with different masses and lifetimes. Or maybe there is a better technique nowadays?
    Hans Dembinski
    @HDembinski
    @marinang I don't know this method and I didn't read the paper yet, but at first glance it seems inferior to Alexander Read's interpolation https://inspirehep.net/record/501018/