Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Nov 28 19:16

    Suor on 1.14

    (compare)

  • Nov 28 19:16

    Suor on master

    Use proper set literals in coll… Specify travis distro Up to 1.14 (compare)

  • Nov 28 18:58

    Suor on master

    Add filter_errors param to @ret… (compare)

  • Nov 28 16:11

    Suor on master

    Fix travis pypy3 (compare)

  • Nov 28 14:17

    Suor on master

    Fix wrap_prop() test in Python 2 Use default distro on travis (compare)

  • Nov 28 13:42
    Suor closed #81
  • Nov 28 13:42

    Suor on master

    Add @wrap_with() to cheatsheat Fix cheatsheat hints Drop Python 2.6, test Python 3.… and 3 more (compare)

  • Nov 28 12:13
    Suor commented #81
  • Nov 28 11:46
    pared opened #81
  • Nov 16 18:09
    Suor commented #79
  • Nov 16 18:08
    Suor closed #79
  • Nov 11 10:07
    Suor closed #80
  • Nov 11 10:07
    Suor commented #80
  • Nov 06 17:22
    thalelinh opened #80
  • Oct 29 08:51
    Suor commented #79
  • Oct 29 08:51
    Suor commented #79
  • Oct 23 20:34
    mitchelllisle opened #79
  • Aug 04 11:51

    Suor on 1.13

    (compare)

  • Aug 04 11:50

    Suor on master

    Up to 1.13 (compare)

  • Aug 04 05:05
    Suor closed #78
Ronny Pfannschmidt
@RonnyPfannschmidt
maybe it makes sense to have a general parallel execution manager, and push work to it from functions that are part of a chain
Alexander Schepanovski
@Suor
Still defining transducer separate from processor is beneficial. And semantics like order should be defined in transducer not in processor anyway
The problem with rcompose() I see in the context of parallelization is that it just returns a function and you can't really introspect it
So you can either parallelize the whole thing or use parallelized version of functions
Ronny Pfannschmidt
@RonnyPfannschmidt

maybe compose is not exactly the right primitive to think about the problem

after all in the context of making things parallel one thinks of data processors and streams of data

Alexander Schepanovski
@Suor
Anyway parallelization is definitely is out of funcy scope
Ronny Pfannschmidt
@RonnyPfannschmidt
however what compose does is chain a number of function calls
Alexander Schepanovski
@Suor
if streams are iterators then composing imap(), ifilter() sounds like what you need )
Ronny Pfannschmidt
@RonnyPfannschmidt
well of course, but those are not exactly aware of parallel execution
Alexander Schepanovski
@Suor
they shouldn't be and they don't need to be
that's the beauty
Ronny Pfannschmidt
@RonnyPfannschmidt
in practice you need some sort of inversion of control to distribute the work efficiently among cores
Alexander Schepanovski
@Suor
why?
Ronny Pfannschmidt
@RonnyPfannschmidt
just think of filter, the default one is order protecting, and you cant propperly spread the execution of the filtering to n cores
Alexander Schepanovski
@Suor
why can't? we chunk the input sequence, pass the chunks to cores, filter each chunk, then reassemble
filter is the same, we are just wrapping it
Ronny Pfannschmidt
@RonnyPfannschmidt
however that makes presumpions on the input structure, and nforces some limitations on the output structure
Alexander Schepanovski
@Suor
presumptions like it's iterable? and returns an iterator or a list?
this is what transducers are for
Ronny Pfannschmidt
@RonnyPfannschmidt
hmm, my point of view might be a bit focused on giving too many options for making things efficiently parallel
M. Floering
@hangtwenty
very interesting conversation here. my one chime-in: the TransducerChain being similar to LINQ is nice. i agree that it's better not to pass the data in the first place, but i don't see why to avoid it ... just to avoid adding new abstractions
from funcy import transducers as t

process = rcompose(
    t.remove(is_useless),
    t.map(process_row),
    t.chunks(100),
    t.each(write_chunk_to_db)
)

process(data)
feels clunky
the TransducerChain example given by @RonnyPfannschmidt isn't very abstract. it's a useful small amount of abstraction. it seems like a small price to pay for good usability, more declarative code
Alexander Schepanovski
@Suor
The thing described by Ronny is definitely is out of funcy scope, while mine could be in.
Anyway, I decided to not implement any of this. I don't really have need for transducers, simple functions and their composition take care of everything I need. And since first rule of open source -- you can't build something you don't use, I will let someone else write it.
I also think that transducers has little value in Python, they were designed to solve problems in Clojure that Python never had.
M. Floering
@hangtwenty
true true
I thought on this a little more and I also see the reason not to do classy transducers... why do method resolutions you don't need to do
"first rule of open source -- you can't build something you don't use" is a great guideline haha
I haven't heard it actually.
Alexander Schepanovski
@Suor
Cause I made that up ;)
M. Floering
@hangtwenty
haha oh. right.
Alexander Schepanovski
@Suor
Thanks to Marcus McCurdy and Swaroop for pull requests
Alexander Schepanovski
@Suor
Added print_iter_durations() (and its log_* counterpart) several days ago - http://funcy.readthedocs.org/en/latest/debug.html#print_iter_durations
Going to release 1.6 soon. If you want something in tell me now ).
Mike Panciera
@averagehat
It looks like 1.7 introduced a possibly unintentional breaking change to map.
  File "/home/travis/build/VDBWRAIR/bio_bits/bio_bits/plot_muts.py", line 75, in get_relative_info

    muts = map(get_mutations, seqs)

  File "/home/travis/virtualenv/python3.4.2/lib/python3.4/site-packages/funcy/seqs.py", line 99, in map

    return _map(make_func(f, builtin=PY2), *seqs)

  File "/home/travis/virtualenv/python3.4.2/lib/python3.4/site-packages/funcy/cross.py", line 16, in map

    return list(_map(f, seq))

  File "/home/travis/virtualenv/python3.4.2/lib/python3.4/site-packages/funcy/funcs.py", line 15, in <lambda>

    pair = lambda f, g: lambda *a, **kw: f(g(*a, **kw))

  File "/home/travis/virtualenv/python3.4.2/lib/python3.4/site-packages/funcy/seqs.py", line 99, in map

    return _map(make_func(f, builtin=PY2), *seqs)

TypeError: map() takes 2 positional arguments but 3 were given
python 3.4
Mike Panciera
@averagehat
I haven't made an issue because I haven't recreated it, which is odd because get_mutations is just a partially applied function
Alexander Schepanovski
@Suor
Should be fixed with Suor/funcy@cf2750b
Alexander Schepanovski
@Suor
Made a cheatsheet for funcy, an easier way to find whatever you need at the moment - http://funcy.readthedocs.io/en/latest/cheatsheet.html
Alexander Schepanovski
@Suor
A question has been brought up about funcy.curried namespace to contain autocurried versions of everything - Suor/funcy#48
William Mayner
@wmayner
Hi! I just came across funcy and was wondering if there's a comparison anywhere to toolz?
Alexander Schepanovski
@Suor
No comparison as far as I know. I consider funcy to be more practical while toolz more pure and academic.
Alexander Schepanovski
@Suor
@/all A question to everyone: how often do you use select_keys() and select_values()? How useful you see a shortcut for select_keys(complement(pred), coll)?
Marcus Feitoza
@mfeitoza
Hi folks, I can use tree_nodes and tree_leaves with dict?
Thank you.
Alexander Schepanovski
@Suor
@mfeitoza tree_nodes(dicts, follow=is_mapping, children=itervalues) like this?