Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jan 31 22:17
    CrossEye commented #2779
  • Jan 31 21:04
    ArturAralin commented #2779
  • Jan 31 20:08
    CrossEye commented #2779
  • Jan 31 18:56
    buzzdecafe commented #2631
  • Jan 31 18:09
    ArturAralin commented #2779
  • Jan 31 16:18
    CrossEye commented #2779
  • Jan 31 16:10
    CrossEye commented #2631
  • Jan 31 16:06
    CrossEye commented #2777
  • Jan 31 14:44
    ArturAralin opened #2779
  • Jan 31 07:39
    inferusvv commented #2631
  • Jan 31 03:07
    sespinozj commented #2771
  • Jan 31 02:33
    machad0 commented #2771
  • Jan 31 02:26
    JeffreyChan commented #2777
  • Jan 30 14:30
    CrossEye closed #2777
  • Jan 30 12:13
    vanyadymousky updated the wiki
  • Jan 30 01:42
    JeffreyChan commented #2777
  • Jan 29 21:06
    vanyadymousky updated the wiki
  • Jan 29 16:28
    CrossEye commented #2777
  • Jan 29 15:50
    mbostock commented #2772
  • Jan 29 15:48
    CrossEye commented #2772
Tobias Pflug
@gilligan
basically just run mocha to run the individual file
@scott-christopher also just look at the output of the ramda repl
@scott-christopher it evaluates to false even though in the test it tests for equality to return true
Scott Christopher
@scott-christopher
Yeah, the repl is running v0.18
Tobias Pflug
@gilligan
ah, d'oh now i get you
Scott Christopher
@scott-christopher
try running npm run build beforehand if you're running mocha directly
or npm run pretest && mocha test/equals
Tobias Pflug
@gilligan
let me try
that works
and I don't understand why
can you enlighten me ? :)
Scott Christopher
@scott-christopher
the tests use dist/ramda.js, so you need to make sure that has the latest build from the current source
The dist/ramda.js that is commited is the build from the most recent tagged release.
so when you ran the test without running npm run build prior, it was using ramda v0.18, which didn't yet have the equals implementation for Error
running npm run test implicitly runs npm run pretest by convention of npm scripts prefixed with pre (and likewise for post).
Tobias Pflug
@gilligan
actually it all boils down to the fact that i hadn't rebased my branch for too long ;}
Scott Christopher
@scott-christopher
or that :D
Tobias Pflug
@gilligan
and at the point where the repl result did not make sense and i didn't think about it merely not being up to master
Tobias Pflug
@gilligan
@scott-christopher either way - thanks for the quick heads up ;)
Scott Christopher
@scott-christopher
no worries :)
Tobias Pflug
@gilligan
Meh, just realized again that I have no idea/intuition about transducers - i need to fix that
Aldwin Vlasblom
@Avaq

@gilligan You can view it as a way to compose a computation, where every step takes the value(s) from the previous step, and emits zero or more new values for the next step. This allows the final computation to be used in any kind of reduction (map, filter, etc) of any kind of list (array, stream, generator, etc). The internal mechanism also has a built-in way to deal with short-termination (when doing a take(), for example).

Since you're extending the "step" function you end up with a means to "stream" your input list through the entire computation value by value, rather than doing .map().filter().take() and building up intermediate lists in between steps. So once you've coded a transducer it is more efficient, and more reusable than when you code a chain of operations.

I found watching this video over and over again quite helpful in gaining an understanding: https://www.youtube.com/watch?v=6mTbuzafcII

Tobias Pflug
@gilligan
Thanks, at the surface all that does make perfect sense. I just need to find (identify appropriate) opportunities to use transducers
@Avaq i think it boils down to me having to play around with transducers ;)
Aldwin Vlasblom
@Avaq
@gilligan Just use them any time you have a sequence of steps, if only for the performance gain. Like instead of going: pipe([map(f), filter(g)], arr) go into([], compose(map(f), filter(g)), arr). I really like using them for processing object streams in Node with transduce-stream, because it allows me to work using my normal Ramda workflow on Streams:
import throughx from 'transduce-stream';
stream.pipe(throughx(compose(
  map(f),
  filter(g),
  take(5)
), {objectMode}))
Tobias Pflug
@gilligan
@Avaq currently working in a big React project with ramda. Suppose I will just have a look at the various scenarios where I apply a transformation pipeline via R.pipe
Keith Alexander
@kwijibo
@gilligan this blog post is what helped me grok transducers http://phuu.net/2014/08/31/csp-and-transducers.html
boxofrox
@boxofrox
@Avaq, your pipe and compose from :point_up: December 12, 2015 10:11 AM have the same order of operations. I take it transducers do something to reverse the order of operations?
Raine Virta
@raine
they don't exactly do anything, it's inherent to the way they compose
it took me a while to understand why it happens
boxofrox
@boxofrox
Thanks, @raine. I'll dig around and see if can figure it out, too. I vaguely recall Rich Hickey's talk covered that composition bit. Think I'll start there.
Tobias Pflug
@gilligan
Can someone merge ramda/ramda#1541 ? master is broken right now
boxofrox
@boxofrox
Note to self, don't use Ramda functions in undocumented ways. In 0.17, invoker(1, 'getResponseHeader', 'content-type') worked fine, but was documented to be used as invoker(1, 'getResponseHeader')('content-type'). I suspect the currying of pipe and compose had something to do with that. Upgrade to 0.18 and I run into errors with functions further down the pipe because 'content-type' was effectively dropped. Doh.
Aldwin Vlasblom
@Avaq
@boxofrox Imagine you're building up your "step" function by wrapping the previous step functions. That's what happens when you compose a transducer: compose(a, b, c) = (from right to left) "c is wrapped by b is wrapped by a", so you end up with a transformation that first applies a, and a then calls b and b then calls c.
So you compose the transducer right-to-left, but end up with a step function that executes "left to right", so to say.
Aldwin Vlasblom
@Avaq

With a transducer you're really just extending a step function. Let's take into as an example. Without a transformation over the step function , all into([], identity, list) does is iterate list and append every item to []. So the step function here is append. However, I could choose to extend the step function, by not passing identity, but (a,b -> b) -> (a,b -> b) (note how the functions have the same signature as append). So the function I'm passing to into([], f, list) takes the step function as a first argument, and returns a new step which wraps the old one in order to apply some custom logic (like transforming its argument, in case of map). In the case of a composition of these "step transformers" that means we're just taking this step function, and threading it through the pipeline, decorating it along the way. In the end we're left with append, but wrapped many times in order to do all sorts of things with its argument before it's finally called (or not called, if you're using filter as one of its "decorators").

Now, that's a bit simplified, in reality our step function is actually three functions (contained in an object), one init (like "setup") one step and one result (like "teardown"), but the idea remains the same.

Tobias Pflug
@gilligan
@Avaq so taking some random example, can I write https://gist.github.com/gilligan/75f1b2277f92c751a476 with a transducer instead ?
boxofrox
@boxofrox
@Avaq, thanks for the explanation.
Tobias Pflug
@gilligan
@Avaq guess not. quite simply because R.toPairs is not a transducer
Aldwin Vlasblom
@Avaq
@gilligan I believe the mapKeys function from your example takes in an object as its argument (http://goo.gl/qGer0u). Transducers are really only useful if you're transforming multiple iterations of data the same way. There is a function inside your example which does just that (adjustKeyPrefixes), so you could rewrite that to be a transducer. However, it wouldn't add much since it's only a single step: map. :)
Tobias Pflug
@gilligan
@Avaq yeah i suppose it's interesting when you are streaming data in some way - not really doing that at the moment hehe. I am still interested to learn about it though. will spend some time reading the (readily available) material
Aldwin Vlasblom
@Avaq
It's certainly nice for streaming, because transducers act upon single values rather than entire collections at once, so you don't need anything buffered up to use a transducer on it. However, you could just as well use it on buffered collections like Arrays. And I can't imagine many useful programs that don't transform values in lists at some point: Lists of users, lists of posts, lists of click-events (if you're doing reactive), etc. I'm sure you'll run into a list that you have to map over or filter, or slice. And when you're doing multiple of those kinds of operations in sequence on the same list, a transducer is probably better at it. :)
Tobias Pflug
@gilligan
sure, plenty ;}
Tobias Pflug
@gilligan
hm, i cam agonising about how this could be written in a more efficiently : https://gist.github.com/gilligan/dc1409dee420798c73dc ( @Avaq you see i extracted that one function from there). I have modify keys and values of an object and currently this happens in two successive traversals which I already don't like but don't see right now how to avoid this. Any ideas anyone ? ;-)
Aldwin Vlasblom
@Avaq
I did a small review sweep of the code; http://goo.gl/uyFGIF, didn't really solve the two-traversals thing, it's difficult since one traversal is recursive and the other isn't. But my comments might be helpful.
Off to bed now, see you.
Tobias Pflug
@gilligan
@Avaq thanks!
Aldwin Vlasblom
@Avaq
I want to learn about type transformers, like ReaderT. Can somebody give me a brief explanation, example or resource so I can get started? What is their purpose? Is there a benefit to using a ReaderTFuture over a Reader[Future]? I played a bit and can achieve the same with both (though the ReaderT is more convenient): http://goo.gl/J2J9lS. Am I even using it right? Is it merely a convenience?
Scott Sauyet
@CrossEye
Sorry, Avaq, still haven't spent any time with them.

Thinking about #1543, I became puzzled by something, and I was wondering if anyone has any insight:

Because it's easy to work with, many of our examples of lift use add, so me might define mAdd = lift(add). Well add is a straightforward function, and it happens to be commutative, add(a, b) ≡ add(b, a). It simply seemed obvious that mAdd would also be commutative. In simple cases, it is:

mAdd(Maybe.Just(10), Maybe.Just(32)); //=> Maybe.Just(42)
mAdd(Maybe.Just(32), Maybe.Just(10)); //=> Maybe.Just(42)

But it's definitely not:

R.lift(R.add)(Either.Right(10), Either.Left('oops')); //=> Either.Left('oops')
R.lift(R.add)(Either.Left('oops'), Either.Right(10)); //=> Either.Right(10)

My question is: Is my intuition leading me astray? Or should we expect that if fn is commutative, then lift(fn) is also commutative? If so, is there something wrong with lift or one of its dependencies? Or is there something the matter with Either?