Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Raitis Veinbahs
@siers
is there a shortcut for Some(x).filter(pred)?
Fabio Labella
@SystemFw
not that I know of, there are a couple of roughly equivalent things
like Option.when(f(x))(x) in scala 2.13, or f(x).guard[Option].as(x), but overall I think x.some.filter(f) is the nicest
ritschwumm
@ritschwumm
this could be generalized to any Alternative, right?
implicit class Ops[T](value:T) { 
  def when[F[_]](pred:T=>Boolean)(implicit F:Alternative[F]):F[T] =
    if (pred(value) F.pure(value) 
    else            F.empty 
}

x.when[Option](pred)
Drew Boardman
@drewboardman
what's a good purely functional config library?
just for storing and passing configuration values?
sinanspd
@sinanspd
@drewboardman I would highly recommend ciris assuming you are dealing with key value pair style configs
Drew Boardman
@drewboardman
ok, and this is better than pureconfig?
sinanspd
@sinanspd
Depends on what you want to do. If you want to work with config files, pure config is the way to go. I don't know if ciris actually work with those. If you are going to use environment variables and system properties, ciris is the way to go. It has more features as far as I am concerned, including third part integrations for aws and Kubernetes.
There is a discussion in the industry regarding config files and how good of a practice to actually store passwords etc like that. So that's sort of where ciris comes in. it has a notion of sensitive data and containers like Secret to give you more control over how and where they are accessed
Drew Boardman
@drewboardman
ok that sounds like what I'm looking for
thanks mate
Raitis Veinbahs
@siers
@SystemFw x.some.filter is nice indeed, thanks
nabih.nebbache
@ThegreatShible
Hi guys, I am new and I need a data type that implements the function reduce without reduceleft or reduceright. I am doing it to generalize over sequential and parallel collections. Thank you
Rob Norris
@tpolecat
Can you say a little more? You want to abstract over reducible things?
Gavin Bisesi
@Daenyth

generalize over sequential and parallel collections

This sounds suspicious

If you want parallel processing then I'd recommend avoiding stdlib par collections and instead use something like parTraverse or fs2
nabih.nebbache
@ThegreatShible
I just want a data type that implement the function reduce without reduceleft and reduceright. Because we find reduce in all collections, but reduceleft and reduceright are not present in parallel collections. Imagine that I want to reduce on a collection, without caring if it's a List or a Spark RDD or a parrallel collection from stdlib
Gavin Bisesi
@Daenyth
I mean
that boils down to "I want to reduce a collection without caring about the performance"
because those have very different characteristics
but you want a version of foldMap that takes CommutativeMonoid as a constraint
so that's the method the data type would be built around
you mention spark so maybe frameless can help here
or scio
Gavin Bisesi
@Daenyth
trait MapReduce[F[_]] {
  def mapReduce[A, B: CommutativeMonoid](fa: F[A])(f: A => B): B
}
Rob Norris
@tpolecat
Generic reduce should pick right/left if one is faster so it should never be worse.
nabih.nebbache
@ThegreatShible
I did that
trait Reduce[F[_]] {
def reduceA(f : (A,A) => A) : A
}
Rob Norris
@tpolecat
Yeah.
nabih.nebbache
@ThegreatShible
With that I can reduce on any collection, but i was wondering whether there is an already existant data type like that in cats
Rob Norris
@tpolecat
If you're using Cats then Reducible will do what you want but I suspect you can't define instances for RDD and so on.
nabih.nebbache
@ThegreatShible
yes that the problem
Reducible contains reduceleft and reduceright that are not existant in RDDs (or any parallel collection)
Rob Norris
@tpolecat
Oh I see.
nabih.nebbache
@ThegreatShible
So I think I am going with my own Trait
Rob Norris
@tpolecat
Yeah.
Gavin Bisesi
@Daenyth
Also if you want it to be appropriate for RDD/par collections you need that CommutativeMonoid (or at least CommutativeSemigroup) constraint
Rob Norris
@tpolecat
It doesn't need to be commutative unless reduce scrambles the elements.
It does need to be associative though.
Gavin Bisesi
@Daenyth
par one would, no?
and rdd?
nabih.nebbache
@ThegreatShible
No only associative, no need for commutative
Rob Norris
@tpolecat
As long as parallel reduce combines the chunks in order it only needs to be associative. If it jumbles them up then yeah commutative.
Ethan
@esuntag
The Spark documentation (at least for the latest version) specifies that reduce needs to be commutative and associative
I think reduce is allowed to force an arbitrary shuffle
Rob Norris
@tpolecat
Yeesh.
Ethan
@esuntag
I might be misremembering on the shuffle, it might just be reduceByKey that will force it. Either way, I don't think the elements in an RDD are guaranteed in any specific order
You can maybe right a custom reader/partitioner that can try to provide those guarantees, but you'll be much better off just asking for a CommutativeMonoid