These are chat archives for dry-rb/chat

4th
Dec 2018
Jules Ivanic
@guizmaii
Dec 04 2018 15:43

Hi everyone,

First, I want to thanks all of the people working on this project ! It’s an awesome project ! :clap:

Second, is there any .sequence method proposed in dry-rb ?

sequence signature is: F[G[A]] -> G[F[A]]
Nikita Shilnikov
@flash-gordon
Dec 04 2018 15:46
@guizmaii called w/o a block traverse acts as sequence https://github.com/dry-rb/dry-monads/blob/master/lib/dry/monads/list.rb#L263-L286
Jules Ivanic
@guizmaii
Dec 04 2018 15:46
ok thanks :)
Jules Ivanic
@guizmaii
Dec 04 2018 16:00
does it work on Ruby Array ?
Nikita Shilnikov
@flash-gordon
Dec 04 2018 16:02
nope, we don't monkey patch core classes, it's against our philosophy :)
Jules Ivanic
@guizmaii
Dec 04 2018 16:03
+1
awesome philosophy !
Jules Ivanic
@guizmaii
Dec 04 2018 16:12
Do you thinks that this make sense: atomic_rows = Concurrent::AtomicReference.new(Dry::Monads::List.[]) ?
I need to update this list but my computation is parrallel
I’m very new to concurrency in Ruby :/

hum maybe this is better:

future_rows = Dry::Monads::List.[]
requests.find_in_baches { |data|
  future_rows += Task[:io] { compute_rows(data) } 
}

future_rows.traverse { |rows|
  …
}

WDYT ?

Nikita Shilnikov
@flash-gordon
Dec 04 2018 16:21
future_rows = Concurrent::Array.new
requests.find_in_baches { |data|
  future_rows << Task[:io] { compute_rows(data) } 
}

Dry::Monads::List.coerce(future_rows.to_a).traverse { |rows|
  …
}
@guizmaii I think this one will work better
as in Monads::List isn't meant to be mutable
and, actually, you can use future_rows = []
since it's mutated from the same thread
Jules Ivanic
@guizmaii
Dec 04 2018 16:23
Hum you’re right ! Thanks :)
is it coslty to coerce ?
if the array contains a lot of elements (< 100.000)
Nikita Shilnikov
@flash-gordon
Dec 04 2018 16:36
not really
O(n) max
you can do List.new, it's O(1)
I bet traversing will be way more expensive tbh, it's not omptimized for long lists
otoh, it can be not that bad
Nikita Shilnikov
@flash-gordon
Dec 04 2018 16:42
btw, if you use find_in_batches then future_rows.size will be equal total_rows / batch_size
which is better
I think you should give it a chance and see if something can be improved later
Jules Ivanic
@guizmaii
Dec 04 2018 16:44
ok thanks for your precise answers ! :)
Jules Ivanic
@guizmaii
Dec 04 2018 17:40
This message was deleted
Thanks for your help @flash-gordon ! 🙂
Nikita Shilnikov
@flash-gordon
Dec 04 2018 20:07
sure, np