Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Feb 01 2021 10:11
    @SystemFw banned @Hudsone_gitlab
  • Jan 31 2019 04:19
    404- forked
    404-/fs2
  • Jan 31 2019 03:01
    SethTisue commented #1232
  • Jan 30 2019 17:22
  • Jan 30 2019 13:45
  • Jan 30 2019 10:48
    pchlupacek commented #1406
  • Jan 30 2019 10:47
    pchlupacek commented #1406
  • Jan 30 2019 10:39
    pchlupacek commented #1407
  • Jan 30 2019 09:58
    lJoublanc commented #870
  • Jan 30 2019 09:42
    vladimir-popov commented #1407
  • Jan 30 2019 08:10
    vladimir-popov closed #1407
  • Jan 30 2019 08:10
    vladimir-popov commented #1407
  • Jan 29 2019 19:20
    SystemFw commented #1407
  • Jan 29 2019 19:20
    SystemFw commented #1407
  • Jan 29 2019 18:57
    SystemFw commented #1406
  • Jan 29 2019 17:47
    pchlupacek commented #1406
  • Jan 29 2019 17:42
    pchlupacek commented #1406
  • Jan 29 2019 17:39
    pchlupacek commented #1407
  • Jan 29 2019 17:39
    vladimir-popov edited #1407
  • Jan 29 2019 17:38
    vladimir-popov commented #1406
Fabio Labella
@SystemFw
mm actually maybe that's good enough
Gavin Bisesi
@Daenyth
I think it should be
he wants a strict zip ordering
Fabio Labella
@SystemFw
it's not as parallel as it could be but it might be enough
Gavin Bisesi
@Daenyth
right
it's a hell of a lot easier to implement
the full concurrent approach is likely a full page of code that you then need extensive tests for
Fabio Labella
@SystemFw
yeah, I agree
Paulius Imbrasas
@CremboC
I think I'll just stick to what I have :P thanks a lot anyway
my IO and parMapN works fine already and is much simpler.. only downside it's eager and not lazy like Stream
Gavin Bisesi
@Daenyth
I'd try my prefetch one, it's pretty simple and I think it would get you there
Fabio Labella
@SystemFw
with prefetch you also lose the "laziness"
you get a much more push based behaviour
Paulius Imbrasas
@CremboC
I just looked at my f, and while it's cheap, it's non-trivial to convert it to support streams
Fabio Labella
@SystemFw
by definition of pre-fetch
but yeah basically even by your description only it feels like the way it's currently structured, your problem is more about IO: you say result and finish
IO as in a single element effect, vs multiple
Paulius Imbrasas
@CremboC
yeah, and my f needs the whole result of both IOs...
Fabio Labella
@SystemFw
just imagine a scenario where the two streams emit a different number of elements: what do you do? discard all leftovers? or use an old value for one side to combine with all the leftovers?
Paulius Imbrasas
@CremboC
just throw new IllegalStateException
:D
Fabio Labella
@SystemFw
we do have a nondetermistic zip, but it uses the latest, it doesn't wait for two new values as you wish
Gavin Bisesi
@Daenyth
It does feel like there could be a place for a concurrentZip combinator
that has the same semantics of zip except that it allows the two to be concurrent rather than sequential
Fabio Labella
@SystemFw
there is one
but it uses the latest
Gavin Bisesi
@Daenyth
right, I mean by same semantics that it will concurrently evaluate both streams as far as it needs to get the next pair
Fabio Labella
@SystemFw
ah, do you want one that discards leftovers?
Gavin Bisesi
@Daenyth
no, strict order zipping like zip, as he says
Fabio Labella
@SystemFw
what do you do when two streams emit a different amount of elements?
Gavin Bisesi
@Daenyth
same as zip
Fabio Labella
@SystemFw
discard leftovers or hang?
Gavin Bisesi
@Daenyth
If neither stream terminates, then hang
if one terminates, terminate
All the same behavior as zip, just running the left pull and right pull concurrently
Fabio Labella
@SystemFw

if one terminates, terminate

so, discard leftovers

Gavin Bisesi
@Daenyth
sure
Fabio Labella
@SystemFw
doable, but it strikes me as quite marginal, and probably needs to act on single elements and not chunks
Andrey
@404-

we do have a nondetermistic zip, but it uses the latest, it doesn't wait for two new values as you wish

what is it named?

Fabio Labella
@SystemFw
you need to go through Signal
so, get signals from stream using hold, combine the signals applicatively with mapN, then get a Stream back with discrete
Andrey
@404-
just to confirm (it does seem to work):
val nums  = Stream.range[IO](1, 10).metered(1.second)
val chars = Stream.range[IO]('a', 'z' + 1).map(_.toChar).metered(500.millis)
val zipped = for {
  a <- nums.hold(0L)
  b <- chars.hold('\0')
  c <- (a, b).tupled.discrete
} yield c
Peter Aaser
@PeterAaser
Add scala after the backticks for proper formatting
:+1:
Andrey
@404-
@PeterAaser done - thank you for the hint, will definitely use it from now on
Peter Aaser
@PeterAaser
no worries :)
Andrey
@404-
hmm the above does seem to "hang", so something isn't getting closed/cleaned up. ideas?
"hang" after both streams have finished
Peter Aaser
@PeterAaser
Not looked very closely at what you're doing, but you might need noneTerminate
Since you can no longer tell for sure if the stream is terminated or not