Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Feb 01 2021 10:11
    @SystemFw banned @Hudsone_gitlab
  • Jan 31 2019 04:19
    404- forked
    404-/fs2
  • Jan 31 2019 03:01
    SethTisue commented #1232
  • Jan 30 2019 17:22
  • Jan 30 2019 13:45
  • Jan 30 2019 10:48
    pchlupacek commented #1406
  • Jan 30 2019 10:47
    pchlupacek commented #1406
  • Jan 30 2019 10:39
    pchlupacek commented #1407
  • Jan 30 2019 09:58
    lJoublanc commented #870
  • Jan 30 2019 09:42
    vladimir-popov commented #1407
  • Jan 30 2019 08:10
    vladimir-popov closed #1407
  • Jan 30 2019 08:10
    vladimir-popov commented #1407
  • Jan 29 2019 19:20
    SystemFw commented #1407
  • Jan 29 2019 19:20
    SystemFw commented #1407
  • Jan 29 2019 18:57
    SystemFw commented #1406
  • Jan 29 2019 17:47
    pchlupacek commented #1406
  • Jan 29 2019 17:42
    pchlupacek commented #1406
  • Jan 29 2019 17:39
    pchlupacek commented #1407
  • Jan 29 2019 17:39
    vladimir-popov edited #1407
  • Jan 29 2019 17:38
    vladimir-popov commented #1406
Christopher Davenport
@ChristopherDavenport
So basically I'd need 3 parsers, 1 for Content-Length based, 1 for Chunked Encoding, and 1 for Multipart.
Adam Chlupacek
@AdamChlupacek
exactly, since they all behave differently in terms of how the body is presented.
Christopher Davenport
@ChristopherDavenport
Alright. I now think I have connected the dots. Thank you so much @AdamChlupacek I was staring at that wall of infinite and coming up short.
Adam Chlupacek
@AdamChlupacek
No problem, sorry I took so long to answer
Christopher Davenport
@ChristopherDavenport
As a question why did you need readWithTimeout in fs2-http when the timeout calls have timeouts in them? So you recursively shorten in the read calls?
Adam Chlupacek
@AdamChlupacek
I am not sure which part you are refering to now, but yeah, we do recursively shorten the time for the read calls, so that the time out provided by user is really a time out of the whole operation, not just one chunk of data.
Christopher Davenport
@ChristopherDavenport
We're talking about the same calls. Very cool. I finally think the knowledge is starting to fall into place.
Justin Heyes-Jones
@justinhj
Are there answers somewhere to the exercises in the fs2 docs? I like to compare what I've done with the recommended way to do things
Fabio Labella
@SystemFw

@justinhj are you referring to:

Implement repeat, which repeats a stream indefinitely, drain, which strips all output 
from  a stream, eval_, which runs an effect and ignores its output, and attempt, which
 catches any errors produced by a stream:

?

You can just look at the code to see how they are implemented
that applies also to the exercises in the subsequent sections
they are all operations already in the library
Justin Heyes-Jones
@justinhj
Ah I see. Yes I can do that. As a new user to the library I found some of the implementations a bit intimidating because for example eval_ is implemented using eval which uses internal private types FromC and Algebra. I thought the intent would be to implement them more from the library users point of view.
Fabio Labella
@SystemFw
yeah, that's the point
you can treat eval as a primitive
and in fact, it is a primitive, that's why it goes to Algebra
so how do you implement eval_ using eval and the other things you have learned so far?
you are not supposed to implement eval_ from scratch :)
Justin Heyes-Jones
@justinhj
ah that makes sense. I did figure it out in the end I just wasn't sure whether I'd done it right, but I had.
Taylor Brown
@tbrown1979

Hi everyone, so I have a Stream[F, Byte] and I'd like to create a stream that emits 128kbps. I'd like to keep the type Stream[F, Byte]. To do this I figured I could make each Chunk 128kb and have each Chunk process every second.

Something like this:

      val perSecond = scheduler.awakeEvery(500 millis)

      val stream: Stream[F, Byte] =
        inputStream
          .chunks
          .zip(perSecond)
          .map(_._1)
          .flatMap(Stream.chunk(_).covary[F])
          .repeat

This works nicely (if there is a better way than this please let me know, I am an fs2 noob). The issue I'm having right now is making each Chunk the proper size. I want each chunk to contain 16000 Bytes, however when I read in my InputStream it comes in at ~8000 bytes per chunk. Is there a way I can condense these chunks so that they're all roughly 16k bytes? I've looked through the Stream code and I see a lot of things about limiting chunk size, but that's just not what I want. Is this a completely dumb way of getting the result I want? Any help is appreciated. Thanks

Christopher Davenport
@ChristopherDavenport
Take a look at rechunk or rechunkN, not near a computer but I believe that gets you what you are looking for.
Justin Heyes-Jones
@justinhj
Another newb question. I'm trying to write a Pull function that's similar to take(n) but that will take stream elements until has seen n elements that match a predicate. For example a stream of Char, I would like to take the stream elements until I have seen 5 'a's. I'm thinking it would be similar to the tk function in the samples using scanSegmentsOpt and take elements one at a time, incrementing the count when it sees an 'a'. The problem I'm having is how to look at the next element. It looks like peek1 may be what I want but I can’t seem to use it from a segment or find an example of its use.
Jose C
@jmcardon
@justinhj uncons1
Justin Heyes-Jones
@justinhj
haha short and sweet, thanks
Jose C
@jmcardon
@justinhj this is just for the sake of an exercise correct?
because the actual implementation of take is in terms of uncons
Justin Heyes-Jones
@justinhj
I'm just extending the exercise in a different way for the learning experience. I was taking the first implementation of tk (that uses scanSegmentsOpt) as a starting point rather than the second one that uses uncons
Jose C
@jmcardon
:+1: cool. yeah that's a good way to learn
Denis Mikhaylov
@notxcain
Hi! How do I change the effect type? I have a FunctionK[F, G] and a Stream[F, A], and I need Stream[G, A]
epifania villamor
@fgalaban_twitter
ok
Denis Mikhaylov
@notxcain
Ah, nevermind, found a way around
Fabio Labella
@SystemFw
@notxcain generally you can do so with translate, or in the case of Pure ~> F, simply use covary
Denis Mikhaylov
@notxcain
@SystemFw nice, thank you!
Paulius Imbrasas
@CremboC
hey @SystemFw , we're running a service reading from Google PubSub, and we occassionaly this beautiful stack trace: https://gist.github.com/CremboC/6f3f158bd32720890e47519211a1a71d (sorry it's in one line, but you'll see a pattern quickly!). any ideas?
Fabio Labella
@SystemFw
@CremboC what version are you on?
this was fixed in 0.10.3
Paulius Imbrasas
@CremboC
ah, on 0.10.1
I assume they're backwards compatible?
Fabio Labella
@SystemFw
yep
Paulius Imbrasas
@CremboC
coolio, will update, danke!
Justin Heyes-Jones
@justinhj
I've been trying to write a function that streams until it has seen N objects equal to a provided instance of O. The last piece of the puzzle is how to emit the stream. I think I need to pull output but somehow need to join that to the tail of the stream in the recursive call? https://gist.github.com/justinhj/10fd8dc2a364a1fdbade36352b15abe0
Peter Aaser
@PeterAaser
I'm not sure what you are trying to do here
maintaining a list, but with Pipe[F,O,O] as type seems incompatible
Pull.output1 might be what you want to use
typically you compose Pulls with >>
how about
s.pull.uncons1.flatMap {

            case Some((o, tl)) =>

              if(o == thing) {
                output1(o) >> go(tl, n - 1)
              }
              else {
                output1(o) >> go(tl,n)
              }

            case None =>
              Pull.done
does that seem like what you are after @justinhj
Peter Aaser
@PeterAaser
Pull forms a monad in R btw, so >> is just myPull.flatMap{ _ => ... } if that helps any (but this is in no way necessary to understand for using pulls)
Jose C
@jmcardon
@PeterAaser sort of useful when you need to evaluate effects in a Pull though