Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 24 00:47
    scala-steward opened #250
  • Oct 21 00:18
    scala-steward opened #249
  • Oct 20 15:04
    scala-steward opened #248
  • Oct 18 13:06
    scala-steward opened #247
  • Oct 18 09:10
    scala-steward opened #246
  • Oct 18 00:45
    scala-steward opened #245
  • Oct 14 14:59
    scala-steward opened #244
  • Oct 14 13:20

    satabin on main

    Update sbt-scalajs, scalajs-com… Merge pull request #241 from sc… (compare)

  • Oct 14 13:20
    satabin closed #241
  • Oct 12 15:37

    ybasket on main

    Update sbt-scoverage to 1.9.1 Merge pull request #243 from sc… (compare)

  • Oct 12 15:37
    ybasket closed #243
  • Oct 12 12:41

    satabin on sst

    Add SST implementation (compare)

  • Oct 11 11:05
    scala-steward opened #243
  • Oct 10 16:59

    ybasket on main

    Update fs2-core, fs2-io to 3.1.5 Merge pull request #242 from sc… (compare)

  • Oct 10 16:59
    ybasket closed #242
  • Oct 08 05:10
    scala-steward opened #242
  • Oct 06 19:54
    scala-steward opened #241
  • Oct 06 07:35
    satabin closed #217
  • Oct 06 07:35
    satabin commented #217
  • Oct 06 07:34

    satabin on main

    Update scalafmt-core to 3.0.6 Merge pull request #239 from sc… (compare)

Lucas Satabin
@gnieh:matrix.org
[m]
:)
then I'm happy I could help somehow :D
Sait Sami Kocataş
@Deliganli

hi everyone, after yesterday even though I still couldn't solve the problem I found out this is a standard, described below;
https://doc.akka.io/docs/akka-http/10.0/sse-support.html

I am not sure wheater it is in scope of this library here

Lucas Satabin
@gnieh:matrix.org
[m]
Hi thanks for sharing
I think it's a bit outside of scope, and I would expect this to happen upstream, before using fs2-data
however if you make your data go through text.lines after decoding, it should be easy to handle with a regex
I will look a bit more into the standard but it really looks like a line oriented format, with the keyword at the beginning of the line
so something like "^tagname:(.*)" should then capture properly after you ensured your data arrives in lines in the pipe
yes, according to the grammar given in section 6, the stream contains events separated by lines
Lucas Satabin
@gnieh:matrix.org
[m]
you also need to drop lines starting with a colon, they are comments
Lucas Satabin
@gnieh:matrix.org
[m]
then you can pipe this through fs2-data
Sait Sami Kocataş
@Deliganli
thanks a lot @gnieh:matrix.org , I also stole a bit of code from https://github.com/Spinoco/fs2-http, would have used it directly but no scala 2.13 binaries published
Damien O'Reilly
@DamienOReilly
Hi, I would like to specify a custom date-time formatter to parse particular dates in some csv files that do not conform to ISO-8601. I see that fs2.data.csv.CellDecoder#javaTimeDecoder is private. Apart from implementing something similar to javaTimeDecoder myself, is there another way to achieve this with this libary?
Lucas Satabin
@gnieh:matrix.org
[m]
hi
I have to check this part of the code again, maybe ybasket knows this on the top of his head?
ybasket
@ybasket:matrix.org
[m]
fs2.data.csv.CellDecoder#javaTimeDecoder is private as it's not safe given its (current) signature. But I totally see your use case and will create an issue to expose something appropriate. Until then, you'll have to do it yourself, sorry!
CellDecoder[String].emap(s => Either.catchOnly[DateTimeParseException](doYourParsing(s)).leftMap(new DecoderError("Couldn't parse my format", _)))
Damien O'Reilly
@DamienOReilly
No worries, thanks for your response.
Lucas Satabin
@gnieh:matrix.org
[m]
@DamienOReilly: generally speaking the CSV API will get a complete overhaul for release 1.0. It is historically the first one and has grown into something too complicated. Any feedback is also welcome on it to improve it. So if you have a pain point with it, don't hesitate to tell us
Damien O'Reilly
@DamienOReilly
Thanks @Lucas , will do

Is there anything in the lib to work with strings regards parsing to rows? For example, I am currently crunching csv files from s3, using the fs2-aws-s3 lib, and have something like:

        s3Client.readFileMultipart(BucketName(bucket), FileKey(fileKey), partSizeMB)
          .through(fs2.text.utf8Decode)
          .through(fs2.text.lines)
          .through(rows[IO]())
          .through(headers[IO, String])

However rows expects an Stream[F, Char], not Stream[F, String].

Damien O'Reilly
@DamienOReilly
I can pipe it through .through(_.flatMap(str => fs2.Stream.emits(str.toCharArray))) before the rows pipe maybe.
Lucas Satabin
@gnieh:matrix.org
[m]
yes this is the common pattern with version 0.x
you don't need the lines pipe
rows handles it, including corner cases (when new line occurs within a row)
the flatMap(Stream.emits(_)) won't be required anymore in version 1.x onward
so just replace .through(fs2.text.lines) by .flatMap(fs2.Stream.emits(_)) and it should work out of the box
Damien O'Reilly
@DamienOReilly
Nice one, thanks again
Lucas Satabin
@gnieh:matrix.org
[m]
👍️
(I really am not used to the way completion works in Element...)
Lucas Satabin
@gnieh:matrix.org
[m]
Hi there! I release fs2-data 0.10.0 with support for CBOR and configurable java.time codecs. https://twitter.com/lucassatabin/status/1373402923154161665 From now on, development will focus on the 1.0.0 release, which will bring new and better CSV pipes, as well as easier support for textual inputs (and much more in CBOR, XML, ...). Main branch will become the one targetting 1.0.0 soon(-ish) and 0.x series will be renamed maintenance and will only accept bug fixes. I wish you all a nice Sunday.
Lucas Satabin
@gnieh:matrix.org
[m]
I just promoted main branch as the default one, which is the former 1.x branch. master was renamed to 0.x and should only be targeted for bugfix PR from now on. I will update CI configuration and documentation accordingly now. This is the official start of 1.x development!
ybasket
@ybasket:matrix.org
[m]
Hey, I just released version 1.0.0-RC1 of fs2-data, featuring CE3 support, more supported input types and new CSV pipes and data types. The changes are described in detail in the release notes, feedback is welcome as usual!
Big thanks to everyone involved!
Dylan Halperin
@dylemma
Just popping in to say hi. Earlier last year I mentioned doing some overhauls in xml-spac to support cats-effect and allow for different "event" providers. It's finally reached a point where I'm just about to start adding fs2-data as a provider! dylemma/xml-spac#28 - this'll allow for easy creation of Pipe[F, XmlEvent, A] for xml streams and Pipe[F, Token, A] for json streams
ybasket
@ybasket:matrix.org
[m]
Sounds cool! Feel free to ask questions here any time 🙂
Lucas Satabin: WDYT of having some "related projects" section in the fs2-data docs?
Lucas Satabin
@gnieh:matrix.org
[m]
Hi @dylemma this sounds awesome! Thanks for letting us know! I will add a section to the website this weekend.
Dylan Halperin
@dylemma
just checking, there's no way to get a Pipe[F, Byte, XmlEvent], right? As far as I can tell it's all about Char, not Byte.
(aside from manually attaching a Stream[F, Byte] to a Pipe[F, Byte, Char] beforehand)
Dylan Halperin
@dylemma
anyway, it seems to be working out well! https://gist.github.com/dylemma/6db1223c0eaf72297018c65f90c46664
ybasket
@ybasket:matrix.org
[m]
@dylemma: The 0.10 series of fs2-data works exclusively on Char, but the new 1.0 (released RC1 just a few days ago) has some support for using non-Char streams directly using the CharLikeChunks abstraction. IIRC we don't documentation on that one yet, but if you look at the modified pipes and this: https://github.com/satabin/fs2-data/blob/main/text/shared/src/main/scala/fs2/data/text/package.scala I hope you figure it out easily (in the end, it boils down to choosing the correct import). Let us know if you get stuck there!
Lucas Satabin
@gnieh:matrix.org
[m]
@dylemma: hi, for reading textual format from a byte stream, you'll need to tell fs2-data how to decode it, I will deploy the new website tonight (and try to finally make it work automatically from the CI) so that you can have a bit of doc. Besides what ybasket pointed you to, there is documentation about the and various encoding supports CharLikeChunks here https://github.com/satabin/fs2-data/blob/main/documentation/docs/index.md
this will be available on the website tonight (Berlin time)
Lucas Satabin
@gnieh:matrix.org
[m]
I released fs2-data 1.0.0-RC2 and updated the documentation on the website to explain the CharLikeChunk concept and how to parse byte streams. @dylemma you can have a look here: https://fs2-data.gnieh.org/documentation/#reading-text-inputs-from-a-file
Dylan Halperin
@dylemma
Oh cool, thanks for the links. I was using the 1.0-RC1 and found the CharLikeChunks. I was using that abstraction to make my helper implicits, but I didn't realize that text package was there. That should work out nicely since an end user can just import the one they want from that package, and my implicits that work in terms of CharLikeChunks should just work. I'll poke at this a bit more later tonight.
Dylan Halperin
@dylemma
https://github.com/dylemma/xml-spac 0.9.1 is released, including "parser backend" modules for fs2-data-xml and fs2-data-json!
It's currently built against your 1.0.0-RC2, but I can put out an 0.9.2 when you release 1.0.0
Lucas Satabin
@gnieh:matrix.org
[m]
@dylemma: this is awesome thanks! We just released 1.0.0-RC3 with support for scala 3 if you want to give it a try :)
Dylan Halperin
@dylemma
Nice, I used that to release 0.9.2 of my library which also supports scala 3
Lucas Satabin
@gnieh:matrix.org
[m]
We just released version 1.0.0 final with support for Scala 3, CE3, and fs2 3! This release also includes refactoring of CSV pipes and a more flexible way to read textual inputs. Enjoy! https://github.com/satabin/fs2-data/releases/tag/v1.0.0