Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jun 22 05:18

    47erbot on scala3-library-3.1.3

    (compare)

  • Jun 22 05:17

    47erbot on main

    Update scala3-library to 3.1.3 (compare)

  • Jun 22 05:17
    47erbot closed #639
  • Jun 22 05:12
    47erbot opened #639
  • Jun 22 05:12

    47erbot on scala3-library-3.1.3

    Update scala3-library to 3.1.3 (compare)

  • Jun 13 05:21

    47erbot on scala-library-2.12.16

    (compare)

  • Jun 13 05:21

    47erbot on main

    Update scala-library to 2.12.16 (compare)

  • Jun 13 05:21
    47erbot closed #638
  • Jun 13 05:16
    47erbot opened #638
  • Jun 13 05:16

    47erbot on scala-library-2.12.16

    Update scala-library to 2.12.16 (compare)

  • Jun 08 05:13

    47erbot on mdoc-toc-generator-0.4.0

    (compare)

  • Jun 08 05:13

    47erbot on main

    Update mdoc-toc-generator, sbt-… (compare)

  • Jun 08 05:13
    47erbot closed #637
  • Jun 08 05:08
    47erbot opened #637
  • Jun 08 05:08

    47erbot on mdoc-toc-generator-0.4.0

    Update mdoc-toc-generator, sbt-… (compare)

  • Jun 03 05:13

    47erbot on main

    Update scalafmt-core to 3.5.8 (compare)

  • Jun 03 05:13

    47erbot on scalafmt-core-3.5.8

    (compare)

  • Jun 03 05:13
    47erbot closed #636
  • Jun 03 05:09
    47erbot opened #636
  • Jun 03 05:09

    47erbot on scalafmt-core-3.5.8

    Update scalafmt-core to 3.5.8 (compare)

Taleb Zeghmi
@talebzeghmi

got help in cats room:
@emilypi

because for comprehensions are implemented poorly.
every time you have a predicate on your type like : Type or if foo, it calls withFilter

Emily Pillmore
@emilypi
I was pinged in here :P
@tewf what kind of effect are you dealing with that doens’t have withFilter?
Dermot Haughey
@hderms
@raulraja nah I agree I think in retrospect destructive updates don't make sense with it
Patrick Curran
@patrickthebold_twitter
I just started looking at fetch, and I might have a hard time formulating this question.
I like the deduplication feature, but when I run a fetch I'd like to be able to get the data as it becomes available. (Imagine I have a websocket and I want to push data through it as it becomes available.) My issue is that if I use:
(fetchA,fetchB).tupled even if the two fetches ultimately execute in parallel, I have to wait until they both complete to get the full tuple.
I think I want to be able to run the fetch and get an F[(F[A],F[B])] or something like that.
Patrick Curran
@patrickthebold_twitter
So somehow I'd like to get separate ConcurrentEffects that still share the same underlying batching/caching logic.
Alejandro Gómez
@purrgrammer
hey Patrick, your use case is interesting but not something the library supports at the moment
one of the requirements of the library is that data should fit in memory, so it is fine for constructing HTTP responses, but not a good fit for streaming through websockets or server-sent events
Patrick Curran
@patrickthebold_twitter
I'm not sure this changes anything, but in my case memory is not an issue. I could wait until I have everything and then send to my UI, but I'd like to send some data from the first round of calls, render what I can, and then send subsequent data.
Alejandro Gómez
@purrgrammer

I could wait until I have everything and then send to my UI, but I'd like to send some data from the first round of calls, render what I can, and then send subsequent data.

the use case makes sense, although I'm not sure Fetch could support that at the moment. I've been talking with a colleague about how we could support it but I need to spend some more cycles on it to have an answer

i used Fetch for reading data for a UI, although it was pre-rendered in the server
it then runs the Fetch in the client too in case something has been updated
Patrick Curran
@patrickthebold_twitter

thanks, I just wanted to be clear it wasn't a memory/infinite stream thing I was asking for. Thinking out loud: How about attaching an IO action/callback to a Fetch. Something like:

def subFlatMap[A, B](fa: Fetch[F, A])(f: A => F[B]): Fetch[F, B] = Unfetch(fa.run.flatMap {
      case Done(v) => f(v).map(Done(_))
      case Throw(e) =>
        Applicative[F].pure(Throw[F, B](e))
      case Blocked(br, cont) =>
        Applicative[F].pure(Blocked(br, subFlatMap(cont)(f)))
    })

Then you could do:
```
def sendToClientA: F[A] = for {
_ <- sendData(a)
} yield a
def sendDataA: F[Unit] = ???

Anyway I'll keep thinking about it...(And trying to figure out cats-effect) thanks again!
Ryan Tomczik
@Tomczik76
I have a question about how the Cache works. Does it have a time to live?
Alejandro Gómez
@purrgrammer
@Tomczik76 glad you asked. The default implementation of the cache uses an in-memory cache with no TTL, so all the data in a fetch is kept in the cache. Note that using a cache with a TTL could cause different values to be returned for the same identity. Since we want to keep the guarantee that once you read an identity it will be the same for the whole Fetch execution, we use no TTL in the cache.
Alejandro Gómez
@purrgrammer
see https://github.com/47deg/fetch/blob/master/examples/src/test/scala/JedisExample.scala#L156-L187 for an example of a Redis cache implementation that you can use when running a Fetch. Cache is pluggable so you can write your own with a custom TTL
Bijan Chokoufe Nejad
@bijancn
Hi all. It seems to me that when composing together many Fetches, I can only flatMap or traverse, i.e. there is only support for the happy path, so all Fetches succeed or one fails and then all fail. However, I have in some cases appropriate business logic what to do if it's not there or would even return the user an empty result. The only way I see of doing this is to do bring it back to the IO context and handle the error there with something like Fetch.run(myFetch).attempt. But then I would like to get back to the Fetch[IO, _] monad so it plays together with all the other requests that have to be composed together on the higher level
Bijan Chokoufe Nejad
@bijancn
after looking at the sources, I guess I could do this but now I will have to carry around everywhere a ConcurrentEffect, ContextShift and Timer
  private def fetchOrNone(f: Fetch[IO, ExternalId])(
      implicit C: cats.effect.ConcurrentEffect[IO],
      CS: cats.effect.ContextShift[IO],
      T: cats.effect.Timer[IO]
  ): Fetch[IO, Option[ExternalId]] =
    Unfetch[IO, Option[ExternalId]](
      Fetch
        .run(f)
        .map(Some(_))
        .handleError(_ => None[ExternalId])
        .map(Done(_))
    )
Omer Zach
@omerzach
Hey! Loving what i've seen so far of fetch and experimenting with using it at my company. a bit stuck on one specific question and one more general one and hoping to get a bit of help if someone has a sec.
  1. I'm going through the docs (http://47deg.github.io/fetch/docs.html) in a scala REPL but getting issues calling .tupled on a pair of Fetch[F, Foo] instances:
scala> def fetchProduct[F[_] : ConcurrentEffect]: Fetch[F, (User, User)] =
     |   (getUser(1), getUser(2)).tupled
<console>:200: error: value tupled is not a member of (fetch.Fetch[F,User], fetch.Fetch[F,User])
         (getUser(1), getUser(2)).tupled
                                  ^
am i just missing some imports?
and now more generally, is it possible to use Fetch with a fixed concurrent effect type? our codebase uses cats.effect.IO throughout rather than more generic cats effect typeclasses. can i just fix type FetchIO[T] = Fetch[IO, T] (not sure if that's actually proper syntax for higher kinded programming in scala) and use FetchIO instead of Fetch throughout? we may generalize down the line, but looking for a quicker win first.
Alejandro Gómez
@purrgrammer
hi @omerzach, regarding the use of .tupled, make sure you import the Applicative syntax cats.syntax.applicative._ so you can use it, see https://github.com/typelevel/cats/blob/master/docs/src/main/tut/typeclasses/applicative.md#syntax
regarding whether it's possible to use Fetch with a fixed IO type, you can just do type FetchIO[A] = Fetch[IO, A] as you mentioned :+1:
hey @bijancn, regarding the "happy path", one asumption that Fetch does is that your identities will be there, and will short-circuit when one is missing. However, for optional identities, you can construct Fetch instances using Fetch#optional. It will yield a Fetch[F, Option[A]], and won't fail if the identity is missing. Let me know if it helps!
Alejandro Gómez
@purrgrammer
@omerzach another option would be using Fetch parameterised to F[_] : ConcurrentEffect, and use ConcurrentEffect[F].liftIO in the data sources. this way you can use your IO-returning functions in the data source implementation, and keep your fetch code generic.
Alejandro Gómez
@purrgrammer
just cut the 1.0.0 release of Fetch, for the curious: https://github.com/47deg/fetch/releases/tag/v1.0.0
Pepe García
@pepegar
:clap: :clap: :clap:
Omer Zach
@omerzach

@purrgrammer

scala> import cats.syntax.applicative._
import cats.syntax.applicative._

scala> def fetchProduct[F[_] : ConcurrentEffect]: Fetch[F, (User, User)] =
     |   (getUser(1), getUser(2)).tupled
<console>:181: error: value tupled is not a member of (fetch.Fetch[F,User], fetch.Fetch[F,User])
         (getUser(1), getUser(2)).tupled
                                  ^

after copy-pasting everything in http://47deg.github.io/fetch/docs.html#syntax-6-companion-object-0 up to the def getUser

Omer Zach
@omerzach
looks like adding "-Ypartial-unification” there fixes it. didn’t work in my environment, but means the issue is on my end.
Alejandro Gómez
@purrgrammer
cool @omerzach, I overlooked the compiler flag, maybe I should recommend it in the docs
Maureen Elsberry
@MaureenElsberry
New article on using Fetch for optimizing requests to GitHub's API from @purrgrammer: https://www.47deg.com/blog/optimizing-github-api-access-with-fetch/
Omer Zach
@omerzach
yeah, would be helpful in the docs i think :)
making a ton of progress refactoring our codebase to work with Fetch now that that’s resolved
got another question though
is there any way to lift an IO[T] to a Fetch[IO, T] without defining a Data/DataSource for it? i have certain code that’s un-batchable anyway and it would make a lot of my code more straightforward if i could just lift it right into Fetch.
Omer Zach
@omerzach
something along these lines:
  def liftT[T](io: IO[T]): Fetch[IO, T] = {
    object Lift extends Data[IO[T], T] {
      val name = "Lifts an IO[T] to a Fetch[IO, T]"

      def source: DataSource[IO, IO[T], T] = new DataSource[IO, IO[T], T] {
        override def data = Lift

        override def CF: ConcurrentEffect[IO] = ConcurrentEffect[IO]

        override def fetch(id: IO[T]): IO[Option[T]] = (
          id.map(Some(_))
        )
      }
    }

    Fetch(io, Lift.source)
  }
Alejandro Gómez
@purrgrammer

is there any way to lift an IO[T] to a Fetch[IO, T] without defining a Data/DataSource for it? i have certain code that’s un-batchable anyway and it would make a lot of my code more straightforward if i could just lift it right into Fetch.

that's something that I hadn't thought about, right now Fetch assumes you'll use a Data/DataSource for performing requests. If i understood correctly, you want a way to lift IO actions to Fetch without an associated Data/DataSource, knowing that those requests can't be optimized?

Fetch doesn't support it currently but I'm going to open an issue to address this, thanks for bringing this up Omer
Omer Zach
@omerzach
yep exactly what i meant, thanks!
Bijan Chokoufe Nejad
@bijancn

Fetch doesn't support it currently but I'm going to open an issue to address this, thanks for bringing this up Omer

Yeah I would have liked that as well :+1:

hey @bijancn, regarding the "happy path", one asumption that Fetch does is that your identities will be there, and will short-circuit when one is missing. However, for optional identities, you can construct Fetch instances using Fetch#optional. It will yield a Fetch[F, Option[A]], and won't fail if the identity is missing. Let me know if it helps!

Yeah thanks I found that as well at some point and it addresses my problem fully

One thing I am kind of missing is to have custom caches for certain Fetch queries or mark them as uncachable
Bijan Chokoufe Nejad
@bijancn
In any way, refactoring my service to use Fetch has made it really fast and I love the fact that I can do batching, parallel request and caching all with a similar API and not rewrite according to what has to be done first etc
Alejandro Gómez
@purrgrammer
it makes me really happy to read that Bijan, I've opened an issue to address marking fetches as not to be served from the cache
Omer Zach
@omerzach
@purrgrammer another request after liftIO is some sort of “clearCache” command we could use. not sure implementation-wise if that’s much trickier. we want it for this sort of use case:
for {
  user <- Fetch(userId, UsersById)
  organization <- Fetch(organizationId, OrganizationsById.source)

  _ <- Fetch.liftIO(addUserToOrganization(user, organization) // This may modify an organization object in the DB

  _ <- Fetch.clearCache()

  organizationJson <- Fetch(organizationId, OrganizationJsonById.source) // This uses the OrganizationsById DataSource but needs to get a new one, not the cached one!
} yield {
  organizationJson
}

right now we're doing something like this:

for {
  (user, organization) <- Fetch.run[IO])(Fetch(userId, UsersById), Fetch(organizationId, OrganizationsById.source))

  _ <- addUserToOrganization(user, organization) // This may modify an organization object in the DB

  organizationJson <- Fetch.run[IO](Fetch(organizationId, OrganizationJsonById.source)) // This uses the OrganizationsById DataSource but needs to get a new one, not the cached one!
} yield {
  organizationJson
}

which isn’t too bad, but you can see how this can can get out of hand as we have more and more complex cases