Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Apr 02 22:15
    scala-steward opened #1222
  • Mar 31 16:36
    scala-steward opened #1221
  • Mar 27 22:41
    codecov-io commented #1220
  • Mar 27 22:37
    codecov-io commented #1220
  • Mar 27 21:35
    scala-steward opened #1220
  • Mar 27 09:50
    scala-steward opened #1219
  • Mar 20 21:26
    codecov-io commented #1218
  • Mar 20 21:19
    scala-steward opened #1218
  • Mar 20 11:52
    scala-steward opened #1217
  • Mar 20 08:21
    codecov-io commented #1216
  • Mar 20 08:15
    scala-steward opened #1216
  • Mar 14 07:26
    codecov-io commented #1215
  • Mar 14 07:20
    scala-steward opened #1215
  • Mar 10 22:54

    vkostyukov on version-bump-0.33

    (compare)

  • Mar 10 22:54
    vkostyukov closed #1214
  • Mar 10 22:53

    vkostyukov on v0.32.1

    (compare)

  • Mar 10 22:53

    vkostyukov on master

    Setting version to 0.32.1 Setting version to 0.33.0-SNAPS… (compare)

  • Mar 10 21:52

    vkostyukov on v0.32.0

    (compare)

  • Mar 10 21:48
    vkostyukov opened #1214
  • Mar 10 21:48

    vkostyukov on version-bump-0.33

    Bump version to 0.33-SNAPSHOT (compare)

Dermot Haughey
@hderms
one of the big obstacles for us getting off finagle filters is we want to have a filter which sets a request id on the request object before any application code is run
then we want to configure a logger that has the request ID as a parameter
the way I accomplish that currently is with a finagle filter that runs before everything and sets a random request ID on finagle Request
then I have an endpoint based on root which can pull out that request ID, configure the logger and return the logger. I call it withRequestLogger
and then I use it like
post("foo" :: withRequestLogger) { logger: Logger =>
this does everything we want but it has the undesirable side effect of putting logger in lots of type signatures (which is debatably a positive)
and in addition requires understanding of finagle filters
Sergey Kolbasov
@sergeykolbasov

@hderms you just hit the most actual topic for me, so I have tons of answers :)

If I understand correctly, you create a logger per request (due to the unique request context) and pass it around. If it's so, then I strongly advise using the famous Reader monad!
You might ask why, and the answer is - because you can pass around logger, context, whatever you feel fancy to pass around without polluting your interfaces API.

The way we solve it in Zalando is to use tagless final + cats-mtl to extract context whenever you need it (actually, that's a topic for a blog post or even a tech talk). Nevertheless, you can fix on specific monad (Reader) and go with it for a while.

Then, to compile Finch endpoints into Finagle service, you're required to have Effect (or ConcurrentEffect) for your monad. Reader doesn't have them out of the box, and the reason is simple: what should be the initial environment?
You have two options here:

  • mapK over Endpoint.Compiled and in natural transformation define the initial environment for your reader (say, NoopLogger?). Then in the next Kleisli redefine it based on the request, instantiating the logger you need and using local to propagate this environment down to the next element in the chain.
  • provide your own implicit Effect for Reader[F[_] : Effect, YourEnv, *] that will run this monad with some initial YourEnv, so finch would pick it up to convert Compiled to Service

Voilà, you don't have these logger endpoints everywhere, with loggers as parameters being passed here and there.

And it's not over yet! Just this night I've published the first version of Odin:
https://github.com/valskalla/odin

It's Fast & Functional logger that is not that features-fancy as log4j yet but has basic options available with something special on top. One of them: context being first-class citizen, so you don't need to mess around with ThreadLocal MDC and/or create loggers per context. It even has a contextual logger that can pick up your context from any ApplicativeAsk (that is Reader) and embed it in the log for you.

It's not that I suggest you to pick it right away and use in production today, we are still to battle test in production this month, as well as some features might be missing. But you might be interested to subscribe to it and follow, and who knows if one day you can start using it in your project :)

Dermot Haughey
@hderms
@sergeykolbasov thanks for the help. I'll try that approach.
Odin looks nice but one thing we've come to rely on is Izumi logger's ability to convert logs to JSON https://izumi.7mind.io/latest/release/doc/logstage/
in particular i think a lot of scala logging libraries should be going for structured logging as a first approach
maybe building a ciirce integration for Odin would be helpful
Georgi Krastev
@joroKr21
If you use Monix, TaskLocal is a great alternative to ReaderT (I think FiberLocal if you use ZIO). You can actually define ApplicativeAsk[Task, Logger] based on a TaskLocal and profit.
Sergey Kolbasov
@sergeykolbasov
@hderms there is already one :) called odin-json
I'll spend some time in the next days working on proper documentation
@joroKr21 I'm not huge fan of *Local things personally. It's even worse than implicits if you think about it. You should believe that someone somewhere put the required data into the magical box of *Local before the moment you're going to use it
This message was deleted
Georgi Krastev
@joroKr21
It's not so difficult to arrange as long as you don't have too many ends of the world. Besides, how is it different than providing a default NoopLogger to ReaderT?
You also need to make sure that someone is calling local with a new tracing logger.
Sergey Kolbasov
@sergeykolbasov
well, at least it's an explicit requirement to have one
you might as well have no NoopLogger at all and just run the Endpoint.Compiled[ReaderT[F[_], Ctx, *]] inside of Endpoint.Compiled[F[_]] when there is an access to request to build a proper logger right away
Pavel Borobov
@blvp

Hello everyone.
I faced one little problem using finch to build json REST API application.
In my application I have 2 entities User and Pet and they both have CRUD like operations.
Both API groups have their own encoders and decoders (I'm using circe).

class UserResources[F[_]](userRepo: UserRepo[F]) extends Endpoint.Module[F] with UserCodecs {
   val create: Endpoint[F, User] = post("user" :: jsonBody[User]) { user: User => userRepo.save(user).map(Ok(_)) }
   val get: Endpoint[F, User] = get("user" :: path[Long]) { userId: Long => userRepo.findById(userId).map(Ok(_)) }
   ...
   val endpoints = (create :+: get)
}
class PetResources[F[_]](petRepo: PetRepo[F]) extends Endpoint.Module[F] with PetCodecs {
   val create: Endpoint[F, User] = post("pet" :: jsonBody[Pet]) { pet: Pet => petRepo.save(user).map(Ok(_)) }
   val get: Endpoint[F, User] = get("pet" :: path[Long]) { userId: Long => petRepo.findById(userId).map(Ok(_)) }
   ...
   val endpoints = (create :+: get)
}

Then I use them in this fashion:

val allEndpoints = (new UserResources(repo).endpoints :+: new PetResources(repo2).endpoints)
val api  = Bootstrap.serve[Application.Json](allEndpoints).toService

But this call require same instances of Encoder/Decoderwhich are defined in *Codecs trait for .toService call to materialise the service. I understand why we should have instances in both situations.

Could you please suggest to me how I can better organise code in similar fashion, but without codecs instances duplicate?

Sergey Kolbasov
@sergeykolbasov

Hi @blvp

Best practice in Scala is to put type classes instances for specific types into companion objects whenever is possible

Compiler picks it up from there on its own without any imports.

If you need to describe it for types outside of your application (like library types), you might as well to keep it inside a package object (or just an object) and import those implicits from there

Pavel Borobov
@blvp
Yeah, but it will require it to be imported firstly inside of your Resource class and in component that will combine several resources into a service. So in my example it will require both instances of User and Pet in the code where I call . toService
Sergey Kolbasov
@sergeykolbasov
You don't need to import anything if you put implicit encoders and decoders into corresponding companion objects (User and Pet in your example)
Kevin Pei
@kpei
Hi guys, new to finch, if i have an Endpoint[A], how do I go about getting it's defined path? e.g val endpoint = get("foo" :: "bar") { //something } how do I extract the "foo/bar" from the endpoint?
Ryan Plessner
@rpless
Hi @kpei one approach would be to use root to access the raw finagle request. something like val endpoint = get(root :: "foo" :: "bar")((request) => ...). request.path should give you the full path
Guillaume Balaine
@Igosuki
Hi there
I’m migrating from http4s to Finch, has anyone ever used a bracket that signals IO[ExitCode] upon Finagle server termination ?
I want to join a Finagle server concurrently with other fs2.Streams I have running in the background
Sergey Kolbasov
@sergeykolbasov

Hi @Igosuki I experimented with it once, don't remember the way but it's possible for certain

But to clarify, what do you mean exaclty by upon Finagle server termination? Because usually you shut it down all together with the whole application, and Http.serve returns forever-running Twitter Future

Guillaume Balaine
@Igosuki
I should have mentionned additionnally to mounting the server (so I can just wrap the server in a resource and close it as it gets discarded) I have a repl workflow where I restart it
But I solved it no worries
There’s actually an example in the repo that looked very similar to what I had
Another question, is throwing exceptions the idiomatic way of handling errors in Finch ? My entire code returns errors, I guess I have to write my own error-based Response creators
Guillaume Balaine
@Igosuki
What’s the best practice if endpoints are in multiple files ?
ComFreek
@ComFreek
import io.circe.generic.auto._ gives me an error about the subpackage generic not being found. What SBT dependencies do I need for this?
Currently I have libraryDependencies ++= Seq( "com.github.finagle" %% "finchx-circe" % "0.31.0", "com.github.finagle" %% "finchx-generic" % "0.31.0" )
ComFreek
@ComFreek
Got it, opened #1194 to improve docs.
Sergey Kolbasov
@sergeykolbasov
@Igosuki we just keep our endpoints grouped in separate classes per domain, instantiate them in DI and then manually build a coproduct of all the endpoints
classes instead of objects due to polymorphic F and dependency injection in class constructor
Richard Gomes
@frgomes
Hello, is there a better way to write this endpoint below?
In a nutshell, I'm transforming an IO[Output[Something]] into a Future by calling unsafeToFuture. Since I had already and IO[...], I suppose this call to unsafeToFuture is not necessary, if not undesirable.
Is there a better way to do it, please?
  val productsQuery: Endpoint[IO, ProductListResponse] =
    post( "products" :: jsonBody[SimpleRequest] ) { req: SimpleRequest =>
          val origin = "austria"
          val products: IO[Output[ProductListResponse]] =
            productCache
              .flatMap { cache => cache.ref.get(origin) }
              .map { r => Ok(ProductListResponse(ctx.api.version, None, r.get.contents)) }
          products.unsafeToFuture
    } handle {
      case e: IllegalArgumentException => handleProductList.BadRequest(e)
      case e: Exception                => handleProductList.InternalServerError(e)
    }
Prabhat Kashyap
@PKOfficial
This message was deleted
Sergey Kolbasov
@sergeykolbasov
@frgomes just return IO[Output[Something]] from your endpoint. At the moment you have a round-trip of IO -> Future -> IO
Richard Gomes
@frgomes
@sergeykolbasov : yeah... I suppose I'm missing some import of implicit conversions or something else, since the compiler complains that a Future is required.
[error] /home/rgomes/workspace/guided-repair-api/service/src/main/scala/Endpoints.scala:143:17: type mismatch;
[error]  found   : cats.effect.IO[io.finch.Output[api.model.ProductListResponse]]
[error]  required: scala.concurrent.Future[?]
[error]                 products //XXX .unsafeToFuture //FIXME: investigate if there's a way to avoid this call
I'm using these imports below:
  // These are magic imports which must survive IntelliJ attempts to "help us".
  // @formatter: off
  import cats.effect._
  import cats.implicits._
  import cats.syntax.apply._
  import io.circe.generic.auto._
  import io.finch._
  import io.finch.circe._
  // @formatter: on
Richard Gomes
@frgomes
My mistake. A for comprehension around the error line was imposing a Future. Fixing that fixes the entire endpoint.
@sergeykolbasov Thanks a lot :-)
Richard Gomes
@frgomes
Hello, I'm using filters in my endpoints.
I would like to "inject" a certain MyApp object into the Request when the request is authorized.
At the moment, the auth filter only blocks not authorized requests, but does not pass MyApp to compiled(req).
Any idea how this could be done, please?
trait Filters extends Whiteboard with StrictLogging {
  import io.finch._
  import cats.effect.IO
  import cats.implicits._
  import com.twitter.finagle.http.Status
  import com.twitter.finagle.http.Response

  def authorized(authorization: Option[String]): IO[MyApp] = tokenValidation.authorized(authorization)

  val auth: Endpoint.Compiled[IO] => Endpoint.Compiled[IO] =
    compiled => {
      Endpoint.Compiled[IO] { req =>
        authorized(req.authorization)
          .redeemWith(
            _ => (Trace.empty -> Right(Response(Status.Unauthorized))).pure[IO],
            //FIXME: should pass MyApp object into the request
            // See examples 2 and 3 of: https://finagle.github.io/finch/cookbook.html#defining-custom-endpoints
            myapp => IO(println(myapp)) *> compiled(req))
      }
    }
...
}