Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Swoorup Joshi
    @Swoorup
    ?
    it appears to work not sure if this can be simplified
    Rob Norris
    @tpolecat
    If you want to execute each chunk in its own transaction then you can do s.groupWithin(1000, 5.seconds).evalMap { chunk => ... } where ... is chunk => F[Whatever].
    Since this looks like a long-running thing that's how I would do it.
    Swoorup Joshi
    @Swoorup
    yep
    Swoorup Joshi
    @Swoorup
    woot
    TooManyParametersException
    [info] 🔥
    [info] 🔥 Problem: Statement has more than 32767 parameters.
    [info] 🔥 Hint: Postgres can't handle this many parameters. Execute multiple
    [info] 🔥 statements instead.
    [info] 🔥
    why can’t it handle more parameters?
    Swoorup Joshi
    @Swoorup
    its an array of elements consiting 6 parameters
    about 1000 elements
    Rob Norris
    @tpolecat
    It expands to a statement with more than 32k parameters. That’s the most PG can handle.
    Swoorup Joshi
    @Swoorup
    i see
    Swoorup Joshi
    @Swoorup
    so the library is pretty barebones :smile:
    Swoorup Joshi
    @Swoorup
    still getting the exception
    val sink: Pipe[F, MarketData, Unit] = s => 
          s.groupWithin(PostgresConstant.MARKET_DATA_INSERT_CHUNK_SIZE, 5.seconds)
            .map(_.toList.map(_.convert[List[RawMarketDataRecord]]).flatten)
            .covary[F].evalMap {chunk => 
              pool.use { 
                _.prepare(insertMarketData(chunk.length))
                  .use{_.execute(chunk)}
                  .void
              }.adaptError{ 
                case ex =>
                  println(s"Exception: ${ex}")
                  ex
              }
            }
    Rob Norris
    @tpolecat
    How big is the chunk?
    And how many fields does each element in the chunk have? This product can’t be more than 32k
    Swoorup Joshi
    @Swoorup
    max of 256
    yeah it does seem weird
    val marketDataRecord: SkunkCodec[RawMarketDataRecord] =
        (varchar(32)
        *: varchar(32)
        *: timestamptz 
        *: varchar(64)
        *: bool 
        *: bool 
        *: bool.opt 
        *: numeric(30,6)
        *: numeric(18,6))
        .pimap[RawMarketDataRecord]
    256 * 9 = 2304 max
    never mind. uuhhhj
    the groupWithin should be after the flatten thing
    Rob Norris
    @tpolecat
    Print out the length of the chunk just to be sure.
    Ok.
    Aditya
    @aditya-K93
    Pardon me for being a total beginner, after going through the example docs and I'm able to make my service work with skunk(service based tagless final approach passing sessionpool resource). However I'm not able to get queries to work in seq like getIDQuery then use that to doAnotherQuery in sequence. How do I run queries in seq inside context F. My output type is F[Option[Item]]. I can't get it to compile. I'm using a Query type
    Fabio Epifani
    @epifab

    hello! I'm apparently having an issue decoding a field. in particular I have a query like this:
    SELECT 2::float8 + 3::numeric
    I'd expect the result to be numeric, but I get a

    [info] 🔥  The actual and asserted output columns are
    [info] 🔥  
    [info] 🔥    ?column?  float8  ->  numeric  ── type mismatch

    I take this is how postgres works, and nothing to do with Skunk specifically?

    Fabio Epifani
    @epifab

    hi, me again. I've got this:

    val codec = varchar(16)
    sql"SELECT $codec".query(codec)

    if I run it I get a ?column? varchar -> varchar(16) type mismatch. is this expected?

    Swapnil S.
    @Iamswapnil619
    Hi , Is there any module present for testing the skunk queries like we do it for doobie using doobie.scalatest.IOChecker ?
    Swapnil S.
    @Iamswapnil619

    Hi ,

    i have below case class :

    case class StepRun(job_run_id: String, step_name: String, properties: Json )

    where properties is json type . How do i write a codec for this.

    I have written codec as below but it is failing on jsonb type. Error : Not found: jsonb

    def stepRunCodec: Codec[StepRunDB1] = (varchar ~ varchar ~ jsonb).gimap[StepRun]

    Any help ?

    Fabio Epifani
    @epifab
    if you're using circe you can import "org.tpolecat" %% "skunk-circe" % "0.2.0"
    then you have jsonb in skunk.circe.codec.all
    I think you can decode it directly into the final type rather than a generic json
    Swapnil S.
    @Iamswapnil619
    @epifab Got the issue.. Thanks for the help
    Swapnil S.
    @Iamswapnil619

    Hi , Is there a way we can process the dynamic queries in skunk. I have seen all the examples where we can define the queries at compile time and there codecs but is there any provision in skunk where we can execute the dynamic queries directly without knowing the codecs.

    Like in doobie we can do this :

     def executeQuery(query: String): IO[DBException, Unit] = {
            Fragment.const(query)
              .update
              .run
              .transact(transactor).unit
          }

    Any Help ?

    Rob Norris
    @tpolecat
    I'm no longer monitoring this channel. Please switch to Typelevel Discord at https://sca.la/typeleveldiscord
    Michal Lacko
    @visox
    Hi, i dont quite get how to deal with postgres arrays, i see Arr but i dont see how do i get a List out of Arr ??
    basicbang0x
    @basicbang0x

    Hi everyone, I have a session pool, poolResource : Resource[IO, Resource[IO, Session[IO]]]. I use Session.pooled(...).use to get the actual pool, pool: Resource[IO, Session[IO]] and all my postgres services use this pool. However, I get an error that says the following:

    🔥  
    🔥  EofException
    🔥  
    🔥    Problem: EOF was reached on the network socket.
    🔥     Detail: Attempt to read 5 byte(s) failed after 0 bytes(s) were read, because
    🔥             the connection had closed.
    🔥       Hint: Discard this session and retry with a new one.
    🔥

    When I use pool.use wouldn't that make a new session for me?

    How can I resolve this issue. Does it involve a certain config?

    Alexandr Oshlakov
    @shadowsmind
    Hi everyone. Who know how work with serial id in skunk, when need insert object without id and return with id?
    Alexandr Oshlakov
    @shadowsmind
    example:
    CREATE TABLE users(
      id        SERIAL NOT NULL,
      email     VARCHAR(255) NOT NULL,
      age       SMALLINT DEFAULT NULL,
      is_banned BOOLEAN NOT NULL DEFAULT FALSE
    );
    
    case class UserInsert(email: String, age: Option[Int])
    case class UserEntity(id: Long, email: String, age: Option[Int], isBanned: Boolean)
    
    trait UserRepository[F[_]] {
      def save(data: UserInsert): F[UserEntity]
    }
    3 replies
    Gökmen Nişancı
    @nisancigokmen_gitlab
    Hi Everyone ! :beers:
    Russel
    @arussel
    When you want to use UPDATE *** RETURNING ***, should you use a command or a query ?
    Felix Bjært Hargreaves
    @hejfelix
    In doobie, there was a way to typecheck your queries against the database -- does skunk offer something similar?
    Harry Chen
    @chenharryhua
    Hi, there, I try to find an example of writing an upsert command using Postgres On conflict statement.. anyone can help?
    Truedrog
    @Truedrog
    Hi! In doobie I was able to construct a fragment from string (from a file)
    But how can I do it in skunk?
    Fragment is now applying 3 arguments, what should I pass as a third?
    Oliver Winks
    @ohuu
    @Truedrog I run sql files to initialise the db before tests
    val sqlStr = Source.fromResource(path).getLines.mkString("\n")
    val sql    = sql"#$sqlStr".command
    I do it like that :point_up_2:
    if there are multiple queries in your sql file you have to wrap them in a DO block https://www.postgresql.org/docs/current/sql-do.html
    Oliver Winks
    @ohuu
    I'm a bit confused about tracing/logging in Skunk. I'm not used to tracing. When I execute queries I get super detailed (and very useful) logs in the terminal about each query and result. This is awesome but during tests it gets super noisy, is there a way to silence (or at least be selective of) the output?
    HansG
    @HansG
    @tpolecat I have tried to refactor and update the commented out Http4sExample in "modules/example/src/main/scala-2/Http4s.scala" and came up with two versions in gist , wherby the second one Http4sExample2.scala seemed to me to be the more "Skunk-idiomatic". Unfortunately only the first version Http4sExample1.scala works, the second throws a skunk.exception.PostgresErrorException ("..prepared statement doesn't exist") . Can you please give me a hint, what's wrong with the second version.