Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    Reason for this is in my tests, when I write integration tests I need to keep multiple queries from multiple data accessors in one transaction, which is not possible if we creat a new session for each object
    Or am I wrong?
    Kazuhiro Sera
    @pauljamescleary Sorry for my late reply. I haven't checked here for a long time. If you need one big insert statement, you need to build such a query dynamically by your own. For instance,
    val valuesPart: SQLSyntax = sqls.csv(group.map { params => sqls"(${sqls.csv(params:_*)})" }: _*)
    sql"INSERT INTO table (a, b, c) VALUES ${valuesPart}".execute.apply()
    @Sindu__twitter It's possible to bind values from ResultSet to nested data structure. See also: http://scalikejdbc.org/documentation/one-to-x.html However, mapping to one to many to many structure by issuing a single SQL query can be inefficient for most cases. Also, you can do such mapping manually but one-to-x API doesn't support the case.
    Paul Cleary
    @seratch it actually does work with rewriteBatchedStatements set to true :) Verified with the general log
    Andrii Lytvynenko

    Hi, people. I'm trying to connect ScalikeJDBC with SQLite and getting
    java.sql.SQLException: Cannot change read-only flag after establishing a connection. Use SQLiteConfig#setReadOnly and SQLiteConfig.createConnection().
    all the time.

    Is there any way to connect the lib with SQLite?

    Mark Harris
    Apologies if this is a FAQ and I've just missed it somewhere. I've got a data set on S3 I'm looking to query from a lambda. The data's stored in parquet, so Athena seems like a good choice. Last time I looked, it wasn't possible to use the athena connector and scalike because it's based entirely on prepared statements, which the connector doesn't support.
    Is that still the case?
    If so, has anyone successfully used https://github.com/zaneli/scalikejdbc-athena or some other way to query Athena at a higher level than hand-rolling a lot of AWS SDK cruft?
    Jem Mawson
    Hi. I'm trying to do an UPDATE ... RETURNING statement with an AutoSession, but it fails because SQLToResult enforces a readonly txn. Can I circumvent this? https://stackoverflow.com/questions/55213167/update-returning-queries-in-scalikejdbc
    Jem Mawson

    Another question… I have a method that works in prod:

    DB.localTx { implicit s =>

    But when run in test it does not see the data set up in the fixture, as part of scalikejdbc.specs2.mutable.AutoRollback. That’s because the spec has its own transaction.
    Instead of DB.localTx, what should I use that would join the existing transaction if necessary (spec) or start a new transaction (prod)? I have access to session.tx: Option[Tx] which is Nonein prod and Some in test.

    Jem Mawson
    I solved it by matching on the session type. If it’s AutoSession, I know I’m inside play. Otherwise I’m probably in a spec and need to play in the same session. This minimises the difference between prod and test code in the prod class. Not perfect, but good enough for me. Open to other suggestions:
    def statement(implicit session: DBSession): Iterator[Payment] = {
        update payments
        set status='submitted', submitted=$now
        where status='pending'
        and scheduled <= $now
        returning $paymentFields
    session match {
      case AutoSession => DB.localTx(statement(_))
      case _           => statement(session)
    Kenneth Lu
    Hi all, I want to create a scala function that lets you create a table with some input name
    I have something like this def createTable(name: String): Unit = sql"""create table $name (id serial not null primary key, name varchar(64), created_at timestamp not null)""" .execute().apply()
    but if i'm understanding this right, scalike treats $name as a binding for a sql query, hence this doesn't work
    any suggestions ?
    Vladimir Morozov
    Hi all, how to correct read and write Seq[T] stored as JSONB in PostrgeSQL?
    Jonas Natten
    This isn't really a scalikejdbc issue, but I'm using json4s to serialize and deserialize data to jsonb. And after upgrading json4s from 3.5.5 to 3.6.x something goes wrong.
    I just use read[LearningPath](jsonString) in my constructor to deserialize and the LearningPath type is defined like this: http://scalikejdbc.org/documentation/one-to-x.html
    stacktrace -> https://hastebin.com/naxadomano.scala
    Any suggestions?

    Hey, I was wondering how I would call a stored procedure with scalike.

    CREATE OR REPLACE FUNCTION truncate_tables() RETURNS void AS $$
        statements CURSOR FOR
            SELECT tablename FROM pg_tables
            WHERE schemaname = 'public';
        FOR stmt IN statements LOOP
            EXECUTE 'TRUNCATE TABLE ' || quote_ident(stmt.tablename) || ' CASCADE;';
        END LOOP;
    $$ LANGUAGE plpgsql;

    This is the stored procedure I'm using. I was hoping I could do something like this:

    val sql = sqls"select truncate_tables()"

    Is there a way to do this?

    No one watching this channel anymore?
    Kenneth Lu
    @larstoc so, I'm not a contributor to scalike, but I think the best way to get your question answered might be to post a github issue:
    Hello, is anyone use scalikejdbcGen with Presto? When I tried it, I am getting java.sql.SQLFeatureNotSupportedException: primary keys not supported Exception since primary keys is not supported in Presto jdbc library.
    Hello, also interested in knowing if anyone here has used ScalikeJDBC with AWS Snowflake. I pretty much have it working for Inserts/Deletes but Select statements fail because it appears ScalikeJDBC is using readOnly DB session and attempting to set the readOnly flag on the underlying JDBC driver which is failing in snowflake-jdbc because setting the flag is not supported. Has anyone run into this before or know a workaround?
    Perhaps lemme ask question in a different way: can a run a select query via ScalikeJDBC query interpolation in a non-readonly way?
    Kazuhiro Sera
    @larstoc Thanks for submitting an issue - as I already answered there, it's not supported yet scalikejdbc/scalikejdbc#1062
    @amirhac hmm, I've never tried to run the command with Presto. If fixing the incompatibility needs a little efforts, it may be possible to address it.
    @uneewk will respond to your GH issue later
    Hello, is anyone read Typesafe Config from HDFS path? For different Environment using different config path
    Han Ju
    Hi. Just found out that when I'm using sql interpolation with java.time.LocalDateTime, scalikejdbc acutally changes the time according to the default time zone. Is there a way to deactivate this behavior? Thanks!
    What we want is just to write the LocalDateTime as is into DB, regardless of the timezone on the machine.
    2 replies
    Ilya Kachalsky
    Hi everyone, does anyone know where to find change log for new versions?
    Dragutin Marjanović


    Is there an option to use .returning in update when where is mixed in. I've found an example in the tests but there's no .where there.


    But, when I try something like:

    update(A).set(ac.name -> "value").where.eq(ac.id, 1).returning(ac.*)

    it doesn't work.

    I found a workaround and use .append(sqls"returning ${ac.*}).

    I would create a PR for this if currently there's no way to do it.

    Please let me know. Thanks in advance.

    Roberto Leibman
    Hello! So... I'm working on this gnarly query that has a bunch of CTEs... I need to construct my query dynamically, and inside the dynamic part there's variables... paraphrasing my real query:
      val strings = List("A", "B", "C")
      val queryStr = s"""WITH cte AS (${strings.map(s => s"select '${s}' as MyCol").mkString(" UNION ALL ")}) select * from cte"""

    I then do:


    In order to get a scalalikejdbc query that I can use.

    But this is open to a sql injection!
    How would you handle this?
    Przemek Sokół
    Hi there! I've been wondering whether it is possible to build SQL queries with ScalikeJDBC without connecting to the database and without side effects. I guess that ScalikeJDBC can connect to DB to figure out table syntax. That's cool but I would rather provide table definition myself (as in Slick), even if the query is going to fail at runtime (due to misspelled column name etc.). I simply want to get rid of SQL query strings in my codebase in favor of ScalikeJDBC DSL. Is it doable or is valid connection pool in ScalikeJDBC absolutely necessary for it to operate (to construct SQL query strings)?
    Ilya Kachalsky


    How would you handle this?

    I'm doing so using SQLSyntax,

      val strings = List("A", "B", "C")
      val queryStr = sqls"""WITH cte AS (${strings.map(s => s"select ${s} as MyCol").mkString(" UNION ALL ")}) select * from cte"""
    And then you can simply do sql"${queryStr}"
    I also have a question:
    Is there a way to register out parameters for queries like {? = call function(?)}
    Hello, Just started to play around with this awesome library today. I have some questions. Are the real connection to DB is lazily established? If yes, is there a way see when actual connection is established ?
    Also is there a way to see which connection from the pool was used to run a query ?
    Hello, I would like have access to statement (or prepared statement) object before it is closed. I believe it is encapsulated in StatementExecutor case class. Is it possible get access to StatementExecutor object ?
    Matt Dziuban
    hi all, I'm seeing that if I run multiple queries in parallel using withinTx, all of the queries take as long as the longest one, however if I use AutoSession then the queries complete as quickly as they finish executing. is this expected behavior?
    Anu Zachariah
    Hi, How can I increase the maxPoolSize and maxQueueSize of the AsyncConnectionPool. I keep getting the queue is full error.

    How I can establish SSL connection to the database?
    I didn't find an answer in the documentation.
    I need to send additional properties like, in the example, but I didn't find how to do it.

    var options = Map[String, String]()
    options += ("javax.net.ssl.trustStore" -> "truststore_superadmin.jks")
    options += ("javax.net.ssl.trustStorePassword" -> s"welcome123")
    options += ("javax.net.ssl.trustStoreType" -> "JKS")
    options += ("javax.net.ssl.keyStore" -> "keystore_superadmin.jks")
    options += ("javax.net.ssl.keyStorePassword" -> s"welcome123")
    options += ("javax.net.ssl.keyStoreType" -> s"JKS")
    options += ("oracle.net.authentication_services" -> s"(TCPS)")
    options += ("oracle.net.ssl_server_dn_match" -> s"false")
    options += ("oracle.net.ssl_cipher_suites" -> s"(TLS_RSA_WITH_AES_256_CBC_SHA256)")

    If somebody has any example would be great if you could share it
    Thank you

    Hello, every one. I need some help with ScalikeJDBC, thats why this question for its users.
    my project have next tables: Posts, Users, Votes. Post one-to-many Votes, Post many-to-one users. I want to get All posts with comments and user and order by comments count. First problem was uncorrect limit, but I fixed this by sub-queries. But how can I sort posts by count queries ?
    I know how to do it in SQL. How can i order by count(v.voteId) in DSL ? .orderBy(x(count(v.voteId))) not working
    In this gist you can find my DSL function and SQL script: https://gist.github.com/afanasevek/82fb6b0f09d7f74565b55adf9879d582
    Ramakrishna Hande

    Hi, I am using scalalikeJdbc in my project and

    I have an SQL query like this

          val str = "select name, age from employee”
          val selectQueryString = sqls"$str”
         val like = "%" + term + “%"
         val emp: List[Employee] = db.localTx { implicit session =>
                   sql"#$selectQueryString where name like ${like} order by age"
                      .map(rs => Employee(rs.string("name"), rs.int("age")))

    But when I run the program I get Error like this . Can someone please help ?

                Failed preparing the statement (Reason: Parameter index out of range (1 > number of parameters, which is 0

    My requirement is , I already have a safe SQL query which is a string and I want to add conditions to that , so that the query is safe from SQL injection. Thanks in advance

    Ramakrishna Hande


    I was using scalikejdbc-sync and now i want to use


    I have added

              "org.scalikejdbc"       %% "scalikejdbc-async" % "0.14.+",
              "com.github.jasync-sql" %  "jasync-mysql"      % "1.1.+"

    to my dependencies

    I am trying to setup a connection pool using

             val ds: HikariDataSource = {
                    // read from config file 
                  AsyncConnectionPool.singleton(ds.getJdbcUrl, ds.getUsername, ds.getPassword)
                       val query = SQLSyntax.createUnsafely(filteredQuery)
                     val result = AsyncDB.localTx { implicit tx =>
                          .map(rs => rs.long(1))

    but I get

                      Message: Error 1059 - #42000 - Identifier name is too long

    The is nothing but my JDBC URL.
    Is this a bug in the library ?

    thanks in advance

    1 reply
    Hey everyone, just want to know the reason to use the loan pattern with Structural Types? It looks like standard java AutoClosable is suitable for this. Just curiosity