Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Antoine Doeraene
    @sherpal

    The question is why you want to keep that import statement in as min files as possible? If it's because your project is the barebones for building other projects, then you can always store somewhere in a global object the profile you want to use. Something like

    object DatabaseGlobals {
      def profile = slick.jdbc.PostgresProfile
    }

    and then you can import that instead, via my.package.DatabaseGlobals.profile.api._.
    If the reason is for mocks in tests, then I think you can put your table definitions inside a trait which requires an abstract jdbc profile, and feed that profile by extending the trait (I think that works)

    Margus Sipria
    @margussipria
    I there easy way to make extension to Query that will allow add NOWAIT to select queries for Postgres?
    nafg
    @nafg
    Yeah there's some method to modify the SQL, I don't remember offhand though
    Margus Sipria
    @margussipria
    yah, i have tried to find out, but probably not well documented, haven't find anything
    nafg
    @naftoligug:matrix.org
    [m]
    Margus Sipria
    @margussipria
    this seems to work
        val action = query.result.headOption
        action.overrideStatements(action.statements.map(_ + " NOWAIT"))
    i'm just worried if binding will happen correctly in every case. was wondering if there could be some way making this work with WrappingQuery (like forUpdate)
      implicit protected class AddNoWait[E, U, C[_]](val q: Query[E, U, C]) {
        def nowait: Query[E, U, C] = {
          ???
          new WrappingQuery[E, U, C](q.toNode, q.shaped)
        }
      }
    nafg
    @naftoligug:matrix.org
    [m]
    No idea sorry
    Anil Singh
    @asjadoun
    Is there a way to connect to Sybase database ? Any help or example is appreciated.
    Antoine Doeraene
    @sherpal
    There is a jdbc driver so in principle that means you can connect. The difficulty is that you need to teach slick the syntax via a profile, unless the syntax is the same as another existing db
    Naveen
    @dexter2305

    The question is why you want to keep that import statement in as min files as possible? If it's because your project is the barebones for building other projects, then you can always store somewhere in a global object the profile you want to use. Something like

    object DatabaseGlobals {
      def profile = slick.jdbc.PostgresProfile
    }

    and then you can import that instead, via my.package.DatabaseGlobals.profile.api._.
    If the reason is for mocks in tests, then I think you can put your table definitions inside a trait which requires an abstract jdbc profile, and feed that profile by extending the trait (I think that works)

    @sherpal - Intention is to use an in memory database for testing.

    nafg
    @nafg
    I plan to do some maintenance tasks in say half an hour live on https://discord.gg/afDsPzmU, if anyone wants to join
    nafg
    @nafg
    On now
    Debasish Ghosh
    @debasishg

    Here is something that I am trying to do with Slick.

    • I have a column in a table which is NOT a primary key but I want to have a global ordering on it. Initial thinking was to have it an AUTO_INC column
    • On insert things are fine with the AUTO_INC doing its part in assigning an incremented value automatically
    • Now my use case demands that when a row gets updated I want to change the value of that column to the (max-value of the column + 1), something like update table set global_offset = select max(global_offset)+1 from table

    Besides the fact that this might have performance implications, the underlying sequence of the AUTO_INC column does not get updated. Hence this same value can be inserted in another row as well through an insert statement.

    Any help how I can do this with Slick ? Thanks.

    nafg
    @nafg
    @debasishg what RDBMS?
    I think you have a few different questions. I would suggest first figuring out how to do it in SQL in your RDBMS, then breaking down the different Slick questions that remain
    In general, Slick does not have built in support for update expressions, AFAIK (only literal values)
    Of course you can use the sql string interpolator, or construct DBIOs with raw JDBC interaction (I forgot the exact way to do that offhand)
    Debasish Ghosh
    @debasishg
    @nafg Can I do the following in Slick ? I know AutoInc is a better option but I want to have a handle on the sequence as I would like to use it during update of this table.
    CREATE TABLE state (
      offset integer NOT NULL DEFAULT nextval('state_ordering_seq'),
      ..
    );
    nafg
    @nafg
    @debasishg I'm not sure if O.Default takes an expression or raw SQL. But in general I don't recommend using Slick to generate your DDL
    It's nice that you can write slick table definitions and get some CREATE TABLE statements but then how do you do migrations? IMO it's convenient for getting started but not practical for production code
    Especially if you codegen your table definitions from the database
    If you draw a dependency diagram between the database and your code the arrows shouldn't reveal a cycle
    I use Flyway. Write your DDL in Flyway migrations, and then worry about how your code talks to the database
    Jeremiah Malina
    @jjmalina

    Can someone tell me if this config looks correct for mysql using Slick 3.3.1 on Scala 2.12?

    nc {
      mysql {
        profile = "slick.jdbc.MySQLProfile$"
        dataSourceClass = "slick.jdbc.DatabaseUrlDataSource"
        properties = {
          driver = "com.mysql.cj.jdbc.Driver"
          databaseName = "analytics_cache_schema"
          serverName = "localhost"
          portNumber = 3306
          user = "analytics_cache"
          password = "qwe90qwe"
          characterEncoding = "utf8"
          useUnicode = true
        }
        numThreads = 10
        keepAliveConnection = true
      }
    }

    I initialize with Database.forConfig("nc.mysql") but getting this error: java.lang.ClassNotFoundException: slick.jdbc.DatabaseUrlDataSource
    It only happens in EMR 6.0.0 in Spark 2.4.4, and not in my tests run by sbt
    If I inspect my jar with I can see slick/jdbc/DatabaseUrlDataSource.class is included

    nafg
    @naftoligug:matrix.org
    [m]
    Can you post the full output
    1 reply
    Naveen
    @dexter2305
    @jjmalina I used a similar configuration with mysql. I have build.sbt with "mysql" %% "mysql-connector-java" % "6.0.6". Can you check if you mysql driver library dependency defined in your build.sbt ?
    1 reply
    nafg
    @naftoligug:matrix.org
    [m]
    He said it works in sbt just not in spark
    Akinmolayan Olushola
    @osleonard
    @jjmalina slick and spark wont work together if what you are interested in is reading and writing to mysql you should use the structured streaming api's that supports any type of data storage
    1 reply
    Andy Czerwonka
    @andyczerwonka

    I have a simple insert that looks like:

    records += Record(c1, c2, c3)

    I'd like to change it to make that insert conditional. I don't want to proceed with the insert if a record exists where the all the columns match but c3 is null.

    I'd like to have Slick generate the following:

    insert into mytable (c1, c2, c3) values ('one', 'two', 'three')
    where not exists (select 1 from mytable where c1 = 'one' and c2 = 'two' and c3 is null);
    5 replies
    amahonee
    @amahonee

    Hi everyone, does anyone have experience using Slick with the Squants library? I am trying to add two colums of Rep[Length] together. The thing I'm running into seems like an implicit string conversion is being applied, messing things up. The first value (bow) is lifted and read as Rep[Length], the second (stern) however has anyOptionLift(primitiveShape(stringColumnType)) implicit added to it. Both bow and stern in the for comprehension are read as Rep[Length] but the yield uses the any2stringadd implicit causing a String expected error.

    val length: Rep[Option[Length]] = for {
          bow   <- nearbyVessel.distanceToBow
          stern <- nearbyVessel.distanceToStern
        } yield bow + stern

    Any suggestions on how I can solve this by using explicit extension methods or writing my own are greatly appreciated, cheers.

    MrFincher
    @MrFincher
    Hi,
    I have a situation where I have a complex query, for parts of which I already have lifted slick queries, the rest is (still) plain sql at the moment.
    In the end I want to use the query's results for an update which, as far as I can tell, is not currently possible with slick without loading it from the db into the memory of my service and sending it back to the db again, correct?
    But I would like to use lifted slick as much as possible.
    Baased on this , the two solutions I see are:
    • running the complex query using slick and storing the results in a temporary table to then do the update in plain sql
    • get the sql statement generated from my lifted slick query as a string and use it in a plain sql update statement (via string concatenation)
      The first issue I encountered while trying out the second option is that the generated sql contains generated column names like x1, x2 etc. I managed to workaround that by wrapping the query in some plain sql that renames the columns based on their position (e.g. using a with-clause/cte) but all this gives me the impression that this is a bad practice. I would be grateful for some thoughts on this.
    Jeremiah Malina
    @jjmalina
    So I'm now in spark-shell trying to debug java.lang.ClassNotFoundException: slick.jdbc.DatabaseUrlDataSource
    when I run Database.forConfig("nc.mysql") (see earlier message)
    What's odd is that I can initialize DatabaseUrlDataSource directly in the shell
    scala> import slick.jdbc.DatabaseUrlDataSource
    import slick.jdbc.DatabaseUrlDataSource
    scala> new DatabaseUrlDataSource()
    res7: slick.jdbc.DatabaseUrlDataSource = slick.jdbc.DatabaseUrlDataSource@15ff0f79
    Jeremiah Malina
    @jjmalina
    So what I think is happening is that I have slick-hikaricp 3.3.1 which depends on HikariCP 3.2.0 but in Spark on EMR HikariCP 2.4.12 is installed
    Jeremiah Malina
    @jjmalina
    Disabling connection pooling solved the ClassNotFoundException during database initialization, but now I'm facing a different error
    java.lang.NullPointerException
        at slick.jdbc.DriverDataSource.getConnection(DriverDataSource.scala:101)
        at slick.jdbc.DataSourceJdbcDataSource.createConnection(JdbcDataSource.scala:68)
        at slick.jdbc.JdbcBackend$BaseSession.<init>(JdbcBackend.scala:494)
        at slick.jdbc.JdbcBackend$DatabaseDef.createSession(JdbcBackend.scala:46)
        at slick.jdbc.JdbcBackend$DatabaseDef.createSession(JdbcBackend.scala:37)
        at slick.basic.BasicBackend$DatabaseDef.acquireSession(BasicBackend.scala:250)
        at slick.basic.BasicBackend$DatabaseDef.acquireSession$(BasicBackend.scala:249)
        at slick.jdbc.JdbcBackend$DatabaseDef.acquireSession(JdbcBackend.scala:37)
        at slick.basic.BasicBackend$DatabaseDef$$anon$3.run(BasicBackend.scala:275)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    Jeremiah Malina
    @jjmalina
    Decided to try out using 3.2.3 instead of 3.3.3 because that uses an older version of HikariCP that hopefully won't conflict, but the codegen in my project now is not working. It runs without error but doesnt generate classes for tables. Switching back to 3.3.3 solves that
    Jeremiah Malina
    @jjmalina
    Managed to fix the classnotfoundexception by using shading in sbt. Now my issue is that when I initialize my database with Database.forURL and try to write to the db, the mysql driver connects using a different database URL and I get an error saying the table doesn't exist. Anyone experience something like this?
    ramakrishna-hande
    @ramakrishna-hande

    Hi, I have a Main program that queries data from DB using slick. It constructs JSON from the record it gets from DB and then sends publishes to some kafka topics. Almost at the end of the program, I am trying to call a stored procedure that updates the records.

    My code looks like this :

    implicit val system: ActorSystem = ActorSystem()
    implicit val mat: ActorMaterializer = ActorMaterializer()
    implicit val ec = system.dispatcher
    
    implicit val session: SlickSession = SlickSession.forConfig("my-mysql")

    then

     for {
          record1 <- selectQuery1
          record2 <- selectQuery2 
         // create JSON and publish to kafka, this is done using akka streams to get list of Future[Done]
         topicsExecuted: List[Done] <- topicsExecutedF
         procedureExecuted: Int <- session.db.run(myProcedure)
       } 
        {
           session.close()
           system.terminate()
        }

    procedureExecuted prints some value and confirms that the procedure is executed however, it does not end the main program.

    If I remove the procedure call, the programs ends fine. I also converted the procedure call to a JDBC statement, which is synchronous and that also works fine. But when I try to call the procedure using Async code (something that returns Future), program does not end somehow.

    These are my library specifications :

      Manifest-Version: 1.0
      Implementation-Title: akka-stream-alpakka-slick
      Automatic-Module-Name: akka.stream.alpakka.slick
      Implementation-Version: 1.0-M1
      Specification-Vendor: Lightbend Inc.
      Specification-Title: akka-stream-alpakka-slick
      Implementation-Vendor-Id: com.lightbend.akka
      Specification-Version: 1.0-M1
      Implementation-URL: https://github.com/akka/alpakka
      Implementation-Vendor: Lightbend Inc.

    Thanks in advance for any help

    3 replies
    Andrea Turli
    @andreaturli
    hi folks, I wonder if anyone has a suggestion on how to fix this problem
    [E]      Adaptation of argument list by inserting () is deprecated: this is unlikely to be what you want.
    [E]              signature: SQLInterpolation.sql[P](param: P)(implicit pconv: scala.slick.jdbc.SetParameter[P]): scala.slick.jdbc.SQLInterpolationResult[P]
    [E]        given arguments: <none>
    [E]       after adaptation: SQLInterpolation.sql((): Unit)
    [E]      L1006:         sql"""select now()""".as[DateTime].first
    thanks in advance!
    nafg
    @nafg
    Anyone interested in watching me or joining me to work on Slick soon?
    KiranKumar BS
    @kirankbs

    Hello Folks,
    I am facing a problem while adding timestamp. I am using slick sql interpolation.

    Original sql statement which works

    (date_trunc('seconds', ('2021-06-24T12:30Z' - timestamp '2021-06-24T12:30Z') / 300) * 300 + timestamp '2021-06-24T12:30Z')

    Slick is failing at

    ((date_trunc('seconds', (bucket - ${start.floor(FiveMinutes)}) / ${interval.duration.toSeconds}) * ${interval.duration.toSeconds} + ${start.floor(FiveMinutes)}))

    i am getting invalid input syntax for type interval: "2021-06-24 12:30:00+00.

    Any help?

    Antoine Doeraene
    @sherpal
    @kirankbs I think you should put # in front of your $ when doing this kind of string interpolation with slick
    Austin Steady
    @purpleladydragons

    hi everyone, could someone please tell me how i can produce where 'str literal' = 'str literal' with slick's filter method? i've tried using

    .filter(_ => SimpleLiteral[String]("str literal") === SimpleLiteral[String]("str literal"))

    but it errors with Unknown column 'str literal' in 'where clause'

    In case i'm trying to solve my underlying problem the wrong way, the context is that I'd like to be able to "label" my queries so that it's easier to trace them from AWS performance insights to their original code source, but I don't think slick allows me to add comments directly, and I can't rewrite all my code using plain sql

    Austin Steady
    @purpleladydragons

    i think i solved my immediate issue. instead of "str literal" i needed to use "'str literal'" with the single quotes inside

    that said, if anyone has any opinion on how I could maybe accomplish my real goal in a better, less hacky way, I would appreciate it. thank you!

    Richard Dallaway
    @d6y
    Interesting problem. What's the AWS comment format you need @purpleladydragons ?
    Richard Dallaway
    @d6y
    There is a method on an action called named, but I've never actually got it to print out anything in any log. If you can get that working, would be interested it see how. Probably something dumb I'm doing.
    The other thought for commenting is to make use of result.statements and query.overrideStatements to append a comment to the end of the SQL.
    Richard Dallaway
    @d6y
    ^^ here's a proof of concept of the kind of thing I mean: https://github.com/d6y/slick-comment-query/blob/main/src/main/scala/main.scala#L40-L46 <- always dangerous hacking the SQL string in my opinion
    It's really an action label rather than a query label, but hey-ho.