def addEvents(entryEvents: Seq[EntryEvent]) = {
val now = LocalDateTime.now()
val newEvents = entryEvents.map(e => Event(0, now, e.employeeId, e.teamId, e.projectId, e.companyId, e.notes))
val task = tables.db.run(tables.events ++= newEvents)
(newEvents, task)
}
Question:
Insert into a Table from Raw SQL Select
val rawSql: DBIO[Vector[(String, String)]] = sql"SELECT id, name FROM SomeTable".as[(String, String)]
val myTable :TableQuery[MyClass] // with columns id, name and some other columns
Is there a way to use forceInsert
functions to insert data from select into the tables?
If not, Is there a way to generate a sql string by using forceInsertStatements
?
Something like:
db.run {
myTable.map{ t => (t.id, t.name)}.forceInsert????(rawSql)
}
Thanks for the help.
forceInsertQuery
which seems to this but for inserts, and there are DBIO Actions which apparently let me update values based on a computation. But is there a forceUpdateQuery
or do I indeed use DBIO Actions for this purpose?
case class User
and a Users
table
Shape[FlatShapeLevel, Users, User, Users]
outside of class Users
Users
So, how do I write this line:
implicit def userShape: Shape[FlatShapeLevel, Users, User, Users] = ...
?
Users
table, and so cannot access the column
s within it
override def getPaymentErrorsByBookingIds(bookingIds: Seq[UUID]): DBIO[Seq[(UUID, Seq[PaymentError])]] =
paymentErrors
.filter(b => b.bookingId inSet bookingIds)
.groupBy(_.bookingId)
.map {
case (bookingId, errors) => bookingId -> errors
}
.result
trait PaymentErrorTable {
class PaymentErrors(tag: Tag) extends Table[PaymentError](tag, "payment_error") {
def newIdWhenInserting: UUID = UUID.randomUUID()
def currentWhenInserting = new Timestamp((new Date).getTime)
def id = column[UUID]("id", O.Default(newIdWhenInserting), O.PrimaryKey)
def bookingId = column[UUID]("booking_id")
def paymentMethodId = column[String]("payment_method_id")
def error = column[String]("error")
def amount = column[BigDecimal]("amount")
def currency = column[Currency]("currency")
def createdAt = column[Timestamp]("created_at", O.Default(currentWhenInserting))
def * =
(
id,
bookingId,
paymentMethodId,
error,
amount,
currency,
createdAt
) <> ((PaymentError.apply _).tupled, PaymentError.unapply)
}
protected val paymentErrors = TableQuery[PaymentErrors]
}
No matching Shape found.
Slick does not know how to map the given types.
Possible causes: T in Table[T] does not match your * projection,
you use an unsupported type in a Query (e.g. scala List),
or you forgot to import a driver api into scope.
Required level: slick.lifted.FlatShapeLevel
Source type: (slick.lifted.Rep[java.util.UUID], slick.lifted.Query[JdbcPaymentErrorStorage.this.PaymentErrors,booking.PaymentError,[+A]Seq[A]])
Unpacked type: T
Packed type: G
.map {
Hi everyone, I'm using plainsql slick queries and I'm running into the following issue:
Caused by: org.postgresql.util.PSQLException: ERROR: column "expiry_date" is of type timestamp without time zone but expression is of type character varying
The column in question is of type timestamp
with timezone.
Here's how I build my query:
val query = sqlu"""INSERT INTO "offers"
VALUES (
${c.userId.toInt},
${c.categoryId.toInt},
${new Timestamp(Instant.now.toEpochMilli)},
${new Timestamp(c.expiryDate)},
${c.description},
${c.address},
${c.latitude},
${c.longitude},
${new Timestamp(c.startAt)},
${new Timestamp(c.endAt)},
${c.compensation},
${c.imageUrl},
${c.requirements},
${c.title}
)"""
Case.If(aquery.length>0) Then bquery else cquery
It seems that the following does not work, what is the suggested way to perform different queries based on the length of the previous one?
hi all! :) I'm having a problem with values interpolation in raw SQL. This works on PostgreSQL CLI:
select file_name from (values ('file1'), ('file2'), ('file3')) AS t (file_name) except select file_name from processed_s3_files;
I was expecting this to work:
val fileNamesAsValues = fileNames.map(fn => s"('$fn')").mkString(", ")
sql"""select "file_name" from (values $fileNamesAsValues) AS t ("file_name") except select "file_name" from "processed_s3_files"""".as[String]
but I get org.postgresql.util.PSQLException: ERROR: syntax error at or near "$1"
Hi, I am having problems with using Instant
with the newest h2 version 1.4.200.
I am getting the following error:
org.h2.jdbc.JdbcSQLDataException: Cannot parse "TIMESTAMP" constant "aced0005740014323039392d30312d30315430383a30303a30305a"; SQL statement:
With previous versions I used a custom column mapper to work around errors with H2.
protected implicit val H2InstantMapper: BaseColumnType[Instant] = MappedColumnType.base[Instant, Timestamp](
w => Timestamp.from(w),
r => r.toInstant
)
But this does not work anymore with the newest version.
I also needed to use the PostgresProfile
instead of the H2Profile
because the BLOB
data type was not working anymore.
Can anyone help me with figuring out how to use Instant
both with H2 (which I use in tests) and Postgres (which I use in prod)?
I tried around for hours now and I am stuck..
Hi folks,
This has been covered a good bit in StackOverflow questions & in other places, but I have yet to find a solution that I'm happy with... I want to create an UPDATE
statement with a variable number of columns. Something like
coffees.filter(_.id === id)
.map(t => change.cols(t)) // cols(t) returns Seq[Rep[_]]
.update(change.values) // values returns a Seq[_]
I feel like there could be a way to do this with a custom Shape
(like TupleShape
but for Seq
), but I was hoping for some more experienced folks to tell me if this is a futile endeavor or if it's a viable approach.
There's also this gist but I'd rather not have to create a custom driver or rely on regexes to manipulate the query.
Seq
, as I'd imagine a lot of type information (for each column) will get lost away to Any
. I'd be tempted to use something like slick.collection.heterogeneous.HList
, and there would be some fun/challenging type-level methods to get the types to exactly line up for cols and values.
change
on.
Thanks, Richard. In this case, I'm okay with missing out on some type safety. For the moment, I've ended up with something along these lines:
SQLActionBuilder(
Seq(s"UPDATE $tableName SET ",
cols.map(_.getDumpInfo.mainInfo + " = ?").mkString(", "),
" WHERE ID = ?",
" LIMIT 1",
),
(_: Unit, pp: PositionedParameters) => {
(values :+ id).foreach {
case v: Boolean => pp.setBoolean(v)
case v: Byte => pp.setByte(v)
case v: Date => pp.setDate(v)
case v: Double => pp.setDouble(v)
case v: Float => pp.setFloat(v)
case v: Int => pp.setInt(v)
case v: Long => pp.setLong(v)
case v: Short => pp.setShort(v)
case v: String => pp.setString(v)
case v: Time => pp.setTime(v)
case v: Timestamp => pp.setTimestamp(v)
case v => throw new SQLException("update doesn't know how to handle parameter (" + v + ")")
}
}).asUpdate
It's not quite complete because it doesn't handle NULLs, but it works for my current needs.
Given the following for PostgreSQL
CREATE TABLE IF NOT EXISTS X(id INT PRIMARY KEY, name VARCHAR not null);
select *from X x where (x.id, x.name) in ( (1,'foo'), (2,'bar'))
;
how do I do this with Slick 3.3.3? The compiler never seems to agree with me
inSet
doesn't seem to support tuples. We might be able to teach Slick how to do that, but the workaround would be to transform into a (id ===1 && name == "foo") or (id === 2 && name === "bar")
. Sounds horrible, but the code isn't too much. Old example: https://github.com/d6y/list-of-tuples/blob/eaa2903332d7797ac38d8807fa598fd2a5aad344/src/main/scala/or-example.scala#L52-L55