Reputation: 653
I have a method with this signature:
def fn1[A1, P, Q, E, U, C[_]](
fn: A1 => Query[E, U, C],
sorts: (E => Rep[_], String)*
)(implicit
aShape: Shape[ColumnsShapeLevel, A1, P, A1],
pShape: Shape[ColumnsShapeLevel, P, P, _]
) = ???
and in my class I have a slick query defined as:
protected def base(id: Rep[Long]): Query[(entity1Table, entity2Table), (Entity1, Entity2), Seq] = ???
Now, I want to do something like this:
fn1(base, (_._1.name, "name"))
or, at least
fn1(base, (x => x._1.name, "name"))
but even with the second way, scala cannot infer the type of x, that is (entity1Table, entity2Table)
, so to make it compile correctly I have to explicitly tell the type of x writing this
fn1(base, ((x: (entity1Table, entity2Table)) => x._1.name, "name"))
Why scala cannot infer the type and what can I do to make scala infer the type automatically?
Upvotes: 1
Views: 519
Reputation: 13985
This is because the type parameters of your sorts
are derived from the first
parameter fn
. But if you pass both of them at the same time, the compiler does not know anything about fn
yet and hence fails to derive for sorts
.
You need to curry
your fn1
so that the compiler can first work on fn
and then use the derived types to make sense of sorts
.
I am a bit confused by the use your terms entity1Table
etc... so I drew up a mock schema to go with the example.
import slick.driver.H2Driver
import H2Driver.api._
case class Entity1(i: Int, s: String)
case class Entity2(i: Int, s: String)
class Entity1T(tag: Tag) extends Table[Entity1](tag, "Entity1s") {
def id = column[Int]("id", O.PrimaryKey) // This is the primary key column
def name = column[String]("name")
def * = (id, name) <> (Entity1.tupled, Entity1.unapply)
}
val entity1Table = TableQuery[Entity1T]
class Entity2T(tag: Tag) extends Table[Entity2](tag, "Entity2s") {
def id = column[Int]("id", O.PrimaryKey) // This is the primary key column
def name = column[String]("name")
def * = (id, name) <> (Entity2.tupled, Entity2.unapply)
}
val entity2Table = TableQuery[Entity2T]
Now, I am not sure which one do you want, this one
def fn1[A1, P, Q, E, U, C[_]](
fn: A1 => Query[E, U, C]
)(
sort: (U => Rep[_], String)*
)(implicit
aShape: Shape[ColumnsShapeLevel, A1, P, A1],
pShape: Shape[ColumnsShapeLevel, P, P, _]
) = ???
protected def base1(id: Rep[Long]): Query[(TableQuery[Entity1T], TableQuery[Entity2T]), (Entity1T, Entity2T), Seq] = ???
val x1 = fn1(base1)((etq => etq._1.name, "name"))
Or this one,
def fn2[A1, P, Q, E, U, C[_]](
fn: A1 => Query[E, U, C]
)(
sort: (E => Rep[_], String)*
)(implicit
aShape: Shape[ColumnsShapeLevel, A1, P, A1],
pShape: Shape[ColumnsShapeLevel, P, P, _]
) = ???
protected def base2(id: Rep[Long]): Query[(Entity1T, Entity2T), (Entity1, Entity2), Seq] = ???
val x2 = fn1(base1)((etq => etq._1.name, "name"))
From what I can see, both of the versions are able to derive types.
Upvotes: 1
Reputation: 40510
I think, this is because functions are contravariant: _._1.name
could be (entity1Table, entity2Table) => String
or it could be Any => String
, and either would satisfy the type constraint, because the latter is a subclass of the former.
Upvotes: 0