Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
Martijn Hoekstra
I particularly wanted to see where things would lead, and what kinds of operations are easy/hard in what style, but a stack based language might indeed be better for discovering that.
there are such obvious optimizations, that I was really wondering how that would work, and what that would look like
Fabio Labella
I think writing "compilers" is skewed towards an initial representation
it can be more intuitive to write transformations and stuff
final can give you advantages like fusion though, but it's harder to figure out imho
the opposite is true in user code, like for your standard "interface"
where final wins due to less boilerplate and ease of composition
Martijn Hoekstra
I was led to believe that Free was ideal for such things
Fabio Labella
yeah, that's what I mean by initial representation
an AST, basically
whereas a final representation is functions (the eliminators)
and in fact, you are using Free now
your Free is List
and it's Free wrt to Monoid, which is all you need here
Martijn Hoekstra
I don't know what initial, final, or eliminator really means
Fabio Labella
ok, I can explain the terminology to bring some clarity :)
so, the domain all this originated is EDSLs (embedded domain specific languages)
the most intuitive way of writing one is by writing a data structure that represent your EDSL, then interpreting it into something else
e.g. data Op = Add Int Int | Const Int
plus a function from Op to whatever you want to interpret this to (e.g. an Int for evaluation, or a String for pretty printing)
this is an example of an initial encoding, also referred to as a deep embedding
what you had there is therefore initial
an AST basically
Martijn Hoekstra
so this translates to abstract class Op; case class Add(l: Int, r: Int) extends Op; case class Const(value: Int) extends Op right?
Fabio Labella
The Free monad is another example of this: your EDSL is "monadicity", and you represent that by a concrete data structure, which you then interpret to something else
there's a million other things to talk about on this, but I'll skip
a final encoding, or shallow embedding, is when you represent your mini language using functions, i.e. directly in terms of what are going to be the interpreters (the eliminators of your language/edsl/algebra)
for example
actually wait, I got the other embedding wrong (too easy)
let's say the initial one is:
data Exp = Add Exp Exp | Lit Int
sealed trait Exp
case class Add(l: Exp, r: Exp) extends Exp
case class Lit(i: Int) extends Exp
Fabio Labella
so that we can nest expressions like Add(Lit(1), Add(Lit(3), Lit(3))
interpretation for this style is just a function that uses pattern matching and recursion
Martijn Hoekstra
yeah, SeemsGood
Fabio Labella
the same thing in final style would be
class Exp t where
    lit :: Int -> t
    add :: t -> t -> t
trait Exp[T] {
  def lit(i: Int): T
  def add(l: T, r: T): T
so we're using typeclasses and, more generally, functions, to represent our expressions
what do these function represent? they represent the interpreters for each instruction
for example, let's see how to evaluate both styles to an Int
Martijn Hoekstra
top one would be def rec(exp: Exp): Int { exp match { case Add(l, r) => rec(l) + rec(r); case Lit(i) => i }}
bottom one new Exp[Int] { def lit(i: Int): Int = i def add(l: Int, r: Int): Int = l + r }
Fabio Labella
ah, you type faster than me ;)
anyway yes, that's it pretty much
so when you write an expression def expr[A]: Expr[A] = add(lit(1), add(lit(3), lit(5))
you are already calling the evaluator, so to speak
btw, that trait (at least in Haskell) is normally encoded as a type class
and instead of Exp[Int], you create a newtype over Int, and use that as your target
so in tagless style interpretation is the identity function in Haskell (since the typeclass dictionary can't be touched)
it's kinda in between in scala, depends on whether you pass the instance implicitly or not
but that's the gist of what the two approaches are