Goals for today:
What is an implemented composition system?
Composable
sComposable
s, does a (brute-force) search over possible valid combinationsMany composition operations that have been proposed in the literature can be represented as combinators.
The initial core Heim & Kratzer system is essentially the system determined by:
(However -- types for the first two!)
Switch to notebook for this:
%%lamb
||cat|| = L x_e: Cat(x)
||gray|| = L x_e: Gray(x)
||kaline|| = Kaline_e
||julius|| = Julius_e
||inP|| = L x_e : L y_e : In(y, x) # `in` is a reserved word in python
||texas|| = Texas_e
||isV|| = L p_<e,t> : p # `is` is a reserved word in python
||fond|| = L x_e : L y_e : Fond(y, x)
of = lang.Item("of", content=None)
a = lang.Item("a", content=None)
binder = lang.Binder(5)
t5 = lang.Trace(5)
display(of, a, binder, t5)
kaline * (isV * (a * (gray * cat)))
kaline * (isV * ((a * (gray * cat)) * (inP * texas)))
kaline * (isV * (a * ((gray * cat) * (inP * texas)
* (binder * (t5 * (fond * (of * julius)))))))
Depending on time, let's look at the Neo-Davidsonian fragment
All metalanguage objects are subclasses of class TypedExpr
. Subclasses add on:
There are essentially three main kinds of metalanguage objects: terms, operators, and binding operators. Functions are a special case of binding operators.
Terms come in two kinds: variables and constants.
We looked yesterday at an example like this:
formula = %te p_t & (q_t | ~r_t)
formula
display(formula.op) # typed expressions (may) have an operator
display(formula.type) # typed expressions have a type
display(list(formula)) # typed expressions have parts
display(formula.__class__) # the python type of `formula`
'&'
[p_t, (q_t | ~r_t)]
lamb.meta.BinaryAndExpr
Let's look at a part:
formula[0]
display(formula[0].op) # Terms put the variable name in the `op` field
display(formula[0].type)
display(len(formula[0])) # no parts
display(formula[0].__class__)
'p'
0
lamb.meta.TypedTerm
Concept from programming language design: an AST is a representation of the abstract syntactic structure of a formal language.
def collect_ast(x):
result = [x.op] + [collect_ast(sub) for sub in list(x)]
if (len(result) > 1):
result = [""] + result
return result
display(formula)
svgling.draw_tree(collect_ast(formula))
f2 = %te L x_e : L y_e : P(x) & Q(x)
display(f2)
svgling.draw_tree(collect_ast(f2))
INFO (meta): Coerced guessed type for 'P_t' into <e,t>, to match argument 'x_e' INFO (meta): Coerced guessed type for 'Q_t' into <e,t>, to match argument 'x_e'
f3 = %te L x_e : Forall y_e : y <=> x
display(f3)
svgling.draw_tree(collect_ast(f3))
Some basic logical inference, nothing sophisticated (contributions welcome!):
More simplification/reduction examples:
f4 = %te True & False
f4.simplify()
f5 = %te x << (Set y: Cat(y))
f5
INFO (meta): Coerced guessed type for 'Cat_t' into <e,t>, to match argument 'y_e'
f5.reduce()
The most involved part of the metalanguage implementation is type inference. Plan for discussing:
On Thursday, we will revisit this topic, and talk about polymorphic type inference.
Useful references
Desiderata for implementation:
(I won't discuss the latter, but just to tantalize you: in the untyped lambda calculus, it is undecidable whether two lambda expressions are equivalent -- arguably the first undecidable problem, discovered by Church.)
An example (untyped): $\lambda x . x(x)$
Reduction 1 (converges):
ok...
Reduction 2 (does not converge):
The untyped lambda calculus is not strongly normalizing. No guarantee that reduction will converge at all!
$\alpha$-reduction: $\lambda v_\tau . \alpha \Rightarrow_\alpha \lambda v'_\tau \alpha{}[v_\tau := v'_\tau]$
if $v'_\tau$ is not free in $\alpha$, and $v'_\tau$ is free for substitution for $v_\tau$ in $\alpha$.
$\beta$-reduction: $(\lambda v_\tau . \alpha)(\beta) \Rightarrow_\beta \alpha{}[v_\tau := \beta]$
if $\beta$ is free for substitution for $v_\tau$ in $\alpha$ and $\beta \in \mathbf{Term}_\tau$.
Strong normalization: a lambda calculus (viewed as a rewrite system) is strongly normalizing if every term has a $\beta$-normal form.
If $\alpha \Rightarrow \beta$ and $\alpha \Rightarrow \gamma$, then there is some $\delta$ such that $\beta \Rightarrow \delta$ and $\gamma \Rightarrow \delta$.
All this to say:
Back to the Carpenter definition: If $\alpha$ is a term of type $\langle \sigma,\tau \rangle$, and $\beta$ is a term of type $\sigma$, then $(\alpha(\beta))$ is a term of type $\tau$
Implementation looks pretty straightforward. Given some LFun
f
and TypedExpr
a:
f.type[0] == a.type
, then return ApplicationExpr(f, a)
(of type f.type[1]
)TypeMismatchError
Important detail: we also need to ensure that variables are used consistently!
%te L x_t : P_<e,t>(x_e)
ERROR (parsing): Parsing of typed expression failed with exception: ERROR (parsing): Binding operator expression has unparsable body, in string 'L x_t : P_<e,t>(x_e)' (Type mismatch: 'x_e'/e and type t conflict (Failed to unify types across distinct instances of term))
%te p_t & Q_<e,t>(p_e)
ERROR (parsing): Parsing of typed expression failed with exception: ERROR (parsing): Type mismatch: 'p_t'/t and type e conflict (Failed to unify types across distinct instances of term)
Type inference: given two types $a,b$, what (if any) single type $c$ is equivalent to $a$ and $b$?
In the simply-typed lambda calculus, type inference is the same thing as type checking. If $a$ and $b$ are equal, they can be unified as $a$ (or $b$), otherwise, they cannot be unified.
Given two terms $t1_\alpha$ and $t2_\beta$, a unification is a valid substitution that produces a single $t3_\gamma$ that is equivalent to $t1$ and $t_2$.
Reduction as unification: reduction involves unifying a variable of some type $\alpha$ (indicated by the $\lambda$ term) with an argument of some type $\beta$, and substituting the result for the variable in the scope of the lambda term
%te (L x_e : Cat_<e,t>(x))(Joanna_e)
%te (L f_<e,t> : Forall x_e : f(x))(L x_e : Cat_<e,t>(x))
Translate some lambda calculus formulas to metalanguage objects with %te
Combinations:
Translate a formula from your work to the metalanguage (if it can handle it!)