You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently we infer kinds of type variables by just looking at where they are used in the signature.
If a type variable is used in row position, we infer the kind as "row".
Otherwise we infer it as "star".
For type declarations, we infer all kinds as "star".
This is not right, example:
# `r` is currently `*`, but it should be `row`.
type Test[r]:
variant: [A, B, ..r]
# `r` is currently `*`, but it should be `row`.
f1[r](t: Test[r])
match t.variant:
~A: printStr("A")
~B: printStr("B")
other: printStr("Other")
The reason why this doesn't cause issues right now is because we also allow non-kind-preserving unification, e.g. we can link a variable of kind * to a row type.
Currently we infer kinds of type variables by just looking at where they are used in the signature.
If a type variable is used in row position, we infer the kind as "row".
Otherwise we infer it as "star".
For type declarations, we infer all kinds as "star".
This is not right, example:
The reason why this doesn't cause issues right now is because we also allow non-kind-preserving unification, e.g. we can link a variable of kind
*
to a row type.Haskell 98 kind inference should work.
I started working on this in
kind_inference
branch.The text was updated successfully, but these errors were encountered: