On the 28th day of February of the year 2026, I was attempting to start writing an FP scripting language for fun.
So I wrote up an Ast enum for a basic calculator to start:
pubenumAst{
Binary{
left:Box<Ast>,
operator:Token,
right:Box<Ast>,
},
Unary{
operator:Token,
right:Box<Ast>,
},
Value{
value:Value,
},
}
I realized that I wanted to add a way to include spans when I wanted or not. So here’s something I cooked up:
pubstructSpanned<T>{
inner:T,
span:std::range::Range,
}
pubstructSimple<T>{
inner:T
}
Of course, since this is my project and I wanted to be a little bit silly, I went ahead and abstract this like a Java Enterprise Software Engineer but in an alternative universe where everything is Haskell.
Higher kinded types, or HKTs, is the notion that generics can have generics. That is, in something like struct Foo<T>(T<i32>);, T would be any specific type that hypothetically accepts one generic type. In that sense, T is a type constructor that has an arity of one that accepts a generic. In this case, we construct a new type T<i32> by passing i32 as an argument to the T type constructor.
…
Okay let’s take a step back.
Type systems usually live in their own universe. They have similar analogues to the current universe we live in. If we have values, the universe above us has types. If we have functions, the universe above us has type constructors. So T is a “function” accepting a type and returns a new, different type.
Okay you got that? Uh? What’s the “kinded” part mean? What’s a universe?
Uhhhh.
Okay. Let’s think about types. Generic types, that is.
Contrary to some people’s belief, Vec is not a type.
So Vec<i32> is a type. And i32 is also a type.†† The proof is left as an exercise to the reader. But not Vec. What is it exactly? In the parlance of type systems, Vec is a type constructor. You can think of it as fn(type) -> type, where it accepts a type, i.e. i32, and spits out a new type Vec<i32>. Let’s consult the comparison table:
Values
Types
function
fn(bool) -> bool e.g. unary not !
fn(type) -> type e.g. Vec
argument
an bool e.g. true
a type e.g. i32
output
an bool e.g. false
a type e.g. Vec<i32>
Ah, so ! has a type which is effectively fn(bool) -> bool,
†† People would write it as bool -> bool ala Haskell style. Starting now, let’s write it like that for simplicity and brevity.
and !true has the type bool.
bool and i32 what people consider a base type, which are primitive to the language itself and cannot be formed with other types.
These base types can be rearranged to form different new types, like bool -> bool and i32 -> String.
An example of an i32 -> String value is i32::to_string.
Both bool -> bool and bool are types.
In other words, bool -> bool and bool has the kindtype.
If there are base types, then are there base kinds?
Yes!
Most languages only have one base kind, the type kind.
Rust has two: the type kind and the lifetime kind.
Although in this post, we are going to mostly ignore the lifetime kind.
Next, can we form new kinds from rearranging base kinds?
Yes!
If we had a bool -> bool type earlier, then we could have a type -> type kind.
Well, according to the comparison table, we do have a kind in the form of type -> type, it’s Vec!
Of course, higher kinded terms like these don’t stop at type -> type.
Result is type, type -> type,†† Be quiet, Haskellers, I’m not writing actual Haskell here. and Cow<'a, T> is lifetime, type -> type.
Finally we can define some terms. These stuff such as type -> type are called type operators
and type operators that evaluate to a type are called type constructors.
There are even higher order type operators, which are operators that take in type operators.
Like (type -> type) -> type or fn(fn(type) -> type) -> type.
These are what we consider as higher kinded types,
since these abstract over type constructors.
Does rust have higher kinded types?
No.†† because Rust is a bad language 😔
Well.
Sorta kinda.
The explanation earlier is not that original;
I copied it from the Generic Associate Types RFC.
And GATs are kinda discount HKTs, so uh close enough?
The problem we have with the nonexistence of HKTs is that we don’t have a way to pass in type constructors to generics.
Take this non-syntactically valid example we introduced earlier.
structFoo<T>{
inner:T<i32>
}
Theoretically we could pass in a Vec as a type constructor:
typeBar=Foo<Vec>;
Which when applied gives us effectively a:
structBar{
inner:Vec<i32>
}
But generics only accepts types, not type constructors.
We just proved that in our bulletproof earlier before.
So what’s the workaround for this?
Enter GATs.
Let’s make up a new trait:
traitWrap{
typeWrapper<T>;
}
Prior to 1.65, the <T> part in type Wrapper<T> isn’t possible.
Now, given this newfound power, we can define a dummy marker struct like so:
structVecAsTypeConstructor;
implWrapforVecAsTypeConstructor{
typeWrapper<T>=Vec<T>;
}
Now, since VecAsTypeConstructor is a type and not a type constructor,
we can now rewrite Foo and pass the VecAsTypeConstructor as a concrete type to the generic.
structFoo<T:Wrap>{
inner:T::Wrapper<i32>,
}
typeBar=Foo<VecAsTypeConstructor>;
Let’s check if Bar’s inner field is actually of type Vec<i32>.
Okay, let’s go back and remind you of our initial problem.
pubenumAst<W:???>{
Binary{
left:Box<W<Ast>>,
operator:Token,
right:Box<W<Ast>>,
},
Unary{
operator:Token,
right:Box<W<Ast>>,
},
Value{
value:Value,
},
}
I wanted to make a wrapper type that abstracts between Simple and Spanned?
Why?
Because type-fu is fun!
Anyways, given the newfound Wrap superpower, let’s rewrite this. On second thought, wrapping every field in W::Wrapper is a stupid idea. Should have been just the Ast.
pubenumAst<W:Wrap>{
Binary{
left:Box<W::Wrapper<Ast<W>>>,
operator:W::Wrapper<Token>,
right:Box<W::Wrapper<Ast<W>>>,
},
Unary{
operator:W::Wrapper<Token>,
right:Box<W::Wrapper<Ast<W>>>,
},
Value{
value:W::Wrapper<Value>,
},
}
Uhhhh, W::Wrapper<Ast<W>>?
That looks recursive.
What’s 1 + 1? A trait requirement evaluation overflow, of course #
Okay let’s make a simple test.
"1 + 1" should parse to effectively Binary(Value(1), Plus, Value(1)) and evaluate to 2.
A simple calculator!
We derive Debug, PartialEq on all relevant types.
I posted this snippet
to the RPLCS Discord Server.
Fortunately, Helix†† theemathas on GitHub, notable for finding a lot of bugs in rustc.
helped with me in debugging this snippet.
After some back and forth with him, the horrible truth set upon me.
Conversation between me and Helix on Discord. Below is the transcription:
A Comprehensive Guide to Understanding Coinduction Principles in Mathematical Logic
Sarah Lee (AI generated) (Llama-4-Maverick-17B-128E-Instruct-FP8)
Okay I’m not reading that.
I’m not gonna even link it here.
Uhhhhh.
Okay found another one.
Oh my god it’s an academic article hell yeah.
…
Okay still a bit technical for me.
Let’s revisit what induction means first.
Mathematical induction is a method on proving a statement is true
for all natural numbers (Natural numbers meaning n≥1).
Proof by induction works by splitting the proof into two:
the base case and the inductive step.
Giving an example, let’s have this problem from Math Is Fun:
Is it true that 3n−1 is a multiple of 2? In other words,
3n−1mod2≡0
The base case proves the statement when n=1.
No other stuff required.
So let’s test it out!
Substituting n=1, we get
31−1=3−1=22mod2=0
QED or whatever.
Next, we assume that for n=k, the statement true.
Okay.
3k−1mod2≡0 is true.
I assume it is true.
Therefore it is true.
Have you assumed it yet?
Yeah you should assume it now.
Did you assume it yet?
If so, then you have assumed it.
Therefore it is true.
Because you assumed it.
Then, the meatiest part of this method is the inductive step.
Given that n=k is true†† Of course, we have just assumed that. ,
then n=k+1 is true.
This is the “step” part of inductive step;
the process of stepping through every number is enough to prove the statement
as long as we have a base case.
Okay let’s substitute n=k+1 into 3n−1.
3k+1−1=3×3k−1=2×3k+3k−1
Remember that 3x can be rewritten as 2x+x,
and so if we let x=3k,
3k can be rewritten as 2×3k+1×3k.
We omit 1× for brevity.
Since 2×3k is a multiple of 2,
and 3k−1 is also a multiple of 2, by assumption,
then
3k+1−1mod2≡0
is true!
Okay so this is the part where marry the base case and the inductive step together†† In a gay way of course. I don’t believe in heterosexuality. .
Given the base case n=1, which we proved to be true in equation 1,
then we can try n=1+1=2.
We could manually evaluate it.
But since we did prove for n=k+1 to be true,
we can substitute k=1 to get n=2
and have 32−1mod2≡0 be true
without bothering to evaluate it wholesale.
Our “hole” then lies in our assumption that n=k is true always.
However, given that we have our base case n=0
and inductive step n=k+1,
we can eventually prove the statement true
by recursively applying n=k+1 forever.
Since the n=1 case is true, then n=1+1=2 is true, by applying the inductive step.
Since the n=2 case is true, then n=2+1=3 is true, by applying it again.
And again and again, to the point where the statement is true for every number ever.
Okay that’s nice, but what does have that to do with whatever we are doing earlier?
Well okay, we need to generalize the concept of mathematical induction
to work beyond natural numbers.
How do we use induction in stuff that aren’t integers?
Researching about this topic led me to structural induction,
which could be related, so I’ll try talking about it here.
Structural induction is a way to prove some property holds for a structure
like lists, strings, and structures.
Just like normal induction, structural induction has a base case and an inductive step.
We can emulate the creation of such structures
by recursively defining them.
Natural numbers could be defined recursively as follows:
Base case: 1∈N
Recursive step: if k∈N then k+1∈N
Integers:
Base case: 0∈Z
Recursive step: if x∈Z then x+1∈Z and x−1∈Z
Since these structures are defined inductively,
we could prove some properties about them via induction too.
Let’s use this example from a random pdf I found on the internet:
Let S be a set where:
Base case: 6∈S, and 15∈S
Recursive step: if x,y∈S, then x+y∈S
Now, let’s say we want to prove the property that
all the elements in the set S is divisible by 3.
How do we do that?
By induction of course!
As is tradition, let’s consider the base case first.
Is 6 divisible by 3? Yeah.
Is 15 divisible by 3? Yup.
Do we need to elaborate on that? No?
Thank god.
I thought I have to prove 1 + 1 = 2 or something.
Anyways.
Let’s consider the inductive step:
For all numbers x and y,
we assume that 3∣x and 3∣y.†† That’s notation for “3 divides x”. Personally I think it looks backwards.
Because of that, we can represent x=3n for some integer n,
and same for the other with y=3m for some integer m.
Therefore, applying the recursive step gives us x+y=3n+3m=3(n+m).
Wait, since x+y takes the form 3p where p=n+m,
then it must be that 3∣x+y.
By the power of induction,
we have proved that all the elements of set S are divisible by 3.
…
Okay but what all of that have to do with anyth—
That moment when Curry and Howard did a joint yaoi slay and caused a motherquake measuring 9.9 on the cunter scale #
damn curry and howard corresponded all over that thang
Why are we talking about proofs in the first place?
Well.
Haskell Curry†† yes that haskell and that curry in FP parlance
and William Alvin Howard realized something and that is
there is a connection between mathematical proofs and computer programs.
To be more specific, both of them are isomorphic to each other.
In fact, there is a direct correspondence between
logic terminologies and programming terminologies.
See the table below, taken from Wikipedia:
Logic side
Programming side
formula
type
proof
term
formula is true
type has an element
formula is false
type does not have an element
logical constant ⊤ (truth)
unit type
logical constant ⊥ (falsehood)
empty type
implication
function type
conjunction
product type
disjunction
sum type
universal quantification
dependent product type
existential quantification
dependent sum type
Hilbert-style deduction system
type system for combinatory logic
natural deduction
type system for lambda calculus
hypotheses
free variables
implication elimination (modus ponens)
application
implication introduction
abstraction
Unfortunately, teaching about this in a detour section of an article about HKTs is kinda uhhhh.
So I’ll let you watch this video first.
Remember to come back!
So this part is me trying out Lean 4 for the first for the following reasons:
I want to have an excuse to try out Lean 4.
lol
Since Curry-Howard Isomorphism states that proofs == programs,
I want to try rewriting the proof above as a program in Lean 4.
I tried writing the proof in Rust, but then I got bitten by the conflicting trait implementation error.
Curse you Rust!
So let’s start!
First, we have to define the set S in Lean.
How do we do that?
We make a type, using the inductive keyword:
inductiveSwhere
| six : S
| fifteen : S
| step : S → S → S
six and fifteen are our base cases.
It is a type constructor that takes nothing and returns an S.
step is our recursive step,
where it is a constructor that takes two instances of S and return an S.
Surprisingly, there’s a rust analogue for this: the enum!
You can rewrite this in rust as follows:
enumS{
Six,
Fifteen,
Step(Box<S>,Box<S>),
}
Next step, we define a way to convert S to a natural number.
We use natural numbers because uhhhhh I did the natural numbers game
and that’s the only number set I know.
deftoNat (s : S) : Nat := match s with
| six => 6
| fifteen => 15
| step s1 s2 => toNat s1 + toNat s2
(s : S) is our argument list and : Nat is the return part of the definition.
It’s similar to this:
fnto_nat(s:S)->u32{
matchs{
S::Six=>6,
S::Fifteen=>15,
S::Step(s1,s2)=>to_nat(*s1)+to_nat(*s2),
}
}
Then, we define a proposition.
This is the part where theorem provers and programming languages diverge a bit.
Well not really.
But sorta kinda.
What am I saying.
Propositions are types.
In Lean, you can define the “divisible by 3” proposition as so:
defThreeven (i : Nat) := ∃ k, i = 3 * k
Let’s translate this.
We define a new proposition Threeven
which takes in a natural number i.
We state that given a number i that is Threeven,
there exists a number k (that’s what we mean by ∃ k)
where i = 3 * k.
If we wanna prove that i is threeven,
then we just need to provide a value for k
such as k * 3 equals i.
That seems easy!
So let’s try proving the base cases using this.
We use the theorem keyword to, well, introduce a theorem.
We introduce a hypothesis h that 6 = 3 * 2,
then we simplify it to 6 = 6, which is trivially true.
This is the part where I’m not sure what is happening,
but my intuition tells me that Exists.intro 2 h
applies with k := 2 and h := i = 3 * k.
Since they match up together when i := 6,
then the goals have been solved and the program typechecks.
We repeat this for the other base case.
theoremthree_divides_fifteen : Threeven 15 :=
have h : 15 = 3 * 5 := by simp
Exists.intro 5 h
Now this is the part where I got stuck.
I didn’t know how to prove the recursive step of the theorem:
theoremthree_divides_threeven_plus_threeven
(a : Nat) (b : Nat)
(ha : Threeven a) (hb : Threeven b) : Threeven (a + b) :=
Translating this, we say that:
The theorem three_divides_threeven_plus_threeven
accepts two natural numbers a and b,
and two hypotheses ha where a is threeven,
and hb where b is also threeven,
where we must prove that the sum of a and b is also a threeven.
I do have experience in the natural numbers game,
but I don’t have experience in proving existentials.
Thankfully, pitust and Sabrina was kind enough to help me with this.
First, I could’ve rewrote the above theorems as:
theoremthree_divides_six : Threeven 6 := by exists 2
theoremthree_divides_fifteen : Threeven 15 := by exists 5
Thank you tactics mode.
Why didn’t I think of that.
Then, pitust gave the answer to three_divides_threeven_plus_threeven with this:
theoremthree_divides_threeven_plus_threeven
(a : Nat) (b : Nat)
(ha : Threeven a) (hb : Threeven b) : Threeven (a + b) := by
obtain ⟨a', ha⟩ := ha
obtain ⟨b', hb⟩ := hb
exists a' + b'
rw [ha, hb, Nat.mul_add]
This is the first time I’ve seen obtain,
so hopefully this explanation sounds good enough.
First, by entering tactics mode, we get a tactic state with a goal.
a b : Nat
ha : Threeven a
hb : Threeven b
⊢ Threeven (a + b)
Our goal is that given a and b as natural numbers
and ha and hb being the hypotheses that the two numbers are threeven,
then we must meet the goal that a + b is also threeven.
(This tactic state appears when you put your text cursor after the word by)
After that, we use obtain,
which is kinda like unpacking the Threeven a hypothesis
into its parts.
What are its parts, exactly?
Remember the definition: def Threeven (i : Nat) := ∃ k, i = 3 * k
The two parts here are the natural number k
and the proposition i = 3 * k.
So we split Threeven from k to a' and i = 3 * k to ha.
Our proof state now looks like:
a b : Nat
hb : Threeven b
a' : Nat
ha : a = 3 * a'
⊢ Threeven (a + b)
Let’s do the same for hb.
Now we get:
a b a' : Nat
ha : a = 3 * a'
b' : Nat
hb : b = 3 * b'
⊢ Threeven (a + b)
Since the goal state of our theorem is an existential,
we can use the exists to substitute k with anything we want.
Here, we substitute a' + b' into the goal.
a b a' : Nat
ha : a = 3 * a'
b' : Nat
hb : b = 3 * b'
⊢ a + b = 3 * (a' + b')
Now it’s a matter of rewriting the goal.
Yes that’s what rw stands for.
We can rewrite a to 3 * a' and b to 3 * 3b' with rw [ha, hb]
to get this goal state
a b a' : Nat
ha : a = 3 * a'
b' : Nat
hb : b = 3 * b'
⊢ 3 * a' + 3 * b' = 3 * (a' + b')
Of course, the last step would be us expanding 3 * (a' + b')
via the distributive property of numbers.
This is spelled out as rw [Nat.mul_add].
Why is it called that?
Because a * (b + c) has a * first then + next.
mul_add.
Yup. That’s creative naming.
After the last statement, we get:
a b a' : Nat
ha : a = 3 * a'
b' : Nat
hb : b = 3 * b'
⊢ 3 * a' + 3 * b' = 3 * a' + 3 * b'
Notice how both sides are the exact same.
As such the goal is met and the theorem typechecks.
Hell yeah!
Now here is the fun stuff.
We can use these theorems to prove that the statement
“all elements of S are divisible by 3”.
Let’s define this theorem right now.
theoremS.is_threeven (s : S) : Threeven s.toNat :=
pitust helpfully tells me I could use that S.is_threeven for ease.
I also redefined toNat to be defined on S.
Anyways.
This theorem states that all elements of S are threeven.
I’m not sure how that theorem implies that as spelled out,
but apparently it is a feature of calculus of constructions.
I’m not really sure what that means. pitust says:
if you have a function which maps a value of type S into a proof that something is true for that value, then by definition that thing is true for all values of that type
I don’t know what that means.
Anyways.
This is the part where the first part of this section is leading up to.
Induction.
We do proof by induction on s:
theoremS.is_threeven (s : S) : Threeven s.toNat := by
induction s with
and then we pattern match on s.
theoremS.is_threeven (s : S) : Threeven s.toNat := by
induction s with
| six => apply three_divides_six
| fifteen => apply three_divides_fifteen
| step a b ha hb => ...
Both six and fifteen base cases are self-explanatory,
we just proved them earlier.
We just have to apply them here.
step is a bit cute.
With induction, we not only pattern match on its variables,
but also its propositions. What are these proposition you ask?
That a and b are threevens.
This is The Big Assumption.
That we know is True.
By Assumption.
Starting with this branch we get the state:
case step
a b : S
ha : Threeven a.toNat
hb : Threeven b.toNat
⊢ Threeven (a.step b).toNat
First we rewrite (a.step b).toNat with its definition via rw [S.toNat],
giving us:
a b : S
ha : Threeven a.toNat
hb : Threeven b.toNat
⊢ Threeven (a.toNat + b.toNat)
Wow, that awfully looks like our three_divides_threeven_plus_threeven theorem.
Let’s apply it.
| step a b ha hb =>
rw [S.toNat]
apply three_divides_threeven_plus_threeven
Gives us:
case step.ha
a b : S
ha : Threeven a.toNat
hb : Threeven b.toNat
⊢ Threeven a.toNat
case step.hb
a b : S
ha : Threeven a.toNat
hb : Threeven b.toNat
⊢ Threeven b.toNat
What the hell?
At first, I found this confusing, but thankfully Sabrina explained this to me:
apply matches up the type of the result of the proposition with the goal, and then looks at what hypotheses you need to prove to get there
it creates new goals for each of those hypotheses
So since the goal matches with the result of the sum of threevens theorem,
then it’s up to us to prove the hypotheses we defined instead of the result itself.
That’s nice.
Because our hypotheses is much simpler that our original goal,
we can exploit those instead.
We got two cases: step.ha and step.hb.
What do you know, the goal is the exact same as ha.
Let’s use exact ha.
And same for exact hb.
The final theorem:
theoremS.is_threeven (s : S) : Threeven s.toNat := by
induction s with
| six => apply three_divides_six
| fifteen => apply three_divides_fifteen
| step a b ha hb =>
rw [S.toNat]
apply three_divides_threeven_plus_threeven
· exact ha
· exact hb
Nice!
I don’t know what the dot is for.
Here’s the full program if you are wondering.
inductiveSwhere
| six : S
| fifteen : S
| step : S → S → S
defS.toNat (s : S) : Nat := match s with
| six => 6
| fifteen => 15
| step s1 s2 => toNat s1 + toNat s2
defThreeven (i : Nat) := ∃ k, i = 3 * k
theoremthree_divides_six : Threeven 6 := by exists 2
theoremthree_divides_fifteen : Threeven 15 := by exists 5
theoremthree_divides_threeven_plus_threeven
(a : Nat) (b : Nat)
(ha : Threeven a) (hb : Threeven b) : Threeven (a + b) := by
obtain ⟨a', ha⟩ := ha
obtain ⟨b', hb⟩ := hb
exists a' + b'
rw [ha, hb, Nat.mul_add]
theoremS.is_threeven (s : S) : Threeven s.toNat := by
induction s with
| six => apply three_divides_six
| fifteen => apply three_divides_fifteen
| step a b ha hb =>
rw [S.toNat]
apply three_divides_threeven_plus_threeven
· exact ha
· exact hb
This proof is mathematically sound and valid and correct or whatever
because the Lean compiler stopped yelling at me for “incomplete goals”.
So uh.
QED.
We now return to your most programming language ever #
Right.
Where was I?
Rust?
Oh yeah.
Let’s circle back on that.
The reason we talked about logic because we are going to talk about the trait solver.
Specifically, both the chalk-based trait resolving
and the next generation trait resolver which is based in the previous chalk-based one.
Here’s a quote from the rustc-dev guide
on chalk-based trait resolution:
The key observation here is that the Rust trait system is basically a kind of logic,
and it can be mapped onto standard logical inference rules. We can then look for solutions
to those inference rules in a very similar fashion to how e.g. a Prolog solver works. It
turns out that we can’t quite use Prolog rules (also called Horn clauses) but rather need a
somewhat more expressive variant.
Prolog??? I should’ve learned prolog instead of lean!!!
:clueless:
Looking at the example on the next-gen solver, they have:
In this code, to be able to clone Vec<T>, then T must implement Clone.
Since this function has T bound on Clone, we can safely
assume
that Vec<T> is also Clone.
The statement “prove Vec<T> is Clone assuming T is Clone” is called a Goal in next-gen parlance.
Assuming is fine here because we defer the bound to the caller of the function.
The bounds Vec<T>: Clone and T: Clone are called Predicates.
Wow this looks a bit familiar.
So apparently to prove Vec<T>: Clone we check its implementation,
which is impl<T: Clone> Clone for Vec<T>.
However, to use this impl, we also have to check if T: Clone also holds.
Since we have the
assumption
that T: Clone from uses_vec_clone,
and there are no nested goals from that,
therefore Vec<T>: Clone holds.
Here’s the part where we return to that Discord convo and explain what inductive cycles are #
It is important to note that the default Rust trait solving is inductive.
According to this page,
proving something is true by induction must happen without any cycles.
So to inductively prove a goal there should be a finite proof tree.
For example, for the bound Vec<Vec<Vec<Vec<i32>>>>: Debug to hold,
the tree would look like as follows:
Vec<Vec<Vec<Vec<i32>>>>: Debug is provable
Vec<Vec<Vec<i32>>>: Debug is provable
Vec<Vec<i32>>: Debug is provable
Vec<i32>: Debug is provable
i32: Debug is provable (which it is, we have an impl for that)
Let’s check if Vec<SomeRandomStruct>: Debug holds,
assuming SomeRandomStruct doesn’t implement any trait.
Vec<SomeRandomStruct>: Debug is provable
SomeRandomStruct: Debug is provable (which it isn’t)
The trait resolver can return a success or an error depending on the outcome of proving a bound holds.
However, let’s give an example of a recursive type.
structList<T>{
value:T,
next:Option<Box<List<T>>>,
}
To prove that List<T>: Send, we need to construct a proof tree first,
checking each part:
List<T>: Send is provable
T: Send is provable
Option<Box<List<T>>>: Send is provable
Box<List<T>>: Send is provable
List<T>: Send is prova—wait huh
Wait.
This tree is infinitely large.
It’s not well-founded.
We can have a proof tree that is theoretically infinite.
To note, recursive does not mean infinite.
Sometimes recursive things eventually end if there’s a base case.
So this isn’t reliant on induction. This requires something else.
And that’s
coinduction.
According to the rustc-dev guide:
To coinductively prove a goal the provided proof tree may be infinite.
So since infinite proof trees are fine,
then it is sufficient for coinductive traits as long as there is a cycle
and there are no other requirements needed.
Given List<i32>, we have the proof tree as follows:
List<i32>: Send is provable
i32: Send is provable (yes, i32 is Send)
Option<Box<List<i32>>>: Send is provable
Box<List<i32>>: Send is provable
List<i32>: Send is provable (uh, that’s a cycle)
Since the two terminal branches here are i32: Send and List<i32>: Send,
then we can say that List<i32> is Send as Send is a coinductive trait.
According to the rustc-dev guide,
only auto-traits, Sized,
and WF-goals†† Not sure what this means. Does it mean “well-formed”? are coinductive.
But that’s crazy.
I was that close to borking the compiler.
Anyways.
Helix did add a reply on his original issue,
so all is well in the end.
I mean.
I still can’t use HKTs tho.
…
Goodbye Wrap trait. I will miss you!
This gave me hope though:
However, we’re currently also working on strenghening/formalizing our cycle semantics, and my current expectation is that the instantiated impl should soundly hold:
Rust is a dogshit language that is so void of features
that it is basically borderline useless because
I can’t jerk off to use higher kinded types in my code.
The fact that you can’t have monads
makes this an unserious language
and nobody should use it for anything serious ever.
The fact that this bug will only get fixed
after the next-gen solver lands means the rustc devs
are lazy ass bums and should pull themselves up
by their bootstraps.
But then, attempting to use higher kinded types
made me learn logic and Lean 4 and the word coinduction
which is complicated and therefore if the rustc devs
added support for custom coinductive traits then
everyone will have to learn what an inductive cycle mean
which makes Rust a complicated language
and therefore it becomes a bloated and complex mess
and Rust becomes a dogshit language.
Because they’re adding woke to the language.
Like monads.
Lol.
I guess this article is really an excuse for me to write more
and basically make me learn about a bunch of things about type systems.
Which is very interesting, don’t get me wrong!
But wow this shit is complicated.
I wish there was like a collection of resources about these kind of stuff,
but everything is gatekept behind a bunch of pdfs of presentations from lectures in different universities.
So my knowledge of induction is basically from my weird advanced math tutor classes
and the logic books I’ve read before.
And then coinduction is basically from Wikipedia and a bunch of random people on Discord.
My fault for trying to learn an impenetrable topic :clueless:
I guess I should read more books.
And finish that ebook about category theory for programmers.
Wait why am I reading that shit?
I’m a chemist for goddamn sake!
I should read a book about abstract algebra for chemists
so I can understand what the fuck is happening in inorganic chemistry!