TOE means Theory of Everything.
I don't have the answers, but I have some observations…
Code boils down to two operations:
Breath in usually involves pattern matching.
Breathing out usually means rearranging the stuff that got breathed in.
Pattern matching: use PEG.
PEG is "better" than REGEXPs in that PEG can match nested patterns.
The theoretical name for pattern matching is parser, and PDFA. Theory gave us YACC. PEG is not YACC. PEG is easier than YACC.
REGEXP theory is called scanners. Theory gave us LEX. PEG makes LEX obsolete.
FP languages use pattern matching.
Data boils down to one universal data structure - the triple.
All other data structures are ways to rearrange and optimize triples.
In 1950's-thinking, it was a good idea to optimize data structures.
I'm not so sure that it's a good idea to optimize data structures in the 202x's. Certainly not during "compile time".
We used to believe that optimizing memory and CPU usage was critical.
I believe that optimizing human-thought-effort is more critical, now.
My further thoughts on this subject are:
Optimizations thwart efforts to automate processes.
We saw this problem happen in the early days of C and Pascal. Assembler programmers thought that they could optimize better than compilers could. Then GCC came along. Now, we all use HLL's and don't imagine writing code in assembler.
Make it boring.
Make it repetitive.
Then write code that automates and optimizes the boring and repetitive parts. We already know how to do that, e.g. peephole optimizers, compiler technologies, etc.
How do you make data boring? Deconstruct it into triples.
How do you make code boring? Deconstruct it into triples.
is a triple. And
is a triple
and PROLOG can write triples
is a triple in PROLOG-ese.
is not a triple.
Don't use triples if you think that your optimizer needs more edge-cases.
How do you get more-interesting data structures if everything is reduced to triples?