A definition of simplicity is
Complexity is the opposite of simplicity.
Complexity occurs in a system when there are too many variables to be juggled, at the same time, in a solution.
To simplify something does not necessarily mean to delete variables from the system.
One can simplify a problem by subdividing it. Divide and Conquer.
I remember being told that humans can handle 7±21 items at a time.
This would imply that one needs to apply Divide-and-Conquer until the number of variables in a subdivision is 7±2.
The number of variables in a system might be more than 7±2, say 100, but no part (subdivision) of the system should have more than 7±2 variables in it.
Physicists simplify problems using approximations.
When the effect of a variable is much less than (≪) the effect of some other variable, physicists make simplifying assumptions — they drop (elide) the former variable from their equations, dealing only with the latter variable(s).
This does not mean that one approximation is valid in every situation.
For example, in his book “Order Out of Chaos”, Nobel Laureate Ilya Prigogene2 decries the misuse of approximations that are not valid in certain situations.
Often, people over-use an approximation, or a notation, once it has worked in some domain.
I see this happening in programming, where, often the question is “how do I re-cast this problem in a strongly-typed, functional manner?” instead of asking “what is the problem and how can it best be solved?” and “what approximations / notations are useful in this case?”.
In programming, we see this played out in synchronous notation being misused to solve asynchronous problems (e.g. multitasking and the accidental complexities it brought along) and in “features” like JavaScript’s callbacks.
Errors in one domain might not be errors in another domain.
For example, a timeout is an error when building a ballistics calculator, but is an expected occurrence when building blockchain.
The “tells” of misuse of notations and approximations are:
Successful use of divide and conquer reduces not only the number of variables at the input of a subdivision, but, also, reduces the number of outputs.
To be able to apply divide-and-conquer in infinitely many subdivisions4 a notation must support reduction of the number of input ports and reduction of the number of output ports.