IoT means Internet of Things.
Basically, this means that each device has its own CPU.
CPUs are cheap, now, and the use of multiple CPUs can be imagined. This idea couldn’t be realized/imagined in the 1950’s when CPUs were expensive.
What Has Changed?
What are the old-fashioned concepts that are no longer true in IoT?
CPUs are cheap now.
Each device can have its own CPU.
It is no longer necessary to slice CPU time using time-sharing.
Memory is cheap now.
Each device can have its own memory
It is no longer necessary to share memory.
[Some applications need to transfer large amounts of data - memory - and such applications still need to use memory sharing. Memory-sharing should be the exception, not the rule.]
Yet, we continue to put full-blown operating systems - with full preemption and memory-sharing - onto small CPUs, like Raspberry PIs.