Hot Take: Nothing in computing is truly asynchronous at the hardware and instruction level.
Everything is synchronous, until a runtime fakes it.
People talk about async like it's a magical property of the universe.
But at the lowest level, every system is ultimately synchronous.
-CPUs appear to execute instructions in order (even though they reorder internally)
-Syscalls block unless the OS schedules around them
-Hardware is event-driven but still bound by clocks
-“Async” functions don’t run in parallel, the runtime schedules them
So what’s actually happening?
Async is an illusion built by runtimes:
-JavaScript: event loop + callback/microtask queue
-Rust: executor + waker + poll-based futures
-Python: event loop + coroutines
-Go: scheduler multiplexing goroutines over OS threads
-OS kernels: interrupts + preemptive scheduling
Under the hood, it’s all synchronous execution broken into chunks and interleaved by a scheduler.
-Async ≠ “parallel”
-Async ≠ “multithreaded”
-Async = “scheduled synchronous steps that look concurrent”
The runtime just decides when your synchronous fragments run.
In summary:
Every system is synchronous at its core. Async is a runtime abstraction that slices sync operations into interleaved steps that feel concurrent.
It's the core concept of Asynchronous.
I am always open to feedbacks, because through your feedback I get to know a lot.
I think you see async in very narrow sense. Formally, a function f: T -> U is async if it guarantees to eventually produce U on call. f is sync if additionally guarantees to produce U before call ends. And that’s it. No CPU, nothing. Just maths.
Nov 8, 2025 · 8:46 PM UTC

