Try C#. It will help you unravel a lot of your misconceptions about async.
Half the languages you picked have the fake async, yes, but go truly concurrent, you need to start dealing with Threads and locks and prove (via benchmark) that you're running on separate cores.
Your insinuation about high-level langs (or even low-level ones like C) being simulated async (or concurrency, I think you meant), is innacurate - we've been running code with concurrency forever.
It's just been up to language designers just how well Threads are abstracted (or not) and how processes are terminated (or not).
For example, you can run concurrent Bash if you use the '&' between scripts or commands.
Is that an 'abstraction'? No. It's telling the OS "Hey, run these processes on 2 threads, please".
Now, where you're somewhat correct that there's simulation is when there's 2 processes on the *same* core, trying to share memory and CPU time. THEN, there is logic done by the scheduler to allocate memory (if available/unlocked) and run instructions.
This has nothing to do with high-level language Async syntax or implementation, but is a native OS process.
Therefore, saying these languages (or the OS itself) are simulating Async threads or even full on concurrency is inaccurate.
If you want full concurrency, C# has async Tasks that you can run and they will be truly concurrent, but only if you take the time to learn to do it right.
Same with bash, though you're risking one or more threads cancelling before completion or running out of memory with no control over them.
So, "nothing in computing is truly asynchronous at the hardware and instruction level" is not only a "hot take", but one that falls apart the moment you examine software like the OS.
You might say, "I meant strictly hardware". Well, you're wrong there too. If hardware couldn't process asynchronously or concurrently any task, then IBM would have been out of business long ago.
I think you're trying to fuse two ideas together without properly explain the components.
Yes, engines like the Chrome browser fit the bill as to what you're describing - fake async. 100% agree.
But not everything, and not your full list (I'm sure a Golang dev can step in and defend their concurrency model as truly async, which it is from what I've heard).
I could be wrong about these things, but from what I was taught in school and from what I've observed, I believe there's error here.
Hot Take: Nothing in computing is truly asynchronous at the hardware and instruction level.
Everything is synchronous, until a runtime fakes it.
People talk about async like it's a magical property of the universe.
But at the lowest level, every system is ultimately synchronous.
-CPUs appear to execute instructions in order (even though they reorder internally)
-Syscalls block unless the OS schedules around them
-Hardware is event-driven but still bound by clocks
-“Async” functions don’t run in parallel, the runtime schedules them
So what’s actually happening?
Async is an illusion built by runtimes:
-JavaScript: event loop + callback/microtask queue
-Rust: executor + waker + poll-based futures
-Python: event loop + coroutines
-Go: scheduler multiplexing goroutines over OS threads
-OS kernels: interrupts + preemptive scheduling
Under the hood, it’s all synchronous execution broken into chunks and interleaved by a scheduler.
-Async ≠ “parallel”
-Async ≠ “multithreaded”
-Async = “scheduled synchronous steps that look concurrent”
The runtime just decides when your synchronous fragments run.
In summary:
Every system is synchronous at its core. Async is a runtime abstraction that slices sync operations into interleaved steps that feel concurrent.
It's the core concept of Asynchronous.
I am always open to feedbacks, because through your feedback I get to know a lot.