Synchronous, asynchronous

From Helpful
Revision as of 18:44, 15 January 2024 by Helpful (talk | contribs) (→‎sync/async servers)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Some fragmented programming-related notes, not meant as introduction or tutorial

Data: Numbers in computers ·· Computer dates and times ·· Data structures

Wider abstractions: Programming language typology and glossary · Generics and templating ·· Some abstractions around programming · · Computational complexity theory notes · Synchronous, asynchronous · First-class citizen

Syntaxy abstractions: Constness · Memory aliasing · Binding, assignment, and such · Hoisting · Closures · Context manager · Garbage collection

Sharing stuff: Communicated state and calls · Locking, data versioning, concurrency, and larger-scale computing notes ·· Dependency hell

Language specific: Python notes ·· C and C++ notes · Compiling and linking ·· Lua notes

Teams and products: Programming in teams, working on larger systems, keeping code healthy · Benchmarking, performance testing, load testing, stress testing, etc. · Maintainability

Algorithms: Dynamic programming · Sorting · String search · Sequence alignment and diffs

More applied notes: Optimized number crunching · File polling, event notification · Webdev · GUI toolkit notes · StringBuilder

Mechanics of duct taping software together: Automation, remote management, configuration management · Build tool notes · Packaging · Installers


This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.


In programming

sync/async execution

synchronous execution is typically used in the sense of "there is one main thread, and it makes an active decision in what happens next"
asynchronous execution refers to cases where the reasons that code gets executed is less predefined/centralized
whether that's because things can happen independently of main program flow
e.g. threading, where there are independent flows of execution
e.g. event loop setup, which has tasks that yield to each other
e.g. interrupts, where a hardware line (or CPU opcode) basically forces a reaction ASAP
or other things may ask for / queue tasks, signal that they have, etc.
async also relates to the mental models that help you think about this, and practical mechanisms that help them go (think polling, callbacks, event loops, iterators, coroutines, etc)
async also refers to a higher-level keyword in some languages that model this sort of execution - see e.g. async and await


sync/async IO

synchronous IO is also known as blocking IO, usually indicates when code does an IO call where that IO code can/will make you wait until it's done - the execution of the calling program will stop until the function says that it's done.

(like any function call, but just pointing out that the function is waiting on IO to complete rather than on CPU doing calculation)
This is the simplest way of safely doing IO, both for the underlying library and OS, and for a program


asynchronous IO describes any situation where IO doesn't block execution.

This is useful when your code can do other work (other CPUwork, other IO)

usually requests you fire off and check back on later
often about the practical question "could we have gotten other things done in a program, instead of completely halting?"

Usually indicates some amount of concurrency -- which is often abstracted out for you at least a little.


Notes:

  • there is often a useful distinction between whether a program does sync or async (sys)calls towards the kernel/OS
...because a multitasking kernel/OS's IO subsystem is almost necessarily asynchronous
...and intentionally a design requirement for its parts
...and ideally its task scheduler is aware of IO, because that makes it easy to not waste time scheduling tasks that are currently blocked on IO anyway
in some ways, we're just moving the blocking to another place (specific parts of the the OS's IO subsystem), yet this has made a lot of sense since the time multitasking became a thing: the OS is nicely parallelized, and is also the one scheduling you, also meaning it can be clever in ways you no longer have to be.
  • relatedly, if a program does a blocking (sync) call towards the OS, the OS may have set up a smallish buffer to accept some data immediately (to itself get to soon and separately), which means writes do not always lead to any actual blocking of the calling program (and may do so only rarely, particularly for smaller or rarer writes)
  • if you consider IO and CPU different subsystems, then sync is using them sequentially, and async is using them concurrently.
so async can be more efficient (under certain assumptions)



See also:

sync/async servers

sometimes: delayed loading

'async' sometimes just means some variant of "when there's nothing else on my TODO list", or lazy evaluation

For example, putting async on a JS script tag means "whenever you get to it, I've go no dependencies to worry about"

In electronic signalling