Python usage notes/joblib

From Helpful
Jump to: navigation, search
Syntaxish: syntax and language · importing, modules, packages · iterable stuff · concurrency

IO: networking and web · filesystem

Data: Numpy, scipy · pandas, dask · struct, buffer, array, bytes, memoryview · Python database notes

Image, Visualization: PIL · Matplotlib, pylab · seaborn · bokeh · plotly


Threads and processes · joblib · pty and pexpect

Stringy: strings, unicode, encodings · regexp · command line argument parsing · XML

date and time

speed, memory, debugging, profiling

semi-sorted

This article/section is a stub — probably a pile of half-sorted notes, is not well-checked so may have incorrect bits. (Feel free to ignore, fix, or tell me)


Joblib is a way to serialize jobs, and execute them on demand and in parallel, and offer memoization to avoid double work.


It has a few options that let you bolt it onto existing code, rather than requiring you to write code towards a specific API.

It's numpy-aware - and should deal okayish with large arrays, compressing them where that's easy.



Memoization

joblib.Memory is disk-backed memoization, which you

could decorate functions with
could wrap in more explicitly to do the occasional checkpoint
could wrap into every Parallel call


Parallel execution

joblib.Parallel uses

lower overhead, but not always faster (consider GIL stuff)
more overhead, but can be safer.
like multiprocessing with a few extra nice details
(also capable of threading)


joblib.dump() and joblib.load()) help serialize numpy data (and in general handles more types than plain pickle handles).

This is however not a portable format, because the underlying cloudpickle is only guaranteed to work in the exact same version of python, not between them. (verify)



delayed() is basically a cleanish way to pass in the function and its arguments to Parallel, without accidentally calling it as a function and doing that work in the main interpreter.

For example, the example from [2] is trying to parallelize

[sqrt(i ** 2)  for i in range(10)]

which works out as something like

Parallel( n_jobs=2 )( delayed(sqrt)(i ** 2)  for i in range(10) )

(Note that i**2 is still computed in the main thread(verify)




https://joblib.readthedocs.io/en/latest/