Python usage notes - import related stuff

From Helpful
Revision as of 14:21, 30 July 2024 by Helpful (talk | contribs) (→‎Where import looks)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Syntaxish: syntax and language · type stuff · changes and py2/3 · decorators · importing, modules, packages · iterable stuff · concurrency · exceptions, warnings


IO: networking and web · filesystem

Data: Numpy, scipy · pandas, dask · struct, buffer, array, bytes, memoryview · Python database notes

Image, Visualization: PIL · Matplotlib, pylab · seaborn · bokeh · plotly


Tasky: Concurrency (threads, processes, more) · joblib · pty and pexpect

Stringy: strings, unicode, encodings · regexp · command line argument parsing · XML

date and time


Notebooks

speed, memory, debugging, profiling · Python extensions · semi-sorted


Import related notes

Specifying import fallbacks

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

You've probably seen fallback import tricks like:

import StringIO
try:
    import cStringIO as StringIO
except ImportError:
    pass

or

try:
    import cElementTree as ElementTree
except ImportError:
    import ElementTree

or

try:
    set
except NameError:
    from sets import Set as set

(For ElementTree you may want something fancier; see Python_notes_-_XML#ET_and_lxml_-_Lineage,_variants,_and_importing

A reference to the module you're coding in

There are a few ways of getting a reference to the current module object (which is rarely truly necessary, and note that if you need only the names of the members, you can use dir() without arguments).


Apparently the generally preferred way is to evaluate sys.modules[__name__], because this is needs no knowledge of where you put that code, and can be copy-pasted directly. (The variable __name__ is defined in each module and package (it will be '__main__' if the python file itself is run as a script, or you are running python interactively).


Another way is to import the current module by its own name, which (because my then the module is already loaded), has the net effect of just returning that reference (and binding it to a name).

There are a few details to this, including:

  • you shouldn't do this at module-global scope(verify), since the module won't be loaded at that point
  • will work for packages, by its name as well as by __init__, but there is a difference between those two (possible confusion you may want to avoid): the former will only be a bind, while the latter is a new name so may cause a load, which might pick the pyc file that Python created, so while it should be the same code it may not be id()-identical (...in case that matters to your use)

Importing and binding, runtime-wise

In general, importing may include:

  • explicit module imports: you typing import something in your code
  • implicit module imports:
site-specific details
anything imported by modules you import
and package-specific details (see */__all__)
  • binding the module, or some part of it, as a local name


Module imports are recorded in sys.modules, which allows Python to import everything only once.

All later imports fetch the reference to the module from that cache and only bind it in the importing scope.



Binding specific names from a module

You import a whole module at a time.

Optionally, you can take a few things from within that module, and bind it in another scope. This is mostly personal taste - it does not change what gets evaluated during import.


Say you are interested in the function comma() from lists.format (package lists, module formats).

You can do:

import format as fm             # people seem to like this for things like numpy, because it saves typing for something you reference a lot
# binds 'fm', so usable like:
fm.lists.comma()


import format.lists
# binds 'format', so usable like:
format.lists.comma()


from format import lists
# binds lists (and not format), so:
lists.comma()


from format import lists as L
# locally binds lists as L (and neither format or lists), so:
L.comma()
 

import format.lists as L
# same as the last
L.comma()


from format.lists import *
# binds all public names from lists, so:
comma()      # and who knows what else is now in your scope


from format.lists import comma
# binds only a specific member
comma()


from format.lists import comma as C
# like the last, but binds to an alias you give it
C()

Packages

For context, a module is a python file that can be imported, and any file can be a module. The requirements amount to

  • the filesystem name must not be a syntax error when mentioned in code
  • needs a .py extension
(the import system is more complex and you can get other behaviour, but the default behaviour mostly looks for .py)


Packages are a little extra structure on top of modules, an optional way to organize modules.

A package is a directory with an __init.py__ file and that directory will typically contain some module files it is organizing.

A package doesn't much special status (it's mostly just a module with a funny name) except in how the import system handles further importing and name resolution.



What should be in __init__.py


If you are only using packages to collect modules in a namespacing tree structure sort of way, you can have __init__.py be empty

however, sometimes this only means more typing and not more structure. A lot of people like keeping things as flat as sensible

(you might still like to have a docstring, an __author__, maybe __version__ (a quasi-standard) in there)


If you like to run some code when the package is first imported (usually for some initial setup), that can go in __init__.py

however, as this amounts to module state, this is sort of a shared-global situation that most libraries intentionally avoid


You could put all code in __init__.py -- because that's just an awkward way to make what amounts to module. With a weird internal name.


If you want to use a package as an API to more complex code, and/or selective loading of package contents, read on

You can drag in some PEPs about clean style, but in the end there are mostly just suggestions.

The most convincing argument I've seen is centered around "think about what API(s) you are providing".


  • If you leave __init__.py empty, that means users have to explicitly import all modules.
upsides
very selective loading
downsides
lots of typing
users have to know module names already, as the package won't tell them
  • __init__.py itself mostly imports modules' contents into the package scope
upsides
doesn't require you to import parts before use.
Lets that package be an API of sorts, cleaner in that it shielding some private code users don't care about
for the same reason, makes it easier to reorganize without changing that API
downsides
doesn't allow user to load only a small part
  • __init__.py imports parts from submodules
upsides
can be cleaner, e.g. letting you mask 'private' parts from the package scope
downsides
can be messier, in that it's less


  • __init__.py uses __all__
see below, but roughly:
upsides: lets users have some control over whether to load submodules or not
downsides: a little black-box-magical before you understand the details


Relative imports

importing *, and __all__

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.


Using import * from something / from something import * asks python to import all names from a module or package, and bind them in the scope you said that in.


Within user code, importing * may be bad style, mostly because you cannot easily know what names you are importing into your scope (they are not mentioned in your code, and may change externally), so in the long run you won't know what name collisions that may lead to. This problem is not actually common, but that itself is why it can lead to very confusing, well hidden bug once it happens (particularly as same-named things will often do similar things).


It should be much less of an issue for packages to import * from their modules, because and should be well known to the package's programmer what names that imples.

Side note: the package's code doesn't tell you want names will be there at runtime. This can be annoying when users are trying to e.g. find a particular function's code.




import * around modules

  • if there is no __all__ member in a module, all the module-global names that do not start with an underscore (<text>_</text>) are bound
which means that basic modules can use underscore variable to have global-like things that won't pollute others


  • if there is an __all__ member in a module, it will bind those things
lets a programmer minimize namespace cluttering from their own modules. If people want to throw a lot of names onto a pile, they have to work for it


import * around packages

  • if there is no __all__ member, it only picks up things already bound in this package scope (except underscore things)
  • if there is an __all__ member, it seems to go through that list and
if that name is not an attribute bound in the package, try to import it as a module.
if that name is an attribute already bound in the module, don't do anything extra
binds only names in that list

You can mix and match, if you want to confuse yourself.

For example, consider a package like:

__all__ = ['sub1', 'myvar']
__version__ = '0.1'
import sys
myvar    = 'foo'
othervar = 'bar'
import sub2

...will

  • always import sub2 (it's global code)
  • import sub1 only when you do an import * from this package, not when you import the package itself
  • bind sub1 and myvar, but not othervar or sub2 or sys
  • if __all__ includes a name that is neither bound or can be imported, it's an AttributeError


This ties into the "are you providing distinct things" discussion above in that you problably don't want explicit imports then.




importing from packages
This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.

The examples below assume a package named format with a module in it called lists.

To experiment yourself to see when things happen, try:

mkdir format
echo 'print "format, __init__"'  > format/__init__.py
echo 'print "format, lists"'     > format/lists.py


In an import, everything up to the last dot has to be a package/subpackage, and the last part must be a module.

The package itself can also be imported, because a __init__.py file is a module that gets imported when you import the package (or something from it) and aliased as the directory name. With the test modules from the last section:

>>> import format.lists
format, __init__
format, lists

The import above bound 'format' at local scope, within which a member 'lists' was also bound:

>>> format
<module 'format' from 'format/__init__.py'>
>>> dir(format)
['__builtins__', '__doc__', '__file__', '__name__', '__path__', 'lists']
>>> format.lists
<module 'format.lists' from 'format/lists.py'>


Modules in packages are not imported unless you (or its __init__ module) explicitly do so, so:

>>> import format
format, __init__
>>> dir(format)
['__builtins__', '__doc__', '__file__', '__name__', '__path__']

...which id not import lists.


Note that when you create subpackages, inter-package references are resolved first from the context of the importing package's directory, and if that fails from the top package.

Where import looks

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.


import looks in all of sys.path.


Which is only a partial answer, because

  • sys.path is altered during intepreter startup, by a few different things, before your code gets control.
  • And your code might add some more.
e.g. via sys.path.append() (docs: "A program is free to modify this list for its own purposes."[1])
...but this is too late to do some things cleanly - and frankly you rarely get exact control of the order in which all that you import gets imported


The parts before you get control are mostly:

  • it tries to put the invoking python's directory on front of sys.path (i.e. sys.path[0]) so that it takes precedence
otherwise left as '' which signifies the current directory(verify)
both meaning you can put supporting modules alongside your script


this seems to come after the above, before anything else (verify)
Intended for private libraries, when you have reason to not install them into a specific python installation (...or you can't), or perhaps to override with a specific version(verify)
avoid using this to switch between python installations - that is typically better achieved by calling the specific python executable (to get its configured overall path config)
avoid using this to import specific modules from other python installations (just install it into both is likely to be cleaner and lead to less confusion)


  • it does an import site[2], which (unless -s or -S blocks things) leads to roughly four things:
add site packages (via site.addsitedir)
combines the compiled-in values of sys.prefix and sys.exec_prefix (often both /usr/local) with a few suffixes, roughly ending up on lib/pythonversion/site-packages (verify) (except on debian/ubuntu, see note below. There may be distros that customise site.py)
setting PYTHONHOME overrides that compiled-in value with the given value.[3]
fully isolated (see virtualenv) or embedded[4] pythons want to do this. Most other people do not, in part because of what it doesn't mean for subprocesses calls
add user site packages (via site.addsitedir) (see also PEP370, since roughly py2.6(verify), was introduced allow installation into homedirs without requiring virtualenv)
similar, but looking within ~/.local (platform-specific, actually) so ending up with something like ~/.local/lib/pythonversion'/site-packages
which is added only if it exists and has an appropriate owner.
that user base (~/.local) can be overridden with PYTHONUSERBASE
import sitecustomize
e.g. for site-wide development tools like profiling, coverage, and such
note that 'site' is still a specific python installation
import usercustomize (after sitecustomize)
e.g. for user-specific development tools like profiling, coverage, and such
can be disabled (e.g. for security), e.g. -s parameter avoids adding the user site directory to sys.path (and also sets site.ENABLE_USER_SITE to False)



Notes:

  • If you customize some of this, you need to think hard about how scripts run via subprocess won't or will get the same alterations.


  • pip installs into
    • if a virtualenv is enabled: your virtualenv's site-packages
    • if using --user, your user site-packages
    • otherwise (i.e. default): the system site-packages
if run without sudo, it should fail and suggest --user


  • site.addsitedir() amounts to
    • sys.path.append() plus
    • looking for and handling *.pth files


  • .pth files are intended to include further paths (allows namespace-ish things).
apparently intended for slightly more intelligent search path alteration, where necessary, and must start with 'import'
...but since it allows semicolons you can abuse this for varied inline code
There is a move to design this away - once all the useful things it is used for (package version selection, virtual environment chaining, and some others) exist properly
e.g. take a peek at your site-package's .pth files


  • dist-packages is not standard python, it is a debian/ubuntu convention
these say that python packages installed via apt go into dist-packages, and the site-packages directory that would normally be the target is not used
pip will be installed that way, and also install there
this is intended so that admin installing things with with apt or pip go to a specific system directory (dist-utils), while e.g. your own custom compiled python won't know about this and install into its site-packages.
Which is considered better isolated. But not necessarily very clear to understamd.
dist-packages is baked into the system python's site.py(verify)




See also:

-->

Packaging python packages

Probably see Python packaging, which is also contextualized around virtual environments.