The new policy is designed to reduce the load on people packaging Python modules, and, by the dint of doing so, ease the transition that occur as new Python versions are introduced, old ones removed, and as the default version of Python changes, with minimal impact on the target system. As far as possible, Python version transition events are handled by automated processes, and minimal effort recompilation of new versions when the process can not be fully automated.
The need to support more than one version of a python runtime or to support different implementations was seen. It takes a while for applications to support new versions of Python, and supporting multiple versions of Python is essential for a smooth transition preiod.
The old schema of using pythonX.Y-foo packages lands packages in the NEW queue, when support for another python runtime is added to the package. Since this requires manual intervention, support for new versions of Python added manual processing and often delayed the support for the new Version of python.
Having pythonX.Y-foo mentioned in the control file would disallow binary NMU's in situations where a python runtime is dropped or added, since the control needs to be regenerated.
Putting extension modules for more than one python version into a package eases transition of these packages to the testing distribution, provided that the package supports to default python version in testing and the default python version in unstable. [1]
Ease the manual intervention required when any of the following events occur:
Most pure Python modules with no restrictions on the version of Python supported, and those pure Python modules that only have a lower bound on the versions of python supported (for example, ">= 2.3", or "all"), would require no upload; they are merely recompiled using the rtupdate hooks, or utilities like python-central and python-support that hook themselves into rtupdate.
A number of public extension modules that do not have restrictions on the version of Python supported can just be recompiled. These include packages like:
Packages using $(shell pyversions -s) to determine which Python versions to build for at build time, and which build depend upon python-all-dev would work seamlessly.
Some packages using CDBS will also work out of the box.
Packages using the distutils> build system should also work.
Private modules are only built for one Python version, usually the default (pyversion -d). Private modules that do not have restrictions on the version of Python required, or which intelligently use $(shell pyversions -s) and their internal restrictions to discover which version of Python to build for would also merely need to be recompiled.
The only packages that will need manual upgrades, are packages that needed a set of python versions that did not contained the (at the upload time) current python version, and a couple of other cases, which should reduce the packages that need a non-automatized action from the maintainer (like a new source upload). This reduction of manual intervention should reduce the effort and time taken for changing the Python version or adding or dropping versions of Python much more painless than the old policy.
The new policy aims to reduce the pressure on packagers when the default Python version (what /usr/bin/python points to) changes. In that case, it tries to:
Any packages that do not need to change need not be rebuilt or uploaded. Pure Python modules, either public or private, should be byte compiled automatically on th target machine using the rtupdate mechanisms.
Packages that have already an extension installed for the new version of Python do not need to change either, since they would continue to keep working.
The new policy also aims to reduce the pressure when an
old version of Python is dropped, since only packages
directly dependent on that version
will need a manual upgrade (this includes, for example,
any package with scripts that use
/usr/bin/pythonX
.Y
).
The new policy also reduces the numbers of packages in the archive, by supporting multiple versions of Python in the same binary package (at the cost of increased size of that one package, but it should still result in space saving.) [2]
This presupposes that the package build mechanism utilizes the utility pyversions to get information about the supported Python versions, the currently installed Python executable versions, and the default Python runtime, instead of hard coding values in the debian/rules script. This script can also be used to parse the value of the XS-Python-Version field in the debian/control or debian/pyversions.
Another consequence of the current design: the default python version has to be installed, other supported versions can be installed additionally, not as a replacement.
[1] | The older policy required an extra upload of every package containing an extension, adding new dependencies on new shared libraries in unstable, but not yet in testing. It also tended to require all packages that required a version of python strictly less than a specific version to move into testing at the same time, creating long periods where packages were blocked. |
[2] | The two cases where this happens is
|