Your beloved FTP Team had a meeting in the Essener LinuxHotel during the last week of October. A lot has been achieved during this meeting and we would like to take this opportunity to thank our hosts from the LinuxHotel for the nice environment as well as every Debian Developer, Maintainer and interested for their patience during the time of reduced archive service.
During this meeting more than half of our codebase got changed and
multiple outstanding and intrusive patches got merged. We also discussed
various outstanding topics, a few of which we can report about already,
a few others where we still have to gather more information. This
process, either asking our lawyers or various other people, has already
dpkg v3 source format, compression
As many already noticed, our archive now additionally supports 3.0
(quilt) and 3.0 (native) source package formats. You can use either
gzip as usual or bzip2 for the compression within the binary packages -
and now also for the source files. We do not support lzma as a
compressor, as the format is already dead again. After squeeze we will
probably add support for its successor, xz.
Currently there are no plans to support the 3.0 git or bzr formats.
The public view of incoming.debian.org has changed. Due to internal
changes in dak, queue/accepted no longer exists so we generate a
public view in the same way as the hidden buildd queues were always
generated. This change means that packages will stay visible on
incoming.debian.org for a short time after they enter the pool
and are pushed to the mirrors (which removes the old problem whereby
during dinstall packages would vanish from incoming but not yet
be visible in the pool). As a couple of people have commented, the
changes files are no longer part of this view. It’s possible that
in the future we will add them back, but the same information is
available on firstname.lastname@example.org and we would
need to make some small architectural changes to add the .changes
Another side-effect of this work is that it will now be possible
to autobuild suites other than unstable from ‘accepted’. This will
require work and discussion with the buildd team as to how they
want to handle it.
The channel #debian-ftp is now open to the public as a way of
contacting members of the ftp-team, but not for random discussions.
Please, help us keep the channel well focused and on-topic.
After some discussion about this, there are two opinions within the
ftp-team about this matter. Given that other distros experience has
shown that allowing source only uploads results in a huge loss of
quality checks and an increased load on the buildds from packages
FTBFSing everywhere, some members of the team believe that source+binary
uploads should happen as currently, but that the maintainer built
binaries should be rebuilt by the buildds (i.e. be thrown away at accept
time). Other members of the team think that we should allow source-only
uploads and that if some people keep uploading packages which FTBFS
everywhere (showing a lack of basic testing), this should be dealt with
by the project in other ways which are out of the scope of the ftp-team.
The current “winning” opinion is to go with the source+throw away
binaries route. We are close to being able to achieve this, it is
simply that it has not yet been enabled. Before any version of this
can be enabled, buildd autosigning needs to be implemented in order
that dak can differentiate buildd uploads vs maintainer uploads.
Provisions have been made in dak for things such as bootstrapping a
new architecture where binary uploads from porters may be necessary
in order to get going.
Now turned on. lintian tags available in yaml format from
“Inline” signed Release files
We now have a second Release file on our mirrors, called InRelease. The
only difference to Release is that the signature is not detached, but
within the file. This is a first step towards getting rid of race
conditions when updating Packages/Sources files and mirror updates
running. After squeeze we will change to have more than one generation
of Packages/Sources files on the mirror (likely the last 3) and have
them all included. This will require discussion with the apt
(and other relevant) maintainers about how this should be implemented.
We enabled all gzip content to be generated using –rsyncable
(Packages, Sources, Contents, indices/). This gives a slight increase
in size but a huge decrease in neccessary bandwidth used by the mirror
The extra source case
This issue is the one traditionally known as the linux-modules-extra
problem but also exists for some compiler packages and in the past
existed for things such as apache2-mpm-itk and so is a more general
problem. It exists where a package needs to use source from another
package in order to build. This is normally done by creating a -source
binary package and Build-Depending on it (this is of course ugly in its
own right, but let’s fix one thing at a time). The problem is that dak
then has no idea that we need to keep the extra source(s) around on the
mirror whilst that binary package is still available. We intend to fix
this by introducing a way of packages declaring that they were
Built-Using a certain source package,version and then tracking that to
ensure that the sources are kept around properly. The changes in dak
which are needed are now relatively small and a proposal will be posted
to debian-devel soon which will outline how we hope that this will work.
Tracking arch all packages
#246992 asked us to not delete arch all packages before the
corresponding (if any) arch any packages are available for all
architectures. Example: whenever a new source package for emacs23
gets uploaded the installation of the metapackage emacs_*_all.deb
breaks on most architectures until the needed Architecture: any
packages like emacs23 get built by the buildds. That happens because
dak removes all arch: all packages but the newest one.
While this behaviour is easily documented and one can easily devise a
fix (“just keep the arch all until the any is there, stupid”), the
actual implementation of it contains several nasty corner cases
which is why it took so long to fix.
Thankfully Torsten Werner took on this task during the meeting  and
we are now in a position where we can merge his work. We’ll email
before turning on this feature so that people can watch out of any
strange cases which could cause problems.
Due to the massive changes in the archive, NEW (and also Byhand) had to
be disabled. Certain assumptions made by the processing tools no longer
applied. The last week was used to work on this issue and we think this
will be fixed today, so NEW processing will return to its normal speed
– bye, Joerg, for the FTP Team.