Neon/Builder
Neon uses a Jenkins continuous integration system to build its packages
The Setup
One of KDE's many servers is the master of neon and runs a Jenkins instance which is a Continuous Integration website at build.neon.kde.org that has many jobs to build the packages and run other functions, either on demand or at pre-scheduled intervals.
The code behind build.neon is from pangea-tooling which also runs the code for DCI Debian CI, KCI Kubuntu CI, and MCI Mobile neon Plasma CI.
The Jenkins jobs farm off the hard build work to a number of DigitalOcean slave servers. It runs most jobs inside a Docker container to give a fresh build environment.
After a checkout of pangea-tooling add the submodule for the CI config with git submodule update
.
This adds https://github.com/blue-systems/pangea-conf-projects.git
which contains the files that list the jobs to be made.
To use the scripts to access Jenkins you will need to set ~/.config/pangea-jenkins.json
using access key available inside Jenkins to administrators User (top right menu) -> Configure -> API Key
https://github.com/blue-systems/pangea-tooling/wiki/Jenkins-Config
The setup of various machines, that are provided by Blue Systems, is maintained in pangea-kitchen which uses Chef to set up the servers with software all configured.
For more information see pangea-tooling/Getting-Started
The Packaging
Our packaging is kept at packaging.neon.kde.org Git archives, see Neon/Git.
The packaging is for .deb packages and the Git repos contain a single debian/
directory which defines how the .deb is made. We try to keep the packaging in sync with Debian pkg-kde team's Git repositories and keep the diff as small as possible with them.
Neon/unstable
is for Developer Edition Unstable Branches, its packages are combined with master branches from the KDE project.
Neon/stable
is for Developer Edition Stable Branches, its packages are combined with stable branches from the KDE project which are defined in overrides/base.yaml. Stable branches also includes branches released as Beta (so the name is not quite logical).
When a project makes a new (non-bugfix) release you should merge Neon/unstable into Neon/stable and update the stable branch in overrides.
Neon/release
is for User Edition, the code gets built with release tars.
Neon/release-lts
is for User LTS Edition, the code gets built with release tars except plasma which is lts tars.
Neon/mobile
is used by mobile CI, and not available in all repos. This branch have patches applied which are required only for Plasma Mobile.
When moving files between packages in the same source package you can use the variables (<< ${source:Version}~ciBuild)
for your Breaks/Replaces where ~ciBuild gets replaced on merge into Neon/release.
See [New Repositories] for new packages.
The [build overrides] files is used to define jobs which need a paticular branch or tar to build from.
We don't use debian/changelog
files, they just add merge conflicts and we already log changes in Git changelog.
The repos are kept in sub-directories which are the same as Debian pkg-kde team uses. Ones we add are neon-packaging/
for stuff we package but don't expect Debian to use, neon/
for distro specific packages such as neon-settings and forks/
for repos packaged elsewhere we want to base on.
The Jobs
The Jenkins jobs are created by running the pangea-tooling script jenkins_jobs_update_nci.rb
. This creates some manual Jobs specified in the script such as the ISO jobs but mostly uses factories to create batches of jobs based on archives. Use NO_UPDATE=1
to speed up running of it by not updating git checkouts. As with other scripts it needs the version of the Gems provided by Bundle so run it with bundle exec jenkins_jobs_update_nci.rb
.
The YAML files in pangea-conf-projects define what jobs get created.
For each package there is a parent MultiJob which runs some sub jobs.
parent job
this is set to checkout the relevant archive from KDE Git as source/ (for Developer Editions), then check out the relevant archive from KDE neon Git as packaging/. It then runs a number of child jobs...
src
will create the source package. For User Edition this means running uscan to use the debian/watch file to download the relevant tar, for Dev Editions it uses the source the parent job checked out. It then builds the source package.
If the epoch has changed it will fail here. Log onto the build server (charlotte) and under /home/neon/data/jobs remote the last_version files from all the build jobs such as
find *pulseaudio-qt* -name last_version
If the version has downgraded you will need a pin file in /etc/apt/preferences.d in the neon-settings package such as the one at 97fdd3e7818a7bf00e60f5e2094798390de232dd. For version and epoch downgrades you will need to delete the existing packages in the archive first.
bin
job will extract the source, install the build dependencies and compile the package. It finishes by checking the output from lintian and fails on any errors, you can override errors with lintian-overrides files in the normal .deb packaging method (see dh_lintian). It also checks for any list-missing files and fails if there are any, override by adding a debian/not-installed files.
It also fails if cmake reports build-dependencies it needs, override with debian/meta/cmake-ignore. cmake-ignore can be a list of the missing dependencies as output by cmake, it can also be a YAML list which allows to set the ignore only for specific releases e.g.:
- QCH , API documentation in QCH format (for e.g. Qt Assistant, Qt Creator & KDevelop): series: xenial
KCrash Validator adds a test to executable to ensure they are linking to KCrash. It fails the build if they do not explicitly link to KCrash. If you come across this problem add KCrash::initialize() into the same place as KApplication is set up.
adt
job runs Debian's test framework autopkgtest. See Ubuntu guide for some details. It runs adt-run on the binaries which installs them and runs the relevant test suite as defined in debian/tests/. It doesn't fail if tests fail.
pub
job will upload to aptly, see The Archive below.
lintqml
job will scan for QML dependencies which have not been satisfied by the package dependencies, it will print a JSON output of any missing QML modules it requests. The packager should add these to the packaging manually and rebuild. Any false positives can be overridden, see Kubuntu/CI/QMLIgnore.
lintcmake
job will install build packages (plus dependencies), scan for CMake Config files then get cmake to try to use them. This will show if any dependencies are missing. Using the cmake file in isolation may also show problems in the cmake file such as missing includes such as CMakeFindDependencyMacro
snap
job will package it up as a Snappy Snap package. This is experimental, you can see the output at distribute.kde.org.
Other Jobs
watcher
jobs are made for packages in User Edition. They use debian/watch files to check for new releases and if one is found add a new changelog entry, merges from Neon/stable, then runs the release build job. See man uscan
for info on watch files.
It will fail it is finds an "unstable" line in the watch file as we don't include these in User Edition.
It will mangle the watch file to use the [1] we have running on the jenkins master, this bridges http to sftp to expose the contents of download.kde.org even when they are hidden from the web server so we can get previews of tars before they are released. When building unreleased packages make sure not to snapshot them into User until they get released. Consider disabling the Snapshot job to avoid mistakes.
mgmt
jobs run various management tasks.
mgmt_appstream-generator_bionic
and friends use asgen (appstream-generator) to generate appstream data from the data in repos. It gets run after the snapshotmgmt_appstream-health
checks the above has run correctly and sends an e-mail if nowmgmt_aptly
sets up the aptly archivesmgmt_build_bionic_release
and friends is the nightly job to make all the build jobs poll for changes in KDE git and runs the jobs if there is new codemgmt_daily_promotion_bionic_release
et al gets run by the snapshot jobs to make sure the current 'user' archive can install all, then that it can upgrade to the new 'release' archive successfully and then that it can purse all the packages. If it fails then the snapshot does not run.mgmt_digital_ocean
updates the VM images used to create cloud servers on digital ocean, this needs to be run manually after updating pangea-tooling and aftermgmt_tooling
, you will need to wait for all the existing DO servers to die before it actually has an effect so don't run lots of jobs.mgmt_digital-ocean_dangler
removes old digital ocean droplets (cloud servers) that for some reason have not removed themselvesmgmt_docker
used to be run by mgmt_tooling and updates and pushes the docker images used for builds. It has now moved to https://build.plasma-mobile.org/view/mgmt/ which runs it automatically instead.mgmt_docker_hub_check
checks all the neon images on hub.docker.com are built correctlymgmt_docker_hub_rebuild
runs daily to ping hub.docker.com to rebuild the neon docker imagesmgmt_germinate
Updates the Neon/release branch in our seed packagemgmt_git-semaphore
Pushes out update to git-semaphore, our wrapper around git which limits simultanious connections to serversmgmt_jenkins_archive
archives old builds onto the slower but larger diskmgmt_jenkins_prune_parameter-files
removes old paramater files used to pass on status between sub-jobsmgmt_job-updater
runs jenkins_job_updater_nci which updates or adds all the build jobs according to pangea-conf-projects settingsmgmt_merger
runs all the merger jobs each nightmgmt_merger_debian-frameworks
merges in debian branches into Neon/unstable branches for frameworksmgmt_pause_integration
can be run manually and just blocks jobs from starting, remember to kill is when you're done.mgmt_progenitor
runs the mgmt_build jobs each nightmgmt_repo_cleanup
removes old snapshots of user edition (we keep the most recent 4)mgmt_repo_divert_stable_bionic
is used when new Qt is built to allow temporary copies of dev-stable and dev-unstable repos for testing and rebuilding bitsmgmt_repo_test_versions_release-lts_bionic
checks all the packages in our archive have larger version numbers than ubuntu archivemgmt_repo_undo_divert_stable_bionic
undoes mgmt_repo_divert_stable_bionicmgmt_snapshot_bionic_user
snapshots release to user repomgmt_tooling
is run whenever there is a commit made to pangea-tooling to update tooling on the jenkins master. It fails if ruby testing fails.mgmt_workspace_cleaner
cleans the build workspace on build servers
iso
jobs builds the installable ISOs. See Neon/InstallableImages. They are run weekly and should be run manually after significant updates such as a new Plasma release.
The Archive
archive.neon.kde.org is our .deb package archive. For your sources.list you need one of the following lines.
deb http://archive.neon.kde.org/unstable xenial main deb http://archive.neon.kde.org/testing xenial main deb http://archive.neon.kde.org/user xenial main deb http://archive.neon.kde.org/user/lts xenial main
It runs on KDE server racnoss
and mirrors its packages with cdn77.
It is an aptly instance and may be running the Blue Systems Aptly fork.
Admins can access it using the repo console from pangea-tooling:
./ci-tooling/nci/repo_console.rb
Repo.list repo = Repo.get("unstable_xenial") repo.packages()
This makes available the Aptly-Api code using the Ruby GEM written by Harald and Rohan https://github.com/KDEJewellers/aptly-api/
pangea-tooling/ci-tooling/nci/repo_cleanup.rb
can be run to delete packages other than the latest one and save some disk space on racnoss
User Repo
To allow for extra QA the packages built for User Edition are uploaded to the secret release
repo
deb http://archive.neon.kde.org/release xenial main
You can test this manually and when happy run mgmt_snapshot
to copy the packages to user repo. This will first run mgmt_daily_promotion_xenial_release
(slow takes ~30 mins) which installs existing packages and attempts to upgrade them to new packages, if there are any problems it'll stop the snapshot. It also runs mgmt_appstream-generator
which creates the Appstream data files used by the archive.