Building Packages From Source When Backports Are Not Available

Obtaining Source Code When Prebuilt Packages Are Unavailable

When the software version you need is not available as a prebuilt package in your Linux distribution’s repositories, obtaining the source code directly from the project’s website is often necessary. The source code will typically be provided as a tarball file (compressed archive) containing the source files and build scripts needed to compile the software.

Locating and Downloading Source Code Tarballs

Most open source software projects make official releases available for download from their websites or code hosting platforms like GitHub. The downloads page will contain source code tarballs for stable releases and possibly unstable developmental versions. When deciding which tarball to download, consider whether you need the latest features or a more tried and tested stable release.

The naming convention of tarballs usually identifies the software name, version number, and compression format such as .tar.gz or .tar.xz. Verify you have selected the intended release version before downloading to avoid issues compiling an incompatible codebase. The file size should match what is listed on the downloads page to ensure the full archive downloaded correctly.

Verifying Checksums to Ensure Valid Download

To verify the integrity and authenticity of downloaded source tarballs, projects provide checksums generated from the exact files using algorithms like SHA256. After downloading on Linux, run the ‘sha256sum’ or ‘shasum’ command on the file and compare the output to the checksum listed by the project.

Matching checksums gives confidence no errors occurred during download and the file has not been tampered with. A mismatch indicates a corrupted download or tampered tarball and it should be discarded. Only compile code verified against provided checksums to software obtained securely over trusted channels like a project’s official website.

Configuring and Compiling From Source

Once trusted source code has been obtained, compiling it into usable software requires setting up build dependencies, configuring compile options, invoking the build scripts, and addressing any errors encountered. This flexible but manual process allows tailoring the build to your specific system.

Installing Build Dependencies

Most complex software has dependencies on libraries and headers to successfully compile. Common build tools like autoconf generate config scripts inspecting your environment to report missing prerequisites. On Linux distributions consult your package manager to install development (‘dev’) packages providing the required dependencies.

For example on Debian/Ubuntu running:
sudo apt install build-essential libfoo-dev libbar-dev
Would install the basic compiler toolchain and the development packages for hypothetical ‘foo’ and ‘bar’ libraries needed to build the target software.

Configuring Source with Optimal Compile Options

Advanced compile-time options can toggle optional features, optimizations for specific CPU architectures, output directories, and additional customizations. Most source tarballs ship with ‘configure’ scripts processing command-line arguments or make based config files like Makefile to handle this.

Inspect available options with ‘./configure --help‘ and enable relevant ones based on your environment and needs through additional flags:
./configure --enable-feature --disable-debug --prefix=/usr/local

Pay close attention to the prefix option to control where the finished binaries will ultimately be installed to instead of potentially overwriting existing OS files.

Invoking Make to Compile Code

The canonical build automation tool ‘Make’ will handle parsing build dependencies between source files and invoking the compiler appropriately. After configuring the options, simply run:

make

This multi-stage process can take significant time to complete depending on hardware resources. Multiple processor cores can be leveraged with make options like ‘make -j8‘ telling it to simultaneously utilize 8 CPU cores to substantially reduce build times.

Potential Compile Errors and Solutions

Particularly for large and complex projects, compile issues frequently occur due to unmet dependencies, outdated build systems, or portability limitations of transferring source code between environments. Meticulously reading the emitted compiler messages allows properly diagnosing and addressing the specific problems encountered.

Common solutions include installing additional missing prerequisite development packages, obtaining compatibility libraries or headers not available on your distribution, modifying problematic code segments, or patching build systems. With iterative troubleshooting, most compile failures can be resolved without needing to drastically modify source code.

Installing and Testing Custom Builds

After successfully being compiled from source, software still needs to be correctly installed and integrated into the operating system before use. Automated installation, validating file permissions, and testing functionality reduces potential issues from manual custom builds.

Using Checkinstall for Cleaner System Integration

The ‘checkinstall’ tool designed for source-based installations on Linux creates proper OS software packages from build output. Instead of directly running ‘make install‘, invoke checkinstall configured to generate a distributable ‘.deb’ package in Debian-based or ‘.rpm’ in Red Hat-based Linuxes:

checkinstall make install

Managing the resulting package through your distribution’s standard package manager instead of manually tracking loose files provides easier uninstallation, upgrades, and integration with other tools expecting standard system packages.

Setting Correct File Permissions

The Unix security model relies heavily on strict user and group-based file permissions defining what actions are allowed on resources. Files created from source may lack the expected access modes or end up fully writeable by all users.

Review permissions on installed binaries with ‘ls -l /path/to/files‘ and directories using ‘ls -ld /path/to/dirs‘. Reference standards like the Filesystem Hierarchy Standard (FHS) to determine expected modes based on location. Leverage commands like chmod, chown, and chgrp to modify permissions and owners to conform to system defaults.

Verifying Functionality of Custom Package

Despite successfully installing, compiled software can still fail or exhibit unexpected behavior without rigorous testing. Verifying key functionality matches expectations before relying critically on custom-built components can greatly reduce potential disruptions.

Use built-in help output, unit tests if available, or simply exercise major workflows to baseline normal behavior. Monitor system resources during testing to catch performance anomalies and confirmation output is correctly formatted as expected. Ideally match current functionality against the capabilities of an earlier stable version for non-regression verification.

Updating and Maintaining Custom Builds

The work of building from source does not end after the initial installation. Codebases continue evolving so maintaining revision control, scripting automation, and migrating back to distribution packages where possible reduces long-term maintenance burdens.

Tracking Upstream Changes for Updates

Software projects constantly add new features, fix bugs, and release updated versions over time. To benefit without needing to rebuild from scratch, track revisions in the project’s source control repository if available on collaborative platforms like GitHub.

Upgrading locally is then often a simple matter of running ‘git pull‘ to download changes followed by re-running the build scripts such as ‘./configure && make‘. Granularly review changes first to catch potentially breaking updates needing coordination.

Scripting Build Process for Simplified Maintenance

Manually running through all compile and installation steps increases chances of mistakes being introduced on subsequent updates. Instead capture the build sequence in an executable script checked into revision control and enable passing parameters.

This allows standardizing on desired options and simply invoking ‘build.sh’ without needing to remember finicky command sequences. Scripted builds also facilitate automated execution as part of continuous integration pipelines.

Strategies for Migrating to Distribution Packages

Ideally opt to use distribution-provided packages over custom builds whenever available to minimize maintenance. Checking back on major version upgrades or yearly releases catches when the desired functionality gets merged into official repositories.

When migration opportunities exist, plan transitions accounting for configuration format changes, data migrations, and code refactoring potentially needed to achieve feature parity. Phase rollouts combined with extensive testing reduces disruption risks during such ecosystem migrations.

Leave a Reply

Your email address will not be published. Required fields are marked *