This is a huge achievement for Debian and the free software world.
It took a while though until this was understood. In 2007 when pointing out on debian-devel that this is needed, I was still told what huge waste of time this would be. And indeed it took a huge amount of work by many people to get there, but it is well worth it.
There was no bug or attack on Debian since 2007 that reproducible packages would prevent.
"Well worth it" is not correct. And it just ups the the contribution barrier to Debian higher, I already heard a lot of people complaining that contributing to Debian is hard and while in past I defended it by "they need all the checks and bounds to make sure packages play with eachother nicely", this is just step that makes it hard for no reason and little benefit.
Reproducible builds are applicable not only to respond to ‘attacks’, a subject you seem to be bikeshedding, but also for other reasons too.
Anyone having to maintain a code base or a distributed fleet of devices will gain from this decision, immensely, as their operational periods come and go.
Reproducible builds are about longevity as much as they are about security.
Please don’t make bold claims about ‘no reason and little benefit’ while demonstrating ignorance of this hard fact: reproducible builds should have been the norm, in computing, from the get-go.
I don't think they do, actually. Longevity sounds good, but in reality anything that's old probably has critical security holes and so you shouldn't use it anyway.
https://wiki.debian.org/ReproducibleBuilds has some more infos; some is outdated, but it also has a chart showing how many packages are built in the CI, and how many of those are reproducible builds.
(Orange = FTBR = "failed to build reproducibly")
I'm not good at reading numbers from charts, but I'd guess it's a few percent (4-5ish?).
As pointed in your link, NetBSD achieved this with some help from Debian. If I understand correctly, it's not that NetBSD tried harder, it's that their problem was easier: fewer packages which change less (they still use CVS, "stability" is an understatement!).
BTW, most Debian packages have reproducible builds. Those which have not (I'd say 5%) are shown in orange in the graph there: https://wiki.debian.org/ReproducibleBuilds
Also, the *BSD are structured somewhat differently to a Linux distro.
It's not like the Linux world where you have distinct projects like the Kernel, GNU, OpenSSL, and then it's the distributions job to assemble everything.
In the BSD projects, the scope is developing and distributing an entire base system, i.e., the kernel but also the libc, the shell/all posix utilities, and a few third parties like OpenSSH (which are usually "softforked").
Additional packages you could get from pkg_in/pkgsrc (NetBSD), pkg-ng/ports (FreeBSD) or pkg_add (OpenBSD) are clearly distinct from the base system, installed in a dedicated subtree (/usr/src in NetBSD, /usr/local/ OpenBSD/FreeBSD), and provided in a best effort manner.
The reproducible build target was almost certainly only for the base system, which is a few percent of what Debian tries to achieve, and on which NetBSD has a tighter control over (developer + distributor instead of downstream assembler+distributor).
A reproducible base system is useful, but given how quickly you typically need to install packages from pkgsrc, it's not quite enough.
While we are bragging, stagex was the first to hit 100% full source bootstrapped deterministic and hermetic builds last year and the first to make multiple signed reproductions by different maintainers on their own hardware mandatory for every release.
Debian has come along way, but when Debian says reproducible they mean they grab third party binaries to build theirs. When we say reproducible we mean 100% bootstrapped from source code all the way through the entire software supply chain.
I am always surprised Debian are leading this and not the commercial vendors. You'd think big organisations paying for RHEL and Ubuntu would be beating down the door for verifiable binaries.
If a competitor can prove that their packages are bit-for-bit identical to what a big organization is shipping, that allows the competitor to benefit from the security assurances of the big org. This is great for software freedom, not so great for wannabe monopolists.
I wonder why this is a thing nowadays. I use yocto for embedded devices and it was almost a no-brainer to implement reproducible builds. I can also easily enable Debian package management, so everything is already available.
Reproducible builds are an essential method in industrial computing - Debian isn’t at the forefront of this, it is merely adopting industry wide techniques also applied to other operating systems in use in long-term and safety-related applications.
Certainly, a lot of the hard work of the Yocto and Debian developers is already in your hands.
What is interesting is that this is now being applied in a more forward-focused policy by the Debian developers, that it will now be the norm rather than an option…
Unfortunately, many of these "protections" don't know what is a bot or a human. Many clueless websites are often just blocking huge swaths of legitimate readers and customers.
Has anyone fought Microsoft Visual Studio successfully to produce reproducible builds of C++ programs? From what I have heard, it is one of the worst contexts to do it.
As someone who recently spent a lot of time on making a large C++ program entirely reproducible on 4 different OS’es, one cannot understate just how many tiny details matter here.
it's funny that as a non-native speaker, I have to check with Gemini about how "cannot overstate" is used
I also asked Gemini whether we express ourselves that way in my mother tongue (Mandarin), and yes, we do, but it came off as being too formal way of speaking. We don't normally use it (I'm not from China/Taiwan though)
So much time has been wasted on reproducible builds which could have better spent on securing more important parts of Debian. Practically minor changes like a build timestamp being different is not an issue.
Debian, like any other legacy distro, mush became declarative, because the '80s model of manual deploy and the absurd pain of D/I and Preseed must end.
In the end, Nix is just a thin veneer on this stuff.
Given how many quick & dirty sed patching or exec commands I've seen in the few nix package/modules I've read, I would not exactly bet my life on it being completely idempotent & reproducible.
Maybe not by itself, but it does allow for the ecosystem to be audited, in a way that ultimately benefits the end-user. It really is an important part of a healthy supply chain.
no problem in Debian since the start of the effort would be solved by reproductible builds
This is nice pat yourself on the back achievement for people that prefer security theatre and checking boxes than doing something actually useful, and they wasted thousands man hours of poor victims that had to implement it
This is some of the best news I've heard recently when it comes to figuring out how to produce high quality Software Bills of Materials for the upcoming EU Cyber Resilience Act, for what it's worth. Reproducible packages are actually worth a great deal when you are selling products with digital elements. Much easier to scan through, audit, etc. with confidence.
Debian has had a better "software supply chain" posture than any other player in the ecosystem since before the turn of the century. While we all face the risk of malware from upstream, Debian is the least at risk of being affected by it. See for example the stream of issues from npm et al. None of it has affected Debian.
It's npm that's affected, therefore it's not even considered when choosing language/ecosystem for writing distro tools. You'll find no sane distro writing package manager in javascript precisely to avoid this joke of a supply chain.
I quite like the OpenBSD approach to Go and Rust projects in ports. They store all the dependencies and their hashes in the build recipe, not trusting the project ones. And they’re more readable.
Here is jujutsu’s list of dependencies[0] and their hashes[1]. As an aside, that’s why I don’t like those packages managers. Something like Python’s numpy or lib curl, get sliced into atomic portions.
Why should it only be valuable if the effects were to be publicly known?
There are plenty of places in industrial computing where reproducible builds have prevented subterfuge within the organizations themselves. Injecting binaries to do inf-/exfiltration is a long-standing industrial espionage activity which is of immense value to all users of the operating system - not just the consumer users.
“Hasn’t happened” is quite naive. It happens internally - putting unscrupulous code in a company’s distro before torching the place is a surprisingly regular occurrence in places which have long since adopted Debian as a platform host. IT departments around the globe will benefit from this immensely.
This question is meaningless. Attackers will pick the best attack if they have more at their disposal. The fact that they didn't push a commit shows it's better not to. So closing that attack is good.
If you find yourself holding opinions of the kind: "If it can't be made perfect, it shouldn't be changed at all?" you may want to consider that most things that work well today were incrementally improved.
Reproducable builds are not solving all issues as you rightly observed, but they can be a stepping stone (or even a pre-condition) for further measures.
That's not what reproducible builds aim to prevent, and no one claims that. When upstream pushes bad code, that's on upstream.
The thing reproducible builds aim to prevent is Debian or individual developers and system administrators with access rights to binary uploads and signing keys to get forced to sign and upload binary packages by attackers - be these governments (with or without court orders) or criminal organizations.
As of now, say if I were an administrator of Debian's CI infrastructure, technically there would be nothing preventing me from running an "extra" job on the CI infrastructure building a package for openssh with a knock-knock backdoor, properly signing it and uploading it to the repository. For someone to spot the attack and differentiate it, they'd have to notice that there is a package in the repository that has no corresponding build logs or has issues otherwise.
But with reproducible builds, anyone can set up infrastructure to rebuild Debian packages from source automatically and if there is a mismatch with what is on Debian's repository, raise alarm bells.
Reproducible builds shows that, within a specific configuration, the code produced the binary, regardless of who signed or published it.
Indeed, this could mitigate an attacker replacing the binary with something that's not produced from the code, but it does not mitigate the tool chain or code itself containing the exploit, creating a malicious binary.
Well - reproducible also means code guarantee. It may not improve an end-user experience directly, but you get an extra quality control step, as guarantee, here. I think reproducibility is great. If we can achieve that, it should be achieved. See also NixOS; it can guarantee that snapshot xyz works, not just for one user, but ALL users. I see it as hopping from guarantee to guarantee. That's actually a good thing in the long run. Just think differently here.
You could already do that since Debian cryptographically signs all its package indexes, and the indexes contain the hash of all packages. The additional guarantee that reproducible builds bring is that you can re-build the packages in your own controlled environment and verify that the resulting package is bit-for-bit identical to what Debian offers.
This is a huge achievement for Debian and the free software world.
It took a while though until this was understood. In 2007 when pointing out on debian-devel that this is needed, I was still told what huge waste of time this would be. And indeed it took a huge amount of work by many people to get there, but it is well worth it.
There was no bug or attack on Debian since 2007 that reproducible packages would prevent.
"Well worth it" is not correct. And it just ups the the contribution barrier to Debian higher, I already heard a lot of people complaining that contributing to Debian is hard and while in past I defended it by "they need all the checks and bounds to make sure packages play with eachother nicely", this is just step that makes it hard for no reason and little benefit.
” If you are wondering why we are doing this at all, then hopefully the Reproducible Builds website will explain why this is useful.”
https://reproducible-builds.org/
Could you perhaps respond to the argumentation here?
Reproducible builds reduce the need for trusted parties.
Have many organizations produce the binaries independently and post the arifacts.
Once n of m parties agree on the arifact hash, take that as the trusted build.
If every party reaches a different hash then we cannot build consensus.
Reproducible builds are applicable not only to respond to ‘attacks’, a subject you seem to be bikeshedding, but also for other reasons too.
Anyone having to maintain a code base or a distributed fleet of devices will gain from this decision, immensely, as their operational periods come and go.
Reproducible builds are about longevity as much as they are about security.
Please don’t make bold claims about ‘no reason and little benefit’ while demonstrating ignorance of this hard fact: reproducible builds should have been the norm, in computing, from the get-go.
I longevity is harmed though. Your certs need to expire in a few years we think that your toolchain will not be downloadable.
Those problems need to be solved as well.
I don't think they do, actually. Longevity sounds good, but in reality anything that's old probably has critical security holes and so you shouldn't use it anyway.
It makes shipping backdoors a whole lot harder, yes.
Hmm, it prevents Trojan binaries which is a small subset of backdoor IMHO.
Defense in depth obviously is a good thing
There was perhaps no detected bug or attack. There have most likely been bugs or attacks that reproducible builds would have prevented.
And you base it on what exactly ? It's "just" making sure the build process is always ordered.
If anything it will make attacker's job easier, as Ubuntu package will have same files structured exactly same way as Debian one.
https://wiki.debian.org/ReproducibleBuilds has some more infos; some is outdated, but it also has a chart showing how many packages are built in the CI, and how many of those are reproducible builds.
(Orange = FTBR = "failed to build reproducibly")
I'm not good at reading numbers from charts, but I'd guess it's a few percent (4-5ish?).
all I get is this:
> Forbidden
> <p>You are not allowed to access this!</p>
(yes, with HTML tags on display) :)
EDIT: I also found a "I Challenge Thee" page in history. did I just get blocked by antibot measures? why???
Do you have JavaScript disabled? They put one of those anti-scraper things on it.
nope, it's enabled. I can pass Cloudflare, reCaptcha, whatever Microsoft is doing, and Annubis, but Debian caught me off-guard
A great milestone, congrats Debian on taking a stance and holding high standards for yourself, especially in the current era.
Good thing. NetBSD has fully reproductible build since 2017. https://blog.netbsd.org/tnf/entry/netbsd_fully_reproducible_...
As pointed in your link, NetBSD achieved this with some help from Debian. If I understand correctly, it's not that NetBSD tried harder, it's that their problem was easier: fewer packages which change less (they still use CVS, "stability" is an understatement!).
BTW, most Debian packages have reproducible builds. Those which have not (I'd say 5%) are shown in orange in the graph there: https://wiki.debian.org/ReproducibleBuilds
Also, the *BSD are structured somewhat differently to a Linux distro.
It's not like the Linux world where you have distinct projects like the Kernel, GNU, OpenSSL, and then it's the distributions job to assemble everything.
In the BSD projects, the scope is developing and distributing an entire base system, i.e., the kernel but also the libc, the shell/all posix utilities, and a few third parties like OpenSSH (which are usually "softforked").
It's quite visible in the sources, it's a lot more than just a kernel: https://github.com/NetBSD/src
Additional packages you could get from pkg_in/pkgsrc (NetBSD), pkg-ng/ports (FreeBSD) or pkg_add (OpenBSD) are clearly distinct from the base system, installed in a dedicated subtree (/usr/src in NetBSD, /usr/local/ OpenBSD/FreeBSD), and provided in a best effort manner.
The reproducible build target was almost certainly only for the base system, which is a few percent of what Debian tries to achieve, and on which NetBSD has a tighter control over (developer + distributor instead of downstream assembler+distributor).
A reproducible base system is useful, but given how quickly you typically need to install packages from pkgsrc, it's not quite enough.
While we are bragging, stagex was the first to hit 100% full source bootstrapped deterministic and hermetic builds last year and the first to make multiple signed reproductions by different maintainers on their own hardware mandatory for every release.
Debian has come along way, but when Debian says reproducible they mean they grab third party binaries to build theirs. When we say reproducible we mean 100% bootstrapped from source code all the way through the entire software supply chain.
We think that distinction matters.
https://stagex.tools
That distro has smaller codebase than Debian Installer.
I am always surprised Debian are leading this and not the commercial vendors. You'd think big organisations paying for RHEL and Ubuntu would be beating down the door for verifiable binaries.
If a competitor can prove that their packages are bit-for-bit identical to what a big organization is shipping, that allows the competitor to benefit from the security assurances of the big org. This is great for software freedom, not so great for wannabe monopolists.
Reproducible builds exist to reduce the need for trust, while commercial vendors are in the business of selling trust.
I wonder why this is a thing nowadays. I use yocto for embedded devices and it was almost a no-brainer to implement reproducible builds. I can also easily enable Debian package management, so everything is already available.
What do you mean why is it a thing nowadays?
Reproducible builds are an essential method in industrial computing - Debian isn’t at the forefront of this, it is merely adopting industry wide techniques also applied to other operating systems in use in long-term and safety-related applications.
Certainly, a lot of the hard work of the Yocto and Debian developers is already in your hands.
What is interesting is that this is now being applied in a more forward-focused policy by the Debian developers, that it will now be the norm rather than an option…
Forbidden
You don't have permission to access this resource. Apache Server at lists.debian.org Port 443
:/
I can see it just fine; maybe an overzealous firewall thinks you're a bot? At any rate, the Wayback Machine has it: https://web.archive.org/web/20260510074120/https://lists.deb...
Unfortunately, many of these "protections" don't know what is a bot or a human. Many clueless websites are often just blocking huge swaths of legitimate readers and customers.
Why the fuck does that site break the back button? DO NOT do that.
Has anyone fought Microsoft Visual Studio successfully to produce reproducible builds of C++ programs? From what I have heard, it is one of the worst contexts to do it.
Probably easiest way is to use Bazel to leverage the effort that has gone in there
Well, you can't build MSVS yourself, reproducibly or otherwise, so this is a less appealing endeavor I would think.
A small step for debian,
giant leap for mankind.
As someone who recently spent a lot of time on making a large C++ program entirely reproducible on 4 different OS’es, one cannot understate just how many tiny details matter here.
"overstate"
Whoops, yes. Well I hope the point came across anyway.
it's funny that as a non-native speaker, I have to check with Gemini about how "cannot overstate" is used
I also asked Gemini whether we express ourselves that way in my mother tongue (Mandarin), and yes, we do, but it came off as being too formal way of speaking. We don't normally use it (I'm not from China/Taiwan though)
So much time has been wasted on reproducible builds which could have better spent on securing more important parts of Debian. Practically minor changes like a build timestamp being different is not an issue.
Yes, making sure build timestamps are reproducible isn't a security win.
What is a win is that two independent parties can run the same build, and get the same binaries.
This is important because it removes trust from builders: anyone can verify their output.
It just so happens that unimportant things like build versions impede that.
It allows verifying that the binaries actually match the source, which is extremely valuable.
Debian must ship packages without the hard dependence on systemd.
Debian, like any other legacy distro, mush became declarative, because the '80s model of manual deploy and the absurd pain of D/I and Preseed must end.
I've been 100% on NixOS on many years, but it's Debian that really drove this project.
They're still a pragmatic choice for many usecases.
In the end, Nix is just a thin veneer on this stuff.
Given how many quick & dirty sed patching or exec commands I've seen in the few nix package/modules I've read, I would not exactly bet my life on it being completely idempotent & reproducible.
bootcrew have bootc Containerfiles for Debian, Ubuntu, Arch, and openSUSE:
https://github.com/bootcrew/mono
zero improvement on end-user experience. does not solve supply chain issues, debian package will reproducabily contain the malware from upstream.
> zero improvement on end-user experience.
Maybe not by itself, but it does allow for the ecosystem to be audited, in a way that ultimately benefits the end-user. It really is an important part of a healthy supply chain.
no problem in Debian since the start of the effort would be solved by reproductible builds
This is nice pat yourself on the back achievement for people that prefer security theatre and checking boxes than doing something actually useful, and they wasted thousands man hours of poor victims that had to implement it
This is some of the best news I've heard recently when it comes to figuring out how to produce high quality Software Bills of Materials for the upcoming EU Cyber Resilience Act, for what it's worth. Reproducible packages are actually worth a great deal when you are selling products with digital elements. Much easier to scan through, audit, etc. with confidence.
Debian has had a better "software supply chain" posture than any other player in the ecosystem since before the turn of the century. While we all face the risk of malware from upstream, Debian is the least at risk of being affected by it. See for example the stream of issues from npm et al. None of it has affected Debian.
You do remember the xz-utils backdoor was found in Sid right?
https://en.wikipedia.org/wiki/XZ_Utils_backdoor
> for example the stream of issues from npm et al.
Curious, what distros where affected by npm supply chain attacks?
It's npm that's affected, therefore it's not even considered when choosing language/ecosystem for writing distro tools. You'll find no sane distro writing package manager in javascript precisely to avoid this joke of a supply chain.
ECMA-262 doesn't require the use of NPM or NodeJS. (In fact, they are at odds, even 10+ years after modules were standardized in ES6.)
I quite like the OpenBSD approach to Go and Rust projects in ports. They store all the dependencies and their hashes in the build recipe, not trusting the project ones. And they’re more readable.
Here is jujutsu’s list of dependencies[0] and their hashes[1]. As an aside, that’s why I don’t like those packages managers. Something like Python’s numpy or lib curl, get sliced into atomic portions.
[0]: https://github.com/openbsd/ports/blob/master/devel/jujutsu/c...
[1]: https://github.com/openbsd/ports/blob/master/devel/jujutsu/d...
It does not solve all supply chain issues, it do solve some supply chain issues.
Not being able to see if the source code shipped is the same as been used for creating the binary is scary
Has there been a single publicly known attack that would have been prevented by this?
Why should it only be valuable if the effects were to be publicly known?
There are plenty of places in industrial computing where reproducible builds have prevented subterfuge within the organizations themselves. Injecting binaries to do inf-/exfiltration is a long-standing industrial espionage activity which is of immense value to all users of the operating system - not just the consumer users.
Zero in Debian. They have enough other procedures to catch it.
Less diligent projects had it but there are easier ways to fix it
Several actually. Pypi is regularly targeted in this way.
Hasn't happened in Debian
“Hasn’t happened” is quite naive. It happens internally - putting unscrupulous code in a company’s distro before torching the place is a surprisingly regular occurrence in places which have long since adopted Debian as a platform host. IT departments around the globe will benefit from this immensely.
But how many of those attackers also had the ability to publish a github commit but didn't to remain more stealthy.
This question is meaningless. Attackers will pick the best attack if they have more at their disposal. The fact that they didn't push a commit shows it's better not to. So closing that attack is good.
Who is this mythical end user? Reproducible builds are good for everyone - not just the average joe.
If you find yourself holding opinions of the kind: "If it can't be made perfect, it shouldn't be changed at all?" you may want to consider that most things that work well today were incrementally improved.
Reproducable builds are not solving all issues as you rightly observed, but they can be a stepping stone (or even a pre-condition) for further measures.
That's not what reproducible builds aim to prevent, and no one claims that. When upstream pushes bad code, that's on upstream.
The thing reproducible builds aim to prevent is Debian or individual developers and system administrators with access rights to binary uploads and signing keys to get forced to sign and upload binary packages by attackers - be these governments (with or without court orders) or criminal organizations.
As of now, say if I were an administrator of Debian's CI infrastructure, technically there would be nothing preventing me from running an "extra" job on the CI infrastructure building a package for openssh with a knock-knock backdoor, properly signing it and uploading it to the repository. For someone to spot the attack and differentiate it, they'd have to notice that there is a package in the repository that has no corresponding build logs or has issues otherwise.
But with reproducible builds, anyone can set up infrastructure to rebuild Debian packages from source automatically and if there is a mismatch with what is on Debian's repository, raise alarm bells.
Reproducible builds shows that, within a specific configuration, the code produced the binary, regardless of who signed or published it.
Indeed, this could mitigate an attacker replacing the binary with something that's not produced from the code, but it does not mitigate the tool chain or code itself containing the exploit, creating a malicious binary.
Well - reproducible also means code guarantee. It may not improve an end-user experience directly, but you get an extra quality control step, as guarantee, here. I think reproducibility is great. If we can achieve that, it should be achieved. See also NixOS; it can guarantee that snapshot xyz works, not just for one user, but ALL users. I see it as hopping from guarantee to guarantee. That's actually a good thing in the long run. Just think differently here.
> zero improvement on end-user experience
The end-user experience is that now you can host your Debian binaries in caches and CDNs without worrying about supply chain hackers.
You can verify that file hashes match the ones on Debian's website and sleep much better at night.
If you don't trust Debian's website then you can rebuild yourself and check if Debian has been compromised.
You could already do that since Debian cryptographically signs all its package indexes, and the indexes contain the hash of all packages. The additional guarantee that reproducible builds bring is that you can re-build the packages in your own controlled environment and verify that the resulting package is bit-for-bit identical to what Debian offers.