timbray,
@timbray@cosocial.ca avatar

1/2 Looking at one of the writeup, this struck my eye: “The release tarballs upstream publishes don't have the same code that GitHub has. This is common in C projects so that downstream consumers don't need to remember how to run autotools and autoconf.” Ah, GNU AutoHell, I remember it well. Tl;dr: With AutoHell, even if you're building for a 19-bit Multics variant from 1988, it’s got your back. Except for it’s just too hard to understand and use, thus the above.

timbray,
@timbray@cosocial.ca avatar

2/2 Thus, another lesson. Don’t rely on build tools you don’t understand generally, and don’t rely on GNU AutoHell specifically.

[Yes, I understand this hack has many more moving parts, most much more sophisticated. But I didn't see anyone else saying this.]

yianiris,
@yianiris@kafeneio.social avatar

When you say GNU autohell you mean automake autoconf?
Because it was the option NOT to use them that passed the code into the system.

@timbray

timbray,
@timbray@cosocial.ca avatar

@yianiris To be precise: It is their insane fragility and complexity that results in people accepting tarballs of their output.

luis_in_brief,
@luis_in_brief@social.coop avatar

@timbray it’s definitely out there; among others: https://mastodon.social/@kornel/112187783363254917

luis_in_brief,
@luis_in_brief@social.coop avatar

@timbray again, I have to wonder what a healthy GNU project might look like in this moment, dedicated to porting and revitalizing old code in a modern toolchain and language. It’d be an epic but worthy challenge, that could inspire a community.

But alas the org can’t even get itself together to try :/

not2b,
@not2b@sfba.social avatar

@luis_in_brief @timbray It will have to wait until RMS is no longer active, because he can't abide not-invented-here, or replacement of software owned by the FSF with more widely dispersed ownership.

luis_in_brief,
@luis_in_brief@social.coop avatar
not2b,
@not2b@sfba.social avatar

@luis_in_brief @timbray That is a great piece, Luis. As someone who spent a decade on the GCC steering committee, which was mainly an effort to manage RMS from "below", I also have a lot of thoughts, and receipts. Maybe some day I will write something, but I got tired of the fights and just left.

luis_in_brief,
@luis_in_brief@social.coop avatar

@not2b @timbray totally reasonable response! My battles were mostly through/around GNOME but also skirmishes elsewhere. Thank you for your gcc service!

billjings,
@billjings@mastodon.online avatar

@timbray "Don’t rely on build tools you don’t understand generally"

😬

(thinking about gradle here)

acdha,
@acdha@code4lib.social avatar

@billjings @timbray I’m reminded of a Next.js project whose developers had a moment of terror when asked to reduce the render-blocking JavaScript below 4MB because nobody knew how the bundling process worked with enough confidence to commit.

josephholsten,
@josephholsten@mstdn.social avatar

@timbray One interesting thread about it is from @alanc of the X Windows and Solaris projects: https://fosstodon.org/@alanc/112181757523434254

jbqueru,
@jbqueru@fosstodon.org avatar

@timbray I'm also seeing surprisingly many lessons in there.

A partial list from me: Security through obscurity doesn't work. Everything is a security boundary, even build scripts and test files. Everything should be buildable from source and human-readable, even test files.

An indirect one that I remember sharply for having had to deal with it a lot in a past job: not everyone in an Open Source community has the same goals and priorities.

jbqueru,
@jbqueru@fosstodon.org avatar

@timbray Indirectly, Ken Thompson's paper comes to mind, as does a friend's PhD thesis on formally proving that the output of a compiler matched its input.

timbray,
@timbray@cosocial.ca avatar

@jbqueru I have personally never seen a Makefile.am or configure.ac that I would describe as “human-readable”.

jbqueru,
@jbqueru@fosstodon.org avatar

@timbray I sadly agree. and that's an issue. Ultimately, if such files aren't human-readable, we can't trust that the output of a build matches its source files... and that's what happened with xz.

lauren,
@lauren@mastodon.laurenweinstein.org avatar

@jbqueru @timbray I won't posit a "solution" at this time, but the core principles of "open source" come from a different time where for all practical purposes the attack vectors (and potential consequences) simply didn't exist. This is going to take more than technical changes alone to fix; the foundational "philosophy" in some ways of open source is unfortunately implicated.

jbqueru,
@jbqueru@fosstodon.org avatar

@lauren @timbray I believe that this is a pivotal moment, that there will be a before and an after. I agree that the issue here is mostly a human issue, and technical solutions can't fix the whole thing. At the same time, we need to realize that neither humans nor technology can be fully trusted, and that a strong solution will have to strengthen on both sides.

lauren,
@lauren@mastodon.laurenweinstein.org avatar
klausfiend,
@klausfiend@dcerberus.com avatar

@lauren @jbqueru @timbray The gross assumption that private industry can build entire product lines on top of free software, give nothing back to the community, and then complain when bugs happen -- that definitely needs to change.

jbqueru,
@jbqueru@fosstodon.org avatar

@klausfiend @lauren @timbray Deep inside, it's the complaining that worries me (and that hurt me personally in the past).

Open Source might imply indeed "you can legally maintain the software you rely on without having to ask anything from the original authors and maintainers", but that comes with a counterpart "this is provided as-is and you might be foreced to maintain it yourself, be prepared for that possibility."

We all rely on more software than we can maintain, and that's a major issue.

mark,
@mark@mastodon.fixermark.com avatar

@lauren @jbqueru @timbray

To be honest, I find it interesting that people are framing the entire thing as a failure of the process.

Somebody tried to maliciously backdoor an open source project and the problem was identified by a third party, communicated to the maintainer, the community was able to investigate because the entire project is open source, and the open security vulnerability was resolved quickly.

This sounds like a success story for the philosophy "many eyes makes all bugs shallow." And think of how expensive this attack was for the relative lack of success; 2 years of active penetration work and social engineering to get busted out in two minor versions?

I suspect no major sweeping ecosystem changes are necessary here, and perhaps the only cautionary tale is to remind everyone that xz is key security infrastructure and should be funded as such.

lauren,
@lauren@mastodon.laurenweinstein.org avatar

@mark @jbqueru @timbray However, the obvious question persists. Absent a reliable and formal process, how many of these may have not been detected in the past or even today -- or how many may remain undetected in the future?

kkeller,
@kkeller@curling.social avatar

@lauren @mark @jbqueru @timbray I wonder if any such process will have to fall on distro maintainers, especially those who sell support contracts. They are the ones who will bear much of the burden if one of these backdoors makes it in to their distros.

jbqueru,
@jbqueru@fosstodon.org avatar

@kkeller @lauren @mark @timbray From a regulatory point of view, I believe that the EU's Cyber Resilience Act (CRA) will assign liability, which might end up falling onto such distro maintainers (or whichever is the upstream-most commercial non-person legal entity, IANAL).

mark,
@mark@mastodon.fixermark.com avatar

@jbqueru @kkeller @lauren @timbray Trying to hold people legally liable for how other people use software that was clearly licensed AS-IS will be a great way to kill publicly-provided free software.

Is the EU's goal to smother the open source ecosystem?

jbqueru,
@jbqueru@fosstodon.org avatar

@mark @kkeller @lauren @timbray For stuff that is both free-as-in-code and free-as-in-beer, I expect things to remain mostly unchanged (IANAL). Once you take away either of those two and you're not talking about a personal project or personal contributions, things will probably change indeed and there'll probably be some chilling effect (IANAL).

I'd be much more a fan of the CRA if it didn't contain a 5-year planned obsolescence - it only requires security updates for 5 years at most.

kkeller,
@kkeller@curling.social avatar

@mark @lauren @jbqueru @timbray I am curious why the exploit was found now. Is it possibly because distros were starting to move to 5.6, and therefore it was getting more testing by more people? Obviously the issue was found by one person, but perhaps that person wouldn't have found the exploit if it weren't getting ready to move to more widespread distribution.

jbqueru,
@jbqueru@fosstodon.org avatar

@kkeller @mark @lauren @timbray As I understand, now, because 5.6 was starting to see adoption in test channels of major distros. I haven't read through the code enough to figure out why it didn't or couldn't get caught earlier. I'd rather not speculate further, at least not in public.

tknarr,
@tknarr@mstdn.social avatar

@lauren @jbqueru @timbray I disagree. The same sort of long game can be played with proprietary software, with even less chance of detection. We'll never know if any were detected and removed, and any outside researchers who try to find them will face the software vendor's legal team trying to stop them. That's an even worse situation.

jbqueru,
@jbqueru@fosstodon.org avatar

@tknarr @lauren @timbray I think both of these can be true at the same time:
-It might be easier to infiltrate proprietary software than Open Source one.
-Open Source communities need to evolve their threat models and adapt accordingly.

And, in a scary Venn diagram kind of way, proprietary software companies probably also need to evolve their threat models, but might have a lot less ability to adapt accordingly.

tknarr,
@tknarr@mstdn.social avatar

@jbqueru @lauren @timbray I think open-source projects are going to adapt in a hurry. One obvious thing is making binary blobs verboten except under special circumstances where provenance can be verified. You need binary data for tests? Generate it as part of the tests. Another is obsoleting gibberish build scripts, if you need autotools scripts then generate them during the build

jbqueru, (edited )
@jbqueru@fosstodon.org avatar

@tknarr @lauren @timbray I expect that we'll see a deprecation of a lot of gibberish languages: when those proliferate in a project, it becomes harder to defend a project (you need expertise in all of those at the same time) than to attack it (you only need to know the oddities of a single one, like how a line with only a period in it terminates the script such that nothing after it gets executed).

(1/2)

jbqueru,
@jbqueru@fosstodon.org avatar

@tknarr @lauren @timbray Generally speaking, I think in 4 dimensions: Processes and Practices (which I cluster under People), and Technology and Tools (which I cluster under Technical). Having awareness about those for gives me a framework for the various options to approach a problem.

Not accepting binary blobs is a Practice thing, generating binary data is a Tools issue, eliminating gibberish is a Technology issue, etc...

(2/2)

robryk,
@robryk@qoto.org avatar

@jbqueru @timbray

Re test files, I don't think that is desirable, especially for parser-like it compressor-like projects. In case of anything that smells of passing, fuzzer-generated regression tests have significant positive value (one can try to write regression tests by hand instead, but it's more work for imo an infrared chance of getting it wrong) and fuzzer-generated example inputs to parsing have a very large value. Example files that were generated using weird tools or extend weird ones are also important test cases (as opposed to previous ones, not just to assert lack of crashes, but to assert correct parsing).

I think that having a better split between building (which generates all non-test artifacts) and testing (that uses already-built everything else, generates test artifacts and runs them) solves the same problem: it allows build environments to ensure that testing doesn't affect the output and that test files are not inspected by the build process. If done sufficiently well (which is admittedly hard to do in the current world) this can even allow test-only dependencies to not be visible to the build stage.

jbqueru,
@jbqueru@fosstodon.org avatar

@robryk @timbray You and I are reaching the same conclusion: chances are, building and testing should be separate phases, separate enough that test files aren't available while building.

That should be especially true for large binary-only test files that can't be reviewed and have to be blindly trusted - being binary-only, those have no reason to have to be processed through a build pipeline.

timbray,
@timbray@cosocial.ca avatar

3/2 Oops forgot to mention the very decent write-up from which this came: https://gist.github.com/thesamesam/223949d5a074ebc3dce9ee78baad9e27

isotopp,
@isotopp@chaos.social avatar

@timbray if it is not Linux, it is broken. Unpopular opinion, I know. I am still right.

lobingera,
@lobingera@chaos.social avatar

@isotopp @timbray

I'd like to raise also that unpopular opinion, that in the wild there a spectrum of Linuces ...

Still, Kris is right.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • tester
  • magazineikmin
  • khanakhh
  • InstantRegret
  • thenastyranch
  • Youngstown
  • everett
  • mdbf
  • slotface
  • ngwrru68w68
  • DreamBathrooms
  • kavyap
  • osvaldo12
  • rosin
  • JUstTest
  • Durango
  • tacticalgear
  • modclub
  • cubers
  • GTA5RPClips
  • ethstaker
  • normalnudes
  • cisconetworking
  • Leos
  • megavids
  • provamag3
  • anitta
  • lostlight
  • All magazines