Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Debian Linux

Debian Discusses Vendoring -- Again (lwn.net) 48

Jake Edge, writing at LWN: The problems with "vendoring" in packages -- bundling dependencies rather than getting them from other packages -- seems to crop up frequently these days. We looked at Debian's concerns about packaging Kubernetes and its myriad of Go dependencies back in October. A more recent discussion in that distribution's community looks at another famously dependency-heavy ecosystem: JavaScript libraries from the npm repository. Even C-based ecosystems are not immune to the problem, as we saw with iproute2 and libbpf back in November; the discussion of vendoring seems likely to recur over the coming years. Many application projects, particularly those written in languages like JavaScript, PHP, and Go, tend to have a rather large pile of dependencies. These projects typically simply download specific versions of the needed dependencies at build time. This works well for fast-moving projects using collections of fast-moving libraries and frameworks, but it works rather less well for traditional Linux distributions. So distribution projects have been trying to figure out how best to incorporate these types of applications.

This time around, Raphael Hertzog raised the issue with regard to the Greenbone Security Assistant (gsa), which provides a web front-end to the OpenVAS vulnerability scanner (which is now known as Greenbone Vulnerability Management or gvm). "the version currently in Debian no longer works with the latest gvm so we have to update it to the latest upstream release... but the latest upstream release has significant changes, in particular it now relies on yarn or npm from the node ecosystem to download all the node modules that it needs (and there are many of them, and there's no way that we will package them individually). The Debian policy forbids download during the build so we can't run the upstream build system as is."

Hertzog suggested three possible solutions: collecting all of the dependencies into the Debian source package (though there would be problems creating the copyright file), moving the package to the contrib repository and adding a post-install step to download the dependencies, or removing gsa from Debian entirely. He is working on updating gsa as part of his work on Kali Linux, which is a Debian derivative that is focused on penetration testing and security auditing. Kali Linux does not have the same restrictions on downloading during builds that Debian has, so the Kali gsa package can simply use the upstream build process. He would prefer to keep gsa in Debian, "but there's only so much busy-work that I'm willing to do to achieve this goal". He wondered if it made more sense for Debian to consider relaxing its requirements. But Jonas Smedegaard offered another possible approach: analyzing what packages are needed by gsa and then either using existing Debian packages for those dependencies or creating new ones for those that are not available. Hertzog was convinced that wouldn't be done, but Smedegaard said that the JavaScript team is already working on that process for multiple projects.

This discussion has been archived. No new comments can be posted.

Debian Discusses Vendoring -- Again

Comments Filter:
  • by brunes69 ( 86786 ) <slashdot@nOSpam.keirstead.org> on Wednesday January 13, 2021 @01:16PM (#60938826)
    "...but the latest upstream release has significant changes, in particular it now relies on yarn or npm from the node ecosystem to download all the node modules that it needs (and there are many of them, and there's no way that we will package them individually). The Debian policy forbids download during the build so we can't run the upstream build system as is...."

    Umm... have these folks never heard of a Local NPM registry [npmjs.com]? There are literally a dozen different solutions to choose from here. Do they think commercial vendors download all their NPM dependencies from the internet every time they build???

    • by Eric Sharkey ( 1717 ) <sharkey@lisaneric.org> on Wednesday January 13, 2021 @01:48PM (#60938958)

      For a long time, Debian has been striving for Reproducible Builds [lwn.net], where building a source package produces the same output every time. Even with a local NPM registry, it would be nearly impossible to keep track of what was in the registry at the time of doing the build unless that content is actually in the source package itself.

      • It shouldn't be hard at all, because every package should be digitally signed and it should never change. If it changes, you have bigger security related problems.

        • by tkotz ( 3646593 )

          Debian already maintains a package management infrastructure.

          Maybe the solution is to lean even harder into this concept with something like a fake_npm package that has an npm that just wraps dpkg, checks to see if the deb containing the requested package is present. If it isn't it could error out telling the user that the package they are building is missing a build dependency on "whatever". This way the build process doesn't need to download anything, but it acts the same to the build commands.

          Then ideal

  • by skids ( 119237 ) on Wednesday January 13, 2021 @01:50PM (#60938964) Homepage

    Many, many years ago I had a little fun making an extension to automatically convert CPAN perl code projects to an installable package. There was adequate meta-data in CPAN to do a pretty good job at it. The current package managers for the java/js/c# generally astound me with the amount of cruft in the packaging system... I wouldn't say it would surprise me if they were missing some information to make that possible, but I'd suspect they should have all the bases covered there.

    • by WallyL ( 4154209 )
      Do you have a free-software-friendly licensed copy available for the public? I like to use system-level packages for everything, but I don't know how to make them for things like perl or python.
      • by skids ( 119237 )

        So crusty it's likely useless at this point. It looks like more proficient coders subsequently took the same route with dh-make-perl [metacpan.org].

      • by Gyles ( 87774 )

        gem fpm can do these sorts of conversions:
        https://fpm.readthedocs.io/en/latest/

  • by vanyel ( 28049 ) on Wednesday January 13, 2021 @02:13PM (#60939058) Journal

    Seems to me all too many things, particularly websites, fetch things from third parties on the fly, making them susceptible to compromise by things they have given up control over, whether it be security or gratuitous changes. Keeping the things you know work local will make things much more stable.

  • It's not a cutting edge distro. It's not supposed to be fast moving or on the cutting edge of development. If you want to package your latest and greatest shiny then that's exactly what containers are for. For users who want all their packages like this, don't use Debian.

    • by tkotz ( 3646593 )

      The problem isn't with keeping the absolute latest version of the package in there. The problem is these multiple dependency sources are preventing packages from ever getting in there. and as more and more software is built on the massive dependency frameworks like electron debian needs to come up with a a solution so they can have even an older version of some programs like say electron.

  • by BAReFO0t ( 6240524 ) on Wednesday January 13, 2021 @03:45PM (#60939476)

    Of seeing software as applications. Monolithic fortresses that do not cooperate and do not share. All because they got the business model wrong (Development is a service, not software a "property"), and it now depends on not sharing with others so that blackmailing works.

    For example, it is ridiculous that I can't use the Photoshop tools in Word, or vice versa. If the data format fits, it should work, no questions asked.
    This is one reason for Unix's "do one thing and do it right" and "everyting is a file".

    This is also why frameworks are part of that cancer.
    And container formats like Snap. (Dependency hell is a solved problem, boys and girls. Portage has no problem installing more than one version of a library in parallel and having software use the right version.)

    And deveopers that grew up in a world like that niw pour into open source software, thinking they can take their 'culture' with them.
    Just look at how Steam is implemented. There is no Linux version of Steam. It ignores pretty much every rule there is. File system, place of files, package manager, hell it doesn't even use Linux facilities but runs on Wine. It is still 100% Windows. Grafted onto Linux like a truck onto a human cheek.

    I have hope that this will change though, even thougj you can't teach them. IFF we stay stong in the face of their clueless cockiness (Yes, hello Poettering!), and they get older and hurt themselves with their bad choices a time or two, and learn from it by themselves.

    As a wise man once said: Those who do not understand Unix, are doomed to re-invent it. Badly. In two or three decades. After much bickering and arrogant cocky behavior typical for the youth. ... ;)

    • To be fair Windows has rather regressed in this approach. A long time ago the vision was that embeddable COM objects would provide just that kind of editable content inside compound documents. WinFS https://en.wikipedia.org/wiki/WinFS [wikipedia.org] was meant to be the next step as a relational filesystem that allowed documents to be formed from related data and any application understanding a particular schema to edit that part.
  • In my opinion, part of the problem is that some modern developers seem to have dropped the concept of release. There is no more versioning, you are advised to pick latest code from github.

    That makes it a nightmare to integrate software in package systems, since you never know if today's latest will fit correctly for different components that depends on it.

  • I'm impressed at what a comprehensive and comprehensible summary was provided for this article.

  • It sounds like it's time to think about dpkg having overlays that can exist inside containers - OpenVZ or whatever. And then some automagic packaging tools to build a hairy mess out of npm dependencies or whatever. Apt would need to know how to descend and manage the tree. Post-Bullseye, at least.

    What we don't want is to have application vendors be responsible for security updates on all of their libraries. It just doesn't work. Like 99% of the time that strategy fails.

    Apologies for the very few consci

  • by dltaylor ( 7510 ) on Thursday January 14, 2021 @02:37AM (#60941972)

    I used to argue that all web designers and implementers should live at the end of 9600 (later, 56k) pipe from their test systems. Most of their users had to do that at the time, and it would have made it obvious that their cruft was making life miserable for the users.

    These days, far to many of the clowns do not take into account real-world deployment of their "northbound end of a southbound mule" product. Nothing should be acceptable for a release (development branch, fine) unless it can be delivered as an appropriate package on at least Red Hat and Debian. Doesn't help us BSD folks, but it would be a start.

Beware of all enterprises that require new clothes, and not rather a new wearer of clothes. -- Henry David Thoreau

Working...