Forgot your password?
typodupeerror
Debian GNU is Not Unix Open Source Linux

Glibc Steering Committee Dissolves; Switches To Co-Operative Development Model 102

Posted by timothy
from the part-of-an-autonomous-collective dept.
First time accepted submitter bheading writes "Following years under controversial leadership which, among other things, led to a fork (which was in turn adopted by some of the major distributions) the glibc development process has been reinvented to follow a slightly more informal, community-based model. Here's hoping glibc benefits from a welcome dose of pragmatism."
This discussion has been archived. No new comments can be posted.

Glibc Steering Committee Dissolves; Switches To Co-Operative Development Model

Comments Filter:
  • by Anonymous Coward on Saturday March 31, 2012 @09:55AM (#39533757)

    Monolithic libraries are the way to go. They make software development much easier.

    If you don't believe me, just look at the GNOME project. The last time I tried to build it from scratch by hand, there must have been at least 50 libraries I had to build first. That was several years ago, so there are probably many more that are needed now. Those were just libraries from the GNOME project, too! That's not including glibc, the many X libraries, Gtk+, and so forth! Don't forget that you'll need to start making sure you're using the right versions, too. Some of these libraries are released yearly, while others have a new release every week.

    To realistically build something like GNOME, where they went absolutely stupid with unnecessary modularity, you need to use one of the scripts that are out there that'll do it all for you. Those scripts end up being a solution to a problem that shouldn't exist in the first place! They're only needed because what should be one monolithic library was split out into 60 smaller libraries. You'll need all of the libraries to get even a basic GNOME installation up and running, so there's no point in separating them.

    It's not the 1980s any longer. We don't statically link everything using dumb linkers that can't strip out unused executable code. Modern OSes using dynamic linking and delayed loading only ever use the parts of libraries that are actually used.

  • It's not a fork (Score:4, Informative)

    by aglider (2435074) on Saturday March 31, 2012 @10:00AM (#39533771) Homepage

    It's clearly written in the fery first FAQ [eglibc.org]:

    EGLIBC is not meant to be a fork of GLIBC.

  • The icon is wrong (Score:5, Informative)

    by phoxix (161744) on Saturday March 31, 2012 @10:07AM (#39533807)

    It should be GNU, not Debian. Glibc is very much a GNU project. How do people not know GlibC is a GNU project?

  • by TheRaven64 (641858) on Saturday March 31, 2012 @10:35AM (#39533929) Journal

    I don't have exact numbers here, but Ulrich has had a massive role in producing what may very well be the most widely used C library ever

    Is this 'may be' as in the Wikipedia definition, meaning 'isn't'? FreeBSD libc - not GNU libc - is the basis of both the Android and OS X libc implementations, and either one of these uses likely dwarfs glibc installs. And that's before you get to the Microsoft one (which is crap, but is installed on every Windows system).

  • by TheRaven64 (641858) on Saturday March 31, 2012 @11:24AM (#39534253) Journal

    Not that have some kind of allegiance to the glibc, but what do all those Linux based GPSs, routers, TVs, etc, not to mention all those servers?

    Most embedded users don't use glibc either, they use something like newlib or uclibc, depending on the resource constraints.

The meta-Turing test counts a thing as intelligent if it seeks to devise and apply Turing tests to objects of its own creation. -- Lew Mammel, Jr.

Working...