Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Red Hat Software Businesses Software Linux

Fedora Holds Summit To Map Its Future 92

lisah writes "Last month members of the Fedora community met for a three-day summit (wiki here) designed to chart a course for future version releases as well as to plan other Fedora projects. Team members say they want to leverage the enthusiasm of a community that has demonstrated a willingness to develop Fedora Extras (add-on features to the Core package) and support Fedora Legacy (past releases). Red Hat's community development manager, Greg DeKoenigsberg, said, 'Community contributors have proven conclusively over the past 18 months that they can build packages every bit as well as Red Hat engineers — better, in some cases.' In addition to creating several proposals that will be introduced the the community for input and feedback, the summit also gave rise to the newly-created position of Fedora Infrastructure Leader." and Slashdot are both owned by OSTG.
This discussion has been archived. No new comments can be posted.

Fedora Holds Summit To Map Its Future

Comments Filter:
  • by slapys ( 993739 ) on Tuesday December 19, 2006 @05:59PM (#17305838)
    Fedora was the first Linux operating system I ever used. This applies to the majority of my Linux-using friends as well. Perhaps this is because people already know the name of Red Hat, and discover Fedora as a result. In any case, the quality of Fedora is significant because it determines the first impression of Linux on many people. Even though I have switched distributions, it it possible that I may have stopped using Linux if I had come to the conclusion that Fedora was of too poor quality to use on a daily basis.
  • by Intron ( 870560 ) on Tuesday December 19, 2006 @06:09PM (#17305984)
    All of the planning described in the article seemed to be oriented on how to best support developers. I didn't see anything about end user goals.
  • by namityadav ( 989838 ) on Tuesday December 19, 2006 @06:15PM (#17306048)
    I think the first objective for all the Open Source teams should be to stop duplication. A lot of our resources are wasted in getting features ported from other applications and (Even worse) redoing features on different applications (Because of underlying differences). I know that one of the strengths of Open Source is to have "choices", but some of these choices are just plain silly. I am not asking for these choices to go away completely. But there should be at least some sort of coherence between different alternatives (They already have some coherence, thanks to the Kernel .. but we need to see a lot more of the same in more higher level applications too)

    Imagine how much more work could be done to a package manager if every distro was using the same. Imagine how good OpenOffice and KOffice could have been if there were not 200 other Open Source alternatives. I am glad to hear about efforts to unify KDE and Gnome. We need to focus on something similar for a lot of other applications too. And this should be one of the top most priorities for Redhat, Novell, Ubuntu/Debian teams.
  • by RAMMS+EIN ( 578166 ) on Tuesday December 19, 2006 @06:59PM (#17306606) Homepage Journal
    ``Not really. The developers are the guys who write the code, and the users are the ones who bitch about it. Same as any other piece of software.''

    While that's true to an extent, there are two things that make open source software different from the norm:

    1. Many developers write the software for their own use (rather than for money)
    2. Users can and do change the software to better suit them

    This is what blurs the line between developers and users. Of course, both of these are also reasons why developers can and do ignore users' requests, and get away with it.
  • by Junta ( 36770 ) on Tuesday December 19, 2006 @07:08PM (#17306742)
    Forks/duplications of efforts can have negative repercussions, but they are not without reason. A fork reflects a difference of opinion on how to proceed. Duplication of work occurs on similar goals, but one of two things happen. Either the reason behind the fork was not really popular or not sufficiently different to pursuade userbase and the fork dies, or the cause for the work was justified and the fork lives on or overtakes the original.

    Can probably point out tons and tons of failed forks (I believe mplayer has had a few unsuccessful forking attempts). They happen all the time.

    A shining example of a 'fork' like endeavor coexisting with the original is Debian and Ubuntu. Ubuntu has a set of technical and marketing goals that didn't mesh perfectly with Debian. Ubuntu was justified and the community has greatly accepted it. Meanwhile Debian has not really lost much in its userbase (most Ubuntu users come from RPM based distros rather than Debian) because the concepts Debian hold as important still matter.

    And sometimes fork reflect the need to meaningfully continue a project that has for all intents and purposes lost touch. Xorg is a fork of XFree86 that has effectively killed off the original. They still twitch, but they've even taking down their ultimately embarassingly list of distros that still supported them (generally by not having updated yet rather than a concious future decision). The breaking point was a licensing technicality, but it's clear that XFree86 had technical problems as well in adopting new graphical features.

    Hell, linux itself is spiritually (not technically) a fork of minix. The basic point is simple, projects by and large once established tend not to do revolutionary new things as the people at the head are heading basically where they meant to go. Forking is a logical way for revolutionary change to happen and the userbase decides the fate of the original and new.
  • by schwaang ( 667808 ) on Tuesday December 19, 2006 @07:20PM (#17306884)
    I think Red Hat is still working out how to allow real community involvement while still keeping control. And they seem to be making progress. If they get the balance right, maybe they'll end up with more people on their boards who will take users' needs into consideration more naturally.

    Transparency needed to come first, and that's way better now. Fedora's governance was non-obvious, with a different Leader of the Week handing down Red Hat fiats. Now they seem to be consciously trying to expose more of the decision making process, and the leadership team seems more stable and active. This is all to the good.

    I'd still like to see more voices on the advisory board that take the user point of view. You'll get some swinging dick who says "Hey let's just track all Fedora users so we know how many there are, and who cares if some people whine about privacy." And nobody is there to say: "Whoa there cowboy, we're not Microsoft yet."

    But if they're moving towards more open governance as it appears, I think they'll end up hearing out their users' concerns more as a consequence.
  • by Anonymous Coward on Tuesday December 19, 2006 @07:37PM (#17307086)
    You are, of course, assuming that the people who duplicate work are programming for others and not themselves. And that would be wrong. Programmers don't volunteer their efforts as a result of some arbitrary higher calling -- they do it because it benefits them in some way. This is not a bad thing; it is a very good thing. This is why all the good things in the world exist.

    They may value the learning experience or the skills they develop. They may value the recognition. They may value the experience of being involved in a volunteer group effort. They may value the fact that some annoying bug is finally fixed. Perhaps they simply value the feeling they get by helping out.

    So let's get to the point. The question is not "why aren't these programmers working on some other project instead of duplicating effort?" The question is "why should these programmers work on some other project valued by others (or some arbitrary group such as majority opinion) instead of the project they value for themselves?"

    I think we all know the answer: Because programming for YOUR project isn't what makes them happy, and making themselves happy, in whatever form that may take, is exactly why they program.
  • by psykocrime ( 61037 ) <mindcrime@cpphac ... k minus language> on Tuesday December 19, 2006 @09:31PM (#17308126) Homepage Journal
    Red Hat is important in only one way, from what I can see: they make Linux a commercial venture. Other than SCO, I don't think anybody has done a worse job from that perspective, either. Ximian, eventually bought by Novell, at least contributed to the development of Evolution and other GNOME software. Corel got into the Office for Linux market at a time when the biggest complaint about Linux was that there were no good applications available. IBM has contributed to the idea of commercial Linux more than anyone I can think of, both in terms of GPL-ed contributions to the codebase, and as a vendor promoting Linux-based solutions. Red Hat has been a purely profit-based venture, sacrificing the quality of the free distribution to make a few extra bucks.

    Right, because Red Hat has never contributed anything to the community: [] []

    Fedora isn't perfect, and RH did make - IMO - do a poor job of transitioning from the "old" RHL series to Fedora, but to suggest that they don't
    contribute anything to Linux and OSS is just ridiculous.
  • by jd ( 1658 ) <{moc.oohay} {ta} {kapimi}> on Wednesday December 20, 2006 @01:04AM (#17309348) Homepage Journal
    I cut my teeth on those and have the fillings to prove it. :) Seriously, I've mucked with .spec files and written a few. The documentation isn't so much poor as virtually non-existant and you'll get far more information from grabbing good, working pre-existing ones. If you want anything compiled with the -march=pentium4 flag (may or may not be a Good Thing, depending on program) or to use optional libraries that need to be compiled in if present, then this is the way to do it. It's slower and more tedious to pick through than a simple ./configure script, but you can do it.

    Slower? Yeah - you don't know if any of those patches touch the configure options, so you've got to get part-way into an RPM build, break out, find the source directory, find the options, go back to the SPEC directory, find the .spec file, find the call to configure, rebuild, realize that that fixed the wrong architecture, go back, fix the right architecture's call, rebuild, realize your new dependencies aren't correctly reflected, go back, fix the dependency list for the options you are wanting, rebuild, discover that the compiler doesn't include something that is needed, repeat all of the above for the GCC compiler, then rebuild the package for real.

    Sure, I went through dependency hell with tarballs. The "golden era" was more brass-plated than gold. The number of problems was probably comparable, the only package I ever recall swearing at to this degree was X11R4. (Do you know how long that takes to build on a 386SX-16? Do you know what it is like to build the entire distribution tree, only to discover that due to some obscene/obscure bug when on the Linux architecture that random portions will mis-configure, mis-compile, barf on GCC or implode except when run on a non-existant resolution that causes the monitor to give a high-pitched scream and run down the street?)

    Nonetheless, with the exception of X, most problems were quick to discover and quick to fix. (In fact, I have yet to get X to compile correctly with any serious platform-specific optimizations. I won't forgive the Berlin/Fresco group for abandoning their alternative GUI.) The same cannot be said of exactly the same programs managed through .spec files and SRPMs, as there is way too much detail in too many different locations within the .spec file, within the patches, within the build system itself and within interactions with any quirks thrown up by already-installed RPMs. Too many unknown variables and no clean way of finding out what they are.

To do two things at once is to do neither. -- Publilius Syrus