Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
GNU is Not Unix Linux

GNU Make 4.3 Speeds Up Linux Kernel Builds, Debugger/Profiler Fork Released (phoronix.com) 32

Linus Torvalds himself "changed around the kernel's pipe code to use exclusive waits when reading or writing," reports Phoronix.

"While this doesn't mean much for traditional/common piping of data, the GNU Make job-server is a big benefactor as it relies upon a pipe for limiting the parallelism" -- especially on high-core-count CPUs.

This drew an interesting follow-up from Slashdot reader rockyb, who was wondering if anyone could verify that GNU Make 4.3 speeds up build times: I updated and released a fork of that called remake which includes hooks to profile a build, and has a complete debugger in it (although most of the time the better tracing that is in there is enough).

The most recent version has a feature though that I really like and use a lot which is adding an option to look in parent directories for a Makefile if none is found in the current directory.

You can download the source code from either github or sourceforge. Both have a full list of the release notes.

Sorry, at the time of this writing no packagers have picked up the newest release. Repology has a list of packages for older versions though.

This discussion has been archived. No new comments can be posted.

GNU Make 4.3 Speeds Up Linux Kernel Builds, Debugger/Profiler Fork Released

Comments Filter:
  • Is that like the time Steven Hawking himself degaussed the Internet so that Jen Barber could use it in her presentation?

  • I think it's anyway too late for make, autotools, etc. A lot of projects are migrating to either cmake or meson and can use ninja instead of make.

    Perhaps these improvements will be good for finally building softwares written in the 70ties, 80ties and 90ties.

    Most modern software projects don't use build environments that require make.

    Basically because cmake, meson, etc are simply flat out a trillion times better. It's not even a competition.

    • I think it's anyway too late for make, autotools, etc. A lot of projects are migrating to either cmake or meson and can use ninja instead of make.

      Perhaps these improvements will be good for finally building softwares written in the 70ties, 80ties and 90ties.

      Most modern software projects don't use build environments that require make.

      Basically because cmake, meson, etc are simply flat out a trillion times better. It's not even a competition.

      Maybe you missed it but it specifically says "Kernel" in the headline.

      • by freax ( 80371 )

        Yes but the Linux kernel isn't the only project in the world that is using GNU make. The Linux kernel is just the example this article uses to illustrate what the improvements being done are about or what the gain is.

    • by Zelig ( 73519 ) on Saturday March 21, 2020 @03:34PM (#59857338) Homepage

      You've got a pretty low number; That suggests you've been around a bit. Can you make a succinct statement of why you think Cmake is superior? The handwaving made me want to dismiss you as a young enthusiast, but it's possible you've got a case in mind.

      I see cmake as one of a procession of local build systems ginned up by some subcommunity. A decade ago it was rake that was going to take over, and Maven before that. They grow really quickly in the environment where they can ignore the rest of the world, and then they senesce at the edges of their country, when it becomes necessary to coordinate with other ways of doing things. In the meantime, plain old GNU make continues to slowly, carefully increase its formal expressive power. I particularly liked the multi-result target in 4.3.

      • by freax ( 80371 )

        You've got a pretty low number; That suggests you've been around a bit. Can you make a succinct statement of why you think Cmake is superior? The handwaving made me want to dismiss you as a young enthusiast, but it's possible you've got a case in mind.

        As (freelance) software developer I seen quite a few companies' projects. And unlike the ones you just mentioned is CMake being used increasingly. Also at the more conservative places (that are developing software as if they are still in the 80ties).

        Sure, GNU make is still widely in use. But for new things, it's usually CMake that is being used. And on the build environment whenever possible, ninja as build-tool for which CMake is used to generate the ninja files.

        • The question:

          Can you make a succinct statement of why you think Cmake is superior?

          However, the response:

          As (freelance) software developer I seen quite a few companies' projects. And unlike the ones you just mentioned is CMake being used increasingly.

          I don't think answers that question, make may be used more frequently but what would make it better? I've not used either for some time but a while ago I had to construct a make file meant for several wildly different hardware versions, not just flavors of x86

          • by freax ( 80371 )

            Since CMake can be used to generate Makefiles instead of ninja files, you can naturally do everything you can with CMake, with GNU make. It's just going to be a lot more work.

            CMake makes it all much more easy and more standardized. It provides infrastructure to generate the package-config files, set up dependencies, etc. Things you also have with autotools, of course. But I guess in a more modern shape, less strange things like having to hack m4 scripts all over the place.

            Additionally can CMake instead of u

      • That suggests you've been around a bit. Can you make a succinct statement of why you think Cmake is superior?

        It's not that CMake is superior, but if you want to target Windows and unix what else are you going to use? It passes the incredibly low bar of being the best system that does the job. That's like being the least smelly turd in the park, but if you absolutely have to have a turd, you'd rather it stank less.

        Hyperbole yes, programmers love to arrogantly crap on everyone else's code and I'm no exception

    • The thing is, I can easily write a makefile without looking up documentation. It has the perfect balance between ease of use, syntactic clarity and flexibility (matching rules). Sure, it could be faster. But while Ninja is fast and way, but it's grammar is explicit and you have to list every single file. Other build tools are focused around specific languages and project structures. Make has some rules built in for c/c++, but can build anything that results in a file and has chained dependencies.
      • by freax ( 80371 )

        That's why you use CMake or meson to make a 'project structure' environment (with kinds of targets, dependencies between them, package-config files, etc) and you let it generate the ninja files. Then use ninja to build.

        Similar to how in the days you used autotools to generate from configure.ac and Makefile.am the configure script and Makefile-s.

        • Yes, but try making a csv from a pcap dump using cmake in under 1 minute. Iâve even used simple makefiles to build dependencies that build with cmake/ninja. It's just the right tool for the job for a lot of things. I've made complex cmake builds and I've used ninja as the backend, but I still can't get over make's simple and clear syntax. So often more complicated tasks are infuriatingly difficult with build systems that impose an array of assumptions on your project structure. Unless you're interested
          • by freax ( 80371 )

            Ok yes, but making csv files from pcap dumps also isn't really a typical task for doing it with makefiles. That sounds more like something you do with Python, Perl or some other scripting language.

            IMO is make (and cmake) for building source code to binaries. And like often can you use it for all sorts of things: making a tetris game, launching a rocket or booting a Linux distribution (wasn't Upstart like that?). But then again, make is still mostly for and about building stuff.

    • That would be strange since cmake creates makefiles on linux
    • > Basically because cmake, meson, etc are simply flat out a trillion times better. It's not even a competition.

      Um cmake on Linux defaults to generating... you guessed it, Unix Makefiles.

      Can't speak for ninja, meson, etc. But the reality is they probably aren't flat-out better. Honestly, after experiencing some other 'alternative' build systems on projects I'm convinced that most of them are just flavor of the day programs written by people who don't understand Make and are doomed to re-implement it badly

    • This, absolutely, in my latest project I am using cmake and ninja, it's somewhat easy to write a cmakefile. At first I generated "Unix Makefiles" but now I generate Ninja files. My CMakeLists.txt generates ninja files in visual studio in windows, and ninja files on a ubuntu VM.

  • by rockyb ( 6698930 ) on Saturday March 21, 2020 @05:03PM (#59857552)

    Many thanks to editorDavid who greatly improved this and to slashdot for posting.

    I knew when I submitted this that I'd get a lot of negative comments.

    As someone who has spent a large part of his career working on tools [slashdot.org] to [readthedocs.io] improve [github.com] programming [pypi.org] environments, I have been hounded by Enrinyes [theoi.com] for doing such things. There are two furies: "You shouldn't use" and "Why are you wasting your time on."

    A long, long time ago, before DevOps and when Systems Administrators were in great demand, I encountered the "You shouldn't use" fury when I asked about some different behavior of the Python debugger. A high-ranking chat poster told me that I shouldn't do any debugging. In all of his years of Python coding as a contractor he had never needed a debugger.

    He lived in another world: he mostly wrote code and apparently didn't have to deal with anyone else's code.

    The world I lived in was as a Systems Administrator in the trenches. We had lots of buggy code that we didn't write, and there was no documentation and no tests. Yet we were expected to make sure it ran okay. In this world, if you couldn't catch the applications programmer's fault, it was your own.

    I don't want to get into a flame war over build systems. I use different build systems, too, depending on the context. In fact, the --tasks [readthedocs.io] option in remake [sf.net]comes directly from rake [github.com]. The most recent version added an option to scan parent directories [readthedocs.io] when a Makefile is found in the current directory. That too comes from ideas in other programming environments.

    If you are starting a project, use whatever you think best, whether GNU Make or not. As the title pegs this specifically to GNU/Linux Kernel builds, rather than waste your time here with the small stuff, take this up with the GNU/Linux Kernel builders. I think, though, that there are a number of people for whatever reason who will have to build the GNU/Linux Kernel and may run into a problem. For them, since remake aims to be compatible with GNU Nake (it just has additional options which are invalid in GNU Make), it might help them understand what's going on or help them debug a problem.

    • As someone who has spent a large part of his career working on tools to improve programming environments, I have been hounded by Enrinyes for doing such things. There are two furies: "You shouldn't use" and "Why are you wasting your time on."

      I've encountered that before. A lot of people assume their view of the world is complete and if you're not doing things the way they are then you're an idiot. There cannot possibly be a valid reason why you're doing things differently because that would mean something t

      • Debuggers. I first encountered them at a place where they were hardware debuggers, a piece of hardware with a probe that fit into the CPU slot. Later these became known as ICEs (In Circuit Emulators, a name I somehow never got used to.) But then I encountered software versions, the gnu debugger (gdb). Either way I hated those things. The best thing about them was they provided motivation to get your code right in the first place so you wouldn't have to use them.

        Printfs in code was something different.

      • by rockyb ( 6698930 )

        I'd also say if you're worring about performance of the build system...

        There may be a misunderstanding here. While the profiling is done totally inside the remake, what it is profiling is the build time itself. Let's not get into yet another tangent on what you think of the GNU/Linux Kernel size and how it should have been done.

        Just accept that there are programs that are big and complex (compilers, and web browsers are often like this too), and take a lot of time to build from source. Profiling in the build system is just offering a window into how long the components take to

        • Let's not get into yet another tangent on what you think of the GNU/Linux Kernel size and how it should have been done.

          Well, I guess then we really need to get into a tangent on it initially because that's required before we have "another" one.

          And now you're quoting out of context:

          I'd also say if you're worring about performance of the build system when you're starting out

          is very clearly different from what you quoted:

          I'd also say if you're worring about performance of the build system...

          You have made what

          • by rockyb ( 6698930 )

            ok - then the misunderstanding is mine. Just wanted to make clear though that the --profile [readthedocs.io] option in --remake is about build times, not about profiling the build software remake in isolation.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...