Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Debian Operating Systems Software Linux

Ubuntu 9.04 Daily Build Boots In 21.4 Seconds 654

Pizzutz writes "Softpedia reports that Ubuntu 9.04 Boots in 21.4 Seconds using the current daily build and the newly supported EXT4 file system. From the article: 'There are only two days left until the third Alpha version of the upcoming Ubuntu 9.04 (Jaunty Jackalope) will be available (for testing), and... we couldn't resist the temptation to take the current daily build for a test drive, before our usual screenshot tour, and taste the "sweetness" of that evolutionary EXT4 Linux filesystem. Announced on Christmas Eve, the EXT4 filesystem is now declared stable and it is distributed with version 2.6.28 of the Linux kernel and later. However, the good news is that the EXT4 filesystem was implemented in the upcoming Ubuntu 9.04 Alpha 3 a couple of days ago and it will be available in the Ubuntu Installer, if you choose manual partitioning.' I guess it's finally time to reformat my /home partition..."
This discussion has been archived. No new comments can be posted.

Ubuntu 9.04 Daily Build Boots In 21.4 Seconds

Comments Filter:
  • by alain94040 ( 785132 ) * on Wednesday January 14, 2009 @06:10PM (#26456471) Homepage

    This is one of my pet peeves: why can't computers boot in a second or less?

    Imagine a visionary like Steve Jobs (by the way, enjoy your leave of absence and please come back). He goes to his team and says "I don't care what it takes, build me a computer which boots in one second".

    Ignore the past, the legacy of tens of years of layer after layer of OS software. Can it be done?

    A 3 GHz dual-core processor can process 6 billion instructions in that first second. I know the disk is a problem. I'm not asking for all possible OS services to be up in a second... But I'm sure this could be improved greatly. It's all out there in the open. People want this.

    --
    FairSoftware.net [fairsoftware.net] -- work where geeks are their own boss

  • by beelsebob ( 529313 ) on Wednesday January 14, 2009 @06:13PM (#26456523)

    He already did that â" that's why OS X 10.3 took about 40 seconds to boot, and 10.4/10.5 take about 4 seconds. They fixed it by replacing initd with launchd, which schedules dependant tasks, and makes optimal use of resources.

  • by wizardforce ( 1005805 ) on Wednesday January 14, 2009 @06:15PM (#26456547) Journal

    here:

    http://www.linuxdevices.com/news/NS5429881813.html [linuxdevices.com]
    http://lwn.net/Articles/299483/

  • by Thelasko ( 1196535 ) on Wednesday January 14, 2009 @06:16PM (#26456567) Journal
    Is EXT4 backwards compatible with EXT2 and EXT3? (3 is backwards compatible with 2) I'm asking because there are only Windows drivers for EXT2, and this could cause problems for those that dual boot.
  • by CannonballHead ( 842625 ) on Wednesday January 14, 2009 @06:17PM (#26456575)

    I think there's more to it than that, though, too. For example, you'd have to completely bypass all checking, device discovery, etc., on boot (it takes time to discover drives, PCI/PCI-E/ISA ;) /USB device. Yeah, you could just have that set up in BIOS or something and just use that configuration, but that could be a pain, too.

    Now, if we're talking about post-POST boot-up, I think something could be done there. Even if it was having the option of, oh, 8GB of onboard memory dedicated to having a fast-boot operating system.

    As far as the extremely fast-boot idea goes, though, isn't that sorta what Good OS's partnership of Cloud and GIGABYTE is supposed to be? The GIGABYTE Touch Netbook M912 to be precise. Link here [thinkgos.com]. It was mentioned on slashdot a while ago as well.

  • by Anonymous Coward on Wednesday January 14, 2009 @06:19PM (#26456619)

    http://lwn.net/Articles/299483/

    Intel engineers have significantly reduced the boot time. However, there's a lot of hacks that need to be done to do so. Hopefully, a lot of those changes will make it in. At 10 seconds, you've got a PC that boots really fast. At 5, you've got a PC you're more likely to shutdown than hibernate or sleep (Linux has session restore which, for the most part, counteracts some of the advantages of hibernating of shutting down).

    Also, 1 second boot is probably not possible simply because it'll probably take that long just to read all the initialization files. Maybe with SSD it might become a possibility, but even still your system becomes a lot harder to maintain.

    For instance, you have to have all driver support compiled in ahead of time, meaning either you have a very custom kernel or you have a kernel that is way bigger than you need.

  • by Lord Byron II ( 671689 ) on Wednesday January 14, 2009 @06:25PM (#26456717)
    My understanding is that ext4 provides some very nice features, but faster data access isn't necessarily one of them. I'd imagine that an ext2 fs, which doesn't have journaling to slow it down, should be even faster.
  • disappointing... (Score:5, Interesting)

    by sofar ( 317980 ) on Wednesday January 14, 2009 @06:26PM (#26456741) Homepage

    This is a truly disappointing news item. Instead of setting the bar higher and truly trying to reduce boot time, they have not done much more than shave seconds off the existing boot time.

    For a generic desktop distro, 20+ seconds is still terribly long. 10 seconds should realistically be easy to achieve, especially as it took Arjan and me only a few months to get to 5 seconds on a netbook. We sure cut some corners, but we did not even use ext4 on those netbooks, and we still had buggy X starting times of 1.5 seconds, something which we can probably do in 0.5 seconds with kernel modesetting.

    I hate to see everyone settle down with "20 seconds" being "the next 5 second boot". This is really not progress at all, but rather, complacency.

  • by vux984 ( 928602 ) on Wednesday January 14, 2009 @06:27PM (#26456765)

    A 3 GHz dual-core processor can process 6 billion instructions in that first second. I know the disk is a problem. I'm not asking for all possible OS services to be up in a second... But I'm sure this could be improved greatly. It's all out there in the open. People want this.

    Hard to say if there's really a point to booting up before the services are running.

    What good is the PC being 'at the desktop' if the search service still hasn't started, the network still hasn't obtained an ip-address, half my tray icons aren't up? and the hard drive is still madly churning to get everything else running, so anything I try and launch is just going to be thrown into the queue and it probably will depend on something that hasn't started up yet anyway.

    Seriously, how much stuff could you really -defer- to after seeing the desktop and have a useful system?

    Remember the average hard drive moves under 50MB/s. Even a fairly modest Ubuntu desktop requires several times that much RAM. If the hard drive started loading data at maximum speed you've got maybe 50MB you can load in that time, and probably far less in actual practice. That means your kernel, drivers, HAL, desktop environment, localization, firewall, network, background, theme, etc has to ALL fit in under 50MB. And you'd need some sort of impossible situation where the cpu could run all the initialization code for all that in parallel, without waiting... nevermind that it almost has to be initialized in sequence due to the layer dependancies.

    If you want instant on PCs, the only real solution is to never turn them off, waking from suspend to RAM is about as good as its going to get for the forseeable future.

  • by Kent Recal ( 714863 ) on Wednesday January 14, 2009 @06:29PM (#26456809)

    10.4/10.5 take about 4 seconds

    On what hardware?
    I have a brand new MacBook sitting next to me that takes pretty exactly 30 seconds from the push of the button to a usable desktop. Is it broken?

  • by ruiner13 ( 527499 ) on Wednesday January 14, 2009 @06:35PM (#26456911) Homepage

    This is one of my pet peeves: why can't computers boot in a second or less?

    Why do computers need to reboot at all?

  • by Thinboy00 ( 1190815 ) <thinboy00@@@gmail...com> on Wednesday January 14, 2009 @06:40PM (#26457001) Journal

    That means your kernel, drivers, HAL, desktop environment, localization, firewall, network, background, theme, etc has to ALL fit in under 50MB. [snip]

    Why not use DSL? (Damn Small Linux, not a second phone line)

  • by beelsebob ( 529313 ) on Wednesday January 14, 2009 @06:41PM (#26457005)

    Yes, but then, as we're comparing penis sizes, lets do it fairly. TFA explicitly states that they time from after the boot loader is finished, to when the login window appears. Boot your mac, and time between the grey apple with a spinner appearing (the grey apple is displayed while the boot loader does its thing), until the login window appears.

  • by jernejk ( 984031 ) on Wednesday January 14, 2009 @06:42PM (#26457017)

    I've been using Ubuntu for I year. I was quite happy with the 8.4, but unfortunately I've switched to 8.10 64bit (to support 4GB RAM). You know what? I couldn't care less about how fast it boots. I do, however, care about these things:
    - switching from dual display to presentation (clone) and back totally messes up x config, I have to uninstall and reinstall nvidea drivers
    - in dual screen mode, nautilus opens on the first display. I have to open terminal and run nautilus& to lunch it on the second display
    - in dual screen mode, keyboard keeps focus in the previous screen. I have to minimize/maximize a windows on the "new" screen to move keyboard focus
    - RDP client crashes X windows in some cases (it does not close the drop down list of used servers... and bang)
    - oh and NO it's not AN ERROR if I close the RDP window. If I want to reconnect, I will, don't hide under my active windows and bring RDP windows back in 30 seconds. That's just plain stupid.
    - java and window decorations don't play well together (popups without buttons etc.)
    - How about opening a connection to a new server in a new tab, not in a new nautilus window?
    - Flash stops working. I just see a gray square where flash is supposed to be.
    - Firefox is not very stable.
    - Windows would become gray and unresponsive when there's a lot of disk activity.
    - I've seen ubuntu crash on my much more times than I've seen BSOD on the same HW.
    - If i lock my computer, I want it to be locked. I don't want it to be locked for a minute or so and than display what was last on my desktop. Sure, you'd have to log in to get access, but there could be things for my eyes only on that screen. So don't you ever roll eyes on windows security, ok? You've got your own issues.

    I could probably think of more but this is just a list of things I remember from the top of my head. Sure, you'll be downmoding me and say I'm trolling. Maybe I am. But my point is: there are MUCH more IMPORTANT things to fix than the FUCKING BOOT TIME. Who the fuck even cares about boot time?? Can't you just grab a coffee while it boots? What kind of idiotic metric is this?

    I guess SW development is hard and complex. And we've reached a point where maintaining these beasts is hard, for either open source or commercial products.

  • by auLucifer ( 1371577 ) on Wednesday January 14, 2009 @06:52PM (#26457187)
    I find it depends on what it's booting from. I use a mac air now and use to use a pro. I found that if I just close the lid so it goes into a shallow sleep it is up and running in a couple of seconds. However if it's from power down and needs to boot fully it can vary from under a minute to a few minutes. From a deep sleep is faster then powered off and it gets the environment exactly how I left it. I almost never see the boot screen these days which I think is how it should be as I do much the same work every day.
  • by lindi ( 634828 ) on Wednesday January 14, 2009 @06:56PM (#26457241)
    My desktop uses nfs as its root filesystem so it is easy to measure how much data it will need to read on boot by measuring network traffic. A complete reboot with "shutdown -r now" generated only 44 megabytes of traffic (including both read/written data and ethernet overhead) so there is clearly no need to read a GB. The system runs debian gnu/linux 3.0 with linux 2.6.18-4-486.
  • Re:disappointing... (Score:4, Interesting)

    by steveha ( 103154 ) on Wednesday January 14, 2009 @06:57PM (#26457257) Homepage

    This is a truly disappointing news item. Instead of setting the bar higher and truly trying to reduce boot time, they have not done much more than shave seconds off the existing boot time.

    I just checked, and it does seem that a fast boot time was one of the goals that Mark Shuttleworth set for Jaunty.

    There are some specific goals that we need to meet in Jaunty. One of them is boot time. We want Ubuntu to boot as fast as possible - both in
    the standard case, and especially when it is being tailored to a specific device. The Jackalope is known for being so fast that it's
    extremely hard to catch, and breeds only when lightning flashes. Let's see if we can make booting or resuming Ubuntu blindingly quick.

    https://lists.ubuntu.com/archives/ubuntu-devel-announce/2008-September/000481.html [ubuntu.com]

    Given that, I must confess that I'm also a bit disappointed that the boot time isn't closer to five seconds.

    I love your work with the 5 second boot, and I look forward to that technology being implemented widely. On a modern super fast CPU with a solid-state hard drive, I should hope that a desktop computer could boot as fast as a netbook. (And I'd be willing to install Coreboot [wikipedia.org] to get that speed.)

    steveha

  • by Chabo ( 880571 ) on Wednesday January 14, 2009 @06:58PM (#26457271) Homepage Journal
    The boss of Volkswagen did this after they bought Bugatti. He said "let's build a car that produces 1000bhp and goes 400kph". Then it took years for the engineers to figure out how such a thing might be possible. In the end, they did it, and it's probably the greatest car ever made.

    [/clarkson]
  • by rossz ( 67331 ) <ogre@@@geekbiker...net> on Wednesday January 14, 2009 @07:07PM (#26457423) Journal

    In the business world that pretty much sums it up. You don't need to know how to do something. However, despite what you say, figuring out what consumers really want and are willing to pay for is damn hard. Companies spend billions trying to answer this question. Most of the results are complete failures. A few ideas make a few people very rich.

    Geeks can be absolutely brilliant in their field. Given the right direction they can come up with the next big thing. However, most geeks spend their time on little pet projects that will never make a dime. The sad part is when the business man comes up with an idea and the geek implements it, the businessman usually doesn't give the geek enough credit, aka $$$.

    The most rare of exceptions is when the geek comes up with an idea that becomes the next big thing.

  • Re:Oh YEAH? (Score:3, Interesting)

    by evanbd ( 210358 ) on Wednesday January 14, 2009 @07:07PM (#26457425)
    It does not reach full brightness in one cycle. 50 ms will get it nearly there, perhaps 100ms or a little more to get all the way. I'm afraid I'm away from my scope right now...
  • by Kalriath ( 849904 ) * on Wednesday January 14, 2009 @07:17PM (#26457649)

    The BIOS. The BIOS is pretty much the sole reason PCs take so long to boot. For example, at home I have a Core 2 Quad Q8200. When I push power, I get the XFX logo up while the POST runs. This POST takes approximately 20 seconds alone to run because of the inherent slowness of actions like writing ones and zeroes to every byte of RAM and then reading them back to test whether any memory is faulty, or initialising the Video BIOS, so on. Power on to OS loaded (even if it's still spinning up services) is impossible in 1 second, because it takes about that time for the CPU itself to start!

  • by xenocide2 ( 231786 ) on Wednesday January 14, 2009 @07:22PM (#26457719) Homepage

    TFA explicitly states that they time from after the boot loader is finished, to when the login window appears.

    Not quite. It's when the login window sleeps. Pretty close. Some people are arguing that this is too narrow sighted, and that we should wait for the gnome login process to sleep before punching the stopclock.

  • by collinstocks ( 1295204 ) on Wednesday January 14, 2009 @08:09PM (#26458435) Journal

    I set up an auto-login for my Ubuntu laptop, and then have the session-manager lock the screen immediately after logging on (before the panel or nautilus have even loaded, so while the desktop is still unusable). This way, after pressing the power button, I don't have to interact with my computer at all until immediately before I want to use it (i.e. to type my password in order to unlock the screen).

    Unfortunately, just putting `gnome-screensaver-command -l` into the session manager won't work because it doesn't seem to load immediately. Instead, I made it run a script that executes that command in between calls to `sleep 1` six or eight times. It works for me.

  • by Anonymous Coward on Wednesday January 14, 2009 @08:10PM (#26458441)

    Who are you defending? Blaming other people/projects doesn't make linux any better.

  • Re:*cries* (Score:5, Interesting)

    by FrankSchwab ( 675585 ) on Wednesday January 14, 2009 @08:11PM (#26458463) Journal

    Okay, you're right; resuming from power savings modes works perfectly in Vista.

    Now, run a test for me. Attach a secondary monitor, and place it to the LEFT of your laptop. Configure everything to work well. Reboot, and notice everything is still good. Open a few applications, move them to the secondary monitor, then close them. Something mainstream, like Outlook, will do.

    Now, suspend your laptop. Undock it, and walk to a conference room. Wake it up. Note that many applications now open on the (non-existent) second monitor. Including mainstream applications from major software companies, as an example Outlook.

    Suspend it. Take it back in and dock it. Wake it. Notice that Vista now believes that your secondary monitor is on the RIGHT of your laptop.

    Heaven help you if you connected your laptop to the conference room projector when you were there.

    Yep, Vista works exceptionally well for all common usage scenarios with suspend/hibernate.

    That's why I'm interested in boot times. /frank

  • Re:Could care less? (Score:2, Interesting)

    by Maestro485 ( 1166937 ) on Wednesday January 14, 2009 @09:08PM (#26459233)
    How can you call someone an idiot while simultaneously admitting to not updating your machine for 5 years?
  • by Chabo ( 880571 ) on Wednesday January 14, 2009 @09:34PM (#26459555) Homepage Journal
    Right. At least one car (the SSC Ultimate Aero) has beaten the Bugatti's speed record for a production car, but the Bugatti is simply an engineering marvel. Most "really fast" cars are designed to hit their speed limit a few times, and F1 cars are designed to do a couple races, but the Veyron is designed to last 20 or 30 years of road driving.

    The Top Gear presenters kept comparing it to Concorde. That's how big of a leap forward it was.
  • by freddy_dreddy ( 1321567 ) on Wednesday January 14, 2009 @10:34PM (#26460237)
    The problems I encounter in Windows are a few orders of magnitude smaller than those in Linux. I think it's been since the early 90's that I had problems with sound, video or internet on MS products.
    The reason is fairly simple: no matter how complex these peripherals have become, they are ubiquitous: we're 2009 ... an OS failing to provide basic multimedia ? Big Failure.

    I see the humour in your reply, but to be honest: Ubuntu makes me feel ashamed of being in the S/W business.

    Oh, one more thing I forgot to mention/troll about: MS doesn't kill my hard drives as Ubuntu did. About a year ago there was this small peculiar bug which made HDs sleep and wake up constantly. It's not something you notice until you read through the bug reports, but it killed one of my drives and I had to throw away the other one. The store paid it back, after I changed my story from "Linux" to "MS", btw.
  • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Wednesday January 14, 2009 @10:35PM (#26460245) Homepage Journal

    legacy hardware. There's tons of timeouts.

    So why wait for them sequentially? Even better, why not supply optional kernels tuned to modern hardware? I don't own a system with an ISA bus (even a faked-up one on the southbridge or similar), so let me skip probing for one.

    Yeah, yeah, compile my own and all. But surely big distros like Ubuntu could make a legacy-free kernel available that skips ISA, serial, parallel, etc.

  • by HeronBlademaster ( 1079477 ) <heron@xnapid.com> on Wednesday January 14, 2009 @10:49PM (#26460385) Homepage

    But there's no reason the computer shouldn't function before the "real" drivers are installed - we have standards for a reason. Ever notice that your video card works before you install drivers for it? Windows is using a standard API to access the card.

  • by SteelX ( 32194 ) on Thursday January 15, 2009 @12:15AM (#26461185)

    From the original poster [slashdot.org]:

    Imagine a visionary like Steve Jobs (by the way, enjoy your leave of absence and please come back). He goes to his team and says "I don't care what it takes, build me a computer which boots in one second".

    From http://www.linuxdevices.com/news/NS5429881813.html [linuxdevices.com]:

    Yet, most BIOSes available for x86 chipsets were built for the desktop market, and thus have not been optimized in this area, according to Steve Jones, General Software CTO.

    When Steve Jobs is not around, leave it to Steve Jones!

  • by Rich0 ( 548339 ) on Thursday January 15, 2009 @08:11AM (#26464049) Homepage

    Frankly, what linux really needs is a working network filesystem.

    NFS is just a real mess. It has all kinds of security issues, and has no concept of users beyond the local machine. If /etc/passwd isn't synced across the network all kinds of stuff goes wrong. It also has a lot of limits around permissions/etc.

    Probably the next closest usable network filesystem for linux is samba - which really isn't ideal (for one it is almost entirely reverse-engineered and depends on a spec that isn't open). That filesystem does handle many of these issues, but not in a linux-friendly way.

    I just can't believe that nobody considers it important to have file sharing over a network that "just works". Windows has had fairly simple to set up file serving from the desktop for at least a decade. Sure, like all things windows they had a number of security bugs, but that has settled down a bit.

    OpenAFS seems to have some potential, but nobody uses it, and it is a bit of a pain to set up since it has so many layers to it. Is anybody working on a simple way to have users on a network be able to mount their home directories remotely with working permissions and non-synced /etc/passwd?

  • by 4D6963 ( 933028 ) on Friday January 16, 2009 @02:44AM (#26479131)

    "Seeing sound" refers to the sound's spectrogram [wikipedia.org]. It's frequency on the vertical axis and time on the horizontal axis (see the axis on the screenshot). Yes, an horizontal line will sound like a tone, a vertical line like a click, a diagonal like a chirp. You can see which frequency and pitch a feature matches to by referring to these axis.

    If you want to make a guitar sound, it'll look like thin horizontal parallel straight lines that fade out, yes. The height of the lowest line defines which note is played, the distance between each of the parallel line and each of their respective intensity defines how the instrument sounds, defines how warm, metallic, harmonic/anharmonic something sounds.

    It's more intuitive than some other approaches, but it's a bit more technical than that. You can't just obtain the sound you want by thinking about it hard and letting your hand do the rest ;-) you have to know how to do what you want. As I said, it's pretty low level, but if you want to create a guitar sound, you'll have to look at one first (that is open one with Photosounder and look at the image), then you can "let your hand do the job" by reproducing what you saw/what you remember seeing, and from that point on you can try tweaking things, doing things a bit differently, making some lines or parts brighter, spacing lines differently, adding a sort of glow to make it more noisy, and so on.

    And yeah it's more intuitive than looking at waves, looking at waves informs you about little more than the sound's envelope, which is its intensity regardless of frequency. A spectrogram (what I refer to as a sound's image) is more informative in that it shows everything you can hear in a very descriptive way. Although once again it's very low level, it takes a bit of deciphering, depending on what you're looking at. It's very much like a musical score, actually if you loaded a musical score it would sound quite like the tune it's supposed to represent, if you omit the bars. To use a programming analogy, a piano score is like code in a high level language. The compiler (the interpret, or the synthesiser) turns it into sound. Photosounder disassembles this sound into an image, which is a low level representation of the tune, but if you compare it to the original "high level" score, you'll find out that basically the "compiler" turned the score into music by replacing graphically each dot (representing a note) into a set of parallel horizontal lines which are more or less long horizontally, depending on the length of the note interpreted by it. The interest of disassembling is that you can modify anything you want, i.e. you can shift a note up by moving it up, you can shorten one, or you can change the distance in parallel lines to make it sound like a more or less different instrument, and then reassemble the modified result into a sound, by pressing the Play button. Of course it might be more practical to just change the original score and reinterpret it, but maybe you're an electronic musician who wants to modify a bit a specific piano sample.

    There are many kinds of echoes. The mountain kind (the one that repeats what you say) can be achieved by just duplicating many times the sound's image, shifting the copies to the right, toning them down a bit (making it darker) and blending the whole. You can get creative and achieve something new by also for example shifting the copies downwards a bit to get something progressively diminishing, or blur them progressively, or rotating them a bit, or whatever you can think of really. A more room-like echo tends to look like random "scanlines" overlaid on the original image.

    Yes you can make a sound fuzzy by blurring it, although if you want to make it just more fuzzy in time (i.e. make it sound kind of slower while still playing at the same rate) you could just use an horizontal motion blur. Yes you can sharpen the sound you can erase some blur, that's actually what I do to fix the blur on some

For God's sake, stop researching for a while and begin to think!

Working...