Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Red Hat Software Businesses Software Linux

Red Hat Develops Online Desktop 119

pete314 writes "Red Hat announced this week at their San Diego Red Hat Summit that they are planning to compete with Microsoft on the desktop by building an 'online desktop' that will integrate local data with online services. Red Hat CTO Brian Stevens argued that: 'To user the desktop metaphor is dead. We don't believe that recreating a Windows paradigm in an open source model will do anything to advance the productivity in the life of users.'"
This discussion has been archived. No new comments can be posted.

Red Hat Develops Online Desktop

Comments Filter:
  • by jaavaaguru ( 261551 ) on Wednesday May 09, 2007 @02:29PM (#19055311) Homepage
    Are they really competing with Microsoft at this point? As far as I can see Google offer replacements for an increasing amount of desktop software at the moment (Word processor, Spreadsheet, Email, Calendar, Photo management, IM, and various browser integrations such as their note-taking plugin for Firefox. That's a bit more than Microsoft has to offer at the moment.
  • by JCOTTON ( 775912 ) on Wednesday May 09, 2007 @02:45PM (#19055653) Homepage Journal
    Red Hat CTO Brian Stevens argued that: 'To user the desktop metaphor is dead.

    I love it (ironic) when some CIO or other bigwig perports to talk for me. Actually, not only is the desktop still not "dead", but on my desktop is a Mainframe running COBOL/CICS/DB2. Still not dead. Not by a long shot.

    Hello, world.

  • Yet Another Attempt (Score:3, Interesting)

    by Coryoth ( 254751 ) on Wednesday May 09, 2007 @02:48PM (#19055715) Homepage Journal
    The network desktop has been tried many times in the past, by Microsoft (badly) with "ActiveDesktop" and in theory with XAML and .NET, and by Sun in various forms. All the efforts I've seen so far just don't cut it. That doesn't mean it isn't a good idea -- I think there's real promise in a distributed approach to the desktop -- just that it is hard to execute well. Stumbling blocks in the past have included: a lack of real network transparency (the "online" aspect was a thin veneer rather than being truly transparent); lack of sufficient bandwidth (all the "online" stuff was pitifully slow, and ignored); and security, security, security.

    To succeed you need a system that doesn't view the network as a bolted on thing, but integrates it at the core; Plan9 comes to mind on that front. At least X11 has network transparency, but it needs to be more efficient (think NX), and have far better security built in to really work for this. Bandwidth will slowly but surely fix itself. That leaves security -- and there's a lot required to make that happen. It is an ambitious and worthy goal, but in this case it is possibly a case of biting off more than you can chew: if it isn't transparent, efficient and secure, it isn't going anywhere, and fulfilling those requirements would require vast architectural changes.
  • by anoopjohn ( 992771 ) on Wednesday May 09, 2007 @02:48PM (#19055723) Homepage
    Linux has a reasonably big marketshare in the server market share [Netcraft Survey []]. However it is still waiting for the day when it will be accepted in the Home PC market as a strong competitor for the Windows family of OSes. I am a strong fan of Linux and I have been trying to promote Linux in my market but people still refuse to accept it open heartedly. In spite of detailed explanations and demos people are hesitant. I even offer free linux installation assistance [] for people who already own computers. People still look at Linux with scepticism. I think it would have been much better if more effort is put into making linux acceptable for a wider audience. Though I personally disagree I have to agree with what the market is saying - that Linux is still an operating system for the geeks.

    I like what ubuntu is doing - ie making the whole Linux experience easier and better for a common man. In a country like India where I live we are talking about 800 million people whom we can identify with the common man. 2/3 of the world ie 4 billion could be classified with these. We need Linux to target this market. We need Linux to focus on making the Linux experience much more comfortable for these people. We need more effort to be put into creating Linux drivers for the hardware that are not yet Linux compatible. We need easier installations for a larger number of applications.

    I am not too excited about the proposition other than as an useful feature for a small percentage of the whole world.
  • by HW_Hack ( 1031622 ) on Wednesday May 09, 2007 @03:13PM (#19056177)
    This is absolutely the right step for our increasingly connected world - but the devil is in the details as usual.

    The desktop isn't dead but its damn stale - what I would envision is a bi-modal operation: if you have wired or wireless access your "desktop" seamlessly includes your "on-line" resources - applications - data files - links - IM buddies - etc. all integrated into your applications - disk volumes, When offline you would have what you have right now. Of course you would need a method to mark certian files as bi-modal so they would reside in a file cache and be available offline - the OS would handle file sync'ing etc. Or a thumb drive could be a file cache

    On the flip side where the desktop is really dead (as in "Dead to You" ) --- I could see you carrying a USB thumb drive that launches a mini-linux session and then you connect to the "server in the sky" to access all your docs - email - applications - etc.

    Both ideas are step in the right direction for Linux ... just doing "XP the right way" is not a path to success for Linux. The Linux industry is very nimble compared to Microsloth ... lets see what this baby can really do !

  • by evought ( 709897 ) <evought@[ ] ['pob' in gap]> on Wednesday May 09, 2007 @03:14PM (#19056197) Homepage Journal
    I have no interest whatsoever.

    When I was actively doing business travel, online collaborative apps were a supplement to applications on the desktop, given that the online apps were trustworthy (controlled by my own business). I never had any desire to get rid of local applications, especially since I had to be able to do office work, development and other tasks on the go, with no network access, expensive network access, insecure network access, or unreliable network access. If the "network applications" are downloadable and cached for off-line use, then you have nothing new, that's just semi-automated deployment and update. When it comes to that, externally controlled auto-update is a bad thing in many environments. I want to control when I upgrade, after I know the update is not going to break something. I don't want to log on, find out I can't access an old file, and have no way to restore the previous version of the application. Web services are continuously in beta.

    Currently, I have absolutely no need for remote apps. I do all of my work locally and live rurally. Why would I want my applications and/or data externally controlled and unaccessible if I don't have a connection? I have full-featured applications (which would take considerable time to download). I pay for them once (if I have to at all). I have low latency. I can pick and choose which applications I use. I can have multiple versions installed if I need to for compatibility reasons. I control encryption and backups when I need it. What advantage does a "network desktop" get me?

    Why bother?
  • by smilindog2000 ( 907665 ) <> on Wednesday May 09, 2007 @03:31PM (#19056507) Homepage
    I think it's nonsense for RedHat to say that the Windows desktop is dead. RedHat has always gone after the business server and workstation markets, and have done a great job taking down Sun while avoiding pissing off M$. The whole reason that Ubuntu has so much momentum is how they've made the desktop familiar and easy to use, and less buggy. RedHat could still hammer Ubuntu if they'd just ship a desktop focused OS and stop claiming that M$ is doing it all wrong.
  • by 99BottlesOfBeerInMyF ( 813746 ) on Wednesday May 09, 2007 @03:43PM (#19056735)

    A "web application" isn't if it does not require remote for processing and storing. It is just a local application run in a browser.

    True, but that's not what we're talking about. We're talking about applications that run via a Web browser and integrate with a Web service (Google Docs), but which also run locally without Web access, albeit with some features disabled. It is important to note, we were speaking about the desktop metaphor being dead, and when your app is running locally in a browser, that does seem to be the case to a significant extent.

    For example, gmail has a nice view run locally in the browser.

    I'm afraid I have no idea what you were trying to say with that sentence. Could you please be a little clearer.

    You cannot send or receive email if you are not online.

    So? The point is to allow you to compose messages in Gmail when offline. More importantly, for applications whose primary purpose is not communication via the Web (games, photo editor, word processor, spreadsheets, calculators, etc.) it will allow them to be functional offline, only adding functionality when online.

    Desktop is NOT dead.

    The desktop is not dead. The desktop is threatened as a control metaphor, by the browser. I, personally, don't think Redhat's plan is sound or their vision is accurate. I never argued that they did. I merely pointed out the problems with the assertions that Web applications are not useful because they don't function when offline. Since they are moving towards a more hybrid approach, that is a dated view.

    I also understand where Redhat is coming from. The desktop OS has stagnated. Most users still do not have (and will not for the foreseeable future) have spellchecking available in all applications. That is just sad. Any possible way to undermine MS's monopoly and add functionality despite their stubborn refusal to move forward gets developers excited. Anything that removes user's dependancy on Windows is a plus for me.

  • Hybrid approach (Score:2, Interesting)

    by Anonymous Coward on Wednesday May 09, 2007 @04:12PM (#19057371)
    I've always thought rather than having local OR remote, a hybrid approach would be nice. Something like the way exchange works. You have the desktop client; but if you are away you can log into the web client. The data is available in both places. It would seem to me that such a concept could be used for other things.

    I suppose it would require implementing clients twice. I think though, that I would prefer a more accessible system with fewer features rather than a new Office sweet every few years (or whatever other apps may be applicable)
  • While I might be missing something, this sounds kind of like Adobe's Apollo [] software idea.

    This would be like having a version of Google Docs [] that actually was installed on your computer, but communicated with a server in order to store your data. For an organization the end user wouldn't be able to tell the difference, besides the speed of the software.

    I think the closest thing that this would resemble are Microsoft's roaming profiles, but in a way that actually worked.

    By having a copy on the machine for speed and a golden copy on a server for backup purposes, there would be the ability to move away from the idea of "my desktop" so that no matter what machine the user was on, they would have all of the same programs and info that they normally had on their personal computer.

    Another thing to remember (when comparing this to "services" as we know them now) if this was an Open Source project, it would be easier for individual organizations (or even individuals) to setup their own servers to store this information.

  • Re:Yeah (Score:1, Interesting)

    by Anonymous Coward on Wednesday May 09, 2007 @05:39PM (#19058913)
    No, thats wrong and you are a complete moron who hasn't even tried to read any of the wiki on this subject.

    1. The fastest way to have a project fail is to 'preach' to your audience. Re-educating the windows users and indoctrinating them into the 'unix/gnu' way is a long term prospect, on a single user by single user basis. If you spent any time in the ubuntu support channel you would know jus thow much work this is and how long it takes for the noobs to get a clue as to 'why' everything is the way it is.


    2. The stated strategy of ubuntu is to NOT PROMOTE the use of proprietar/non free software/drivers/firmware, but allow for their ease of use. People are not getting SOLD on non free software but they are GIVEN AN OPTION to use it should they CHOOSE TO, and made easy to do so. TRANSLATION: the stated POLICY OF UBUNTU IS TO TREAT NON FREE SOFTWARE AS A BUG. You read that right. It is the CORRECT and SANE way to look at the situation because NONFREE SOFTWARE is BUG. You aren't going to educate new users quickly and equally, you aren't going to be able to satisfy new users demands for non free software without them, therefore the proper strategy is to treat nonfree software as a bug. You get them hooked, then you get them hooked on the GPL and the concept of freedom, then you get them to raise a hell of a fuckload of noise to manufacturers about opensourcing their drivers, then you wipe out the non free software through the sheer volume of people demanding it.

    3. This saves projects like debian from having to handle the VERY VERY large workload of dealing with noobs and morons like you.

    So yes, you are wrong because you are too fucking stupid to actually do a bit of googling and digging in the wiki.

    Goddamned noobs.

The optimum committee has no members. -- Norman Augustine