Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Government IT Linux

The Woes of Munich's Linux Migration 314

mikrorechner writes "The H Online has a writeup of the problems encountered by LiMux (Wikipedia entry), one of the most prominent Linux migration projects in the world, trying to introduce free software into the highly heterogenous IT infrastructure of the City of Munich. Quoting: 'Florian Schiessl, deputy head of Munich's LiMux project for migrating the city's public administration to Linux, has, for the first time, explained why migrating the city's computing landscape to open source software has taken longer than originally planned.'" Here is Shiessl's blog, in which he details some of the transition problems.
This discussion has been archived. No new comments can be posted.

The Woes of Munich's Linux Migration

Comments Filter:
  • by AvitarX ( 172628 ) <me@brandywinehund r e d .org> on Friday March 19, 2010 @03:45PM (#31542622) Journal

    Converting all computers to the Open Document Format (ODF) standard has overcome dependency on a single office software suite.

    Does ODF now define formulas for spreadsheets? because my understanding was that this was still ambiguous in the spec, and it is a pretty big problem if it is.

  • by WrongSizeGlass ( 838941 ) on Friday March 19, 2010 @03:55PM (#31542744)
    Indeed.

    Every major project always takes longer than expected because so many small details are exposed as you uproot any existing system or workflow process. Instead of looking at this as something that may have been "more trouble than they bargained for" we should learn from it and understand that migrating to Linux won't be any easier than migrating to or from any other platform. I think there are two things to take away from Munich's Linux migration:
    * It can be done.
    * Being on the leading edge carries with it a lot of responsibility to those who will follow you.
  • Wrong approach (Score:5, Interesting)

    by Sub Zero 992 ( 947972 ) on Friday March 19, 2010 @04:03PM (#31542852) Homepage

    Well, they tried a horizontal migration strategy, moving from location to location and department to department. That meant the problems never stopped.

    A better approach might have been to do a vertical top-down migration: Servers: first roll out a directory server infrastructure, then a CIFS strategy etc.; Clients: migrate away from MSIE / Active X, then to CUPS, then away from MS Office etc.. And then, finally, to change the desktop OS out from underneath.

    A suggested strategy for those planning something similar: 1: migrate the server services (and create a shiny new unified and consistent infrastructure); 2: migrate the desktop apps to FOSS alternatives (chose apps which will work under your target desktop OS); 3: switch out the desktop OS for linux (the users retain the apps they have become used to).

    Just my 0,02

  • by El Lobo ( 994537 ) on Friday March 19, 2010 @04:09PM (#31542950)
    Now seriously, I've read that all this migration has cost MILLIONS from public fonds and there are rumors that some heads are going to to roll soon because of this. In the university I am working for some IT-boss though it was a great idea to replace a well working First Class conferencing system that had been working GREAT for years by the Open Source Sakai. Well, the results: several millions have been wasted in this, there are (maaany) problems with the new platform, teachers hate it, students hate it... The "brilliant" IT chef is now working somewhere else after that disaster.
  • Re:Bad title is bad. (Score:5, Interesting)

    by MozeeToby ( 1163751 ) on Friday March 19, 2010 @04:10PM (#31542956)

    Very true, by the sound of the blog most of their problems stem from how poorly the systems were managed before. Different versions of Windows running different levels of updates; hundreds of authorized apps, many with overlapping or duplicate functionality; unauthorized applications that had made their way into the work-flows without being documented; proprietary software that didn't follow open standards. I wonder how much of their effort has gone into just getting their infrastructure should have been before the transition even started.

  • Why so prominent? (Score:5, Interesting)

    by Mark_in_Brazil ( 537925 ) on Friday March 19, 2010 @04:22PM (#31543130)

    Why is the Linux migration project in Munich so prominent, as mentioned in TFS? I know of much larger migrations, both in terms of the number of computers and the geographic area covered. The Brazilian government has been migrating to Free Software in mass. The Bank of Brazil, for example, has over 100,000 computers running Firefox and BrOffice. As of last June, the estimate was right at 100,000, with 65,000 of those machines running Linux and 35,000 running other operating systems. The Bank of Brazil has branches and offices all over Brazil, which is a very large country. The mass migration happened in 2006, before the migration really began in Munich. The number of machines involved (counting the Linux boxes only) is about 5 times as large as the number of machines to be involved in Munich, and instead of being located in a single city, they are spread out all over a country that's larger than the US would be if it didn't have Alaska, but smaller than the US with Alaska (i.e., larger in area than the "lower 48" plus DC plus Hawaii). In the year 2006 alone, the Bank of Brazil estimated that it saved R$20MM by using Free Software.

    FWIW, I've also seen Linux desktops at the ITI (Brazil's IT Institute). Even totally non-nerdy ITI employees seemed perfectly at home on Linux desktops when I was there as long ago as early-to-mid 2005. The Bank of Brazil branch where my company has its account has all Linux desktops. The managers who take care of my account think it's funny when I crane my neck to look at their monitors and geek out on the software their 'puters are running. They are total non-nerds and not only appear to be happy with the Linux desktops, but told me they are. It took them a minute to figure out what I was asking - they didn't think of using Linux desktops as anything all that unusual.

  • Re:Bad title is bad. (Score:4, Interesting)

    by MightyMartian ( 840721 ) on Friday March 19, 2010 @04:27PM (#31543206) Journal

    Can't speak to that, but having read the article, it bears little resemblance to the posting title. From what I can tell, this sounds like some mistakes in planning the migration early on. That would happen if you were moving to any new system, FOSS or proprietary.

  • by dave562 ( 969951 ) on Friday March 19, 2010 @04:33PM (#31543340) Journal

    VBA was probably their only choice. In 2000, where was OpenOffice? Where was the Linux desktop? VBA has been around for a "long time" when measured in IT years. At the time they probably went with the "free" tool built into the application that happened to be compatible with the majority of their other applications.

    People bag on VBA like it is worthless. If was totally worthless it wouldn't have been used as often as it was. If there were good alternatives it wouldn't have the market penetration that it does. It is only now that there are alternatives that people are complaining about it. It's kind of like bagging on a 10 year old application for not being optimized for a dual-core CPU.

  • Re:Wrong approach (Score:5, Interesting)

    by 99BottlesOfBeerInMyF ( 813746 ) on Friday March 19, 2010 @04:44PM (#31543498)

    A better approach might have been to do a vertical top-down migration: Servers: first roll out a directory server infrastructure, then a CIFS strategy etc.; Clients: migrate away from MSIE / Active X, then to CUPS, then away from MS Office etc.. And then, finally, to change the desktop OS out from underneath.

    They seem to have taken a more blended approach. A separate project was revamping many of the servers at the same time. They did immediately move away from MS Office to OpenOffice and ODF because they could do so without having to worry about the servers and they laud it as one of the biggest benefits so far. I don't know of any good reason why they should have held off on that. The problem with a top down migration is that many times you don't know what all the services inside your organization and out are actually used. So rolling out a series of Linux clients in every department allows you to discover what your platform specific dependencies are. In some cases they changed the Linux client to work with those services and in some they changed the services to work with Linux.

    A suggested strategy for those planning something similar: 1: migrate the server services (and create a shiny new unified and consistent infrastructure);

    The problem here is in your first step you may have broken a bunch of things and users will have to start changing the way they work. From their perspective you've downgraded the system. That's because they're using a client that does not work as well with your new servers as your Linux clients will. So you've just given the majority of your users a bad taste for the whole thing and generated tons of pushback that can kill your whole migration.

    I think it would make more sense to switch to as many platform agnostic applications as possible, first. Then implement the servers and desktops simultaneously in one part of the company, while letting the users have access to their old desktop via a remote session. Fix the compatibility problems and move on to the next chunk of the company until you can start repurposing the old servers and getting rid of the remote desktop sessions altogether.

  • by Todd Knarr ( 15451 ) on Friday March 19, 2010 @04:54PM (#31543652) Homepage

    Well, migrating an entire organization to the newest version of Windows (with the accompanying upgrades to all the other MS software) isn't exactly cheap. That's why so many corporations are still running XP: they can't justify the costs of upgrading to Vista or Windows 7.

    I note that a lot of the problems they ran into weren't problems with the Linux-based software, they were problems with the proprietary (Windows and Windows-based) software not wanting to play nice with anybody else. One advantage of moving to open-source, standards-based software is competition. In the proprietary environment all those lock-in "features" that caused all the problems during the Linux migration also act to keep you locked in to a single vendor who can then charge high prices because you've no alternative. Once you're on standards-based and open-source software, though, any vendor can come in and take it over. That leads to lower costs down the road because you can dump vendors who try to over-charge without any disruption to your systems.

    It also leads to lower migration costs the next time. OpenOffice doesn't provide some features you need? You can replace it with any other software that handles ODF without any disruption and without any problems with document formatting. Need to talk to another organization that doesn't use OpenOffice? No problem, as long as their software understands ODF you should be able to read each other's documents reliably and correctly (and right now I think the only major word-processing software out there that doesn't handle ODF correctly is Microsoft Word).

  • by JWSmythe ( 446288 ) <jwsmythe@nospam.jwsmythe.com> on Friday March 19, 2010 @05:14PM (#31543886) Homepage Journal

        A long time ago, I decided to start making copies of my floppies onto hard drives, so I'd have images of them before the deteriorated. I made that decision because I had a never opened boxed version of Novell UnixWare (from around 1994). It had sat in a professional air conditioned office until sometime around 2000. It was given to me, and it sat in my computer room for a long time. I finally decided to unbox it and give it a try. It came on floppy disks (3 of them, if I remember right). I went out and bought a floppy drive for this adventure, since all mine had either gathered such an accumulation of dust that I couldn't find the opening, or I had simply thrown them away.

        I put the first disk in, and half way through reading it, there were errors. The disk, although in the original unopened envelope, in the original unopened shrinkwrapped box, had deteriorated. {sigh}

        I tried several other disks that I had been carrying around with me for years, "just in case" I needed them for something. As it turned out, about 2/3 of them were unreadable, just from age.

        So, I tossed them all, and gave the drive away to someone else who wanted to use it. He had a better success rate, something like 75% were readable.

        I was talking to some kids not too long ago, about disks. I kept asking them, to see if they even had a clue what a floppy disk is. One correctly described a 3.5" floppy, but none had seen a 5.25" floppy. :) It's probably all for the better, they really sucked.

  • Re:Wrong approach (Score:1, Interesting)

    by Anonymous Coward on Friday March 19, 2010 @06:01PM (#31544564)

    That was not your 0.02, that was more like your millions of dollars over budget and never able to recoup those loses.

    Theres a point where everyone looks around and says, "you know the estimates on this job have long gone over budget and we will forever be in debt trying this"

  • by Mathinker ( 909784 ) on Friday March 19, 2010 @06:07PM (#31544638) Journal

    > I've read that all this migration has cost MILLIONS

    Yes, there's a lot of shilling going on, trying to paint this transition in a bad light. man_of_mr_e provided me [slashdot.org] with a link to the Microsoft bid [usatoday.com] which was $23M. The original Linux bid was $36M. And it's probably cost more. But as I replied to man_of_mr_e [slashdot.org], this is still probably a good fiscal decision for Munich, since I find it hard to believe that if they save MS relicensing costs of about $23M every, say, 6 years, they won't pay for the extra conversion costs fairly quickly.

    And that's not even counting the advantages to being free of lock-in.

  • by Anonymous Coward on Friday March 19, 2010 @09:29PM (#31546444)

    Also it should be considered that (1) much of the cause of the delays were due to proprietary lock-in on several levels of the existing systems and (2) they could have done it quicker, but they chose instead to take a slower approach and fix several problems along the way. This doesn't stand out as a bad thing to me at all, it simply proves that doing this type of migration will take time and should be planned as such, but will be worth it in the end.

  • by Nefarious Wheel ( 628136 ) on Saturday March 20, 2010 @12:06AM (#31547242) Journal

    VBA was probably their only choice.

    I worked at one of the major Australian banks; Excel/VBA was the norm, not the exception. It was uniformly horrid (except for the stuff I wrote, of course ;P). It was also highly portable, and standard enough to send betweeen different financial organisations (we're talking "financial instruments worth billions").

    The real reason for all that VBA code, and one that nearly caused me to post this AC, was a bit more back-door.

    A department can hire people to write a few Excel macros locally, but anything that looks like a "programming project" is, by policy, sent off shore for development, which can double the cost and triple the duration of a project. Off-shore development seems wise at the C-level, but the poor sods who have to get a report out for the tax people have an entirely different perception.

    So, a few thousand lines of "it's only a macro, really" keeps middle management away of the - rather painful - outsourcing mill. This is not speculation, this is how it worked.

    These middle managers are the equivalent of the senior staff sergeants who creatively interpret orders to avoid getting themselves and their platoons killed.

  • by Pav ( 4298 ) on Saturday March 20, 2010 @03:32AM (#31547966)

    How come in these discussions noone ever mentions the software they're using (eg. GOsa, see https://www.gosa-project.org/ [gosa-project.org] ) ? GOsa is a web admin front-end which allows management of clients and servers through an LDAP based infrastructre and RPC backend. Services that can be managed include Samba+PDC, email+groupware, FAI & OPSI (for auto-install of Linux and Windows clients), DNS, DHCP, Squid, Asterisk, Linux terminal server clients, and quite a bit more. It IS very hard to get working though.

    Hmmm... I just noticed that Munich is no longer listed as a reference on the GOsa site - I wonder if there is a story there.

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...