Critiquing Claims of an Open Source Jobs Boom 134
snydeq writes "InfoWorld's Bill Snyder examines what appears to be an open source job market boom, as evidenced by a recent O'Reilly Report. According to the study, 5 to 15 percent of all IT openings call for open source software skills, and with overall IT job cuts expected for 2009, 'the recession may be pushing budget-strapped IT execs to examine low-cost alternatives to commercial software,' Snyder writes. But are enterprises truly shifting to open source, or are they simply seeking to augment the work of staff already steeped in proprietary software? The study's methodology leaves too much room for interpretation, Savio Rodrigues retorts. 'That's why the 5% to 15% really doesn't sit well with me,' Rodrigues writes. 'I suspect that larger companies are looking for developers with a mix of experience with proprietary and open source products, tools and frameworks,' as opposed to those who would work with open source for 90 percent of the work day."
Many businesses are open-source based accidentally (Score:4, Informative)
Given that Java is now GPL and open-source, and lots of the popular third-party Java frameworks are also open-source, I would expect that open-source is really hot in many businesses - just that they don't know it.
When your developers code an interface to XYZ in a short time, it's not because they're reimplementing the wheel. No, they're using Axis. Or HttpClient. With hibernate, spring, struts, tiles, and so on.
But if we look at databases, you'll see a large investment in proprietary systems still, for core business data, with MySQL running minor functionality around the outside. Cutbacks simply mean that upgrading your database platform won't happen, it's already paid for, why migrate from Oracle to Postgresql!
The other big platform is MS proprietary. You all know the story. It keeps TheDailyWTF alive.
repeated troll... (Score:2, Informative)
Re:I am with Bjarne on this one. (Score:3, Informative)
A free Java implementation of MapReduce and GFS (Apache Hadoop) already works fine on 5000 computers cluster.
And there's no real reason why it can't scale further.
Re:I am with Bjarne on this one. (Score:3, Informative)
Actually...
The largest "real, in-use" Hadoop cluster that Yahoo! has is around 2000 nodes, counting a dedicated name node. As far as we're aware, we've got the largest Hadoop cluster. [If there is a bigger one, we'd love to talk to you and compare notes. :) ]
That said, we do have Hadoop running on tens of thousands of machines. Just not as one big cluster.
It is also worth pointing out, that most of our clusters are multi-user, multi-application. The number of nodes is really more indicative of the size of the Hadoop distributed file system than the number of nodes given to a particular application in our (grid team's) use case.
There is a lot more about Hadoop, and Yahoo!'s particular Hadoop usage for internal utility-type computing, at http://wiki.apache.org/hadoop/HadoopPresentations [apache.org] .