Clearing Up Wayland FUD, Misconceptions 240
An anonymous reader writes "In clearing up common misconceptions about Wayland (e.g. it breaking compatibility with the Linux desktop and it not supporting remote desktops like X), Eric Griffith (a Linux developer) and Daniel Stone (a veteran X.Org developer) have written The Wayland Situation in which they clearly explain the facts about the shortcomings of X, the corrections made by Wayland, and the advantages to this alternative to Canonical's in-development Mir."
Re:The Manchurian Candidate (Score:2, Informative)
Because it would require completely rearchitecting and breaking the protocol.
Re:show me hello world on my own pc or STFU (Score:3, Informative)
First, SDL isn't an alternative display technology, it's a library which works on top of X (and Weston). Second, I am running KDE on Weston right now and it is working pretty well. It's not ready to replace X for most users yet, but it's stable and getting close to ready for mass consumption.
Comment removed (Score:5, Informative)
Re:The Manchurian Candidate (Score:5, Informative)
Well he gave an answer in the article: if you move to "fix X", you end up making X12. And when you do that, all the stakeholders in X come out of the woodwork and insist on preserving all the legacy parts of the system that, frankly, don't belong.
The way things have unfolded, X11 will become a library on top of Wayland. And that's perfectly fine.
Re:The Manchurian Candidate (Score:4, Informative)
Why not fix X?
The article answers that question on the very first page. (Scroll down to the bottom.)
Re:No mention of remote anything in the article (Score:2, Informative)
FTFA, page 3, point VI:
every X app just gets its own mini X-server to deal with
That sounds like it will be pretty straightforward to support individual apps remotely.
Re:The Manchurian Candidate (Score:5, Informative)
Re:The Manchurian Candidate (Score:4, Informative)
There was a time when displays did everything by passing around rendering primitives -- lines, rectangles, black and white bitmap pattern tiles. At that time it made a lot of sense to integrate network at the low level because you had to figure out how to send and decode all those drawing primitives over the wire.
Display technology moved on. Displays became rich and complex and colourful, and different applications had very different needs and took on more and more of the rendering task themselves and simply pushed bitmap buffers to the display system. Now the task of the display system was to mediate and manage and request complex bitmap buffers from the various clients.
At this point remote display was a matter of having the client send (potentially compressed) bitmap buffers -- let the clients do their own rendering. This is how most remote display systems written in the last 15 years do it. Indeed, that is how X does it these days for most applications: the applications do their own rendering via GTK or QT and Cairo and X pushes the pixmaps down the wire.
If all you are doing is throwing around bitmap buffers, and the display software is simply mediating and displaying those, then remote display doesn't need a whole lot of thought at the display level -- all it has to do is mediate and display the bitmaps it gets from clients. Now, providing a remoting system to let remote clients get their bitmap buffers to the display when requested ... well that's still a thing that needs to be done, but it the base display software doesn't have to care too much about how that gets done.
Think of it as teasing out the layers in the software. The base layer is pushing pixels to the screen (no matter where the data for those pixels came from, remote or local). That's one job: pixels on screen. Focus on that and do it well. Another job is getting the data that the base layer is going to display to it, and you can worry about remote/local differences in that layer.