- This feels very right. The problem is there are few entities invested enough in Linux as a consumer platform, that have the motivation to push things forward. To make the decisions on what their "Reference" system is.by TechPlasma - 17 hours ago
Valve is maybe the closest?
- > But there was no one to coordinate Linux on desktop.by mongol - 17 hours ago
Freedesktop.org?
- I'm not entirely clear why/how this is an open source issue?by taeric - 17 hours ago
My assertion: Inertia of user base is by far the largest predictor of what will stick in a market. If you can create a critical mass of users, then you will get a more uniform set of solutions.
For fun, look into bicycles and how standardized (or, increasingly not) they are. Is there a solid technical reason for multiple ways to make shoes that connect to pedals? Why are there several ways to shift from your handlebars? With the popularity of disk brakes, why don't we have a standard size for pads?
I think there are a lot of things that we tell ourselves won from some sort of technical reasoning. The more you learn of things, the less this seems true, though.
Not, btw, that there aren't some things that die due to technical progress.
- Apple coordinates internally, since macOS works with Apple hardware. Windows can drive coordination among hardware vendors. In the Linux world, many organizations and projects share power; there is not the same focal power on having a consistent end user OS (dependencies, configuration). Declarative and deterministic build systems at the OS level allow different groups to package their subcomponents reliably. As various configurations get socialized, it gives choice to tradeoff between customization and popularity/vetting.by xpe - 17 hours ago
- by dkdcio - 16 hours ago
- Solving the coordination problem in FOSS is one of the grand challenges of humanity. If we solve it, I think it will effect a tectonic shift with far reaching implications and fixes major socioeconomic problems like wealth concentration. Eg: a FOSS alternative to Visa, and of course Windows/MS Office.by dcreater - 16 hours ago
- Ehh I don't buy that the market was ready 10 years earlier (in 2006) for open-source LSP implementations.by muglug - 16 hours ago
You gotta have someone write those language servers for free, and the language servers have to be performant. In 2006 that meant writing in a compiled language, which meant that anyone creating a language server for an interpreted language would need to be an expert in two languages. That was already a small pool of people.
And big multiplayer OSS platforms like GitHub didn't exist until 2008.
- I think the definition of linux is much broader than what is considered today a platform for the IDE. It's kind of like the IDE is the cart, and the kernel is the horse, but 30 years later, linux is an engine with a cabin virtual machine, rather than a desktop per se. the parts interact at a different level now.by initramfs - 16 hours ago
- > The underlying force there is the absence of one unified baseline set of APIs for writing desktop programs.by tbrownaw - 16 hours ago
It's called the Common Desktop Environment.
- I was truly shocked at how bad the experience is when you are using and RPM based distribution and a program is only available as a DEBby UltraSane - 15 hours ago
- Who is being incentivised to reduce the friction of interoperation?by RossBencina - 15 hours ago
Coordination is hard. People who are good at coordinating are not necessarily the same people who are happy to contribute their time to FOSS. And FOSS may need to coordinate in ways that vertically integrated companies do not.
Coordinating between loosely aggregated volunteer projects is not the same as coordinating between vested stakeholders either. I would guess that most FOSS projects are more invested in their own survival than in some larger objective. Teams within a company are (presumably) by definition invested in seeing the company mission succeed.
The GNOME / KDE example mentioned elsewhere in this thread is interesting because these are two somewhat equivalent co-existing projects. Any coordination between them is surely not their highest priority. Same with all of the different distros. The each exist to solve a problem, as the fine article says.
I wonder how much the problem is actually "open source can't standardise on a single solution." Let one thousand flowers bloom, sure. But don't expect a homogeneous user experience. The great thing about standards is there are so many to choose from. xkcd 927. etc.
- I would love to tell people about linux for their desktops, but the main issue I have with it is the fact that people who are interested in it ask me one question regarding Linux distributions:by colesantiago - 15 hours ago
“Which one?”
This is pretty much the cause of a 90% drop off of interest in Linux on the desktop.
I could say use Ubuntu (and I do) to some of the people who I’m close with that are interested in Linux, but they discover Lubuntu, or Linux Mint and Debian, then they get easily confused and give up.
And that is not even getting into the updates and the packaging and heaven forbid anything breaks.
- I am reminded of someone I read recently decrying as a loss GNOME adopting systemd components as a critical dependency because they want alternatives to systemd.by shadowgovt - 15 hours ago
... and this a layer of open source flexibility I never wanted. I don't want alternatives to core system management; I want one correct answer that is rugged, robust, well-tested, and standardized so that I don't have to play the "How is this service configured atop this service manager" game.
- What? How is this even at top? Some no-name program is not getting an update or is not perfectly installable and suddenly it's an open source problem? Stop being an entitled prickby udev4096 - 15 hours ago
- The OP defeats his own argument. LSP was a collaborative effort that benefited from a degree of coordination that only hierarchical organizations can provide, yet it still sucks ass.by fr4nkr - 15 hours ago
OP blames FOSS for not providing an IDE protocol a decade earlier, but doesn't ask the rather obvious question of why language-specific tooling is not only still around, but as market-viable as ever. I'd argue it's because what LSP tries to do is just stupid to begin with, or at least exceptionally hard to get right. All of the best language tooling I've used is ad-hoc and tailored to the specific strengths of a single language. LSP makes the same mistake Microsoft made with UWP: trying to cram the same peg into every hole.
Meanwhile, Microsoft still develops their proprietary Intellisense stuff because it actually works. They competed with themselves and won.
(Minor edit: I forgot that MS alone didn't standardize LSP.)
- "The reason why we have Linux, and BSDs, and XNU is that they all provide the same baseline API, which was defined from the outside [by POSIX]. The coordination problem was pre-solved, and what remained is just filling-in the implementation."by ninjin - 15 hours ago
But that is not at all how Posix operates or has operated. Posix standardises common denominators between existing implementations. The fact that we now have strlcpy(3) and strlcat(3) in Posix 2024, is not because Posix designed and stipulated them. Rather, they appeared in OpenBSD in 1998, were found useful by other *nix-es over time, spread, and were finally taken aboard by Posix that standardised what was already out there and being used! This to me is the very opposite of the point the author is trying to make!
- idk. i don't really follow the argument. large projects in open source coordinate internally and engage externally when they need to- i suspect that isn't all that different from what you'd see in a large megacompany like apple or microsoft.by a-dub - 15 hours ago
open source people create reusable interfaces. i'd argue they go one step further and create open and public internet communities with standards, practices and distribution/release channels.
- Maybe open source doesn't need to coordinate. Perhaps users and developers should demand standards and interoperability from their platforms. Perhaps that's why we have things like Electron, Unreal Engine and Unity. One way or another we'll coordinate on something.by bobajeff - 15 hours ago
- Open source has the best kind of coordination. If there's a real use-case for two things to work together, you or someone else can implement it and share it without anyone's permission. Meanwhile in proprietary land, people sometimes build things that nobody wanted, and also leave out features with high demand. Proprietary optimizes for the profit of some individuals; open source optimizes for maximum utility.by antonok - 14 hours ago
Thus far, open source has optimized for maximum utility for individuals who can write code... but AI may be changing that soon enough.
- by throwaway2037 - 14 hours ago
Hot take: This catch phrase is out of date. For Linux desktop normies like me who don't really care about the stability of the Linux user space API, user space does break when GUI libraries (and the myriad of libraries dependencies) change their APIs. For example, I mostly use KDE, which depends upon Qt libraries for its GUI. Qt regularly introduces breaking changes to their API during each version increment: 4->5->6, etc. (I don't hate them for it; it is normally carefully done and well-documented.)> But then, how can Linux exist? How does that square with “never break the user space?”
- but open source foundation provide some guides/events/programs to coordinate.by pacoxu2025 - 14 hours ago