I’d go even further: developers ought to be required to submit reproducible builds to the Library of Congress in order to be eligible for copyright in the first place.
(And copyright ought to be shortened back to its original term length, by the way.)
Sadly, even if I’m moralistically in favor, there is so much insane computer science logic (and proprietary mechanisms) behind the process of compilation, especially on certain embedded systems where this issue comes up, that I doubt that could ever be pushed into law.
I understand it’s easy for a layperson to have that opinion, but I don’t think it can be hand-waved away as too difficult when people are actually doing it.
It being possible for some is quite literally you using an anecdote to try and prove a norm. I sincerely hope you have enough logic skills to understand why that is stupid, incorrect, and bad logic…
You would maybe not be surprised to know that there is way waaaaay more in common from one software project to another. Especially games which essentially all use one of a handful of game engines and asset sources.
I think proper codifying engineering standards for software would also help… maybe even should happen first.
This doesn’t make sense as the compilers would also be included in this new copyright scheme and would become public property after so much time.
There are open source compilers for all major CPU architectures. In fact the open source compilers regularly outperform the closed source ones. It’s also not exactly that difficult to add on more architectures to an existing compiler these days thanks to the modular way modern compilers are built. Once you build a backend for LLVM you unlock not just one language but about a dozen.
Others have mentioned existing efforts to form reproducible results. So, this might be irrelevant now; but I’m fairly sure if the mindset was “open source compilers are always better than extremely expensive ones”, the expensive ones wouldn’t have a reason to exist.
That could be an old mindset. (Of course, binaries made way back in that age are part of how we got in this mess)
Others have mentioned existing efforts to form reproducible results. So, this might be irrelevant now; but I’m fairly sure if the mindset was “open source compilers are always better than extremely expensive ones”, the expensive ones wouldn’t have a reason to exist.
Actually their reason to exist is that some software and hardware platforms don’t have a real open source alternative.
I have a friend who works with some of these compilers, and also with low level assembly language and stuff. He tells me most of the closed source compilers he works with are way behind the open source ones including Microsoft’s compiler. I’ve seen some evidence of this myself too. The reason people use the Microsoft one is because it integrates better with the Windows APIs and Visual Studio, or just because they don’t know better. I believe Microsoft even have an initiative to integrate LLVM into Visual Studio because they know how bad their compiler is in comparison. Since it’s by a large company specialising in systems software theirs is probably one of the better examples.
In the Apple ecosystem they use LLVM for C and C++. The only stable Rust compiler afaik is LLVM based, though they are working on their own alternative which will also be open source.
Every line of code needs to be Open Source. The people or businesses responsible can buy a subscription to keep it from the public. No more money => Publicly overseeable sources + FOSS licensing.
This reminds me of warzone 2100. After its publisher (punpkin) ceased trading, some dedicated ex-employees and community members managed to liberate the source code in 2004.
Now it’s available in some of the major distros and is still updated to this day.
I am out of the loop but it seems like a topic to once again promote my age old belief that:
The moment purchased or licensed software is no longer serviced or supported it must become open-source.
No exception. I am still waiting on the firmware to reprogram my smartish oven.
I’d go even further: developers ought to be required to submit reproducible builds to the Library of Congress in order to be eligible for copyright in the first place.
(And copyright ought to be shortened back to its original term length, by the way.)
Sadly, even if I’m moralistically in favor, there is so much insane computer science logic (and proprietary mechanisms) behind the process of compilation, especially on certain embedded systems where this issue comes up, that I doubt that could ever be pushed into law.
I understand it’s easy for a layperson to have that opinion, but I don’t think it can be hand-waved away as too difficult when people are actually doing it.
Very interesting; thanks for the link.
It being possible for some is quite literally you using an anecdote to try and prove a norm. I sincerely hope you have enough logic skills to understand why that is stupid, incorrect, and bad logic…
You would maybe not be surprised to know that there is way waaaaay more in common from one software project to another. Especially games which essentially all use one of a handful of game engines and asset sources.
I think proper codifying engineering standards for software would also help… maybe even should happen first.
This doesn’t make sense as the compilers would also be included in this new copyright scheme and would become public property after so much time.
There are open source compilers for all major CPU architectures. In fact the open source compilers regularly outperform the closed source ones. It’s also not exactly that difficult to add on more architectures to an existing compiler these days thanks to the modular way modern compilers are built. Once you build a backend for LLVM you unlock not just one language but about a dozen.
Others have mentioned existing efforts to form reproducible results. So, this might be irrelevant now; but I’m fairly sure if the mindset was “open source compilers are always better than extremely expensive ones”, the expensive ones wouldn’t have a reason to exist.
That could be an old mindset. (Of course, binaries made way back in that age are part of how we got in this mess)
Actually their reason to exist is that some software and hardware platforms don’t have a real open source alternative.
I have a friend who works with some of these compilers, and also with low level assembly language and stuff. He tells me most of the closed source compilers he works with are way behind the open source ones including Microsoft’s compiler. I’ve seen some evidence of this myself too. The reason people use the Microsoft one is because it integrates better with the Windows APIs and Visual Studio, or just because they don’t know better. I believe Microsoft even have an initiative to integrate LLVM into Visual Studio because they know how bad their compiler is in comparison. Since it’s by a large company specialising in systems software theirs is probably one of the better examples.
In the Apple ecosystem they use LLVM for C and C++. The only stable Rust compiler afaik is LLVM based, though they are working on their own alternative which will also be open source.
Every line of code needs to be Open Source. The people or businesses responsible can buy a subscription to keep it from the public. No more money => Publicly overseeable sources + FOSS licensing.
This reminds me of warzone 2100. After its publisher (punpkin) ceased trading, some dedicated ex-employees and community members managed to liberate the source code in 2004.
Now it’s available in some of the major distros and is still updated to this day.