The BSD licensed Portable C Compiler (PCC) is steadily on the road for a 1.0 release and is now able to compile a FreeBSD/amd64 CURRENT system with almost no changes. The current version of PCC has evolved from the original PCC developed at Bell Labs during the 1970s and has been maintained by Anders Magnusson and a small team of developers during the last decade. It has received more attention during the last few years, especially by OpenBSD and NetBSD people seeing it at as a viable option as a GCC replacement.
In the mean time, FreeBSD is close to being able to switch to LLVM/Clang:
http://www.osnews.com/story/23166
http://wiki.freebsd.org/BuildingFreeBSDWithClang
Which is the better candidate for use in the BSDs?
PCC, which is supposedly very portable and actual BSD licensed?
LLVM, which has lots of momentum and is an interesting platform beyond static C/C++ compilation?
6c and friends from Plan 9, although these should have catched on decades ago if they are actually interesting.
I think they picked up development of PCC because the GCC was becoming increasingly buggy with each release.
The OpenBSD team like to use whatever is “simplest”. So having a smaller more basic compiler was probably preferred.
I know a chap that is very much part of the OpenBSD team and I once heard him lamenting on some of the ridiculous things that it had done when trying to port a particular piece of software.
Edited 2011-01-30 22:52 UTC
It has more to do with licensing than anything else.
For what it is worth, GCC is fairly stable. But BSDs have a tradition of wanting to reinvent the wheel.
It’s because GCC is a slow, monstrous beast that is incredibly difficult to port to new architectures.
And yeah, it’s GPL which is incompatible with the goals of OpenBSD and netBSD. Kinda like why Linux doesn’t ship with a commercial, closed-source compiler.
GCC being so hard to port, apparently, has managed to support the following platforms:
* alpha*-*-*
* alpha*-dec-osf5.1
* arc-*-elf
* arm-*-elf
* avr
* Blackfin
* DOS
* *-*-freebsd*
* h8300-hms
* hppa*-hp-hpux*
* hppa*-hp-hpux10
* hppa*-hp-hpux11
* *-*-linux-gnu
* i?86-*-linux*
* i?86-*-solaris2.[89]
* i?86-*-solaris2.10
* ia64-*-linux
* ia64-*-hpux*
* *-ibm-aix*
* iq2000-*-elf
* lm32-*-elf
* lm32-*-uclinux
* m32c-*-elf
* m32r-*-elf
* m6811-elf
* m6812-elf
* m68k-*-*
* m68k-uclinux
* mep-*-elf
* microblaze-*-elf
* mips-*-*
* mips-sgi-irix5
* mips-sgi-irix6
* powerpc*-*-*
* powerpc-*-darwin*
* powerpc-*-elf
* powerpc*-*-linux-gnu*
* powerpc-*-netbsd*
* powerpc-*-eabisim
* powerpc-*-eabi
* powerpcle-*-elf
* powerpcle-*-eabisim
* powerpcle-*-eabi
* s390-*-linux*
* s390x-*-linux*
* s390x-ibm-tpf*
* *-*-solaris2*
* sparc*-*-*
* sparc-sun-solaris2*
* sparc-sun-solaris2.10
* sparc-*-linux*
* sparc64-*-solaris2*
* sparcv9-*-solaris2*
* *-*-vxworks*
* x86_64-*-* amd64-*-*
* xtensa*-*-elf
* xtensa*-*-linux*
* Microsoft Windows
* *-*-cygwin
* *-*-interix
* *-*-mingw32
* OS/2
* etc (M68K, M88K, i860… and other very obsolete platforms)
If there is another compiler that supports as many platforms, please let us know.
Also, what exactly is “slow” about gcc?
Edited 2011-01-31 15:40 UTC
“Also, what exactly is “slow” about gcc?”
http://bellard.org/tcc/#speed
Kochise
Gcc 3.2, seriously?
Do you believe GCC 4.5.x has improved much into speed consideration from the time being ? I mean, would it now beat TCC’s incredible compilation speed ? If not, then GCC is definitely a slow horse, whatever you might think. Try also PellesC
Kochise
Edited 2011-02-01 12:50 UTC
A) Yes, it has certainly improved alot in compiling speed since 3.2, and is supposed to improve further in the coming 4.6 release. You realize that gcc 3.2 was released in april 2003!?
B) TCC may still be alot faster, but speed is but one variable. Another is performance of the generated code. It was a LONG time since I tried TCC, but back then the speed of the generated code was very bad compared to the competition. And again although that was many moons ago, from the home page it seems the last TCC release was in 2009 so it’s obviously not updating frequently.
And yes I’ve also used PellesC, in fact it was my favourite compiler to develop with when I was programming under windows particularly due to it’s nice and fast ide/debugger, but the code it generated was VERY POOR compared to gcc/icc back then and I’m certain it’s still very poor compared to gcc/icc/clang/open64/pcc nowadays.
So what? You could put a Fiat engine in many other cars. That doesn’t mean that Fiat’s have a wonderful engine. Being ported to many other systems isn’t necessarily always a good selling point. Sometimes, you want something that might only be usable on your platform, but runs beautifully; rather than something that runs moderately on many platforms.
As already said, it is a consensus that gcc is more buggy and bloated at every release.
Try to read some mailing lists before making statements with no knowledge.
But you cannot compare feature wise the whole gcc infrastructure with this C compiler.
Don’t get me wrong, I like a lot the idea of having alternatives to gcc (competition is always good), but I think pcc is far from be as mature as gcc in several aspects. Give me C++ support on pcc and we can put both in the same leagues.
It is called the “portable c compiler”.
So what?
Clang does C, ObjC and C++. Gcc used to be called “GNU C Compiler”.
Your point being? My point was that the project never intends to support C++, ObjC or anything else. It clearly states on the project page.
http://pcc.ludd.ltu.se/
The “C” in the name is pretty damn important, it seems. The project goals aren’t to support more than one language like Clang or GCC.
Edited 2011-01-31 11:04 UTC
Yea, so a system that comes with this compiler will need another one for C++ anyway. Either they continue using GCC for that, or something else, like Clang. This might seem a bit silly at first, but it could actually make sense if the complete base system can be compiled with the preferred C compiler. The secondary compiler is then only necessary for ports, and would itself be a port.
The point of bringing PCC into the source tree for a BSD is to use it as a system compiler. IOW, the only use for it will be to compile the OS (world, kernel, userland). Thus, making the source tree completely self-hosting.
To compile non-C projects, you install whichever compiler toolchain you want from the ports tree, packages, or pkgsrc (as the case may be).
The goal is to remove GPL’d software not under the project(s) control from the main source tree.
(Just like how some Linux distros have a main goal to remove non-free software from the main project repos.)
Apparently, you have no idea of what the actual names of these projects are or what they actually mean, allow me to enlighten you:
GCC: Gnu Compiler Collection
Meaning: This compiler supports a collection of languages – C (gcc), C++ (g++), Java (gcj), Ada (GNAT), Objective-C (gobjc), Objective-C++ (gobjc++), Fortran (gfortran) and Go (gccgo).
PCC: Portable C Compiler
Meaning: This compiler ONLY supports the C language.
Was that explanation adequate enough to give you the proper understanding necessary for leaving comments without inserting your foot into your mouth? In the future, due diligence should be a priority.
Was I too aggressive to receive a reply like that?
If PCC is a ONLY C compiler, then it lacks a lot of modular features that modern compilers have (“pluggable frontend”, “multiple backend support”, etc.). For gcc for example, no matter what the front-end language is, the optimization tree is the same for all the languages it supports.
What benefits do I get with an “only C” compiler respect to a compiler infrastructure like gcc or LLVM?
Are there some benchmarks showing PCC compiler software faster than a GCC one?
That’s simply a choice in design. Stop confusing architecture choices with feature choices, the 2 are not the same. PCC’s design is one way of doing it. As I’ve said before, just because you can support multiple options, doesn’t guarantee that you’ll be able to use all of those options optimally. Just because someone may be able to write their name with both hands, doesn’t mean that they’ll do it equally well with both hands.
Most useful & productive developers wouldn’t put the compiler’s architecture & design at a higher level of importance than it’s performance. Why else do you think tons of developers still use Watcom???
Refer to the developers, I’m sure that they can give you the data that you require.
Well, recently from the pcc mailing list you will find this:
Here’s a quick overview of major changes lately:
– cpp expansion logic is now is a good state.
– More Jira bug reports than ever before, most of them fixed.
– Fred Tydeman from Tydeman Consulting has been very helpful in
reporting bugs.
Which I would assume that c++ is being worked on for pcc? Also gcc was initially Gnu C Compiler, so there’s nothing saying pcc won’t turn into portable compiler collection if/once it adds more languages.
As for pcc and it’s usefullness, I have no problem with ‘inventing the wheel’ since I think the desire to do so comes from an urge to improve on something and/or attack a problem from a different angle and this in my opinion is a GOOD THING ™ which may help drive technology forward since it means solving a problem in a (potentially better) different way than that of existing solutions. And as always, the more competition the better.
I think this is the C ppreprossessor: http://en.wikipedia.org/wiki/C_preprocessor
Ahh, you are most likely right.
cpp means ‘c preprocessor’. Look it up.
Long time ago there was a length discussion about that at the OpenBSD news site http://undeadly.org
One reason mentioned by an developer was that the compiler is faster resulting in less waiting time and therefore more time for everything else (http://undeadly.org/cgi?action=article&sid=20070915195203&pid=36&mo…).
Furthermore another developer mentioned a lot of other reasons besides this and the license (http://undeadly.org/cgi?action=article&sid=20070915195203&pid=52&mo…).
Fixed.
Edited 2011-01-31 02:19 UTC
NetBSD, FreeBSD, PC-BSD, OpenBSD… nope, no duplication of effort whatsoever 😉
PS. Linux does it too, but since this is a post about BSDs.
PC-BSD is technically just FreeBSD with a few FreeBSD packages pre-installed and pre-configured.
So it’s not duplicating any effort really. Just saving a few people who want a painless FreeBSD desktop some effort.
It’s that easy:
PC-BSD is just a preconfigured FreeBSD for the desktop. NetBSD has its focus on portability, OpenBSD has its focus on security. DragonFlyBSD has its focus on clusters. FreeBSD has its focus on servers. And there is some code exchange between the BSDs.
Time for more education.
NetBSD, FreeBSD, OpenBSD: Not duplicated work, their differences is what’s caused the need for separate projects. Their similarites don’t come from duplicated work, the work was done once & carried over from the forks. Since they share their advancements, the work’s still only done once & then shared with the other members.
Other BSD’s: Share the same relationship as their host BSD. PC & Desktop BSDs are just FreeBSD with a focus on the desktop environment. That’s not duplicate work, they resync with FreeBSD often. DragonFlyBSD is based on FreeBSD 4.8. The differences that cause the fork haven’t lead to duplicated work, the ideas there took a very different approach & went in a very different direction.
Seriously, some of you guys really need to do your homework before talking about things that you don’t understand & know nothing about.
Troll much? You seem to have mixed up “BSD” with “Linux”. A very strange form of dyslexia that appears to be common amonsgt Linux trolls.
Projecting much?
Where did I claim Linux wasn’t reinventing the wheel? And why did I have to discuss linux on a post regarding *BSD.
For what it is worth, both communities reinvent the wheel at different levels: Linux does it at the distribution/package management level (rpm-based, apt-based, etc distros using the same kernel). Whereas *BSD do it at the kernel/toolchain level (NetBSD, FreeBSD, OpenBSD, etc).
I am neither a linux or a BSD troll. You however…
Perhaps you need to do your research better. The BSD’s don’t reinvent the wheel, they heavily share code with each other. They also don’t reinvent the kernel & toolchains very often. The BSD toolchains have not been reinvented. They’ve had their own toolchain before Linux was even created. The only real reason for any of the BSD’s to use the GNU toolchain is for the support of the Linux emulation environment, which really isn’t necessary.
Oh, you’re most certainly a troll. If you’re not a Linux troll, then it’s probably because they won’t have you. At least their trolls have been known to do their homework from time to time. You seem to prefer to just spout non-sense. Perhaps you just need a hug, but you won’t get one here.
Considering that the BSDs have been around longer than Linux & have real Unix genetics, it’s fairly obvious that it’s not any of the BSDs that’re reinventing the wheel. Also, PCC’s codebase is older than GCC’s. This isn’t flamebait, it’s fact.
What exactly does age have to do with the lack of duplication of effort?
Also, who the heck was talking about Linux? GNU is not Linux, and it has been around for a while too.
Be careful about assuming that your knee-jerk reaction and “facts” are the same 😉
Edited 2011-01-31 15:59 UTC
Are you diminish? Age helps verify prior arts. Which verifies who’s reinventing the wheel & who originally invented it. If you can’t understand that simple concept, then you shouldn’t be talking with the adults.
I know GNU isn’t Linux, genius. The fact of the matter is that GNU & Linux are in a symbiotic relationship at this point. GNU doesn’t have a kernel (no, HURD does not count), so it needs Linux. Linux does not have a userland, so it needs GNU at this point in time. What affects one, affects the other (just in case you don’t know what a symbiotic relationship is).
Funny as I thought that was Linux’s trick.
Whether it’s the perpetual state of redevelopment that happens with Linux’s hardware abstraction layers, audio stacks and so forth. Or the fact that GNU usually end up reproducing existing BSD software because of BSD/GPL incompatibilities: BSD kernel predates Linux, BSD userland predates GNU and most importantly, PCC predates GCC by more than 10 years!
Now I’m not about to start a “my OS is holier than thou” debate as I used ArchLinux daily as my primary desktop and love it. But in my opinion I do think it is unfair to argue that BSDs are constantly reinventing the wheel just because they happen to be working on bringing an older piece of kit up to date rather than pandering to GPL and the tangled mess of replicated software that’s licensed under it.
I don’t think you understood my point at all.
First off, I have no clue why some of you assume I was implying Linux at all in my post. Since I was simply discussing BSDs. That. is. all.
Also, having at least 3/4 different projects trying to implement competing kernels from an original common codebase, like BSDs do, seems to be IMHO the very definition of “reinventing the wheel.” Regardless of whether or not BSD pre-dates Linux, or PCC is older than GCC.
Furthermore, it is not an either/or proposition. BSDs and Linux do both waste a lot of resources by trying to duplicate effort to implement the same sort of goal with different approaches. In fact that seems to be a defining characteristic of free/open source: NetBSD vs OpenBSD vs FreeBSD vs… (in BSD-land), Apt vs RPM vs YAST vs slack vs… (in Linux-land), Gnome vs KDE vs… etc, etc, etc.
I am not saying whether that is a good or a bad thing. I am just pointing out that is a characteristic.
I can’t speak for others, but I certainly didn’t assume that you were making a direct Linux vs BSD comparison. However, intended or not, Linux is still part of the context given that we’re comparing a GNU toolkit (GCC) with a BSD one (PCC). So that’s why I replied – because given the context of your statement, I considered it unfair.
Yeah, but BSD does this for how many OSs? NetBSD, OpenBSD, FreeBSD and DragonFlyBSD. Granted there are the Mach kernels and XNU BSD implimentation’s too, but those aren’t really seen on BSD but rather other flavours of UNIX.
Where as Linux has dozens of hybrids ranging from slight variations from vanilla (such as ArchLinux does) to entire forks (like Android).
That’s a defining characteristic of any competitive market regardless of whether the source code, technologies or IP is open or closed.
Windows has a different kernel to OS X. Intel build different chips to AMD. Ferrari build different engines to BMW. While each time the fundamentals are essentially the same; it’s the defining differences between the products that lead people to chose one item over another.
However, I will concede that it is annoying when you see one buggy and incomplete open source project replaced with another buggy and incomplete project.
Anyway, tangent aside, I’m sorry if I’ve offended you with my earlier rant. As I said before, I objected mostly to the context rather than the statement. However I suspect are views on the matter are ultimately pretty similar.
First off, your point is rather pointless. If you’re talking about open source projects, then the only viable comparisons that can be made license-wise are between all of the totally free licensed OS’s (BSDs, Open Solaris, etc.) & the virally licensed OS’s (any GPL-based OS). What you seemingly fail to understand is that none of the BSDs are competing with each other, they complement each other. They’re based on the same codebase & they stay pretty close to each other codewise. They aren’t implementing different kernels, their kernels are past the implementing stage & are mostly in maintenance mode until new features are deemed necessary. When that’s necessary, the code usually propagates between the 3 main BSD projects. No reimplementation needed when you can share code with no licenses putting artificially barriers up between projects.
You also seem to fail to realize that this one principle: YOU AREN’T THE ONE REINVENTING THE WHEEL IF YOU ARE THE ONE WHO INVENTED IT & YOU’RE STILL USING THE WHEEL THAT YOU ORIGINALLY INVENTED.
You’d think that it’d be obvious…
But I don’t think the PCC development is done at the expense of LLVM/Clang – it doesn’t help anyone if the development of something is so heavily geared towards one compiler over another that it makes code portability difficult. That is the current problem with so many open source projects – they’re so designed for GCC and Linux that code portability has become secondary to the mission of creating good tools that can be used on any open source *NIX like operating system.
That is the current problem with so many open source projects – they’re so designed for GCC and Linux that code portability has become secondary to the mission of creating good tools that can be used on any open source *NIX like operating system.
What he said. There is a problem now, developers are only targeting Linux, which makes it hard to port software to other *nix OSes.
You give me an idea about how to make Linux and BSD folks agree on some matter…
My suggestion is simple : people should just drop UNIX altogether. It’s a filthy mess, designed for a long-forgotten family of computers. Each time modern hardware comes out, it proves to be even harder to make UNIX scale on it. The UNIX model is simply not suitable for many use cases where we currently try to shoehorn it, and people should really see the elephant in the room at this stage instead of keeping writing incompatible variants in the hope that their one will magically address all of the core UNIX design issues.
Should a sufficient number of people and large organization drop UNIX, the remaining followers of this outdated OS family would suddenly start to unite and pay more attention to things like API stability and software portability issues.
Problem solved.
< /trolling >
Edited 2011-01-31 09:39 UTC
Resistance is futile, you will be assimilated.
Seriously, I think this is a good thing. I will be modded down for that but that is my opinion. The BSD system is good technically but socially it is obsolete. The license is not strong enough and what is has brought is a lot of incompatible proprietary forks with short term goals and long term damage. In my opinion, the GNU project is socially more advanced and new developments should happen there.
How is this relevant?
Unlike Richard Stallman and the GPL army … some of us, use the computer for doing stuff rather than the mental masturbation on how righteous we are for setting the code free.
Most developers in this world honestly don’t give a shit whether the code is free … they have houses, cars, children etc.
I am fed up of this social movement bollox when it comes to computing … it is a tool for getting things done … nothing more … nothing else.
I am going back to Visual Studio now and I am going to write some proprietry C# … because it belongs to my employer, because he pays me for sitting in a dark IT dungeon for 9 hours a day … and coffee break is over.
No problem, I will not put a gun over your head so you care about the long term social consequences of computer software.
All of this is probably not relevant to you. Sorry if I hit your nerves. Please just ignore my posts when I talk about this.
Edited 2011-01-31 13:18 UTC
Well, yeah, but most people in this world honestly don’t give a shit whether their society is free once they have houses, cars, children etc. So it doesn’t prove much
It is called a change in priority. Family is more important than whether all the code in the world is free.
Most of those developers finally get tired of buying compiler licenses, operating systems, etc. and computers just to serve interests of companies. Most of them finally realize about planned obsolescence that forces them.
They know that their companies are not free to choose, that their governments are not free to choose and that… money from planned obsolescence is also paid in part by developers, as they are customers and citizens of a country.
That money and time does not go to the family of the developer, but to some interested parts.
So most of them… finally care about free software.
I am a developer and I contend your comment. You write things like developers working at software companies don’t get compensated.
While I have made some free and open source code available, I still understand that compilers have a high price: I’ve studied and taught compilers. I’ve been looking for a C++ IDE for my next project and considered Visual Studio. Given the price tag, I came to the conclusion it wasn’t a product for me so I’ve installed the CDT plugin on Eclipse for the time being.
I understand that quality basic bricks have a high price. While looking for tools, I’ve also looked at the price of the commercial Qt license (I admit it left me breathless), it’s almost twice my monthly pay so it’s not for me, at least not now. I’ll be looking for a more affordable toolkit. When I can afford it, I will gladly pay because I know it’ll fund future developments.
I think most developers appreciate free software but also, given that they’re aware it doesn’t magically appear as a result to prayers, chants and dances, developers also understand that software has a price. For instance, what OopsBackup offers is priceless to me, I’ve made donations to FreeCommander and even posted a forum topic asking uTorrent to set up a donation possibility. For some other users, Sibelius or Photoshop would also be priceless. Each of us developers assign a value to software products, just like any consumer assigns a value to goods; if the product happens to be below that value, or even free, then fine. Otherwise, we leave it alone and just sigh. I wouldn’t be surprised if this description of what devs think and feel was the most accurate… but how would we know?
Unless you are planning to do something very strange… Qt will be of no cost for you.
That is to say, you can make “commercial” programs without the “commercial license”.
Please see http://qt.nokia.com/products/licensing/
Thanks for the info. I had read the licensing ten days ago when doing a little research on toolkits and languages. I had inferred that my using Qt without a commercial license would impose conditions on my commercial product. So I had decided that when I am ready to start, I will call them and have a little chat to make sure I understand the license terms.
I am not a member of any of the BSD groups & do not speak for them. With that being said…
Social??? BSD was never a social entity. It didn’t grow into prominence based on social networking or based on people doing OS development as a hobby. BSD got it’s start in academia. For those who don’t know that that means, it means that BSD’s origins (which set the basic mood & pace of the BSD ecosystem), are deeply rooted in being technically sound & correctly implemented. There’s a heavy emphasis on solving the problem correctly the first time by using engineering techniques. This is also the reason none of the BSD groups feel the need to replace working frameworks at absurdly short intervals just because someone wants to do it another way with no true technical reasons to do so. This is real technology with real engineering behind it, social graces need not apply.
License??? The BSD license does exactly what it’s supposed to do. The BSD teams aren’t so heavily ego driven & paranoid that they think everyone’s out to steal their code. They know that their code is good & they’re generally happy just knowing that technology created by them are being widely used. This is in the original spirit of what it once meant to be a computer geek. Share & share alike. However, apparently, there are those who fear that other people will use their code without giving anything back to them. As if that’s the reason they originally wrote the code in the first place. But we all know the true nature of fear. Fear leads to anger. Anger leads to hate. Hate leads to GPL based code. Which becomes a self replicating cycle, because with the viral nature of the GPL comes the viral spread of more fear.
Proprietary forks??? Licenses don’t cause incompatible proprietary forks with short term goals and long term damage short-sighted programmers hobbyist who think they know better well trained software engineers cause incompatible proprietary forks with short term goals and long term damage. Let’s get one thing clear. Linux is the newcomer here & the GNU community are the ones that’s causing incompatibility, it’s really just that simple. Standards that BSD & other *nix systems support are generally trampled on by the GNU ecosystem. It seems that the *nix world was starting to recover from all of the incompatible BS that was rampant & everyone was really starting to standardize on POSIX & other standards when GNU came & started kicking the anthill again. And to be perfectly honest, none of that is Linus’s fault. It’s the fault of the GNU community that latched onto his kernel after not being able to get their own kernel up and running (btw, their own kernel still hasn’t arrived to the party).
Socially advanced??? This is computer science. It’s not meant to be socially advanced, it’s supposed to be technologically advanced, which GNU code clearly is not. If it was, maybe there would actually be a GNU OS rather than a pitiful attempt to bind GNU to Linux by calling it GNU/Linux. If Linux had never used GNU’s GPL license, Linux probably still would have progressed to where it is today, however, GNU would still be on the sidelines trying to get their dead cow of a kernel up and running in a viable way.
I normally avoid the holy war between the GNU guys (who appear to want to start a fight with everyone) & the BSD guys (who actually seem to prefer writing code in comparison to fighting with the GNU guys), but this has gotten a bit ridiculous. This is the *nix community which includes everyone, not just GNU’s developers & supporters. If you guys don’t like the way we do things & think that our ways are outdated, then you’re welcome to leave our community & stay to yourselves. Oh, & on you’re way out, leave our pipes, filesystems, systemcalls, windowing system standards (well, actually you can keep that), file metaphor, & device driver style at the door. Then give plan9 back it’s proc filesystem & give Linus back his kernel. See what life is like when all you have is the HURD.
Sorry for the rant & the lengthy post, but someone needs to say something. What makes things worse is that 9 times out of 10, this guy doesn’t even write code!
Amen
Kinda love you man.
You guys obviously didn’t get a word of what I was saying but that is OK. Let me just tell you that I indeed write code. Everything has social consequences. Of course, most coders just want to write code and don’t care about that. Coders don’t care about digital divide, politics or social matters. That’s not their thing. Most of them are not interested. That is what you guys keep repeating. I fully agree with you, you don’t need to convince anyone about that. It just does not matter to me what coders care or do not care about. I was talking about what I care about.
You guys should not care about that. I may use your code to push my political agenda. The BSD code is technically good, I have already said it. It can be assimilated into GPL code with enough modifications and you guys do not care about that or you would use a stronger license. You do not care when Apple assimilates it. You guyz should just write code and ignore the politics involved. If you care about the politics involved then you are welcome to discuss that with me. If politics get on your nerves then just ignore me when I talk about that.
No we don’t get what you are on about. Not every bit of code a developer writes has social consequences, in fact I pretty much think the social impact of my code is nil.
Edited 2011-02-02 18:09 UTC
Very good post, for the sake of nitpicking I’d just like to correct a single point.
GNU is not FSF I can write GPL-licensed software because I think that’s the way copyright should be used (as a way to promote open innovation) without wanting to start a war with BSD folks anytime I see them.
On the other hand, said folks should also understand that using BSD has nothing to do with realism, freedom, or something. If BSD proponents actually believed in that instead of blindly copy-pasting that license, they would be using public domain or its WTFPL EULA variant already. Or MIT licensing if they want a disclaimer to hide behind.
The very reason why the BSD license (as can be found here http://www.freebsd.org/copyright/license.html ) exists is the megalomaniac will of the University of California to see its name displayed in every software package using that license without any risk to be sued in case said software happens not to work and to cause the death or injury of people in the process.
That it has somehow managed to become widespread outside of Berkeley while being nothing but the EULA version of an ad keeps puzzling me. But well…
Edited 2011-02-01 07:30 UTC
Kudos.
I didn’t know I could fall in love with a rant… Another first.
I hope someday someone will port PCC to Windows. On Windows GCC is even slower than on Linux and Microsoft’s compiler is no alternative. MS abandoned C and does not even attempt to support current standards.
A stable, fast, well-supported C99 compiler with useful error messages (GCC is horrible there) would be a dream. The best thing we have right now is Pelles C but that’s a one man, part-time effort.
I mean, I know that the PCC guys won’t do it themselves, but it’s called “Portable” C Compiler right? So I assume porting wouldn’t be that difficult. Unfortunately, I lack both time and skill..
PCC has been ported to Windows, someone was building snapshots for it for awhile and advertising them on the PCC mailing lists.
You’ll probably have to build it yourself.
On Windows you have TCC and more importantly PellesC
Kochise
The reason the BSDs are looking at other compilers is because GCC moved to GPLv3, which the BSD legal teams have decided cannot be packaged in the base distribution (the kernel and userspace that makes up the minimal BSD operating systems).
FreeBSD has basically forked the last GPLv2 GCC and has been slowly updating it to keep it up to speed with what the Linux distributions, since GCC is still king of optimizations. However, GCC is an extremely complex piece of software, and is built very hackishly, and it has been a losing battle for the FreeBSD developers.
Do not forget GCC is actually descended from EGCS, which was collections of patches and extensions created during a period of stagnation while RMS was in charge of GCC. RMS gave control to EGCS in the late 90s. Things built by committees are often hackish.
So the BSD have been looking at other compilers (PCC and Apple’s clang/LLVM). OpenBSD has looked at PCC even before GCC’s change to GPLv3, since Theo wants everything in his base distribution to be BSD or ISC-licensed. However nothing optimizes like GCC yet, however clang/LLVM is getting close.
GCC’s ability to compile other languages is pretty useless to the base distributions, which are C/C++ code, but it helps with the compiling of applications.
Also, GCC takes forever to actually compile the stuff.
If you really want to see the limitations of using the old GCC in the FreeBSD base, try compiling a new svn snapshot of mplayer.
THEN YOU WILL RAGE!
Edited 2011-02-02 01:21 UTC