Free Software/open source often gets a bad rap for innovating. It just rips-off the work of commercial developers, right? Not so, as this Linux Format piece argues. FLOSS has pioneered, or been a catalyst in, some notable changes in the computing world. Several of these innovations are OS-related.
The scripting languages we all use are all open source, and all have novel features.
3d desktops? How is that useful?
LiveCDs? DOS did this with boot floppies.
Collaborative wiki editing? Lmao[citation needed].
VNC? How is that a new idea? Remote desktop access is old as hell.
You’re comparing the fully functional desktops you get with live cds to a DOS boot disk? Which, btw, was nothing new and innovative.
I think the point was that you could fit all of DOS on a boot disk (i.e. a fully functional version of the operating system). Of course, fully functional isrelative, and in the days of DOS that sure didn’t include graphical desktops or anything like that. True, you had Windows 3.x, Desqview and a few others, but those were third-party add-ons and not widely considered to be part of DOS. Now a days, though, graphical desktops are considered part of the OS.
Boot disks have been around since their were disk drives, though. A LiveCD is simply a progression of the boot disk concept, it is not an innovation. I find the comparison of LiveCDs and boot diskettes to be completely apt.
A better example of an OS on a floppy would be the original Macintosh System Software. They managed to fit an entire GUI into a 400 kB floppy diskette and something like 64 kB of ROM.
I haven’t played too much with Macs that old, but I have used machines that could fit the entire GUI and a modest sized application on a 800 kB floppy and 256 kB of ROM.
The big problem with finding innovation in the open source domain isn’t finding it. The problem is finding innovations that people would want to use, because they want to use the sort of software that they are already familiar with. So desktops end up with the look and feel of Windows or Mac OS X, and OpenOffice is popular because it looks and feels a lot like Microsoft Office.
I’ve used open source programs that have no analogs, that I’m aware of, in the world of commercial software. A lot of that software is quite good. Take something like wmii as an example. But very few people want to use it because it doesn’t work like the stuff they’re used it.
A better example would be the BeOS installation CDs, which were simply live CDs that ran a special bootscript that only launched the Installer app and a few other services. That was as early as ’98 IIRC, and there was also an old non-installable demo live CD of R3.5 I think.
And in turn, I believe that approach was taken from the way that install CDs of “classic” versions of MacOS worked.
VCN might be open source now, but it sure didn’t start that way. Even reaching for Wikipedia is enough to find out that:
“VNC was created at the Olivetti & Oracle Research Lab (ORL), which was then owned by Olivetti and Oracle Corporation. In 1999 AT&T acquired the lab, and in 2002 closed down the lab’s research efforts.
[…]
Following the closure of ORL in 2002, several members of the development team (including Richardson, Harter, Weatherall and Hopper) formed RealVNC in order to continue working on open source and commercial VNC software under that name.
Several other versions of VNC have been developed from the original GPLed source code.”
Being pedantic here, I know, but it really troubles me when journalists (and I’m using the term liberally) don’t even bother to spend five minutes to get their facts straight.
Really? Wikipeda sure don’t say that it started out closed-source. The keyboard is “continue”. They formed RealVNC to continuue working on OSS and non-OSS versions. Obviously in order to contionue working on something that’s open-source it was already open-source before.
Fair enough, but google for “vnc orl” and you’ll find:
” rel=”nofollow”>http://grox.net/doc/apps/vnc/internalversion.html (VNC – the internal ORL version)
“Some of the functionality of the distributed VNC system is limited when compared to the version we use within ORL. This is not because we wish to deprive the rest of the world of a more sophisticated system, but because we want VNC to be easy to download and set up […]”
If you’re in a hurry it might be easy to miss, but the copyright notice at the bottom of both pages reads 1998 — I’m new here, but I’m not used to speak unless I know what I’m talking about! 😉
Regardless of VNC’s pedigree, as the other poster said, remote desktop software has been around a long time. Farallon’s Timbuktu has been around since the late 80’s. http://en.wikipedia.org/wiki/Timbuktu_(software)
I think the most notable points that can made about VNC are that:
– it’s freely-available.
– it’s available for almost any OS you can name.
– and assuming the presence of VNC display driver, it’s finally approaching the speed/responsiveness of stuff like remote X or RDesktop.
Holy cow. Did Microsoft write this? This is so inaccurate it could have come from their marketing department.
1) 3D Desktops
Compiz Fusion gives you 3D eye candy, not a 3D desktop. The 3D spinning cube pictured was copied from Mac OS X… Besides, Silicon Graphics were doing things more along the lines of a genuine 3D desktop about 2 decades ago, and I’m pretty certain that wasn’t open source.
For genuine 3D desktop things that are open source, there’s Sun’s Project Looking Glass, and Squeak’s Croquet project.
2) Live CDs
Don’t make me laugh. This is just a re-invention of floppy boot disks. People used to craft fully bootable Windows 3 floppies. Every version of Mac OS supplied on a CD was a Live CD… This is not innovation.
3) Wikis
The concept isn’t new. Ted Nelson did a bunch of work on this decades ago. Collaborative document editing has been around for an incredibly long time, in proprietary and commercial forms. Tim Berners-Lee’s original web was effectively a Wiki.
4) VNC
VNC is a remote desktop protocol… Timbuktu on the Mac was doing that in the late 80s, pre-dating VNC by nearly a decade, and indeed pre-dating Linux too for that matter.
If one wants to come up with innovation from FLOSS, you’re going to have to do much better than that little list.
For genuine 3D desktop things that are open source, there’s Sun’s Project Looking Glass, and Squeak’s Croquet project.
If one wants to come up with innovation from FLOSS, you’re going to have to do much better than that little list.
well there you go.
http://www.opencroquet.org
No it wasn’t. The cube was a part of Compiz first and the picture is definitely a Linux system.
Innovations are innovations. It often hasn’t much to do with being either open source or closed source.
An example: you make an innovative new program. It is then up to you to choose which ever software license you may prefer, on open source license or a closed source license. But the innovative program didn’t see its birth because of the license.
Software licenses themselves don’t innovate, people and developers do. The open source community can be a very fruitful place to innovate, as well as an innovative commercial company developing closed source software too. It depends on the case and not on the preferred license.
Also, I agree with many other commentators that the examples mentioned in the article are kind of lame.
Perhaps by far the biggest “open source” innovation is the Internet, by the way. We wouldn’t have Internet as it is today if it was based on closed source proprietary non-standards and competing commercial technologies only. Open standards and technologies (= open source) are what makes Internet so successful and innovative place.
Also, for example, the Linux or BSD kernels include a whole lotta more interesting innovations than those mentioned in the article (although aunt Tillie might not understand what those things are about).
The Gentoo and Debian package management systems are quite innovative too. You couldn’t have something like Portage in the closed source world.
SELinux is open source and quite innovative too, and many other innovations in IT security.
The whole open source development model in big projects like the Linux kernel development with its global networked collaboration tools is innovative.
Many important computer languages and development platforms (Ruby on Rails etc.) are open source and innovative too. Etc.
Edited 2008-10-28 21:08 UTC
it is remarkable when someone comes up with a great new idea and shares it with the rest of the world.
I suggest we also take a look at some of the features and extensions of firefox. The extension framework itself is a major innovation
[quote]Perhaps by far the biggest “open source” innovation is the Internet, by the way. We wouldn’t have Internet as it is today if it was based on closed source proprietary non-standards and competing commercial technologies only.[/quote]
I guess the key phrase that spares you here is “as it is today”, since networks existed long before the internet, and some of them were quite good.
Then came AOL, which was awful, but whose marketing somehow managed to attract everyone and his mother. So, yes, the internet is a fantastic achievement of FLOSS.
Edited 2008-10-29 17:13 UTC
Yes, technically, and there still are many closed networks too besides of the huge and open Internet. But it would simply not be a working idea to base a global open network like the Internet on proprietary and closed source technology and semi-“standards” owned by some companies.
This article provides some perspective on the meaning of Linux (a free, open and stable server platform) for the birth of Internet (as we know it today):
Would The Internet Exist Without Linux?
http://www.pcmech.com/article/would-the-internet-exist-without-linu…
Many basic network technologies used by everyone have also originated in the open source BSD world.
However, proprietary, closed source, maybe patented network and web technologies like Flash still remain highly controversial, they may be expensive to use, many oppose them, and such proprietary technologies usually have many competitors too (in the case of Flash: MS Silverlight, Java-based equivalents etc.), making it very problematic to base a global open network / web on such things.
Edited 2008-10-29 18:11 UTC
The open source is a great thing that I love but Innovation is NOT best thing that It has done.
As was said all of this existed way before…
BeOS CDs were live CDs, and I keep reminding it to Linux peeps… and it seems BeOS wasn’t the first.
X11 has been a remote desktop protocol from the start on since it uses the network.
And there are a lot of things in Linux that appeared recently that many people claim got “invented”, while they existed in BeOS or another OS a decade ago.
SMP, preempt, tickless…
I made a speech at RMLL about this very subject…
(slides in french) http://revolf.free.fr/RMLL/2008/Haiku/Haiku_RMLL_2008.pdf
Conclusion is Technodiversity is vital to “innovation” because it’s just building from existing blocks.
Edited 2008-10-28 23:20 UTC
This is why software patents are such a bad idea. The software world isn’t made up of sudden, big innovations. It’s an organic process of gradual improvement, with old ideas constantly being rediscovered and adapted to the current environment.
Patents stall that process, reducing efficiency. In the US, they now have the situation where a few huge corporations hold enormous patent portfolios. They all infringe each others’ IP rights, but the small developer is locked out of this free market of ideas.
Without the protection of IBM’s patent portfolio and the (more-or-less) patentless haven that is the EU, open source would be vulnerable to nuisance law suits from MS that could cripple development.
Exactly why we fight software patents in EU, despite the patent office granting many even though it’s still illegal. SW patents are a plague. They also threaten all the small and medium businesses that makes the EU software economy because all the patents are already held by US companies like IBM or MS. That goes for both FOSS and proprietary vendors.
Even Bill Gates acknowledged that if software patents existed when he started, there would be no big Microsoft like we know it.
I can’t say for certain who was first with LiveCDs but
BeOS didn’t beat Linux to the punch on that one.
Lookup Yggdrasil Linux sometime. They were selling LiveCDs with decent automatic hardware detection in 1992-95.
Selling ? hmm R5PE was free as in beer
Still, I think at that time Linux was still quite confidential, but it’s funny how we easily forget and come up with the same “new” ideas
Well, as I said, it’s always about taking ideas and improving them.
four is a very small number!!!
there are a lot of huge contributions to the “state-of-the-art” from the open source world, including
* The Apache Web Server
* sendmail
* gcc. Though they are “yet another compiler collection”, a lot of software, open and commercial has been developed thanks to it… The world would not be the same without gcc
* Xen
* OpenSSH
* CVS and their brothers, sisters, sons, etc.
* Eclipse
* BSD jails
* ZFS
* DTrace
…
…
and so on…
counting just 4 is not correct to me.
I think many people will agree with me when I say that the biggest innovation from open source software is just that “OPEN SOURCE”.
Open Source itself is an innovation and has completely changed the software and Operating System market today.
The GPL is innovative too.
There are lots of innovations on a much smaller scale within open source software too – but it depends on one’s interpretation of “innovation” as to which ones make the grade.
Good point.
I’d say that the second major innovation of open source software that was entirely missed by the article is repositories and package managers.
ummm no. So much easier to download and install an app on windows and mac.
No, it isn’t. It depends on the case.
Now, how difficult is it to open a package managment program and press a button a couple of times and the program is installed with all its dependencies automatically?
Compare that to MS Windows way, where you first have to find and go to a web site (or to a computer shop), check if the software runs on your version of Windows or not, and if it needs some additional software to work, download all the software packages, cross the fingers and try installing it. Besides, in the Linux and BSD world, installed software occupies much less hard desk space and doesn’t left traces of it all over the hard drive after uninstalling it.
Besides, source-based software installation method àla Gentoo, BSD ports or Source Mage is quite innovative and has many benefits (as well as some difficulties too, of coursse) would be totally impossible in a clossed source world.
Edited 2008-10-29 07:42 UTC
Also located (in the repositories), downloaded and configured automatically.
if I want firefox on windows, I go to firefox.com, click on the big button, save file…done. Nice icon on the desktop. It’s even easy to try out a nightly build.
If I’m on linux and check out firefox.com “hey, that app looks good…maybe I’ll try it?” Sure I can try downloading a tar.bz2 file but that won’t install. I’d have to go out of my way to get out the package manager, scroll around a terrible UI and if I’m lucky it will have the version I want.
Linux is completely shut out of popular websites such as download.com as a result.
Edited 2008-10-29 08:29 UTC
Instead you have to go out of your way to open the file manager and locate the downloaded program.
And that’s for firefox, but what about installing some other program and finding that it requires something else (like say, a newer version of the msi installer). Have fun googleing for it.
Regarding download.com… why would anyone want to download anything from that site?
“Instead you have to go out of your way to open the file manager and locate the downloaded program. ”
no, I have an icon on the desktop when I download something. Then I click on the installer and get the icon for the app on the desktop.
If you don’t like download.com you can go to another website. You aren’t stuck with one system of finding software. And you can always download the app from the author’s website, which offers the most information about using the app.
Just yesterday I downloaded the prey demo for linux from the official web site (a “universal” .bin file). I had to execute it, click next several times, choose the install location… would have been a lot faster and easier to install it through the package manager (let alone uninstalling it).
I also have Doom3 installed but this time using the package manager. It tells me when there’re updates available, and I can browse a list of available mods which I can install or remove with a single click.
Anyway, to each his own. You just can’t prove the “browse the web – download – install” to be better for everyone all the time.
all download managers (and browsers) that i know of, provide very direct ways to ( manually or automatically) open the file, or the folder where it has been saved, when the download has finished
what people do not understand in this regard, is that the system for solving dependencies, and the interaction model for locating desidered program packages CAN be effectively independent of each other
the “shop” model with its list based UI is apparently comfortable for some users – but taking into conideration that it is the only install/update method available on some desktop linux distribution, and that it depends on the distribution’s centralized repositories, that comfort appears like a mean to lure users into accepting a reality in which the system is not as “open” as it could be, with some coordinated effort (that i doubt will come)
download.com is ONE possibile source, together with ZDNet download and others i know of, the sw author’s site apart – but the point is, all of them usually provide the very same redistributable package, they DONT use to modify and repackage the SW they redistribute, as linux distributions conversely and routinely do atm
packages should depend on the packaging system only, not on specific distributions (or their version numbers)
that is, distributions should restrain themselves to customize only whatever component does not affect third party applications ( and even so, there’s plenty of freedom), and redistribute mainly *the same* well defined core system (additional dependency over which should be downloadable from ANY available source)
this would prove Linux as a whole is a viable and consistent target platform for all intents and purposes of SW deployment
instead, we’ re locked in a mess caused by distributions taking over both the platform AND third party applications, the only way to cope with is putting ourselves in the hands on the specific distribution for every possible third party package we could need (hoping to find one)…
You can download a deb from the web, dpkg it and use the package manager to fetch and install dependencies. The point is you don’t have to go hunt them yourself.
Not really, commercial software has been using it’s own install scripts for a long time.
Well, as long as you trust them and/or go checksumming everything.
That can be seen as a a problem – but if you see the matter from stability and security point of view it can be a huge benefit too. Especially if you are running a server.
If you are going to download hundreds of different binary packages and binary updates from hundreds of different online sources, how do you know how well those packages work together if nobody (the distributor) hasn’t tested it for you?
And what I see even more important:
Who can guarantee that all that sofware is trustworthy from security point of view? Who knows what spyware, rootkits etc. might be hidden inside at least some of those packages? Are you just going to blindly trust everyone and install hundreds of binaries from maybe hundreds of different websites, commercial CDs etc? I rather trust a well established and trusted distributor to do the packaging for me and check what the source code really looks like.
MS Windows users are all too comfortable and used to installing who knows what odd third party software like games they have found on some odd web page. Isn’t this one of the main reasons for the huge spyware and virus problem in the MS Windows world? If you want security, that is not the way to go. You want some kind of a guarantee that the binaries you install are trustworthy and work well together.
Source and ports based software management (Gentoo, BSD ports etc.) at least gives you the chance to check what the source code looks like before compiling the software from it, so it may be more ok to automatically download source packages from many different sources and compile the software from it. But I don’t want to blindly trust every binary package offered me online – regardless of how easy the program installation might be advertised to be…
Edited 2008-10-29 17:55 UTC
the work of ubuntu and friends is needed just because they exist (in turn they exist because of the sort of “anarchy” on which the FOSS world is based) – they exist and often apply paches to upstream projects, on the version they redistribute
OR, they may build support for different architectures or processors
OR, they may not use the exact same kernel
OR, they may not use the same kernel as the vanilla one (often times distribuion settle on a kernel version and backport subsequent kernels’ features to it)
OR, they may not use the exact same set of build options
OR, they may not use the exact same point relase of GCC
OR, they may not use the exact same point relase of a given library
and so on –
from a SW engineering pov, every discrepancy is a different testcase – for this reason every distribution goes on the threadmill of testing and actually bundling available foss applications (notable exceptions of deploying half baked SW apart) to keep chances they break at a minimum;
and for this reason, thir party commercial apps, either are scarce, or dont support other than some specific distributions ( with the apparent exception of opera)
way to totally miss the point of my post…
malware on windows is largely credited as the product of people seeking for user’s sensible data to exploit, and a consequence of windows itself, being the dominant desktop platform – and mostly it spreads via infected web pages (simple javascript will suffice) or mail spam with links or attachments, something requiring the user’s intervention
malware – bearing closed source applications exist, but they’re a relative minorance in the vast landscape of third party applications, utilities and games that can be downloaded and installed on windows ( and the mac, for that matter) – for a minor SW producer, intentionally putting a spyware in their (often shareware) product is a double edged sword, as soon as it ‘s discovered, the producer’s reputation is greatly damaged
even in the case one chooses to believe antimalware and antivirus SW makers are the prime developers of malware, if one needs an antivirus he/she is usually well and safe, by buying one of the usual revered products or downloading it from the maker’s website – as much as the installation method is concerne, it is essentially the same thing (that is, trusting the maker)
same goes for any other kind of third party SW product, including the OS (windows – btw, the source code for windows is afaik available to security companies and government agencies to allow them verify against security holes)
to top it, FOSS projects for windows hosted on sourceforge, most often supply a setup exe file
letting third party applications having a common installer is something that’s related and allowed by a platform’s architecture consistency, putting the blame of malware presence it, and dismissing it as the root of evil, (or just because it’s “a microsoft thing”) , is a logical fallacy
moreover, it has nothing to do with the application themselves being open or closed – source software (see, open , or closed source are nothing else than distributed development strategies choosen by the development team, not unlike extreme programming and waterfall are just workflow models, that mostly affect only the maker of a program), and could benefit everyone, from the upstream author to the user
except maybe the distribution themselves – in a scenario all of them work on a common codebase, they’d have a harder time justifying investments devoted to independently reinvent the wheel, it’s not that they dont have their own interest in maintaining the status quo to a certain extent
if you say you’d trust distributions more than the original authors of software they take and repackage, i’d say something is going wrong within the FOSS world these days…
trustworthy: it’s all up to the maker’s reputation
work together: that’s what a standard system core and packaging format, not dependent on specific distributions, but developed and tested with contributions from all of them, would achieve
open source as a deployment model works only for us nerds, the normal user wants to be presented with a package file of some kind , that he can install as quickly and with as little hassle as possible
nor the possibility to look at the source code is of much use for him – he’d need someone else’s help to read the code and attest the absence of vulnerabilities, spyware or backdoors, so it’d become a matter of how much this someone is to be trusted
i didnt’ say that you must download from sites you dont trust;
i said: i want some standards that all distributions should use, to achieve true binary compatibility across all distributions for the same architecture
that would make it possible to download any SW from any site (including, last but not least, the author’s one, who would be able to cover the whole userbase with minimum hassle and without giving up control over the building and packaging of his product) – but this likely wouldnt stop distributions from , ehm, redistributing packages for you 😉
Aunt Tillie might not be capable of doing any of that: find and go to Firefox.com, download something, install a program from an exe file etc.
Now, if you’re on Linux, Firefox is probably already there waiting for you, installed by default.
Naah.. For example, the Synaptic GUI is quite easy to use and searching for sofware couldn’t be made much easier. And like I said: what can be more straight forward than pressing a button a couple of times to install something?
Now, what Firefox version might you want to have? 0.1beta..? Ubuntu has all the Firefox versions you could hope for in its default repositories, including many of he best Firefox extensions too. Besides, Ubuntu – or Debian and many other Linux distros – also takes care of updates to your Firefox – and most other software you’ve installed – half-automatically.
If the program you want (a nightly build of Firefox or what ever program) is really not available in any repository, usually the site you would download it has clear instructions on how to install the software. On Linux the installation routine and package formats may naturally be different from the those on MS Windows, but just being different is no excuse for saying that it would necessarily be any more difficult. If you use an OS regularly, you probably know what needs to be done.
the synaptic gui doesn’t tell me much of anything about the software.
Click and run (cnr.com) is a whole lot better but why should a software developer have to go through *any* 3rd party when he can package the app himself and put it on his own website for people to download?
Firefox upgrades itself automatically. Again, no need for an external program.
Edited 2008-10-29 09:54 UTC
I don’t know if you’ve tried Ubuntu in the last few years but you can download .deb installation files and install them just like installing an .exe in Windows. I do it all the time. You don’t need to use the package manager to download software. getdeb.net is a great place to find .deb install files for all kinds of software. I also find .deb install files offered on a lot of open source software developer’s websites. Maybe you should try Ubuntu? You may be more comfortable with Kubuntu though, it looks more like Windows.
Huh? What are you talking about, or just trolling? Of course, it is none of his business to do so. The Linux distribution in question will do the packaging for that distribution. That’s how it works.
…Or if a developer of a commercial program wants to package his software for Linux, he can use some easy to use distribution neutral packaging format like autopackage http://en.wikipedia.org/w/index.php?title=Autopackage , or package it as an executable that simply runs when you click on its icon (the way many commercial games for Linux work).
If – as a developer – you think that even that would be too much, then just forget Linux and don’t bother porting your program to Linux. Maybe via WINE it is still possible to run that MS Windows program on Unix or Linux too to some extent.
Edited 2008-10-29 11:12 UTC
maybe autopackage is the way to go, though according to the link it still has some rough spots.
I have ubuntu. I don’t use it most of the time but I’m not one of these fanboys that likes to attack linux just for the heck of it. I just want it to be easier to use so that we can see more commercial software for it
just because I don’t accept “how it works” on linux doesn’t mean I’m trolling. Linux used to be very hard to use in many other aspects. Some said “that’s how it works; linux is not meant to be easy to use” but some people acknowledged the problem and actually did something about it.
Alright. But following the Windows way in everything doesn’t necessarily mean that things would become easier. In some things Linux can already be easier and more advanced than MS Windows. Just because it is different from MS Windows doesn’t mean it would be worse.
But I happily admit that usability has traditionally been one of MS Windows’ strong points, usually, but not in everything.
Anyway, people new to Linux, Unix or some other operating system and coming from MS Windows background shouldn’t expect that the new operating system would work just like MS Windows in everything, and if that’s not the case, they shouldn’t think that the new OS has got it all wrong. Of course different operatings systems have many differences and you just have to learn if you want to use them. It would be the same for someone coming from the Unix background if he would need to start to learn to use MS Windows (what are computer viruses and why does one need antivirus software? What is an exe file? or C or A drive? why doesn’t Windows use “/” between directory names but the odd and difficult “\”? etc. etc.)
ari-free wrote:
-“I’d have to go out of my way to get out the package manager, scroll around a terrible UI and if I’m lucky it will have the version I want.”
Do you actually expect anyone to take you seriously? ‘Go out of your way to get out the package manager’? LOL, troll along dude…
Yes
http://www.cnr.com
Also MS Windows software is “completely shut out of popular websites such as” http://www.linuxsoftware.org or http://www.icewalkers.com or packages.debian.org … Also, for example, Tucows (that is likely more popular than download.com) has happily lots of software for both MS Windows and Linux: http://www.tucows.com/Linux
There are simply websites that concentrate on Linux software and there are websites that concentrate on Windows software. That simple.
http://en.wikipedia.org/wiki/Innovation
“The term innovation means a new way of doing something. It may refer to incremental, radical, and revolutionary changes in thinking, products, processes, or organizations. A distinction is typically made between Invention, an idea made manifest, and innovation, ideas applied successfully ”
Apparently a lot of commenter don’t make the distinction ….
—–
Even though I’m not a programmer, I love open source “free” software, I have used it here: http://www.lapantz4less.com http://www.lapantz4less.com a lot.