Today, we cannot know if people would still buy Microsoft products because the government protects the monopoly. What percentage of the market would Microsoft have in a fair market? The only way we can answer that last question is to stop manufacturers from preloading Windows. Until then, we do not have a free market. Microsoft has no way to prove itself otherwise, says LXer.
There are many 'really alternative' operating systems currently in existence. Most of them are purely for research, personal enjoyment or as a coding sandbox. Some of them, however, want to achieve wider acceptance. Is that goal obtainable, in the current OS climate?
Many people take as a given that the desktop computer market is ossified and completely dominated by Microsoft. But, taking the global view, the PC market is anything but saturated. Some huge, untapped markets will ultimately decide how the market share pie will ultimately be divided. There will be room for Microsoft, Apple, and Linux, but how will it shake out?
It's the old catch-22 of the job market: It's hard to get a good job without experience, but it's hard to get respectable experience without a good job. But if you're looking to enter the job market, why not take advantage of the huge opportunity that Open Source Software provides? You can make a meaningful contribution to a high-profile project, based merely on your skills and initiative.
When Pythagoras invented a new way to make calculations with triangles, there was not yet an European Patent Organization. Bad luck, because everybody knows that patents stimulate innovation. Pythagoras invested much time in contemplation. Now, anyone could use this new mathemathical method for free. How could others be stimulated to make the same investment for no financial benefit?
Most people switching to Linux aren't switching because of the ease of use, or the shiny graphics, or the pretty interface. They switch for a variety of reasons, some would say it lowers the operating cost, some do it to support Free Software, others may do it because of its technical capabilities, etc.
Today's confirmation that Apple is going x86 makes today a historic day in the industry. It may mean that Microsoft might see a few percent decline of their market share the next few years, but what about Linux? If Linux were to lose an equal amount of share it would alter its spread to the desktop, a spread that has been very positive so far.
"What BitMover Got Wrong" over at The Better-SCM site analyzes the wrong state of mind that guided BitMover, BitKeeper's parent company in the long, eventful and sad history of its gratis product.
What's the easiest way to ensure that GPS tracking technology doesn't become the tool of an evil elite? Make sure everyone has access to it! For people already thinking along these lines, the availability of tracking technologies such as RFID tags and (GPS) chipsets is confirmation that we're all living in a Panopticon. The saving grace is that the CIA and the Trilateral Commission don't have a monopoly on these technologies -- maybe we can turn a world of mass observation to our benefit.
Being the best doesn't always mean being the most popular. We all know of many inferior products that are immensely, sometimes perplexingly, popular. However, this does not mean that one must forsake the pursuit of excellence when pursuing a broad market share. As proponents of open source software, it should not be beneath us to pursue popularity or to look to proprietary developers as examples. And by following the right examples, we can help spread the usage of open source software without sacrificing the goal of software excellence, says NewsForge.
Wishful thinking? Yes, but let's consider the possibilities. The last couple years have seen significant advances in hardware production and design. One of the more interesting (and potentially revolutionary) developments to take place this past year is the announcement of a new CPU, the STI (Sony, Toshiba, IBM) Cell processor.
This article looks at user reactions to common problems with user interfaces and corporate policies, and how these reactions can make some common business decisions counterproductive. When it comes to inconveniencing your customers, and sometimes even offending them, are some sales tactics worth it in the long run?
A c|net editorial posits that Google may be well on its way to developing a complete suite of internet-based services that could act as a computing environment for any thin client that's capable of accessing it. And Microsoft may be planning a similar move.
Thom Holwerda has written a reply to Eugenia's editorial yesterday: "Yesterday, Eugenia, editor-in-chief of OSNews.com, published an editorial that angered the open-source software community. Even though I believe Eugenia can manage on her own just fine, I do want to support the editorial, with the use of some elaborations and clarifications."
With HP's high-flying CEO Carly Fiorina departing, the company's woes are well known. But how did a firm with such a storied history and vast assets get headed down the wrong path, and what do they need to do to set their course straight?
Amazing is the recent interest in full, live, operating systems that can fit on a 50 MB CD-ROM. It's totally astounding that they can cram so much onto such a tiny disk. But wait.. let's run back to the days of old.. back to say 1988.
In the news media, it is generally shown that flame wars and forks are detrimental to the growth of FOSS (Free/Open Source Software) But if we see the history of FOSS, both flame wars and forks have played a crucial role in determining both growth and direction of important projects. There are also arguments that this leads to fragmentation and marginalization. There is some truth in these arguments but there are a lot of benefits which are often overlooked. This article looks at some of the benefits of forking and flame wars through history.
I have been keeping a log of my Linux experiences since August of 2002. At first, I set it up as a textbase of tips. Using the wonderful program Tuxcards, I maintained a diary.
As a recent ACM Queue article observes the evolution of computer language is toward later and later binding and evaluation. So while one might quibble about the virtues of Java or the CLI (also known as microsoft.net) it seems inevitable that more and more software will be written for or at least compiled to virtual machines. While this trend has many virtues, not the least of which is compatibility, current implementations have several drawbacks. However, by cleverly incorporating these features into the OS, or at least including support for them, we can overcome these limitations and in some cases even turn them into strengths.
Problem: Even the most powerful PC’s become non responsive during resource-intensive computations, such as graphic design, media, image rendering and manipulating. The traditional solution has been to upgrade to a faster computer and throw more computing power at the problem to lessen the wait-time. But there's a simple solution that utilizes multiple machines, but without using grid/clustering. For now, this involves a hack, but how hard would it be for an OS vendor to streamline this process?