Linked by Howard Fosdick on Sat 24th Nov 2012 17:52 UTC
Editorial Do you depend on your computer for your living? If so, I'm sure you've thought long and hard about which hardware and software to use. I'd like to explain why I use generic "white boxes" running open source software. These give me a platform I rely on for 100% availability. They also provide a low-cost solution with excellent security and privacy.
Thread beginning with comment 543304
To read all comments associated with this story, please click here.
earthwirehead
Member since:
2012-11-27

The main quibble I have is that the quality of entry-level stock components has gone down over the years and the failure rate has gone up. Unless you either have a great relationship with a local hardware dealer (and these are becoming increasingly difficult to find) or have someone on staff who can invest the required time and effort to ensure that you're sourcing good components, this business strategy could wind up soaking up an awful lot of otherwise billable time.

The only other issue I would raise is that using desktop Linux as part of a small-business IT strategy, even a business that does IT, is going to require more training than most computer users are going to willingly undertake. Most computer users--including IT professionals-- are basically like most automobile users. They know how to put in gas and use the steering wheel and are otherwise quite content to remain in ignorance of technology they depend upon.

Although the actually usability gap between Windows 7 and the most popular currently Linux distros is virtually nonexistent, stuff happens. When stuff happens in Windows, people shrug and accept it. When stuff happens in Linux, the typical response is a tirade about how crappy Linux is. Even if you're in a position to insist that your consultants are Linux-knowledgeable, I foresee needing the one box running Windows (probably w/a QuickBooks license) for whoever is answering the phone and keeping the books.

Reply Score: 1

tony Member since:
2005-07-06

The only other issue I would raise is that using desktop Linux as part of a small-business IT strategy, even a business that does IT, is going to require more training than most computer users are going to willingly undertake. Most computer users--including IT professionals-- are basically like most automobile users. They know how to put in gas and use the steering wheel and are otherwise quite content to remain in ignorance of technology they depend upon.


That's true for all of the technology we use. You might know how to compile a kernel, figure out which kernel module you need to get the sound card to work correctly, or be able to diagnose that the constant dropping of the wireless signal is based on a buggy driver that hasn't been updated yet by your distro of choice (and when you bring the problem up someone invariably offers a "superior" distro that you should switch to).

But could you go through the driver code, line by line, and solve the problem? Could you design your own PCIe card? Have any idea of what the individual traces do? Or the SATA signaling, what the pre-amble on the SATA command is for (or what it consists of)? Could you re-solder a cracked motherboard (or even know how the different trace lengths might affect timing?)

We all have a demarcation point with technology. There is a huge (quite literally) unfathomable, by a single human mind, amount of complexity hidden from us in the technology that we come to rely on, that not only do we chose to ignore, but we couldn't effectively use the technology without most of it being hidden.

But that's the point. The more of the technology that is hidden from us, the more useful it is. Computers used to be programmed by machine language, then assembler, and then C and others compiled languages, then the scripted languages. Every layer we bury from site means we've reached a new level.

There will always be a need for those that understand the deeper levels (and those people will be highly valued), but it's not necessary (nor practical) that we all do.

I put gas in my car, and my car's computer tells me when to change the oil and perform other maintenance. That's fine by me, because I use it to get around. The workings of it don't interest me, but I enjoy the benefits that it brings. Baring a zombie apocalypse, I'm fine with that relationship. There are people that love to tear down engines, rebuild... whatever in an engine. And that's great. But thankfully today, you don't need to know that to own a car.

Reply Parent Score: 3

Alfman Member since:
2011-01-28

tony,

I agree with you & the OP. Generally most people don't need to know the low level details, and that's a good thing because it makes us more efficient and less distracted.

"The more of the technology that is hidden from us, the more useful it is."

My own view though is that the low level things should remain out of the way, yet accessible for those of us who'd benefit from writing/installing third party modifications. We're seeing many modern platforms simply cutting off access to low levels. That's a big problem because it represents a growing inequality regarding access for developers/engineers who'd otherwise be able to further drive innovation.

Reply Parent Score: 3

zima Member since:
2005-07-06

The more of the technology that is hidden from us, the more useful it is. Computers used to be programmed by machine language, then assembler, and then C and others compiled languages, then the scripted languages. Every layer we bury from site means we've reached a new level.

And before that, re-plugging cables or binary swithes manipulation.

I wonder what is the next level...

PS. Maybe distribution of task-specific VMs, all that is needed nicely included and not much else? (versus recent projects like RPi which seem to focus on hardware more - so a bit stuck in the past; of course, RPi is genuinely useful for many things ...but one goal - offering safe way to experiment with OS & programming while isolating potential damage - can be nicely covered by VMs)

Edited 2012-11-28 10:30 UTC

Reply Parent Score: 2

lucas_maximus Member since:
2009-08-18

Although the actually usability gap between Windows 7 and the most popular currently Linux distros is virtually nonexistent, stuff happens. When stuff happens in Windows, people shrug and accept it. When stuff happens in Linux, the typical response is a tirade about how crappy Linux is. Even if you're in a position to insist that your consultants are Linux-knowledgeable, I foresee needing the one box running Windows (probably w/a QuickBooks license) for whoever is answering the phone and keeping the books.


Not being funny, but the other day I install Fedora 17. Booted from USB stick, I had an error that basically stopped the live distro to boot up, google the problem and there were clues on how to fix the problem. I made a guess that I had to remap the UUID of the USB drive in Grub so I resorted to writing UUIDs of disks down (one was 16 characters long and I think it was my SD card).

The thing is that with a lot of Windows errors there tends to be a work around ... when Linux dumps you at a terminal with a cryptic error message or something just offers no output after erroring out(usually GUI apps that are basically a front end to the CLI) ... it does come somewhat frustrating.

In reality there are very few Windows errors now that aren't friendly.

Reply Parent Score: 4

Alfman Member since:
2011-01-28

lucas_maximus,

"Not being funny, but the other day I install Fedora 17. Booted from USB stick, I had an error that basically stopped the live distro to boot up, google the problem and there were clues on how to fix the problem. I made a guess that I had to remap the UUID of the USB drive in Grub so I resorted to writing UUIDs of disks down (one was 16 characters long and I think it was my SD card)."


I've had long standing issues with Grub on removable media. Infuriatingly they didn't fix this with grub2. The partition map grub used was probably incorrect for your system, and grub stabs around cluelessly loading from arbitrary drives. If you manually fixed the UUID, you should also check that it's loading the right kernel as well. It might still be using a kernel on your hard drive and just using the UUID to boot the distro on your sd-card.

Is this an image you compiled yourself? Most linux live boot disks use syslinux instead because it doesn't get confused about what media it needs to boot off of.


"The thing is that with a lot of Windows errors there tends to be a work around ... when Linux dumps you at a terminal with a cryptic error message or something just offers no output after erroring out..."

There's no excusing the grub problem you had, but I don't think your generalisation is fair. Sometimes the easiest path to fixing a windows problem is to reinstall it.

Reply Parent Score: 2

zima Member since:
2005-07-06

The main quibble I have is that the quality of entry-level stock components has gone down over the years and the failure rate has gone up.

I think that's looking at the past through somewhat rose-tinted glasses. Computers of the past were often notoriously unreliable (remember 8bit micros?)

And/or: entry-level components in the PC world have now much lower absolute price; So not that bad of a trade off.

Reply Parent Score: 2