It used to happen sporadically but now it is a daily experience. As I am browsing the net I click on a link (usually a newspaper website). The page starts to load. Then I wait. And I wait. And I wait. It takes several seconds.
Once loaded, my patience is not rewarded since my MacBook Air mid-2011 seems to barely be able to keep up. Videos start playing left and right. Sound is not even turned off by default anymore. This shitshow festival of lights and sounds is discouraging but I am committed to learn about world news. I continue.
I have the silly idea to scroll down (searching for the meaty citations located between double quotes) and the framerate drops to 15 frames per second. Later, for no apparent reason, all fans will start running at full speed. The air exhaust will expel burning hot air. MacOS X's ActivityMonitor.app reveals countless "Helpers" processes which are not helping at all. I wonder if the machine is going to die on my lap, or take off like a jet and fly away.
This happens even on my brand new laptop or my crazy powerful custom PC. This short article is basically a reply to the article we talked about earlier this week, and I'm pretty sure this is a subject we won't be done with for a long time to come.
I've been programming for 15 years now. Recently our industry's lack of care for efficiency, simplicity, and excellence started really getting to me, to the point of me getting depressed by my own career and the IT in general.
Modern cars work, let's say for the sake of argument, at 98% of what's physically possible with the current engine design. Modern buildings use just enough material to fulfill their function and stay safe under the given conditions. All planes converged to the optimal size/form/load and basically look the same.
Only in software, it's fine if a program runs at 1% or even 0.01% of the possible performance. Everybody just seems to be ok with it. People are often even proud about how much inefficient it is, as in "why should we worry, computers are fast enough".
A bit ranty here and there, but this entire "old man yells at cloud" article is very much music to my ears. Software is bad. We expect software to be bad. We accept that software is bad. We make excuses why software is bad. We tell people it's okay that software is bad. We say it is inevitable that software is bad.
If any other industry were as lax about quality and performance as the software industry, we'd be up in arms.
PowerPoint is so ingrained in modern life that the notion of it having a history at all may seem odd. But it does have a very definite lifetime as a commercial product that came onto the scene 30 years ago, in 1987. Remarkably, the founders of the Silicon Valley firm that created PowerPoint did not set out to make presentation software, let alone build a tool that would transform group communication throughout the world. Rather, PowerPoint was a recovery from dashed hopes that pulled a struggling startup back from the brink of failure - and succeeded beyond anything its creators could have imagined.
Fascinating story. I despise PowerPoint because PowerPoint presentations are difficult to translate (my actual job), but there's no denying it's used in meeting rooms all over the world - for better or worse.
Many science fiction writers - including myself, Roger MacBride Allen, Gerald Brandt, Jeffrey A. Carver, Arthur C. Clarke, David Gerrold, Terence M. Green, James Gunn, Matthew Hughes, Donald Kingsbury, Eric Kotani, Paul Levinson, George R.Â R. Martin, Vonda McIntyre, Kit Reed, Jennifer Roberson, and Edo van Belkom - continue to use WordStar for DOS as our writing tool of choice.
Still, most of us have endured years of mindless criticism of our decision, usually from WordPerfect users, and especially from WordPerfect users who have never tried anything but that program. I've used WordStar, WordPerfect, Word, MultiMate, Sprint, XyWrite, and just about every other MS-DOS and Windows word-processing package, and WordStar is by far my favorite choice for creative composition at the keyboard.
That's the key point: aiding creative composition. To understand how WordStar does that better than other programs, let me start with a little history.
An old article from 1990 and updated in 1996, reprinted, but still a good read.
One of the fundamental things in a medieval book is letters - those symbols that fill up page after page and that make up meaning. Each one of us human beings writes differently and considering that medieval books were made before the invention of print, it follows that the scripts they carry show a great variety in execution styles. This is perhaps the most amazing experience of spending a day going through a pile of medieval books in the library: the immense variation in the manner in which the text is written on the parchment pages.
From monks and scribes copying books letter by letter, we have now arrived at the point where the best book ever written is just a few clicks away.
Since then the â€˜%’ has gone from strength to strength, and today we revel in a whole family of â€œper â€”â€”â€”â€”â€ signs, with â€˜%’ joined by â€˜â€°’ (â€œper milleâ€, or per thousand) and â€˜â€±’ (per ten thousand). All very logical, on the face of it, and all based on a fundamental misunderstanding of how the percent sign came to be. Nina and I can comfort ourselves that we are not the first people, and likely will not be the last, to have made the same mistake.
I love stories like this. The history of our punctuation marks and symbols is often quite fascinating.
The highlight of the new release is a far-reaching visual refresh, with menus, toolbars, status bars, and more being updated to look and work better. While LibreOffice retains the traditional menus-and-toolbars approach that Microsoft abandoned in Office 2007, the new version is meant to make those menus and toolbars easier to navigate.
What are the reasons to use either OpenOffice or LibeOffice?
Way back in 2009, I wrote about a few specific cases in which computers led to (subtle) changes in the Dutch language. While the changes highlighted in that article were subtle and not particularly substantial, there are cases around the world where computing threatens much more than a few subtle, barely noticeable features of a language.
This article is a bit too politicised for my taste, but if you set that aside and focus on its linguistic and technological aspects, it's quite, quite fascinating.
Urdu is traditionally written in a Perso-Arabic script called nastaliq, a flowy and ornate and hanging script. But when rendered on the web and on smartphones and the entire gamut of digital devices at our disposal, Urdu is getting depicted in naskh, an angular and rather stodgy script that comes from Arabic. And those that don’t like it can go write in Western letters.
It'd be fantastic if Microsoft, Google, and Apple could include proper support for nastaliq into their products. It's one thing to see Dutch embrace a new method of displaying direct quotes under the influences of computers, but to see an entire form of script threatened is another.