Seven years after AMD first unleashed its Athlon 64 desktop chips, it looks as though the 64-bit revolution is finally about to kick off. Microsoft has revealed that out of all the Windows 7 installations out there, a massive 46 per cent of them are 64-bit.
I cannot stand how Microsoft can brag that its efforts had helped to make 64bit computing viable. Do they have no shame at all? Do they not understand that it is exactly their actions that had made the transition as difficult as it had been?
I shall disclaim that I had previously thought that incompetency had a lower limit. Alas, I just met a graduate in Computer Science who asked me to help install linux on his computer (for his graduate work, he would be, how to say, closer to the textbook if he used linux). I asked if he was able to use 64bit linux because I had the disc ready, but he said it was only 32bit, judging from the Windows Control Panel. That was a Core 2 duo machine. Again, I am beaten by the realities of life.
I shall reiterate: Hiring incompetent people costs more than hiring competent people because their stupidity can reduce productivity for the entire network of people. Cue adage about accounting, cost and value of everything.
EDIT: Forward a few more decades, and prepare to repeat the cycle for the transition to 128bit. It is inevitable.
Edited 2010-07-12 15:52 UTC
1. why would you want to go to 128Bit? what would you gain?
2. don’t you think virtualisation will have improved to the point where you can just emulate the legacy system at 90% of the native performance?
Theoretically, computing can work at any word size. Memory addresses are fixed to word size only because it is silly and slow to implement BigNum for memory addresses.
However, when 16bit came out, 8bit-ers jumped bandwagon, as did 32bit and 64bit. What is hindering 128bit adoption when we exhaust memory addresses again? I do not get your objection, because I meant the far future, probably in more than just my next generation (and I am young).
EDIT: I should clarify, I meant that transitions of architecture are an inevitable fact (with good reason and history to it). The transition process for linux is no more than just a breeze, while Windows is having this hooha. We know that transition processes are only to get more and more difficult if there are longer time between the transitions because stagnation begets dependency. So, I am predicting that it will just take even longer and more trouble when we transit to 128bit in the far future, which is actually really plausible, isn’t it?
Virtualisation does not help it one bit. The Windows 64bit transition used a huge lot of virtualisation, but it always is the drivers that cause trouble. No doubt, we may have a better scenario next time, but it is clear that we are having lots of trouble on the kernel front (if we were able to divorce drivers from kernels, we would have less trouble in transitions, but that is difficult to do) and also there will be little will to do so (because 64bit will seem to dominate for an extremely long time to any planner). We also have trouble virtualising drivers. Linux, on the other hand, did not need much of virtualising when transitioning — most stuff just worked right away with a recompile.
Which is why I brought in the competency — competent people tend to, if not always, incorporate long term visions. However, I can understand if it is stinging because kernel devs are furthest from incompetent in any incarnation of the kernel, and still none of these brilliant people have solved these problems.
Edited 2010-07-12 18:17 UTC
The REAL question is whether we are going to exhaust memory addressing again. I don’t think so. 64-bit address space is not two times bigger han 32-bit. It is over 4 billion times bigger! 64-bit integers can encode numbers as big as 18 quintillions – most people even don’t know how much is that. There are really few uses for such big numbers.
I agree here. I may be wrong, but I don’t believe we will ever need more RAM than 64 bit can address. It is an immense amount of memory. Think of it this way: it is 4,294,967,296 times as much memory as 32 bit could access.
Maybe file systems will move to 128 bit – but that is a completely different subject; it’s much easier to implement, and has no performance impact (i.e. 128 bit processor would actually be slower than 64 bit because of the extra data that have to be pushed around).
We’re already to that point. 32 bit code can be run on 64 bit OSs at basically 100% speed. Performance is not the issue. Compatibility is.
I think your reply is much more calm and intellectual than the other posts, so I’ll be replying here instead of the others. I think I’m already overstaying my welcome here.
First of all, we have no real idea of what is “enough”. Well, Bill Gates himself talked about 640KiB of memory being “enough” for anybody.
Your point that data transfer rates are levelling off is more plausible, but even then we have always had progress when things start looking as if it were going to stagnate. I think it is arrogant for human minds and their linear familiarity to predict things that tend to go in quantum leaps.
All that said, I suppose I would also agree that 128bit is more than likely to be confined to filesystems, and even there there is not a lot of reason to go 128bit because 64bit systems can couple 2 registers easily for it.
(I’m replying sequentially, so I’ll reply here that I think compatibility is a real problem. It means that we will always be stuck with the cesspool that is x86. Stagnation here is just no good. I’m not saying that we should blindly migrate or something, but x86 is clearly suboptimal.)
I want to state here that I really mean the super long term, probably even after I die, when I talked about 128bit. It is something at the very end of my original post, so why is everybody so fixated on it? I cannot imagine a world without progress in this field — I’d say that the probability that we will eventually go 128bit is almost 1!
In fact, it is not uncommon to hear of 128bit systems in gaming systems or GPUs. With CPUs and GPUs converging, be it with GPGPUs or whatever, we should be aiming to save transition effort by programming with caution in regards to architecture, not start another round of unhealthy integration. That means that portable code is what we should concentrate on, not compatibility hardware-wise.
That user doesn’t know where to look. Under Control Panel, Performance Information and Tools, you click “View and print detailed performance and system information” and it will tell you if the computer is 64-bit capable or not.
Which is why I said the incompetency is so appalling! Confusing Windows’ bit-ness with the hardware’s bit-ness. Sigh.
Thanks for that.
Mate, I went through a local polytechnic and I can tell you that the vast majority are there not because they have a passion for technology but because they think it’ll get them a job. So you end up having legions of people entering into the IT world, not because they’re passionate about technology but because it is ‘another job’.
There is a reason why the state of the IT world (along with other industries) is so appalling, because there are people who choose qualifications for a job rather than a career. Choosing to study something they have no scholastic interest in the field other than the ability to get a job and make the mortgage payments on time. It is mind numbingly depressing when I see the sorts of people who enter the IT world – it truly does.
When I talk to people who are in the IT world who have never used *BSD or Linux or ran anything other than a PC I look at them sideways wondering how the hell they made it this far. What is even more funny – I was at a large school as the system administrator and IT teacher. It is amazing the number of students who came up and told me how they used to run circles around the last System Admin and teacher who virtually knew nothing about computers. Oh well I’m not surprised, it seems that once people get to a certain age they hang up their brain and cease learning/up-skilling/expanding knowledge.
+1 please.
Let me add to your argument. I have quite a few friends in Computer Science all thinking that it will allow them to find employment in making animes. Every time I hear of such cases, or even just thinking of Computer Science as a degree to help get a job, I feel sad.
I mean, I think someone should learn computers only if they already have something they think they can do in the future. I think if I have a computer science degree, I’d list it as a footnote in my CV, not as the main thing. I know I’ll be interested in anybody who did so.
In fact, I think the entire education system that thinks that it is alright to find a degree to get a job should be just shot. If people have no passion for a field, they should not pick up anything because it is more dangerous for the entire society than just a few terrorists.
And I’ve met some computer science graduates who:
1. Can not tell the difference between wireless routers and wireless access points (they think both are the same device with different names).
2. Think that Windows API and MFC are precisely two same things. Some even think that MFC is obsoleted by Windows API — as in the former came out earlier than the latter. C’mon….
3. Claim to be web developer but know next to nothing about CMS.
4. See open source as equivalent to freebie products.
5. Have almost zero knowledge on non-Microsoft OS and enterprise solutions.
Something is definitely wrong with our education system.
I think it is a shame I cannot mod you up. But it also depends on where you are from — I heard that the Scandinavian countries are doing particularly well with their education systems.
Guess what, it seems that war, which I do not advocate, is good for breaking lousy education systems. Sad but true. If we have to resort to killing to bring about change, I really get torn whether it is justifiable or not, though I tend to say it is not justifiable.
It is also sad if the system itself is so entrenched that it has to be changed by something so drastic.
One of the worst ideas is to use the system as a model template. I keep hearing from the teachers that they are not encouraged to experiment — they are supposed to just teach according to methods that are tested and proven. Most systems also have the idea that they have to set minimum standards and punish those that fail to deliver to them every period. However, we also know that you can get short term advances and consistent improvement via rote-learning. But rote-learning is just like employing incompetency — it is mind numbing and switches off the learning part of the brain! I’d rather take a teacher that is much slower at the beginning, being passionate, and later begins to inspire exponential learning.
But to boil it all down, I suppose it is precisely due to incompetent policy makers in the education system that smothers teachers, making the entire system incompetent. That, and that they are so busy playing “keeping up with the Jones'” and become complacent at any rank, are what powers the vicious cycle so strongly. (I suppose over-zealous and misdirected parent-teacher-associations also matter in Japan at least.)
…maybe people [read developers] will start compiling 64 bit versions of their software. Yeah I know that isn’t usually necessary but… I hate seeing those “*32” suffices all over my task manager process window.
I am doing that with my newest game/product right now, it is 64bit only on both the client and server side I know of several developers who are doing the same. It just isn’t worth supporting both 32bit and 64bit platforms.
Probably Flash is the big handicap.