Username or EmailPassword
It's true that projects like MenuetOS, written in assembler, sport with tiny memory footprint, are blazingly fast and whatnot, but the average programmer doesn't usually learn or even want to learn assembler. He (or she) wants to write in high-level language, preferably a scripting language or one utilizing virtual machines, so that once the code is written, it can be run everywhere. Think calling the taxi versus first buying the parts for a car, then building it and only after that actually drive where you originally wanted to go.
The point is, programmers are and always have been lazy. Now it's possible, thanks to fast hardware, to write in very high level languages compared to the 1980s. Back then you could write programs for a DOS/Windows PC in assembler, Pascal or C, and that was about it. If you wanted to have a GUI and the OS didn't provide it, you coded it yourself or didn't use one. Now you're given the option to choose whatever language you happen to like. If you're given the task of programming a new word processor, you can do it in C, C++, Java, Perl, Python, Objective C, Visual Basic, Object Pascal, Scheme/Lisp, etc., not even talking about the different GUIs or frameworks (GTK, Qt, .NET/MONO, ...). Because programmers are lazy and because there is a huge amount of code and libraries ready to be used, programmers take shortcuts and thus spawn bloat (usually).
Reusability is a two-edged sword (or three-edged, if you're a Vorlon . On one hand it saves both time and money, enforces modular programming and lets the programmer concentrate on the task at hand and not worry about how to implement the stuff that's beside the point. On the other hand, if your program uses a bloated library, your program is bloated. Complex libraries often have complex API's, thus negating the time saved on using it; break it up to a set of simple libraries which don't depend on each other and you might both reduce bloat and simplify the API. Then again, if you have n+1 small libraries each doing only one thing, you have bloat again...
Let's all hope we run out of silicon next year so that we have to use the same hardware we have now for the next twenty years. Maybe that teaches us how to optimize code.