Linked by MOS6510 on Wed 17th Apr 2013 21:20 UTC
General Development "You often hear about how important it is to write 'readable code'. Developers have pretty strong opinions about what makes code more readable. The more senior the developer, the stronger the opinion. But, have you ever stopped to think about what really makes code readable?"
Thread beginning with comment 559195
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Comment by lucas_maximus
by hhas on Thu 18th Apr 2013 22:47 UTC in reply to "Comment by lucas_maximus"
Member since:

Sorry if someone has a problem understanding what a whole number is, they shouldn't be programming.

Actually, the fact that NumberOfCoins is declared as an Integer is itself outright misleading: it should actually be a whole number (i.e. an integer with a minimum value of zero and no upper bound) since you can't hold negative coinage in your pocket but in theory could have any number above that (tho' the US Mint and laws of physics might put a practical cap on that eventually).

(Also, I strongly believe any C*/Java developer who bangs on about the vast importance of Integer and Float needs to be locked in a room full of Type theorists and Haskell and Eiffel programmers until they learn what a real type system is.>:)

As for should or shouldn't be 'programming', there are lots of folks in the world who'd recognize the term 'number' or 'whole number' a lot easier than 'integer'. Consider any spreadsheet user, for example: they might be a domain expert in finance and accounting, but they're not what you'd call a 'conventional' programmer by any measure. For them, they don't care what a cell's 'type' is called, only that its contents adhere to their required rules and holds its exact value 100% reliably at all times.

While you're at it, consider the fundamental difficulty, not to mention overt danger, of handing Floating Point types to such users. Especially those that deal with precision-critical tasks like money management, where Very Important Folks like shareholders and IRS might not be so happy to take "just some FP rounding errors" as an excuse.

Frankly the only thing nastier than dealing with numbers in a computer program is dealing with text as (contrary to traditional programmer beliefs:) most of the planet does not speak ASCII, and many of those middle- and far-east scripts are crazy hard to handle correctly, and even quite unexotic ones can have a few nasty tricks up their sleeves.

So, y'know, while it's easy to say "Integer or GTFO", being able to recall what 'integer' means is only a small first step down the actual rabbit hole - and I think even experienced developers may easily overlook just how deep the devil really goes.

Reply Parent Score: 2

Soulbender Member since:

since you can't hold negative coinage in your pocket

No, but you could be in debt.

Yes, that are things to watch out for with number in programming but none if has any relevance to the dubious claim that NumberOfCoins "hides" anything about the nature of the data it stores.

Reply Parent Score: 2

RE[3]: Comment by lucas_maximus
by hhas on Fri 19th Apr 2013 12:22 in reply to "RE[2]: Comment by lucas_maximus"
hhas Member since:

"since you can't hold negative coinage in your pocket

No, but you could be in debt.

Do not mistake Coins, which are shiny pieces of metal of various shapes and sizes, with Money, which is an abstract concept, frequently overdrawn.

Oh sure, if one really insists one can always write stuff like:

int NumberOfCoins = 5;
int DaylightRobbery = -5;
NumberOfCoins = NumberOfCoins + DaylightRobbery;

But then I wouldn't blame either the language nor the reader, but the smarmy smartass that wrote it - because that's exactly the sort of rotten communication skills that rightly give practitioners a bad name.

As for dealing with money, I still wouldn't want to type it as Integer or Decimal (and definitely never Float). I'd declare my own Money type or Dollar type or whatever was the most descriptive name appropriate to the task and then use that. That clearly describes my intent in creating that variable and properly decouples my own program logic from underlying hardware/language implementation.

BTW, this is why I've grown a tad contemptuous of C*/Java type systems and the folk who hold them up as the One True Way Typing Should Be Done. Compared to the sorts of types systems you can find in functional and other 'exotic' languages, C*/Java typing is downright crude in concept and execution, limited in its expressiveness and mindlessly bureaucratic to use.

A good type system ought to be a positive productivity tool for expressing your program's intent as effectively and efficiently as possible. But too often C*/Java typing is used as a personal crutch for lazy thinking, sloppy coding, premature 'optimization' and OCD tendencies: negative programming habits blessed with official seal of approval. Such a mismatch/shortfall is not only a technical deficiency but a cultural one as well. My earlier (long) post has more discussion of why cultural problems are every bit as important as - and fundamentally inseparable from - the technical ones, so I'll not repeat it here.

Reply Parent Score: 2