Linked by MOS6510 on Wed 17th Apr 2013 21:20 UTC
General Development "You often hear about how important it is to write 'readable code'. Developers have pretty strong opinions about what makes code more readable. The more senior the developer, the stronger the opinion. But, have you ever stopped to think about what really makes code readable?"
Thread beginning with comment 559247
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[3]: Comment by lucas_maximus
by hhas on Fri 19th Apr 2013 12:22 UTC in reply to "RE[2]: Comment by lucas_maximus"
hhas
Member since:
2006-11-28

"since you can't hold negative coinage in your pocket


No, but you could be in debt.
"

Do not mistake Coins, which are shiny pieces of metal of various shapes and sizes, with Money, which is an abstract concept, frequently overdrawn.

Oh sure, if one really insists one can always write stuff like:

int NumberOfCoins = 5;
int DaylightRobbery = -5;
NumberOfCoins = NumberOfCoins + DaylightRobbery;

But then I wouldn't blame either the language nor the reader, but the smarmy smartass that wrote it - because that's exactly the sort of rotten communication skills that rightly give practitioners a bad name.

As for dealing with money, I still wouldn't want to type it as Integer or Decimal (and definitely never Float). I'd declare my own Money type or Dollar type or whatever was the most descriptive name appropriate to the task and then use that. That clearly describes my intent in creating that variable and properly decouples my own program logic from underlying hardware/language implementation.


BTW, this is why I've grown a tad contemptuous of C*/Java type systems and the folk who hold them up as the One True Way Typing Should Be Done. Compared to the sorts of types systems you can find in functional and other 'exotic' languages, C*/Java typing is downright crude in concept and execution, limited in its expressiveness and mindlessly bureaucratic to use.

A good type system ought to be a positive productivity tool for expressing your program's intent as effectively and efficiently as possible. But too often C*/Java typing is used as a personal crutch for lazy thinking, sloppy coding, premature 'optimization' and OCD tendencies: negative programming habits blessed with official seal of approval. Such a mismatch/shortfall is not only a technical deficiency but a cultural one as well. My earlier (long) post has more discussion of why cultural problems are every bit as important as - and fundamentally inseparable from - the technical ones, so I'll not repeat it here.

Reply Parent Score: 2

Alfman Member since:
2011-01-28

hhas,

"As for dealing with money, I still wouldn't want to type it as Integer or Decimal (and definitely never Float). I'd declare my own Money type or Dollar type or whatever was the most descriptive name appropriate to the task and then use that."

If a language has a decimal type (like databases do), that would be ideal, why wouldn't you want to use that?

In practice though many web languages have such weak typing you don't have much choice (javascript/php). Floats aren't great for cash, but it's not usually a noticeable problem outside of banking because typical cash arithmetic doesn't produce irrational numbers, so rounding should be sufficient. However to be honest I wasn't even sure which way fractional taxes etc *should* be rounded.

http://boston.cbslocal.com/2010/10/01/curious-if-6-25-sales-tax-wil...


If you buy two items for $1 each, and tax is charged at the end, it could result in discrepancies if you return them individually.

Purchase:
subtotal = $1 * 2 = $2.00
tax = $2.00 * 6.25% = $0.125 # round up or down?
total = $2.125 # round up or down?

Obviously if the purchase price is not evenly divisible by the number of products, then there is no way for all the returns to be valued the same if they're to equal 100%. Does anyone know whether the tax on returns typically gets recalculated in the context of the original transaction? If they are done as new transactions, one may be able to exploit the rounding differences to earn a penny ;)



"BTW, this is why I've grown a tad contemptuous of C*/Java type systems and the folk who hold them up as the One True Way Typing Should Be Done. Compared to the sorts of types systems you can find in functional and other 'exotic' languages, C*/Java typing is downright crude in concept and execution, limited in its expressiveness and mindlessly bureaucratic to use."

It's not clear to me what you are criticizing. Both of these languages let you create your own Dollar class if you wanted to.

I do criticize languages for not having unsigned types (java, vb), this occasionally causes problems when the data type is truly not supposed to be signed.

http://javamex.com/java_equivalents/unsigned_arithmetic.shtml

Reply Parent Score: 2

RE[5]: Comment by lucas_maximus
by hhas on Sat 20th Apr 2013 13:29 in reply to "RE[4]: Comment by lucas_maximus"
hhas Member since:
2006-11-28

"As for dealing with money, I still wouldn't want to type it as Integer or Decimal (and definitely never Float). I'd declare my own Money type or Dollar type or whatever was the most descriptive name appropriate to the task and then use that."

If a language has a decimal type (like databases do), that would be ideal, why wouldn't you want to use that?


I'd use Decimals to represent the cash values, sure, but to explain what the variable actually represents I would want to type that as, say, USD or UKP. Or, if the program allowed users to mix-n-match, I'd type the variable as Money then define a Money subclass for Decimals that also indicates the monetary system and provides methods for combining and converting disparate values (e.g. $5.50 + £2.20 as Yen).


In practice though many web languages have such weak typing you don't have much choice (javascript/php). Floats aren't great for cash, but it's not usually a noticeable problem outside of banking because typical cash arithmetic doesn't produce irrational numbers, so rounding should be sufficient.


You can easily get rounding errors on just a couple decimal places, e.g.:

>>> 324.21 * 100 // 1 / 100
324.2


The pennies become pounds, and the pounds become big honking liabilities.

You can mitigate that a bit by representing monetary values in their smallest units, e.g. cents/pennies rather than dollars/pounds. But the mere fact that you cannot trust floats to be accurate should be a red flag, and I wonder how many systems implemented that way even bother to mention all these caveats to their users. To give another example:

>>> 0.7 * 0.7 == 0.49
False


"BTW, this is why I've grown a tad contemptuous of C*/Java type systems and the folk who hold them up as the One True Way Typing Should Be Done. Compared to the sorts of types systems you can find in functional and other 'exotic' languages, C*/Java typing is downright crude in concept and execution, limited in its expressiveness and mindlessly bureaucratic to use."

It's not clear to me what you are criticizing. Both of these languages let you create your own Dollar class if you wanted to.


The problem with the likes of C*/Java languages is primarily a cultural one which is then reinforced by the design of the language itself. Such user-language combinations have highly mechanistic, reductionist view of domain problems, where everything is dragged down to the lowest, generic level; thus £2.20 is stripped of most of its core meaning until it can be viewed and treated as just another decimal number, 2.20.

Users of such languages are encouraged and conditioned to throw away vial meaning in this way because that is what the language makes easiest for them to do. Languages are not simply a one-way tool: they also have a strong feedback element that influences the ways in which we do (and don't) think. And once such languages become entrenched in mainstream business and education, their whole philosophy and behavior becomes cyclical and self-reinforcing, until their users cannot conceive of any other way to think or behave.


Compare and contrast to the Lisper philosophy of taking the base language and then building it up with domain-specific vocabulary until it is able to efficiently solve problems within that domain.

Of course, to perform this bottom-up domain-centric building strategy effectively, the coder needs to have a good understanding of that specialist domain themselves. Which, alas, is something a lot of C*/Java devs have absolutely no interest or desire to do. They just want one group of people (Architects, Business Analyists, User Liaisons) to hand them a complete and comprehensive finished Specification which they can then unthinkingly convert to code form and hand off to another group of people (Testers, Users) to check that it does indeed fulfill requirements.

Many such programmers actively and jealously guard their deliberate ignorance of - and complete disconnect from - the actual problem domain in question, often aided and abetted by equally ignorant management which believes that if they aren't hammering out code for the full 50 hours per week then they aren't actually working. The notion that a programmer might spend a day/week/month sitting with users and learning about what they do and how they think and work is anathema to both: to the code lover it means stepping outside their comfort zone and having to perform hard mental ('menial!') exercise; to the PHB because it turns their collection of ten-a-penny disposable code monkeys into valuable and hard-to-replace company assets.

The only winners in this are OCD jobsworthys and toxic martinets. And the ultimate losers are the end-users who have to do all the actual (i.e. revenue-generating) work, because when all this shit flows downhill, guess who's left standing at the bottom? This is not programming, it is self-serving code masturbation and total willful abdication of responsibility; the intellectual equivalent of crackhead tweaking.


Yes, it's a rant, but not everything that's wrong in programming and programs can be boiled down to a trivial purely technical problem with an equally pat technical solution: mainstream programming culture is a serious source of major modern ills too. I cannot abide people who, when faced with a problem they do not understand, drag it down to their own level of ignorance, laziness and effed-up understanding, rather than build up their own level of understanding until they can tackle it competently and effectively. I've long since fixed such toxic tendencies in myself, so I've no tolerance now for others who continue to make my and others' lives needlessly painful by failing/refusing to do the same.


I do criticize languages for not having unsigned types (java, vb), this occasionally causes problems when the data type is truly not supposed to be signed.


That's just a degenerate instance of the much larger general requirement to specify appropriate bounds on any type.

It also illustrates my previous point perfectly: you are missing the woods for the trees, because you have been trained to think in terms of low-level literal machine implementation (bits and bytes in memory) rather than expressing your high-level abstract requirements completely, concisely and with total precision.

IOW, you're thinking about numbers as a CPU thinks about them, not as a mathematician or scientist or accountant or economist thinks about them. But the whole point of writing a program isn't to make the machine happy, it's to satisfy the exact requirements of the domain expert who uses that program. Step into their shoes and walk around in them awhile, and perhaps you can shrug off such ingrained preconceptions and prejudices and learn to see the bigger picture.

Reply Parent Score: 2