Post by JQPBut to follow your suggestion would make almost every operation
on every such field nearly an order of magnitude more complex.
Here's the math.
Currency type -------------
Add: Z=X+Y
Subtract: Z=X-Y
Multiply: Z=X*Y
Divide: Z=X\Y
Scaled Integers----------
Add: Z=X+Y
Subtract: Z=X-Y
Multiply: Macro CurMul(X,Y) = (X*Y)\10000
Divide: Macro CurDiv(X,Y) = (X\Y)*10000
String: Macro CurStr(Z) = FORMAT$(Z\10000)+"."+FORMAT$(ABS(Z) MOD
10000,"0000")
Addition and subtraction are unaffected
X*Y gets replaced by CurMul(X,Y)
X\Y gets replaced by CurDiv(X,Y)
Str$(Z) gets replaced by CurStr(Z)
The errors in your examples above clearly illustrate my point. Consider
your statement: "Addition and subtraction are unaffected". Try adding
two numbers with different scaling (e.g. two and four places to the right
of the decimal). Zing! A similar problem applies with all your other
operations. What? You didn't think that you sometimes need two decimal
places (e.g. currency) and sometimes 3 or 4 (e.g. interest rates), and
sometimes 6 or more (e.g. unit cost), even after I pointed that out: "the
programmer must be conscious of the scaling (that it's there, and how
much for each field)"? :-)
You made an assumption that I was advocating BASIC, even after I stated
otherwise. Exactly my point: all people, including programmers, are error
prone, you and me as well. While a fixed scaled variable type is better
than none at all, only variably scaled decimal fields (e.g. COBOL, PL/I)
are best suited for serious financial work. Anything else requires more
complex programming, which means human programmers *will* make
more errors. Humans can do complex things, but they *do* make more
errors when doing more complex things. Unquestionably, undeniably,
easily provable and widely known and understood by anyone even slightly
familiar with human psychology. Q.E.D. ;-)
Post by JQPDoes this look an order of magnitude more complex to you?
Well, your 'this' isn't sufficient. And the complexity is more than simply
X bytes more code. The programmer's constant need to be aware of the
requirement, the documentation and training considerations, and the
human propensity to do the intuitive but incorrect, are probably much
more significant. When you consider what it really does take, it is at
least an order of magnitude more complex. The complexity of doing what
is simple, intuitive and obvious is virtually nil.
The idea should be to work with human nature, not against it. Use human
characteristics to advantage, don't ignore them. Over many years of
working in situations requiring very high precision, I have seen this to
always be much, much better. Unfortunately, this is an area in which most
programmers are virtually clueless. You can see this clearly in the design
of reports and many computer languages. The designers usually try to look
at things from an abstract logical standpoint. But humans are not abstract
logical creatures. We do such things with difficulty, and it takes much
practice and discipline to do them well. We will never be as precise as
machines in such things. But we do have strengths, such as habit and
intuition, and learned patterns of thought. These are more difficult to
quantify, but it is not so difficult to recognize patterns of behavior, both
physical and psychological. For example, humans have an expanded
ability to learn verbal language when young. These learned language
patterns are 'subliminated' (a form of mental abstraction) at an early age,
and become deeply ingrained. And since programming languages are
always learned at a much later age than verbal languages, they will never,
ever, ever be as deeply ingrained in a human mind as a native verbal
language. It is a simply unavoidable consequence that programming
languages which exploit these patterns will be easier to learn and master
than those that go against them. Logical elegance will never outweigh or
outperform facility, in the human psyche. Better to recognize this and use
it to our advantage.
--
Judson McClendon ***@sunvaley0.com (remove zero)
Sun Valley Systems http://sunvaley.com
"For God so loved the world that He gave His only begotten Son, that
whoever believes in Him should not perish but have everlasting life."