[chuck-users] On Division, Arithmetic and expressions

Jim Hinds jahbini at jahbini.info
Wed May 10 15:45:58 EDT 2006


Kassen has run into a "feature" of chuck that needs some  
explanation.   The apparent strange behavior of division of integers  
is common among most computer languages that support integers.

The basic common feature is that we want integer computations to  
result in integers.  If it starts with all integer variables and  
constants, we want to finish with integers (when possible.)  You  
learned about integer division well before you learned about  
fractions or decimals, and WAY before you learned binary.  You  
remember that the answer to 2 / 3 was: "it doesn't go even once" --  
that's zero.  The result is well defined and consistent.  Integer  
division goes hand-in-hand with the remainder operation (mod, % or  
whatever in other languages).

In most cases, you don't notice the subtle nature of integer  
computations.  5 + 3 = 8 is much the same as 5.0 + 3.0 = 8.0  if the  
answer is float or int, you just don't care.

It is, however, important when you do division.  It is ALSO important  
when you are working with "big" numbers (or very negative numbers).   
When this happens, well formed integer arithmetic will overflow in a  
predictable way (and we want it that way).  For example, since 32767  
(2 raised to the 15th power minus 1) is the largest positive number  
in 16 bit twos-complement arithmetic, we DO WANT our integer  
computation to result this way: 32767 + 1 =  -32768.  And not as  
32768.0.

Yes, this is bizarre, and probably not what you are expecting.  But  
it is very common for computer languages, and those of us with little  
music and much computer (that's guys like me), it is like breathing:  
we are so aware of it, that we don't even mention it.

OK, so how do you GUARANTEE that there is no funny stuff happening  
(like truncation or overflow)?  Simple.  You FORCE at least one side  
of the operation to be in floating point.  The operation is then  
carried out in floating point and the result is in floating point.

Example: 2/3 == 0 (integer divides integer results in exact integer),
BUT 2.0 / 3 == .66666667 (integer divides float results in non-exact  
float)
or
2/3.0  == .666666667 (float divides integer results in non-exact float)

or float(a) / 3  should result in a float, not an integer.

This is mostly the way it is.  (you can tell I'm a programmer, since  
I almost always put that caveat word "mostly" in my explanations:  
there always seems to be an exception.

I hope this helps to understand integer computations in chuck, and  
other languages.


> Kassen sez:
>
> Hi list,
>
> Deviding a int by a int results in a int returned like here;
> ------
> <<<(2 / 3)>>>;
>
> //or even like this;
>
> 2 / 3 => float a;
> <<<a>>>;
> ------
>
> Is that realy the intended behaviour?
> In that last case I personally find what happens a bit silly.
>
> If this realy is the way it should be I think it would be nice to add
> a line about this to the manual because I only found out what was
> happening exactly after several rounds of testing. Perhaps this is
> typical behaviour for some languages but I wasn't used to it.
>
> Yours,
> Kas.
>
>

Jim Hinds
http://www.jahbini.org/





More information about the chuck-users mailing list