Acknowledged
CIQQA-3581

Casting Float to Number does not yield Number

This code:

var myFloat = 5.0;
var myNumber = myFloat as Number;
var mod = myNumber % 1;

Compiles without warnings, but crashes at runtime with:

Error: Unhandled Exception
Exception: UnexpectedTypeException: Expected Number/Long, given Number/Float

As far as I can tell:

  • this behavior is undocumented
  • the types "Number/Long" and "Number/Float" are undocumented
  • the effects of casting numeric types are undocumented

Consequently I'm not sure how to characterize this bug. If it's expected, this is a documentation bug. If it's not expected, this is a runtime and/or compiler bug.

I read the relevant section on "type casting" in the SDK docs. I've reproduced the totality of the documentation on type casting here:

The as keyword can also be used in an expression to type cast a value to another type. This can be useful if the type is not clear to the type system.

This, unfortunately, does not help very much.

---

Tested on Descent G1 & G2 simulators

SDK 8.3.0

  • > Underflow, overflow

    I think these are handled by wrapping around (like C), rather than throwing a catchable exception or raising a fatal error.

    > arithmetic

    > promotion 

    Yes, there's various implicit conversions/promotions when you do arithmetic, but they're not really documented.

    - division: if one of the operands is a Float or Double, then floating-point division is performed

    - integer division: if both operands are integers, then integer division is performed

    - comparisons: numeric types are automatically converted/promoted as necessary so that comparisons with <, >, ==, ... work as expected

    Some things that may be unexpected:

    - Adding 2 Chars with "+" results in a String: 'a' + 'b' => "ab" 

    - Adding Char and Number/Long results in incrementing the Char: 'a' + 1 => 'b'

    - Null (or any other non-Boolean type) cannot be implicitly coerced to Boolean. This will be a compile-time error. If you disable the type checker or cast the value to Boolean, it will compile, but there may be unexpected effects at runtime, including crashes.

    > While numeric types are objects, there also seems to be no operator overloading. Correct?

    Correct.

    Another thing to note is that while Long and Double are immutable value types like the other "primitive" types (i.e. Boolean, Number, Float, String, Null), unlike Number and Float, they are allocated on the heap and not the stack, which means they have a huge memory penalty which adds up. So it's best to avoid using Long or Double in an array with many elements unless you really need to.

    (The docs hilariously once said Monkey C doesn't have primitive types, because everything is an object, but then it referred to primitives anyway, all in one sentence. So you can also think of them as the immutable value types. Even though they are Objects, they have much less memory overhead than other objects in Monkey C)

    - Stack: Boolean, Number, Float, Null.

    These values take up 5 bytes of memory. (1 byte for type, 32 bits for value)

    - Heap: Long, Double, String

    Long and Double take up 13 bytes of memory. (1 byte for type, 4 bytes for pointer, 64 bits for value)

    - Heap: Array, Dictionary

    These have about ~13 bytes of memory overhead above and beyond their contents. Dictionaries are more expensive than arrays in general, because there's significant overhead for each key/value pair

    - Heap: other Objects

    Objects have a huge memory overhead of ~35 bytes, which makes them very expensive even compared to Arrays and Dictionaries

    --

    Since there's no comprehensive language reference, unfortunately the only way to discover a lot of this stuff is to find out on your own, or hope someone else posted about it.

  • No worries! Like I said, Monkey Types very similar to TypeScript, which is a compile-time type-checking layer on top of JavaScript. Monkey Types is basically also a compile-time type-checking layer on top of the original Monkey C language.

    Like Monkey C, a cast in TypeScript only serves to change the type-checker's idea of what the type is, not to perform run-time type coercion. TypeScript is better in this aspect in the sense that it does give you a warning or error when you attempt to cast between unrelated types, and you have to type some additional code to tell TypeScript you *really* meant to do that. It's also better in the sense that the syntax for type declarations is completely different than the syntax for type casts.

    > It seems to be impossible to specify Double or Long literals. Instead you have to write "5.0.toDouble()" or "5.toLong()". Correct?

    Incorrect. There are examples of Long and Double literals here:

    https://developer.garmin.com/connect-iq/monkey-c/functions/ 

    var l = 5l;                 // 64-bit signed integers
    var d = 4.0d;               // 64-bit floating point

    Of course, without this syntax, it would be impossible to specify a Double literal that exceeds the precision of a Float, or a Long literal that exceeds the range of a Number. Indeed you would get a compile-time warning if you did so without the appropriate suffix

  • Thank you for very much for taking the time to clarify all of this.

    Therefore, the only purpose of a type cast is to override the type checker's idea of the type of a variable/value. A cast can *never* change the actual run-time type (or value) of a variable.

    This is helpful. Considering this code:

    // Number is a 32-bit signed integer
    var myNumber as Number = 0;

    function setMyNumber(value) {
        myNumber = value;
    }

    function test() {
        setMyNumber(5.12);
        System.println(myNumber.format("%.2f"));
    }

    According to the docs, this code should not compile:

    Once a type has been bound to a value, the compiler will only allow values of that type to be assigned.

    Since it does compile, this implies type coercion; but thanks to your explanation, I now understand that the docs are wrong.

    Prior to your explanation, I expected a numeric conversion to occur and the output to be "5.0". In fact the output is "5.12". I see now that this is just normal duck typing behavior. The presence of a casting operator made me think that a numeric conversion was performed, but I suppose this is on me as the docs do state that there are no primitive types, only objects. Intuitively, I would expect "as Float" to call ".toFloat()" on numeric types; this would follow the least surprise principle which Moron C claims is a priority. In any case, it's now clear why casting effects aren't documented - there aren't any effects.

    Btw,  this is why type casting is so dangerous in Monkey C, and I why I always suggest avoiding it unless absolutely necessary.

    I completely agree with you and will now eliminate casting from my program.

    Those aren't distinct types, it's just typically bad CIQ error messages. [...] the "/" has 2 different meanings in the same message. In "Expected A/B", the "/" means "or", but in "given C/D", the "/" means "and".

    Incredible. Thank you for explaining this. It's clear to me now.

    var x = ("goodbye" as Number);
    var y = ("hello" as Number);

    The fact that this compiles drives home your point about casting even more.

    Since you seem to be a better language reference than the official docs, I have a couple more questions:

    1. How is arithmetic handled? Underflow, overflow, promotion etc. are all undocumented as far as I've seen.
    2. It seems to be impossible to specify Double or Long literals. Instead you have to write "5.0.toDouble()" or "5.toLong()". Correct?
    3. While numeric types are objects, there also seems to be no operator overloading. Correct?

    Thank you again, I think you have saved me a lot of confusion.

  • > Exception: UnexpectedTypeException: Expected Number/Long, given Number/Float

    > the types "Number/Long" and "Number/Float" are undocumented

    Those aren't distinct types, it's just typically bad CIQ error messages.

    What that error means in the context of "x % y" is that:

    - "Expected Number/Long": each of the operands x and y is expected to be Number or Long 

    - "given Number/Float": one operand was Number and the other operand was Float

    Yes, I realize this is vague and lacking in self-consistency - for example, the "/" has 2 different meanings in the same message. In "Expected A/B", the "/" means "or", but in "given C/D", the "/" means "and".

    But you can validate what I'm saying with a few examples:

    var x = (1.0 as Number);
    var y = (2.0 as Number);
    var z = x % y; // Exception: UnexpectedTypeException: Expected Number/Long, given Float
    // ("expected either Number or Long, given Float")
    --

    var x = (1.0 as Number);
    var y = ("hello" as Number);
    var z = x % y; // Exception: UnexpectedTypeException: Expected Number/Long, given Float/String
    // ("expected either Number or Long, given Float and String")
    --
    var x = ("goodbye" as Number);
    var y = ("hello" as Number);
    var z = x % y; // Exception: UnexpectedTypeException: Expected Number/Long, given String
    // ("expected either Number or Long, given String")
  • Btw Monkey C has both run-time and compile-time types, which are related but not identical.

    Run-time types existed in Monkey C from the beginning, while compile-time types / type checking were only introduced a few years ago.

    Obviously if you have 0 casts in your code, then assuming there are no bugs in the type checker, the compile-time type should at least not contradict the run-time type. Although the compile-time type may be *vaguer* then the run-time type - like the compile-time return type of loadResource() being a generic Resource, as opposed to a specific resource type like BitmapResource.

    But there have been bugs in the API type definitions, where values in the API have had the wrong compile-time types (e.g. Number instead of Float or Number). And there are known issues where the compile-time type for values in a container (like Array or Dictionary) can be wrong, since the type checker sometimes doesn't take into account the fact that you can change/add elements of a different type than the existing elements in the container.

    So it's definitely not like a strongly typed language (like Java or Swift), where the compile-time type system is almost infallible (in the absence of casts).