Length of number is always 1

Here's a function I've created to add zero at the beggining of number when it has one digit:

function addZero(number) {
    var numLength = number.toChar().toString().length();
    System.println( "Number is" +  number + "and has length of" + numLength);
    if (numLength<=1) {
        return "0"+number;
    } else {
        return number;
    }
}
The output to console is always with length as 1. I've also tried converting into charArray and using size() with no luck.
Example output:
Number is7and has length of1
Number is11and has length of1
Number is95655553and has length of1

  • try

    return number.format("%02d");

    not sure why you are doing the toChar().toString().  toString() would do what I think you want to do, but just using format() is easier.

  • Can you please explain what "%02d" means and why I get length 2 for number 7 now after changing to 

    var numLength = number.format("%02d").length();
  • It's a basic format where it will display a decimal, that's at least 2 long and the 0 indicates it's 0 filled if less than 2 digits.

    Consider this small bit of code:

    var num=1;
    var str=num.format("%02d");
    System.println(str+" "+str.length());

    The output from the println() is 01 2

    (note it's 0 padded to 2 characters)

    change it to num=123 and the output is 123 3

    (no padding needed)

  • So, I've found out what %02d means - "format the integer with 2 digits, left padding it with zeroes". I've changed to:

    var numLength = number.format("%1d").length();

    Which means don't pad with zeros and format as 1 digit. Now it counts perfectly fine.
  • and why toChar().toString().length() doesn't work then? It would work in JavaScript.

  • and why toChar().toString().length() doesn't work then? It would work in JavaScript.

    I don't think numbers have a toChar() method in javascript.

    The reason Number.toChar().toString().length() doesn't work as you expected in Monkey C is because Number.toChar() converts a number to a single unicode character, and toString() converts that single character to a string, which will naturally always have a length of 1.

    Your code would've worked with just number.toString().length() but as noted it's not the optimal solution.

  • this isn't javascript.  toChar converts a number into it's corresponding character, not a char for the number itself

    Consider this 

    var num=48;
    var str=num.toChar();
    System.println(str);

    The output is a 0.  One character.

  • It doesn't look like there is a "toChar()" function in JavaScript. There are "toCharCode()" and "toString()" functions.

    Anyway, a "toChar()" function should produce a single character (if the name is at all reasonable). One that produced multiple chars would "toChars()" or "toString" (a string is multiple characters.

  • JavaScript actually has String.charCodeAt(index) and String.fromCharCode(code).

    Monkey C's Number.toChar() does the same thing as String.fromCharCode(), *not* String.charCodeAt(). As dpalwyk pointed out, it wouldn't make sense for a function named toChar() to produce multiple chars anyway.

    This is a bit of rehash of everything everyone has said, but to be absolutely clear, here's the Javascript and Monkey C code for converting a char code (as number) to a string containing that character:

    (JavaScript)

    var charCode = 65; // 65 is unicode/ASCII for "A"
    var str = String.fromCharCode(charCode);
    console.log(str); // prints "A"

    (Monkey C)

    var charCode = 65;
    var str = charCode.toChar();
    System.println(str); // prints "A"

    And here's the code for converting a character (contained in a string) to the equivalent character code (as a number)

    (JavaScript)

    var str = "ABCD";
    var charCode = str.charCodeAt(0);
    console.log(typeof(charCode)); // prints "number"

    console.log(charCode); // prints "65"

    (Monkey C)

    var str = "ABCD";
    var char = str.toCharArray()[0];
    System.println(char); // prints "A"

    var charCode = char.toNumber();
    System.println(charCode); // prints "65"

    Finally, here's the code just to get the string representation of a number, like OP wanted in the first place.

    (JavaScript)

    var num = 65;
    var str = num.toString();
    console.log(str.length); // prints "2"
    console.log(str); // prints "65"

    (Monkey C)

    var num = 65;
    var str = num.toString();
    System.println(str.length()); // prints "2"
    System.println(str); // prints "65"