Here's a function I've created to add zero at the beggining of number when it has one digit:
Here's a function I've created to add zero at the beggining of number when it has one digit:
It's a basic format where it will display a decimal, that's at least 2 long and the 0 indicates it's 0 filled if less than 2 digits.
Consider this small bit of code:
var num=1;
var str=num.format("%02d");
System.println(str+" "+str.length());
The output from the println() is 01 2
(note it's 0 padded to 2 characters)
change it to num=123 and the output is 123 3
(no padding needed)
So, I've found out what %02d means - "format the integer with 2 digits, left padding it with zeroes". I've changed to:
and why toChar().toString().length() doesn't work then? It would work in JavaScript.
I don't think numbers have a toChar() method in javascript.
The reason Number.toChar().toString().length() doesn't work as you expected in Monkey C is because Number.toChar() converts a number to a single unicode character, and toString() converts that single character to a string, which will naturally always have a length of 1.
Your code would've worked with just number.toString().length() but as noted it's not the optimal solution.
It doesn't look like there is a "toChar()" function in JavaScript. There are "toCharCode()" and "toString()" functions.
Anyway, a "toChar()" function should produce a single character (if the name is at all reasonable). One that produced multiple chars would "toChars()" or "toString" (a string is multiple characters.
JavaScript actually has String.charCodeAt(index) and String.fromCharCode(code).
Monkey C's Number.toChar() does the same thing as String.fromCharCode(), *not* String.charCodeAt(). As dpalwyk pointed out, it wouldn't make sense for a function named toChar() to produce multiple chars anyway.
This is a bit of rehash of everything everyone has said, but to be absolutely clear, here's the Javascript and Monkey C code for converting a char code (as number) to a string containing that character:
(JavaScript)
var charCode = 65; // 65 is unicode/ASCII for "A"
var str = String.fromCharCode(charCode);
console.log(str); // prints "A"
(Monkey C)
var charCode = 65;
var str = charCode.toChar();
System.println(str); // prints "A"
And here's the code for converting a character (contained in a string) to the equivalent character code (as a number)
(JavaScript)
var str = "ABCD";
var charCode = str.charCodeAt(0);
console.log(typeof(charCode)); // prints "number"console.log(charCode); // prints "65"
(Monkey C)
var str = "ABCD";
var char = str.toCharArray()[0];
System.println(char); // prints "A" var charCode = char.toNumber();
System.println(charCode); // prints "65"
Finally, here's the code just to get the string representation of a number, like OP wanted in the first place.
(JavaScript)
var num = 65;
var str = num.toString();
console.log(str.length); // prints "2"
console.log(str); // prints "65"
(Monkey C)
var num = 65;
var str = num.toString();
System.println(str.length()); // prints "2"
System.println(str); // prints "65"