Man... Go could have been so good.
Instead it just takes all the problems that exist in programming and pretend they don't exist.
"How big is an array?" "An int big!" "...a signed int?" "Yeah sure why not." "What happens if your array is 2049 MB long on a 32-bit platform?" "*shrug*"
Srsly, there's 3 types of languages as far as I can tell: those that define a specific size for numerical types, those that let those types be any size, and Go+C.
Cause if C does it it has to be a good idea, right?
I mean... Correct me if this is wrong, I am sleepy. But I can't think of any other languages where your most common number type varies in size as the machine word size changes.
If you have programs that talk over a network it's just begging for trouble.
@icefox F* does something interesting: integers are “mathematical” integers (so, argitrary-precision arithmetic), and you can define more precise types, like type u128 = n: nat { 0 <= n < 2¹²⁸ }, and so on.
The compiler then maps those to the best arithmetic implementation, IIRC.
@kellerfuchs actually that's quite nice cause it gets you transparent bignums, a la Lisps. Sorta wish that was more common, but alas, you need transparent memory allocation to make it actually work.
Hmmmm...
@icefox So there is this weird tension between having transparent memory, and having opt-in very-explicit memory.
@kellerfuchs That sounds really cool. I should check it out.