Kabuki Fox & Security Theater is a user on vulpine.club. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.

Man... Go could have been so good.

Instead it just takes all the problems that exist in programming and pretend they don't exist.

"How big is an array?" "An int big!" "...a signed int?" "Yeah sure why not." "What happens if your array is 2049 MB long on a 32-bit platform?" "*shrug*"

Srsly, there's 3 types of languages as far as I can tell: those that define a specific size for numerical types, those that let those types be any size, and Go+C.

Cause if C does it it has to be a good idea, right?

I mean... Correct me if this is wrong, I am sleepy. But I can't think of any other languages where your most common number type varies in size as the machine word size changes.

If you have programs that talk over a network it's just begging for trouble.

Kabuki Fox & Security Theater @kellerfuchs

@icefox F* does something interesting: integers are “mathematical” integers (so, argitrary-precision arithmetic), and you can define more precise types, like type u128 = n: nat { 0 <= n < 2¹²⁸ }, and so on.

The compiler then maps those to the best arithmetic implementation, IIRC.

@kellerfuchs actually that's quite nice cause it gets you transparent bignums, a la Lisps. Sorta wish that was more common, but alas, you need transparent memory allocation to make it actually work.

Hmmmm...

@icefox Yeah. F* is in a bit of a strange situation there, because it's a pretty usual functional language (ignoring the ridiculously-powerful type system)... up until you discover there is a subset of the language (called Low*), where basically memory is explicit, and you interact with it through some (abstract) model.

Once you proved correctness of what you are doing in the abstract model (i.e. not use-after-free, no out-of-bounds, ...), the Low* compiler turns it into runtime-free C code.

@icefox So there is this weird tension between having transparent memory, and having opt-in very-explicit memory.

@kellerfuchs That sounds really cool. I should check it out.