💯% accurate. funny how the typescript developer thinks this is some kind of “gotcha!”… like maybe just try a language besides typescript and find out for yourself 😆
Exactly. Most languages I know of that allow this at all will coerce the “1” to an integer and give x = 2. They get away with this because they define the “+” operator as taking numbers only as arguments, so if you hand them x = x + "cheese" they’ll error out.
Type error unless there’s an implementation of
+
that specifies adding together and integer and a string.💯% accurate. funny how the typescript developer thinks this is some kind of “gotcha!”… like maybe just try a language besides typescript and find out for yourself 😆
deleted by creator
“brought” 😏
my complaint is that typescript is stupid, yes. so why wouldn’t i compare to what other languages do that is less stupid?
on the plus side, at least now i know that the ad-hominem minded devs came here too, and brought their righteousness with them.
deleted by creator
OCaml 😍
Exactly. Most languages I know of that allow this at all will coerce the “1” to an integer and give x = 2. They get away with this because they define the “+” operator as taking numbers only as arguments, so if you hand them
x = x + "cheese"
they’ll error out.