Is it correct to assume objects built into JS are operating behind the scenes whenever you attribute a primitive data type to a variable’s value? And, if this stuff exists, it enables your code to identify something as a certain data type and claim that it is an instance of a certain data type?
For example, if I my code has
"11" > 4, what exactly happened here? Did I forget to type-cast one of the values? This evaluates to true… but
"11" > "4" is false… (probably ascii ordering or lexicographical ordering, not sure which). This kind of ambiguity is not good even if the interpreter can make a guess at what’s happening. So in the code you might make some comment (where it’s not obvious, i.e., particularly to other people interacting with your code) to make clear that both inputs to the function/method have to be of a certain type.