by RonPurewal Wed Jul 24, 2013 11:16 pm
expressions have to be defined uniquely. if x^(1/2) could have two different values, then it would become pointless to even have the notation x^(1/2) in the first place, because no one would ever be able to tell what it meant.
the same thing is true of the "√" sign, which always represents the positive square root of a number.
just think about it for a sec. e.g., how long is the diagonal in a 1-by-1-foot square?
the answer is "√2", of course. but, if "√" were allowed to have either positive or negative values, then there would simply be no way to write the desired number.