Monday, February 25, 2019

Computer languages aren't languages

I posted this on a newsgroup:

A computer language isn't a language. Formatting code to make it easier for humans to use doesn't make it a language. Computer code at root describes switching sequences. That's all. Recall that the very first machines were coded by rerouting cables between switches.

Human languages don't code, they present what (for want of a better phrase) I call "intersecting ambiguities". That's why you can understand sentences that include words you've never encountered before (it's how we learned our native language in the first place). It's why we can shift word-usage, and be fairly confident that our hearers and readers will get at least enough of what we intend that they can ask good questions about what we mean.

The ambiguities etc of human language are paradoxically also the reason that statistical and pattern analysis of samples has produced successful translation AIs, and AIs that write boilerplate news reports (eg, for sports and business). The most recent language-writing AI can imitate your personal style well enough that it's impossible for the casual reader to detect a forgery. That will not end well.

No comments:

Time (Some rambling thoughts)

 Time 2024-12-08 to 11  Einstein’s Special Relativity (SR) says that time is one of the four dimensions of spacetime. String theory claims t...