I posted this on a newsgroup:
A computer language isn't a language. Formatting code to make it easier
for humans to use doesn't make it a language. Computer code at root
describes switching sequences. That's all. Recall that the very first
machines were coded by rerouting cables between switches.
Human languages don't code, they present what (for want of a better
phrase) I call "intersecting ambiguities". That's why you can understand
sentences that include words you've never encountered before (it's how
we learned our native language in the first place). It's why we can
shift word-usage, and be fairly confident that our hearers and readers
will get at least enough of what we intend that they can ask good
questions about what we mean.
The ambiguities etc of human language are paradoxically also the reason
that statistical and pattern analysis of samples has produced successful
translation AIs, and AIs that write boilerplate news reports (eg, for
sports and business). The most recent language-writing AI can imitate
your personal style well enough that it's impossible for the casual
reader to detect a forgery. That will not end well.
Monday, February 25, 2019
Computer languages aren't languages
Subscribe to:
Post Comments (Atom)
Time (Some rambling thoughts)
Time 2024-12-08 to 11 Einstein’s Special Relativity (SR) says that time is one of the four dimensions of spacetime. String theory claims t...
-
John Cunningham. The Tin Star (Collier’s, December 4, 1947) The short story adapted for High Noon . As often happens, the movie retains v...
-
Noel Coward The Complete Short Stories (1985) Coward was a very clever writer. All of these stories are worth reading, but few stick ...
-
Today we remember those whom we sent into war on our behalf, and who gave everything they had. They gave their lives. I want to think ab...
No comments:
Post a Comment