Friday, March 28, 2025

A Poker Hand's A Clue (Eric Wright, The Last Hand, 2001)

Eric Wright. The Last Hand (2001) Charlie Salter is approaching retirement, and has been assigned office duties.  An apparently simple murder case turns out not to be. Salter gets the case because one of the people close to the victim wants him to do it. He’s assigned Terry Smith, a brand new constable, an immigrant from Glasgow, to work with him. After a lot of palaver and fact checking, we find out what we probably inferred around the quarter mark: it was a passion-driven murder. A very large pile of misleading information and surmise has to be cleared away, mostly because a lot of it, if true, would implicate a lot of important legal people in corruption and scandal.

A good read, but not a great one. Salter goes off into the sunset of retirement happy that he’s played one last hand. A poker game figures in the solution by providing the clue that unravels the knot.

OK, that’s enough cliches. I enjoyed the book because I like the Salter series. The book could have stood a lot more story about Salter and Smith.  **½

Tuesday, March 25, 2025

Nasturtium

 


September 2009. This was a test of the close-up capability of my then-new Canon SX-20 digital camera.

Sunday, March 09, 2025

The Library of Babel (The Universal Library) (long read)


Some thoughts on the Universal Library problem

The problem was fictionalised by Luis Borges. It may be stated thus: Can we specify a procedure for writing a Universal Library? A universal library contains all texts ever written and ever to be written, in all the languages that have ever and will ever be spoken, and many more that will never be spoken by anyone. The paradoxical answer to this question is yes, and several proofs exist that such a library is not only possible, but is of a finite size, albeit a very large one. One such procedure (adapted from one described by Martin Gardner) is the following:

Suppose a book of 100 pages of 100 lines of 100 characters each. Each such book contains a total of 10^6 characters, including the space. Using the Latin alphabet in upper and lower case (52 characters), 7 punctuation marks and the space, and 10 numerals bring the total to 70 characters. The total number of books, if each contains exactly one permutation will be 70^(10^6), a very large number. It is so large that if every atom in the universe were a printing machine printing at the rate of one character per second, it would take many lifetimes of our universe to print all the books.

Clearly, very, very large library. Does this library in fact contain all possible books?

Each book in the library is a specific combination of characters. Each such combination is 10^6 characters long. Given that any printed book is a combination of characters, that combination will occur at least once somewhere in the library. A book shorter than 10^6 characters will occur many times, since there will be (100-n)^(10^6) permutations of the characters filling out the book to 10^6 characters.

The same consideration applies to books not yet written, for each such book is a combination of characters. Books that will never be written by anyone will also occur in this library. And since all spoken languages can be represented by some scheme of matching characters to sounds, books written in all possible spoken languages will occur in this library.



This summary proof shows that all books ever written, ever to be written, and never to be written occur in this library, many of them more than once. Since every book can be printed with typographical errors, all possible combinations of typographical errors will also occur. In short, not only will all possible books occur, all possible variations on each book will occur. What’s more, a very large proportion of the books will be nonsense in any language, including languages not spoken on Earth (if there are such.) That includes Klingon, and any other fictional language.

This Universal Library is too large. It’s clear that “too large” means not only “utterly infeasible”, it also means “containing too much nonsense.” But mulling over the consequences of the procedure for constructing the library is a useful exercise in handling very large numbers, numbers that are unimaginably large. The Universal Library problem shows that we can conceive of entities that we cannot imagine, and that we can reason accurately about them.


Can the Library be made smaller? Yes, by using an encoding scheme that compresses the data. One might work as follows.

Suppose we use binary code. The we use only 2 characters, and the size of the library will be 2^(10^6), still a very large number. Is it smaller than the library using 60 characters? Yes. The fraction is [2^(10^6) / 70^(10^6)], a very small fraction. 

That’s still enormous, though. Is it enormous enough to contain all possible books? Paradoxically, yes. Every character will be encoded in binary, and hence every combination of characters will occur as a combination of binary characters. What’s more, since binary code can be represented by some combination of alphabetic characters (e.g, a for 1, b for 0), this binary-coded Universal Library will be included in the alphabetic one, once for every encoding of the binary characters. For example, (a,b), or (one, zero) and their equivalent in every possible, known, and unknown language. No wonder encoding the universal library using the alphabet is so inefficient.

Hence the supposedly larger set of books containing every possible combination of 70 characters will be contained in the smaller set of books containing every possible combination of only two characters. Thus, the library utilising 70 characters encodes its information very inefficiently. Can we improve that efficiency?


Suppose we limit ourselves to English books. Since any conceivable language should be translatable into English, surely we can reduce the size of the library? Yes, we can. We need only ensure the inclusion of every combination of characters that represents an English translation of a book written in some other language. But our multilingual library would include all translations of all books into every language. Limiting ourselves to one language to represent all possible books omits those multiple translations. If there are L possible languages, then there are L! translations of all books into all languages. Thus limiting ourselves to one language, the library’s size will be  {[2^6(10^6)]/L!}. This will be a fraction of the multilingual library. But it will still be enormous.

Nevertheless, we can estimate its size. Suppose there are 500,000 English words. Suppose the average length of an English word is 10 characters, including one space. Then each of our English books of 10^6 characters will have an average of (10^6)/10 or 10^5 English words. The size of this library (in binary characters) will be 2^(10^5) books. This is still very large: it’s [2^(10^6)]/[2^(10^5)], which is 10% smaller than the complete library. Not much of a saving. What’s more, it will be this size regardless of the total number of languages.



A Poker Hand's A Clue (Eric Wright, The Last Hand, 2001)

Eric Wright. The Last Hand (2001) Charlie Salter is approaching retirement, and has been assigned office duties.  An apparently simple murd...