In about five minutes, the first decade of the 21st century will be drawing to a close in France. And festivities are under way.
I've just noticed a press article indicating that, throughout France this evening, some 45,000 police and gendarmes are deployed to make sure that peace prevails.
Yes, but 1137 cars were burned in that uniquely French "sport" for the mindless.
ReplyDeleteAnd on a pedantic note, of course the first decade of the 21st century ends in just under a year's time!
When people throughout the world were celebrating the start of the year 2000, many experts pointed out—rightly or wrongly—that the 21st century would not come into existence before the start of the year 2001. My personal thinking has always been dominated by the culture of computer programming, in which we start counting systematically with zero. We think of the first ten decimal digits as 0, 1, 2, 3, 4, 5, 6, 7, 8 and 9. Similarly, the two binary digits (bits) are written as 0 and 1, not 1 and 2. And the sixteen hexadecimal symbols are 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E and F. Likewise the highest decimal value that can be stored in a byte (eight bits) is 255, not 256. So, there is a distinct modern technical culture that pushes us to consider that the 21st century started indeed on January 1, 2000. Besides, the notorious but nonexistent Y2K bug was supposed to go into action on January 1, 2000. Historically, people use the BC dating system as if it referred to times before the symbolic birth of Christ. Let's imagine that Jesus grew in the womb of the Virgin Mary for a period of nine months. In that case, what was the date at which the Holy Spirit fecundated Mary? I would say that this miraculous event took place around the beginning of March of the year 1 BC. If Jesus was actually born on December 25 of that same year, then I would say that his first birthday took place on December 25 of the year zero. It's true that I can appreciate why many people might prefer to say that there was never a year zero. In other words, Jesus was growing in the womb during the year 1 BC, and he was a a six-months old baby during the year 1 AD. But, to people with a vaguely mathematical or computing outlook on numbers, that sounds like talk from an epoch preceding the invention of the concept of zero. Just as Nature is said to abhor vacuums, computer programmers detest numbering systems that don't include a zero.
ReplyDelete"According to the Gregorian calendar, the 1st century C.E. started on January 1, 1 and ended on December 31, 100. The 2nd century started at year 101, the third at 201, etc. The n-th century started/will start on the year 100×n - 99. A century will only include one year, the centennial year, that starts with the century's number (e.g. 1900 is the final year in the 19th century)."
ReplyDeleteWikipedia
Yes, things are perfectly clear at this level. People with a mathematical and/or computing culture are aware that their style of counting (starting with zero) differs from that of the Church (starting with one). In this domain, as in many others, our decision to differ from the arbitrary traditions and dictates of the Church is conscious, deliberate and harmonious (with respect to the methods of number theory and computer programming).
ReplyDeleteFunnily enough, certain dictionaries and encyclopedias tend to make the distinction fuzzy by using the adjective "cultural" for the begin-with-zero approach (because it corresponds to popular culture, which considers, say, that the Roaring Twenties started on January 1, 1920), and the adjective "mathematical" for the begin-with-one approach (because of the trivial arithmetical formula presented by the Church).
[interesting but irrelevant observation] As I said, people who are accustomed to playing around with number systems (binary, decimal, hexadecimal, etc) find it abominable that anybody should ask them to start counting by 1. Now, there's a minor detail that changes nothing whatsoever in this domain. The series of positive integers does indeed start with 1, simply because zero is definitely not a positive integer. In counting systems (as in the binary symbols of computing, or in the symbols we use to designate the names of years), the zero is a mere sign which does not differ intrinsically from any of the other familiar signs such as 1, 2, 3, etc. But when we talk about genuine mathematical entities such as the positive integers, zero becomes a quite different exotic beast... for the simple reason that we can divide any number whatsoever by a positive integer, whereas we can divide no number by zero without stepping into a new ballpark that includes the concept of infinity.