Posts Tagged ‘time’

Class Time

Thursday, 3 December 2015

At a site whose content seems intended to entertain, I read of a teacher who is said to have challenged his or her students to explain time and to define time. The words explain and define are treated in the narrative as if referring to the same task, which suggests something about the sort of answer sought. None of the students succeeded in doing what the teacher asked.

While we might perhaps have different conceptions of time, the essential concept of time is not one that we assemble from and with other concepts. Time is fundamental in our experience. Thus, when we seek to define time, the best that we can do is to find synonyms that might seem to put us into loops. For example, The Oxford Shorter English Dictionary defines time with duration, and duration with time. But to define a term is to coördinate it with a concept; so either definition actually works just fine as a definition, on the assumption that we have a concept for the complementary term.

Definitions often involve conveying a concept by showing how to assemble it from and with other concepts; that is perhaps what one expects when asked to explain a concept or a word. But disassemblies that somehow never reached an end would never reach a concept. We must at some stage somehow point to a concept without further use of definition. In the case of time, we have reached a concept that we cannot disassemble; in the case of time, we have found a word for which we can find only either simple synonyms or assemblies in which its concept lurks undisintegrated, even if unrecognized.

Will This Time Be Any Different?

Tuesday, 11 June 2013

Time is measured by sequences of changes. Classic examples of sequences that have been used are beats of the heart, apparent positions of the sun and of the moon, and the solstices. In theory, any sequence might be used.

When one measure of time is gauged against another, it is ultimately a matter of counting how many changes of each sort occur. When the rate of some different changes seems to be constant relative to those by which time has been measured, if these different changes occur in greater number then it may be decided to substitute these changes as the primary measure of time, and thus have a finer-grained measure.

Speeds and frequencies are not normally recognized as conversion factors for measures of time against each other, but that is exactly what they are. In the case of speeds generally, the changes are quantifications (of physical position, perhaps). In the more specific case of frequencies, one has a sequence of changes that are attainment of — or departure from — what is regarded as a recurring state; a count of these attainments or departures might be quantified as a pure number.

To say that some changes take place at a constant rate over time is simply to say that there is an invariant correspondence between the number of the changes by which we measure time and the number of the changes whose rate is being measured.

Such constancy occurs trivially when the changes that are used to measure time are measured against time — they're just being measured against themselves. If time is measured by appearances of the sun above the horizon, then the conversion factor for the frequency with which the sun appears above the horizon is necessarily 1. If we use these appearances to meaure the frequency with which the heart beats, that frequency will almost certainly be inconstant; but if we use heart-beats as our measure of time, then their frequency will necessarily be 1 (and the frequency with which the sun appears will almost certainly be inconstant).

If someone should ask about using the average rate at which the heart beats as the measure of time, then he or she is implicitly presuming something other than heart-beats in measuring them, or their average frequency will necessarily be 1. Any comfort in imagining that the reference changes are being averaged is a comfort in confusion.

Someone else will insist that it is obvious that the rate at which any reference-heart beats is objectively inconstant, but it is no such thing. Rather, the convenience for the formulation of descriptions of the world around us of some measures is so limited, and that of others so pronounced as to make it seem that some are closer to objective constancy than are others.

If the measure of time were in terms of Scott's heart-beats, then his mood would figure into descriptions of the universe (the rest of the world would be slower when Scott were excited), and we'd need his pulse whenever we timed things (in particular, all other changes would happen infinitely fast after Scott died). These characteristics make Scott's heart-beats grossly inconvenient for everyone excepting, perhaps, Scott.

What one wants of measures of time are accessibility, for them to result in manageable descriptions of rest of the world, and for them to not to deviate intolerably from our subjective experiences of time. There is a certain amount of rivalry at least between the first two desiderata. Those measurements that are most easily made don't correspond to the simplest descriptions of the rest of the world. But some measurements of time result in many fairly simple descriptions (some of those descriptions are even of other rates as constants) that appear perfectly accurate. It is the latter sort of measurement that is mostly likely to be taken as objective, but if the measurements that supported the simplest statements of physical laws made a very poor fit for subjective experiences of time, then these measurements would not widely be accepted (even by scientists) as measurements of time at all, objective or otherwise!


We're often told that the speed of light (in vacuo) is constant. Many people wonder why this constancy is so; others whether it is so. But what is actually meant by the assertion itself that the speed of light be constant is that the changes in the position of light maintain a fixed ratio with the changes by which we have chosen to measure time. In effect, those (such as Einstein) who said that this speed were constant were declaring that we ought to measure time in a manner that made the speed of light constant. And the reason that we ought to measure it thus was because accurate descriptions of the behavior of things of interest would be as simple as possible (or, at the least, as simple as possible without resulting in a measure unrecognizable as time).

The real question is not of why or whether the speed of light is constant; the real question is of why treating it thus simplifies accurate descriptions. And the basic answer is because light and stuff very much like it do a lot. One whoozit affects another by way of that stuff.

Of course, that basic answer shouldn't satisfy anyone with more than passing curiosity. My point is just that the idea that the speed of light is constant doesn't represent a mystery of the sort that many people take it to be.


This entry was primarily motivated by my desire to get the point about the constancy of the speed of light off my chest, but the more general part of it actually has application to questions of method in economics.

When I was in the graduate programme at UCSD, there was a student who wanted to do some sort of econometric work where the changes by which time were measured would be transactions, rather than ticks of an ordinary clock — he called this alternate measure market time. (I don't know whether he arrived at the concept or at this term on his own, but he didn't cite a prior source.) Sadly, his initial presentation was a disaster; when he attempted to explain this idea to the professors who were to judge his work, he came across to them and to most of the rest of the audience as incoherent. I might well have been the only other person in the room who really understood what he was trying to say. (I had for some while been thinking skeptically about the propensity of economists to use the physicists' t.)

Afterwards, I sat down with him and tried to explain to him what he had to communicate; but he seemed not to listen to me as he insisted that the response of the professors were unfair. And, the next time that he gave a presentation on the idea, he essentially repeated his previous performance.

It is at least plausible to me that a major part of the reason that he could not communicate what he was proposing to do was that he had only a vague intuïtion about the nature of measures of time and about distinctions amongst them.

I would note that the particular measure of time that he suggested is certainly not the one for economists to use in all or even most cases, and that it has never been one that had distinctively useful application to a problem that I've investigated.

Randomness and Time

Sunday, 20 February 2011

When someone uses the word random, part of me immediately wants a definition.[1]

One notion of randomness is essentially that of lawlessness. For example, I was recently slogging through a book that rejects the proposition that quantum-level events are determined by hidden variables, and insists that the universe is instead irreducibly random. The problem that I have with such a claim is that it seems incoherent.

There is no being without being something; the idea of existence is no more or less than that of properties in the extreme abstract. And a property is no more or less than a law of behavior.

Our ordinary discourse does not distinguish between claims about a thing and claims about the idea of a thing. Thus, we can seem to talk about unicorns when we are really talking about the idea of unicorns. When we say that unicorns do not exist, we are really talking about the idea of unicorns, which is how unicorns can be this-or-that without unicorns really being anything.

When it is claimed that a behavior is random in the sense of being without law, it seems to me that the behavior and the idea of the behavior have been confused; that, supposedly, there's no property in some dimension, yet it's going to express itself in that dimension.

Another idea of randomness is one of complexity, especially of hopeless complexity. In this case, there's no denial of underlying lawfulness; there's just a throwing-up of the hands at the difficulty in finding a law or in applying a law once found.

This complexity notion makes awfully good sense to me, but it's not quite the notion that I want to present here. What unites the notion of lawlessness with that of complexity is that of practical unpredictability. But I think that we can usefully look at things from a different perspective.


After the recognition that space could be usefully conceptualized within a framework of three orthogonal, arithmetic dimensions, there came a recognition that time could be considered as a fourth arithmetic dimension, orthogonal to the other three. But, as an analogy was sensed amongst these four dimensions, a puzzle presented itself. That puzzle is the arrow of time. If time were just like the other dimensions, why cannot we reverse ourselves along that dimension just as along the other three. I don't propose to offer a solution to that puzzle, but I propose to take a critical look at a class of ostensible solutions, reject them, and then pull something from the ashes.

Some authors propose to find the arrow of time in disorder; as they would have it, for a system to move into the future is no more or less than for it to become more disorderly.

One of the implications of this proposition is that time would be macroscopic; in sufficiently small systems, there is no increase nor decrease in order, so time would be said neither to more forward nor backward. And, as some of these authors note, because the propensity of macroscopic systems to become more disorderly is statistical, rather than specifically absolute, it would be possible for time to be reversed, if a macroscopic system happened to become more orderly.

But I immediately want to ask what it would even mean to be reversed here. Reversal is always relative. The universe cannot be pointed in a different direction, unless by universe one means something other than everything. Perhaps we could have a local system become more orderly, and thus be reversed in time relative to some other, except, then, that the local system doesn't seem to be closed. And, since the propensity to disorder is statistical, it's possible for it to be reversed for the universe as a whole, even if the odds are not only against that but astronomically against it. What are we to make of a distinction between a universe flying into reverse and a universe just coming to an end? And what are we to make of a universe in which over-all order increases for some time less than the universe has already existed? Couldn't this be, and yet how could it be if the arrow of time were a consequence of disorder?

But I also have a big problem with notions of disorder. In fact, this heads us back in the direction of notions of randomness.

If I take a deck of cards that has been shuffled, hand it to someone, and ask him or her to put it in order, there are multiple ways that he or she might do so. Numbers could be ascending or descending within suits, suits could be separated or interleaved, &c. There are as many possible orderings as there are possible rules for ordering, and for any sequence, there is some rule to fit it. In a very important sense, the cards are always ordered. To describe anything is to fit a rule to it, to find an order for it. That someone whom I asked to put the cards in order would be perfectly correct to just hand them right back to me, unless I'd specified some order other than that in which they already were.

Time's arrow is not found in real disorder generally, because there is always order. One could focus on specific classes of order, but, for reasons noted earlier, I don't see the explanation of time in, say, thermodynamic entropy.


But, return to decks of cards. I could present two decks of card, with the individual cards still seeming to be in mint state, with one deck ordered familiarly and with the other in unfamiliar order. Most people would classify the deck in familiar order as ordered and the other as random; and most people would think the ordered deck as more likely straight from the pack than the random deck. Unfamiliar orderings of some things are often the same thing as complex orderings, but the familiar orderings of decks of cards are actually conventional. It's only if we use a mapping from a familiar ordering to an unfamiliar ordering that the unfamiliar ordering seems complex. Yet even people who know this are going to think of the deck in less familiar order as likely having gone through something more than the deck with more familiar order. Perhaps it is less fundamentally complexity than experience of the evolution of orderings that causes us to see the unfamiliar orderings as random. (Note that, in fact, many people insist that unfamiliar things are complicated even when they're quite simple, or that familiar things are simple even when they're quite complex.)

Even if we do not explain the arrow of time with disorder, we associate randomness with the effects of physical processes, which processes take time. Perhaps we could invert the explanation. Perhaps we could operationalize our conception of randomness in terms of what we expect from a class of processes (specifically, those not guided by intelligence) over time.

(Someone might now object that I'm begging the question of the arrow of time, but I didn't propose to explain it, and my readers all have the experience of that arrow; it's not a rabbit pulled from a hat.)


[1] Other words that cause the same reäction are probability and capitalism.