An old clock
Image credit: Illymarry [CC BY-SA 4.0], via Wikimedia Commons

The whole notion of time fascinates me endlessly—speaking metaphorically, of course. Numerous articles here at Interesting Thing of the Day have involved time or timekeeping in one form or another. In one of these articles, about analog clocks, I made what I thought was a commonsense and uncontroversial remark:

…time itself is continuous, not an infinite series of discrete steps…. Units like seconds, minutes, and hours are just a convenient, arbitrary fiction, after all—they don’t represent anything objectively real in the world.

A reader wrote in to suggest that I wasn’t up to date on my quantum physics, according to some theories of which time is indeed quantized, or fundamentally composed of very tiny but indivisible units.

At first, I had a hard time getting my head around this notion, and after considerable research…I still have a hard time getting my head around this notion. Although I try to keep generally abreast of the latest developments in the world of science, I can’t claim to do anything more than dabble in theoretical physics, and complex equations simply make my eyes glaze over. Nevertheless, it’s not only true that many scientists take the notion of quantized time for granted, there was also a fairly major uproar in the early 2000s when a young upstart from New Zealand published a paper that dared to challenge this notion with a theory that says, in effect, that there’s no such thing as an indivisible moment in time.

Second Thoughts

To understand what it would mean for time to be quantized, think of a unit of time, such as a second. You can divide that in half, getting two shorter periods of a half-second each. You can go much smaller, too, dividing a second into a thousand parts called milliseconds, a million parts called microseconds, a billion parts called nanoseconds, a trillion parts called picoseconds, and so on. A trillionth of a second is, to me, such an unimaginably short period of time that I’d be happy to consider it, for all practical purposes, indivisible—an “atom” of time, as it were. But that’s nothing. A trillionth of a second is a decimal point, 12 zeroes, and a 1. Some scientists say that meaningful distinctions in time can be made down to 10–44 second, or 44 zeroes after the decimal point before you reach that 1. But the question is: how low can you go? Is there some point, some number of zeroes, beyond which time cannot be divided any further?

One of the fundamental notions of calculus, and of physics, is that one can determine a moving object’s exact position at some instant in time. That there should be such a thing as an “instant” is taken as a given. An instant effectively doesn’t have duration; that would imply that a moving object changes its position between the start of that instant and its end—in other words, that its position can’t be known precisely. However, seemingly it can, or at least that operational assumption has served calculus well all these centuries. But is the notion of an instant merely a convenient fiction, or does it in some sense represent reality?

Among scientists studying quantum theory, and particularly among those working on the quixotic task of unifying general relativity with quantum physics, the question of whether time is truly continuous or not is of particular interest. Some scientists say that, as far as general relativity goes, time is continuous, but that in order to create a Grand Unified Theory, we might have to accept that it can be treated as a succession of temporal quanta (or chronons), in much the same way that light can be treated as either a wave or a particle. Others say that time is not merely a fourth dimension, but is itself three-dimensional, so from our point of view time is continuous, but from a point of view that encompasses time’s other dimensions, it’s quantized.

But all kinds of mysterious things happen in the quantum realm. What about the macro world we’re all familiar with?

Time for a Kiwi

In 2003, a then-27-year-old student from New Zealand named Peter Lynds published a paper in the peer-reviewed journal Foundations of Physics Letters that caused a great deal of controversy. Lynds claimed, essentially, that the whole notion of an instant is flawed, because if there were such a thing, a moving object measured and observed at that instant would appear to be static, and thus indistinguishable from a genuinely static object measured at that same instant. Since the two measurements clearly represent objects with different states, Lynds argued, it must be the case that there really aren’t any instants, only intervals (though those intervals might be very tiny). If true, this means that a moving object’s position can only ever be approximated—whether at the macro level or at the quantum level. And for this very reason, most of Zeno’s paradoxes turn out not to be paradoxical after all. Lynds went on to claim that time doesn’t flow because flow presumes an ongoing series of instants, that there is no “now” as such, and that our perception of time is just an odd consequence of the way our brains are wired.

The term “snapshot” is frequently used to describe the instant of time at which an object’s position might be determined, but I think it actually helps to make Lynds’s point. If you’re taking a picture of something that’s moving, you need a fast shutter speed to “freeze” the action, and the faster your subject is moving, the faster the shutter speed has to be. But if you set your shutter to, say, 1/4000 of a second and the photograph shows an arrow in mid-flight, with no blurring to suggest motion, that still doesn’t mean the arrow didn’t cover any distance during that tiny portion of a second the shutter was open. Of course it did. It’s just that the distance was sufficiently small, given the resolution of the camera and the human eye, to create the illusion of being frozen. So even if your hypothetical “shutter speed” is a zillionth of a second long, so that your measurement appears to give an exact, fixed location, that, too, is merely an illusion. The object in fact occupies more than one position during that time. Nothing mysterious about that at all.

Instant Controversy

When I heard Lynds’s idea, I thought it made perfectly good sense, and what I couldn’t comprehend was how scientists claimed, with considerable fervor, that they either couldn’t understand it or thought it was wrong-headed. I confess that I have not followed the debate about Lynds’s paper very closely in the years since its publication, and that I can understand only part of what I’ve read. However, it seems to me that many criticisms tend to mention either or both of two facts. First, critics note that Lynds was uncredentialed—he only had six months of university study at the time, so who was he to gainsay PhDs with years of experience? And second, if he were correct, that would mean that calculus as we know it must be essentially wrong or at least incomplete. And we all know it’s right. Right?

As to the matter of Lynds’s erstwhile lack of an advanced degree, all I have to say is: if he’s correct, that doesn’t matter, and those who say otherwise take themselves, and their formal education, way too seriously. As for the supposed assault on calculus, well, Lynds implies that calculus is not exactly wrong so much as very slightly inaccurate. Calculus as it stands appears to be right, but then, so are Newton’s laws of physics. Except they aren’t always: Newtonian physics breaks down both at the quantum level and when objects approach the speed of light. It seems to me—and again, I’m speaking as a nonmathematician here—that the very same thing could be true in this case. Calculus can be right at one level, and the absence of quantized time can be right at another level.

Of course, those are not the only criticisms, and the debate between Lynds’s supporters and detractors has gone through so many rounds of rebuttals and rejoinders that I can no longer keep track of who thinks what. But on the whole, the debate has made me feel even more secure in my personal, nonscientific belief that time is continuous, and I’m not going to doubt that for one instant.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on July 21, 2006.