Updated 2014 by Don Koks. Original by Steve Carlip (1997) and Philip Gibbs 1996.

The short answer is that it depends on who is doing the measuring: the speed of light is only guaranteed to have a value of 299,792,458 m/s in a vacuum when measured by someone situated right next to it. But let's approach the question by considering its various meanings.

Yes. Light is slowed down in transparent media such as air, water and glass. The ratio by which
it is slowed is called the refractive index of the medium and is *usually* greater than
one.^{*} This was discovered by Jean Foucault in 1850.

When people talk about "the speed of light" in a general context, they usually mean the speed of light in a
vacuum. They also usually mean the speed as measured in an inertial frame. This vacuum-inertial
speed is denoted *c*.

At the 1983 *Conference Generale des Poids et Mesures*, the following SI (Systeme International)
definition of the metre was adopted:

The metre is the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second.

This defines the speed of light in vacuum to be *exactly* 299,792,458 m/s. Unfortunately it
doesn't mention anything about inertial frames, but you can consider a measurement in an inertial frame to be
implied.

However, this is not the end of the matter. The SI is based on very practical considerations.
Definitions are adopted according to the most accurately known measurement techniques of the day, and are
constantly revised. At the moment you can measure macroscopic distances most accurately by sending out
laser light pulses and timing how long they take to travel using a very accurate atomic clock. (The best
atomic clocks are accurate to about one part in 10^{13}.) It therefore makes sense to define the
metre unit in such a way as to minimise errors in such a measurement.

The SI definition makes certain assumptions about the laws of physics. For example, it assumes that
the particle of light, the photon, is massless. If the photon had a small rest mass, the SI definition
of the metre would become meaningless because the speed of light would change as a function of its
wavelength. The SI Committee could not just define it to be constant; instead, they would have to fix
the definition of the metre by stating which colour of light was being used. Experiments have shown that
the mass of the photon must be very small if it is not zero (see the FAQ entry
What is the mass of the photon?). Any such
possible photon rest mass is certainly too small to have any practical significance for the definition of the
metre in the foreseeable future, but it cannot be shown to be exactly zero—even though currently
accepted theories indicate that it is. If the mass weren't zero, the speed of light would not be
constant; but from a theoretical point of view we would then take *c* to be the upper limit of the
speed of light in vacuum so that we can continue to ask whether *c* is constant.

The SI definition also assumes that measurements taken in different inertial frames will give the same results for light's speed. This is actually a postulate of special relativity, discussed below.

Previously the metre and second have been defined in various different ways according to the measurement techniques of the time. They could change again in the future. If we look back to 1939, the second was defined as 1/84,600 of a mean solar day, and the metre as the distance between two scratches on a bar of platinum-iridium alloy held in France. We now know that there are variations in the length of a mean solar day as measured by atomic clocks. Standard time is adjusted by adding or subtracting a leap second from time to time. There is also an overall slowing down of Earth's rotation by about 1/100,000 of a second per year due to tidal forces between Earth, Sun, and Moon. There may have been even larger variations in the length or the metre standard caused by metal shrinkage. The net result is that the value of the speed of light as measured in m/s was slowly changing at that time. Obviously it would be more natural to attribute those changes to variations in the units of measurement than to changes in the speed of light itself, but by the same token it's nonsense to say that the speed of light is now constant just because the SI definitions of units define its numerical value to be constant.

But the SI definition highlights the point that we need first to be very clear about what we mean by
constancy of the speed of light, before we answer our question. We have to state what we are going to
use as our standard ruler and our standard clock when we measure *c*. In principle, we could get
a very different answer using measurements based on laboratory experiments, from the one we get using
astronomical observations. (One of the first measurements of the speed of light was derived from
observed changes in the timing of the eclipses of Jupiter's moons by Olaus Roemer in 1676.) We could,
for example, take the definitions of the units as they stood between 1967 and 1983. Then, the metre was
defined as 1,650,763.73 wavelengths of the reddish-orange light from a krypton-86 source, and the second was
defined (then as now) as 9,192,631,770 periods of the radiation corresponding to the transition between the
two hyperfine levels of caesium-133. Unlike the previous definitions, these depend on absolute physical
quantities which apply everywhere and at any time. Can we tell if the speed of light is constant in
those units?

The quantum theory of atoms tells us that these frequencies and wavelengths depend chiefly on the values of Planck's constant, the electronic charge, and the masses of the electron and nucleons, as well as on the speed of light. By eliminating the dimensions of units from the parameters we can derive a few dimensionless quantities, such as the fine-structure constant and the electron-to-proton mass ratio. These values are independent of the definition of the units, so it makes much more sense to ask whether these values change. If they did change, it would not just be the speed of light which was affected. All of chemistry depends on their values, and significant changes would alter the chemical and mechanical properties of all substances. Furthermore, the speed of light itself would change by different amounts according to which definition of units was used. In that case, it would make more sense to attribute the changes to variations in the charge on the electron or the particle masses than to changes in the speed of light.

In any case, there is good observational evidence to indicate that those parameters have not changed over most of the lifetime of the universe. See the FAQ article Have physical constants changed with time?

(Note that the fine-structure constant does change with energy scale, but I am referring to the constancy of its low-energy limit.)

Another assumption on the laws of physics made by the SI definition of the metre is that the theory of relativity is correct. It is a basic postulate of the theory of relativity that the speed of light is the same in all inertial frames. This can be broken down into two parts:

- The speed of light is independent of the motion of the observer.
- The speed of light does not vary with time or place.

To state that the speed of light is independent of the velocity of the observer is very counterintuitive. Some people even refuse to accept this as a logically consistent possibility, but in 1905 Einstein was able to show that it is perfectly consistent if you are prepared to give up assumptions about the absolute nature of space and time.

In 1879 it was thought that light must propagate through a medium in space, the *ether*, just as
sound propagates through the air and other substances. The two scientists Michelson and Morley set up an
experiment to attempt to detect the ether, by observing relative changes in the speed of light as Earth
changed its direction of travel relative to the sun during the year. To their surprise, they failed to
detect any change in the speed of light.

Fitzgerald then suggested that this might be because the experimental apparatus contracted as it passed through the ether, in such a way as to countermand the attempt to detect the change in velocity. Lorentz extended this idea to changes in the rates of clocks to ensure complete undetectability of the ether. Einstein then argued that those transformations should be understood as changes of space and time rather than of physical objects, and that the absoluteness of space and time introduced by Newton should be discarded. Just after that, the mathematician Minkowski showed that Einstein's theory of relativity could be understood in terms of a four dimensional non-euclidean geometry that considered space and time as one entity, ever after called spacetime.

The theory is not only mathematically consistent, it agrees with many direct experiments. The
Michelson-Morley experiment was repeated with greater accuracy in the years that followed. In 1925
Dayton Miller announced that he *had* detected a change in the speed of light and was even awarded
prizes for the discovery, but a 1950s appraisal of his work indicated that the most likely origin of his
results lay with diurnal and seasonal variations in the temperature of his equipment.

Modern instruments could easily detect any ether drift if it existed. Earth moves around the Sun at a
speed of about 30 km/s, so if velocities added vectorially as newtonian mechanics requires, the last 5 digits
in the value of the speed of light now used in the SI definition of the metre would be meaningless.
Today, high energy physicists at CERN in Geneva and Fermilab in Chicago routinely accelerate particles to
within a whisker of the speed of light. Any dependence of the speed of light on inertial reference
frames would have shown up long ago, unless it is very slight indeed. Their measurements are actually
made in a *non-inertial* frame because gravity is present. But in the context of the
measurements, this non-inertial frame is almost identical to a "uniformly accelerated frame" (this is actually
the content of Einstein's Principle of Equivalence). And it turns out that a measurement of light's
speed made in a uniformly accelerated frame directly by someone who is very close to the light will return the
inertial value of *c*—although that observer *must* be close to the light to
measure this value.

But what if we pursued the original theory of Fitzgerald and Lorentz, who proposed that the ether is there, but is undetectable because of physical changes in the lengths of material objects and the rates of clocks, rather than changes in space and time? For such a theory to be consistent with observation, the ether would need to be completely undetectable using clocks and rulers. Everything, including the observer, would have to contract and slow down by just the right amount. Such a theory could make exactly the same prediction in all experiments as the theory of relativity; but it would reduce the ether to essentially no more than a metaphysical construct unless there was some other way of detecting it—which no one has found. In the view of Einstein, such a construct would be an unnecessary complication, to be best eliminated from the theory.

That the speed of light depends on position when measured by a *non*-inertial observer is a fact
routinely used by laser gyroscopes that form the core of some inertial navigation systems. These
gyroscopes send light around a closed loop, and if the loop rotates, an observer riding on the loop will
measure light to travel more slowly when it traverses the loop in one direction than when it traverses the
loop in the opposite direction. The gyroscope *does* employ such an observer: it is the
electronics that sits within the gyro. This electronic observer detects the difference in those light
speeds, and attributes that difference to the gyro's not being inertial: it is accelerating within some
inertial frame. That measurement of an acceleration allows the body's orientation to be calculated,
which keeps it on track and in the right position as it flies.

Discussing non-inertial observers can be simpler if we consider not the rotating frame of a laser gyroscope, but the "uniformly accelerated" frame of someone who sits inside a rocket, far from any gravity source, accelerating at a rate that makes them measure their weight as constant. (That's a very natural definition of uniform acceleration. I am using what I called the "contact camp" definition of weight in the FAQ entry What is Weight?.) In fact, the room in which you are sitting right now is a very high approximation to such a frame—as mentioned above, this is the content of Einstein's Principle of Equivalence.

So consider the question: "Can we say that light confined to the vicinity of the ceiling of this room is
travelling faster than light confined to the vicinity of the floor?". For simplicity, let's take Earth
as not rotating, because that complicates the question! The answer is then that (1) an observer
stationed on the ceiling measures the light on the ceiling to be travelling with speed *c*, (2) an
observer stationed on the floor measures the light on the floor to be travelling at *c*, but (3) within
the bounds of how well the speed can be defined (discussed below, in the General Relativity section), a
"global" observer can say that ceiling light *does* travel faster than floor light.

That might sound strange, so let's take it in stages. Begin with the relativity idea that an inertial
observer does measure the speed of light to be *c*. In particular, we'll need the all-important
topic of the relativity of simultaneity, for which you can find the expression *vL/c ^{2}* in
most textbooks that discuss the fundamentals of special relativity. This quantity is the amount of time
by which the clock on the tail of a train reads ahead of the driver's clock when the train has rest length L,
approaches us at velocity

Imagine that two planets in that galaxy are 2 light-days apart, and one sends a pulse of light to the
other. During the period that we accelerated and clocks in Andromeda jumped 2 days ahead of us, that
light pulse travelled from one planet to the other. But we can accelerate however quickly we like, so
we'll conclude that during our brief period of acceleration, the light passing between those two planets
travelled much much faster than *c*. So while you accelerate towards Andromeda, both light and
clocks (i.e. the flow of time itself) speed up in Andromeda—but only while you accelerate.

None of the preceding discussion actually depends on the distances being large; it's just easier to
visualise if we use such large distances. So now transfer that discussion to a rocket you are sitting
in, far from any gravity and uniformly accelerated, meaning you feel a constant weight pulling you to the
floor. "Above" you (in the direction of your acceleration), time speeds up and light travels faster
than *c*, arbitrarily faster the higher up you want to consider. Now use the Equivalence
Principle to infer that in the room you are sitting in right now on Earth, where real gravity is present and
you aren't really accelerating (we'll neglect Earth's rotation!), light and time must behave in the same way
to a high approximation: light speeds up as it ascends from floor to ceiling, and it slows down as it descends
from ceiling to floor; it's not like a ball that slows on the way up and goes faster on the way down.
Light travels faster near the ceiling than near the floor. But where *you* are, you always
measure it to travel at *c*; no matter where you place yourself, the mechanism that runs the clock
you're using to measure the light's speed will speed up or slow down precisely in step with what the light is
doing. If you're fixed to the ceiling, you measure light that is right next to you to travel
at *c*. And if you're fixed to the floor, you measure light that is right next to you to travel
at *c*. But if you are on the floor, you maintain that light travels faster than *c* near
the ceiling. And if you're on the ceiling, you maintain that light travels slower than *c* near
the floor.

You can also infer that as a distant wavefront travels transversely to your "up" direction, the more distant parts of it will be travelling faster than the nearer parts. So, just as light bends when it enters glass at an angle, you won't be surprised to see the distant light bend toward you. And, of course, bending light is something you'll find in textbooks that illustrate the Equivalence Principle with a picture of a guy in an elevator encountering a beam of light.

Next step: again in the zero-gravity accelerated frame, as you accelerate toward Andromeda, ask what
happens in the direction opposite to Andromeda. Think of another train behind you if you prefer, but now
the velocity *v* has changed sign: the train is receding instead of approaching. So your changing
standard of simultaneity makes clock readings behind you jump backwards, even though the "train clocks"
themselves are still "timing forwards" as far as they are concerned. The clocks immediately behind you
will appear almost normal, but at some critical distance further back, the amount by which your new standard
of simultaneity makes them seem to jump back just balances the amount by which they have timed forwards, and
the result is that, as far as your standard of simultaneity is concerned, they have stopped. This is all
about *your* standard of simultaneity. The clocks themselves don't know anything about what
you're doing of course; they just continue to do what they were built to do. It turns out that if you
accelerate with some value *a* (meaning you feel a constant acceleration of *a*—and that
means your world line is actually a hyperbola on a spacetime diagram on which inertial observers follow
straight world lines), then this critical distance behind you at which you maintain that time and light have
stopped is *c ^{2}/a*. So if you accelerate at one Earth gravity, that distance is about
0.97 light-years, which is near enough to one light-year to make a nice rule of thumb. The more strongly
you accelerate, the closer this "horizon" will be to you. If you stop accelerating, the horizon moves
off to be infinitely far away.

So imagine again that the room in which you're sitting is an accelerating rocket far from gravity, and your
weight is due to its acceleration upwards. Your 1-g acceleration means you infer that light and time
flow faster above you and slower below you. About one light-year below you is a plane parallel to the
floor on which light and time slow to a stop, the horizon mentioned a few lines back. Below that plane
time flows backwards, but you can never receive a signal from below that plane—a fact that you can prove
easily with a quick sketch on the spacetime diagram of an inertial observer, where you'll notice that you'll
forever outrun a light signal that was sent to chase you from that far away, even though an inertial observer
says that the light is travelling (at *c*) faster than you are. So you'll never see any weird
breakage of causality occurring beyond the horizon.

Saying that light and time have stopped on this horizon is a consequence of your changing standard of simultaneity as you accelerate. Anyone sitting on or beyond the horizon just continues life as usual; they can't be influenced by your state of motion. Although you maintain that they have stopped ageing, they themselves notice nothing unusual. In that sense, what we say about the flow of time and the speed of light is all about the coordinates that we have used to describe the world of our accelerated frame. But those coordinates are not silly and arbitrary, because they reflect the fact that we can build our accelerated frame by using the standard mechanism of making measurements in special relativity: we construct a rigid lattice of observers whose clocks always agree with ours, and who don't move relative to us. This construction is precisely what a uniformly accelerated frame is, and it's by no means obvious that it's possible to do: for example, an inertial observer will measure the accelerations of those other accelerated observers to differ from our own acceleration—even though we and all the accelerated observers say that they remain a fixed distance from us and from each other. That might sound odd, and to see why it's true, you have to follow the special-relativistic ideas of simultaneity, timing, and length very carefully. So although this changing standard of simultaneity might be referred to by some as just some kind of coordinate artifact, we shouldn't trivialise the use of such coordinates. They are what our world is built on.

(This changing standard of simultaneity of an accelerating observer is the real kernel behind resolving the Twin Paradox. Most discussions of the Twin Paradox try to simplify things by having the space traveller maintain constant speed on both the outbound and inbound leg, necessitating an infinitesimal period of infinite acceleration at the start of the return trip. In so doing, these discussions throw the baby out with the bath water by producing an analysis that contains an awkward gap in the timing at the moment the space traveller changes direction. If those analyses were to have the traveller accelerate in a more realistic way, what would result would be a very much more difficult, yet far more complete, analysis of the Twin Paradox that has no weird timing gaps.)

You can see that as you go about your daily life, accelerating every which way as you walk around, your
standard of simultaneity is see-sawing madly all around you. Playing around with lines of simultaneity
on a spacetime diagram and maintaining that time is doing weird things are we accelerate might seem like a
departure from good common sense. We must appeal to experiment to keep from straying into an abstract
fairy world that has nothing to do with reality. But via the Equivalence Principle, these
special-relativistic ideas of changing simultaneity feed into general relativity, and in this day and age
we *do* have the luxury of experiments that daily confirm that more advanced theory. If general
relativity didn't work, then the GPS satellite system would fail dismally at telling you where you are and
what the time is.

One note: how can you measure the speed of light if it's not right next to you? You do that through
the standard mechanism, mentioned above, of employing a lattice of observers whose clocks always agree with
yours, and who don't move relative to you. You then use the measurement of the observer who was right
next to the light whose speed you wanted to measure. And that's fine, because that observer in not
moving relative to you, and their clock always agrees with yours. That's the standard way
that *all* measurements are done in the context of special relativity.

Making observations from an *inertial* frame (and using its coordinates) produces a speed of light
that is always *c*. In that respect the inertial frame's coordinates are better for some
analyses; but the accelerated frame is more natural to our description of the world around us. After
all, we don't live our lives in free fall.

Einstein went on to propose a more general theory of relativity which explained gravity in terms of curved spacetime, and the next level of sophistication of treating our ceiling and floor observers takes real gravity into account.

It's easy to build a continuum of observers in flat spacetime with everyone inertial, who each measure
events only in their vicinity. It's possible but much harder to do the same for a uniformly accelerated
frame. For more complicated frames and also for real gravity, we find that I simply can't populate space
with a continuum of observers who all agree with me on distances and simultaneity. We just won't have a
common standard of rulers and clocks. Each observer is going to measure the speed of light to
be *c* in his vicinity, but I can't accurately talk about the speed of a distant light ray (or anything
else), because I can't enlist anyone to make measurements for me in such a way that we all agree on what space
and time standards we're using.

Given this situation, in the presence of more complicated frames and/or gravity, relativity generally
relinquishes the whole concept of a distant object having a well-defined speed. As a result, it's often
said in relativity that light always has speed *c*, because only when light is right next to an
observer can he measure its speed— which will then be *c*. When light is far away, its
speed becomes ill-defined. But it's not a great idea to say that in this situation "light everywhere has
speed *c*", because that phrase can give the impression that we *can* always make measurements
of distant speeds, with those measurements yielding a value of *c*. But no, we generally can't
make those measurements. And the stronger gravity is, the more ill-defined a continuum of observers
becomes, and so the more ill-defined it becomes to have any good definition of speed. Still, we can say
that light in the presence of gravity does have a position-dependent "pseudo speed". In that sense, we
could say that the "ceiling" speed of light in the presence of gravity is higher than the "floor" speed of
light.

Einstein talked about the speed of light changing in his new theory. In the English translation of
his 1920 book "Relativity: the special and general theory" he wrote: *"according to the general theory of
relativity, the law of the constancy of the velocity* [Einstein clearly means speed here, since velocity
(a vector) is not in keeping with the rest of his sentence] *of light in vacuo, which constitutes one of
the two fundamental assumptions in the special theory of relativity* [...] *cannot claim any unlimited
validity. A curvature of rays of light can only take place when the velocity* [speed] * of
propagation of light varies with position."* This difference in speeds is precisely that referred to
above by ceiling and floor observers.

In special relativity, the speed of light is constant when measured in any *inertial* frame.
In general relativity, the appropriate generalisation is that the speed of light is constant in any freely
falling reference frame (in a region small enough that tidal effects can be neglected). In this passage,
Einstein is not talking about a freely falling frame, but rather about a frame at rest relative to a source of
gravity. In such a frame, the not-quite-well-defined "speed" of light can differ from *c*,
basically because of the effect of gravity (spacetime curvature) on clocks and rulers.

In general relativity, the constancy of the speed of light in *inertial* frames is built in to the
idea of spacetime being a geometric entity. The causal structure of the universe is determined by the
geometry of "null vectors". Travelling at the speed *c* means following world-lines tangent to
these null vectors. The use of *c* as a conversion between units of metres and seconds, as in the
SI definition of the metre, is fully justified on theoretical grounds as well as practical terms, because
*c* is not merely the vacuum-inertial speed of light, it is a fundamental feature of spacetime
geometry.

Like special relativity, some of the predictions of general relativity have been confirmed in many different observations. The book listed below by Clifford Will is an excellent reference for further details.

C.M. Will, "Was Einstein Right?" (Basic Books, 1986)

For an in-depth analysis of the speed of light in an accelerated frame, see Chapter 7 of "Explorations in Mathematical Physics" by D. Koks (Springer, 2006).

^{*} The refractive index can be less than one. Indeed, it is almost always less than
one for X-rays. This is because the phase speed of X-rays in a medium (i.e. the speed of their wave
fronts) is faster than the phase speed of visible light, and the refractive index is the ratio of phase
speeds. The speed of photons is the "group speed", which is always slower than *c* (except when
it isn't :-). For simplicity we ignore the distinction in this article. See the Relativity FAQ
article on faster than light (phase speed) for an explanation. (Thanks to
Pieter Kuiper for pointing this out.)