Stories do change every decade or so and Farsight has a much greater and impressive log of the documented stories (which also change now and then), but from what I have gathered it all started when Einstein was a young professor teaching physics. He taught of how radio waves travel through the aether. If you raise the electric potential on an antenna, it takes time for that change to propagate through space. But the wave doesn’t care from where it came. It travels at a speed determined by the space itself, not the source. Thus a radio wave carrying the changes created by any antenna will travel at the same speed through space regardless of how fast the antenna might have been moving.
Since Einstein believed that light was merely another type of wave, the same must be true of light. His example was that if a man were on a train and shot an arrow, a man at the station would see the arrow traveling at the speed of the train plus the speed of an arrow leaving the man. But with light it wouldn’t be so. If that same man shined a “torch” ahead of him, the man at the station would see the light traveling at the speed of light regardless of the speed of the train.
But then things got complicated when Michelson Morley attempted to prove the existence of aether. Even though their experiment was horribly flawed, they assumed that because they didn’t prove aether, they must have disproved aether. For more interesting reasons, it became excessively important that people not believe in aether. But then if there is no aether, would Einstein’s (and just about everyone at the time) theory of wave speed be right?
As it turned out when they analyzed physics with the Galilean relativity postulate (it doesn’t matter who is moving, the laws have to be the same for all), if they assumed that wave speed was constant but time itself changed for the moving observer, then everything would work out… (almost). If the time (the speed of a clock) held by the man on the train slowed, then the man would perceive (if he could) that the light was traveling away from him at the speed of waves and also the man at the station would see the light traveling at the speed of waves. As it turned out with experiments, sure enough clocks really do slow down when they are moving faster relative to a clock that isn’t. But of course, that got into “who is really moving and who isn’t”. About that time, things got all too complicated for them to keep it all actually straight but none the less, we ended up with Special Relativity due to what must be true if their prior assumptions from observations were true.
So the current popular theory is that if a man were traveling near the speed of light, his clock would be turning so slow, that when he measured the light that seemed to other to be moving near the same speed as he, he would still measure the light to be traveling from him at the same speed it always travels.
But as I have substantial pointed out (FINALLY after over 800 posts) in the Stopped Clock Paradox - page 12, their reasoning is still seriously flawed and in fact, impossible.