On Friday, August 8, 2025 at 2:10:19 PM UTC-6 Brent Meeker wrote:
On 8/7/2025 10:17 PM, Alan Grayson wrote: I finally was able to identify and resolve my confusion about Hubble's Law. First, let's use a geometric model to establish that the recessional velocity of distant galaxies increases as the universe expands. For convenience, assume the universe is spherically shaped and uniformly expanding, and consider two galaxies at distances of one and ten billion light years removed from our own. As r, the radius of the universe increases linearly, so will the separation distances of these remote galaxies, since the arc distances to these galaxies, if they placed e.g, on the equator, will also increase linearly. So in some unit of time, if say the rate of increase is 10%, the closer galaxy will recede by 10% of 1 billion light years, or 100 milllion light years, whereas the most distant galaxy will recede 1 billion light years in *the same time duration*. So clearly, in an expanding universe, more distant galaxies will recede faster than nearer galaxies. Let's now consider the light emitted from these galaxies. The light reaching us left those galaxies 1 and 10 billion years ago respectively. If their red shifts represent their recessional velocities when the light was emitted, it would imply that in the early universe those galalaxies were receding very rapidly, the farther away *in time* they are, that is the more distant they are, the more rapidly they must be receding. Why not phrase this as the equally true statement, "The more distant they are the more rapidly *we *must be receding.", which is then consistent with your first paragraph? Anyway, I'm glad you resolved it to your own satisfaction. Brent *Thanks for your kind thought, but unfortunately I am still confused. I think the geometric model is conclusive; the more distant a galaxy is, the more rapid is its recessional velocity, which is Hubble's Law. Moreover, considering the red shifts of two galaxies of different distances, from the pov of time moving forward, there is slowing of recessional velocity due to gravity (ignoring the speed up discovered in 1998). But my problem arises when I consider time flowing backward, where in remote times the recessional velocity inferred from the red shift is huge. Clark seems to be of two minds on this; he has stated that in very early times, after the manifestation of the CMB of course, the galaxies were very close and receding from each other slowly; and once recently he stated the opposite. What, IYO, is going on the very early universe wrt recessional velocities, and why? TY, AG * But this contradicts the geometric model, wherein we have inferentially proven the opposite; that in early times, those galaxies were receding with *decreasing *velocity as their separation distances from us was decreasing. So what the hell is going on? The answer is that although the light emitted from those galaxies was emitted in the distant past, the expansion of the universe distorted those emissions as they propagated in our direction. That is, the red shifts observed were caused by the expansion of the universe, and therefore represents the current red shifts of those receding galaxies. AG -- You received this message because you are subscribed to the Google Groups "Everything List" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion visit https://groups.google.com/d/msgid/everything-list/f053be49-da62-4484-b6a0-3b8ab521ae2bn%40googlegroups.com.

