I took physics in college just so people know. The zeroth law states that two objects with the same temperature as a third will be the same temperature as each other. It is simple thermodynamics.
Why then, do astronomers assume that no "star" can cool below 2400 Kelvin, being that outer space is an incredible heat sink and can easily allow for solid objects to drop to the single digits Kelvin?
Seems quite short sighted. Like saying, "All campfires have flames." Using that logic, there are no such thing as red hot embers which are not apparent in daylight. Seems to me someone made a huge mistake, which is why I'm posting this paper here. I think this makes much more sense. The heat would become trapped as the star lost its outer envelope and began solidifying in its internal regions, for the previous example the ash covered embers blocking heat radiation. Thus, it makes more sense to suppose that "planets" are ancient stars. The heat of the Earth is just the long cooled off remains of a star's differentiation process long ago. It is much colder, vastly older, and also has justification for its age. Unfortunately it places the Sun as being quite young, meaning the objects in our solar system are not actually related, but all adopted by the Sun in temporary conditions, and the Earth having orbited different stars in the past, giving cause to the cyclical extinctions which occurred on periods of many tens of millions of years.
More importantly, it is apparent that astronomers have ignored half of star evolution by calling the most ancient stars "planets", when their assumptions are questioned. They assumed all stars shine, which is false according to this paper.
It states in this paper that if the star is colder, it is older, if it is hotter, it is younger.
Seems quite straightforward to me.