Well, there are actually a lot of possibilities here. For example, what if an impactor breaks up just before impact, and then fractures the crust -- is it possible that the rille is both a fracture, due to an impact on a crust that was already under tensile stress, and a conduit for electrical discharges through the fracture, further excavating the rille? I think that my point here is just that I don't see a reason to lock down on an hypothesis, with so few data, and so many possibilities. It isn't going to prove anything.Lloyd wrote:Fault Rilles & Sinuous Rilles: But what do you think about these possibilities?
The latter is correct -- I'm saying that that if the Sun started out at 10,000 K and cooled according to the Stefan-Boltzmann Law, it's now 378 million years old (+/-).Lloyd wrote:Sun's Age: I thought 378 million years was your upper limit for the Sun's age. But now you're saying that's the approximately exact age.
The size is limited at the upper end by supernova theory, where anything above 1.4 solar masses would produce the internal pressure necessary for a runaway thermonuclear reaction (i.e., a Type 1a supernova). So I believe that all stars that survived the star formation process, and did not create a supernova, are < 1.4 solar masses. So that piece is contrary to some aspects of mainstream stellar theory, which allow stars to be many times more massive than the Sun, but it's consistent with conventional supernova theory, as well as nuclear physics, in that it acknowledges the well-known limits on how much pressure you can have before the runaway reaction occurs.Lloyd wrote:Are you fairly sure what size and temperature the Sun had initially?
As discussed elsewhere, I don't know what the lower limit is, but it seems that a lot of stars begin at something like 1/3 the mass of the Sun. If the Earth was once a star, as I believe, then 1/333,000 solar masses is still possible for a star.
The initial temperature of the Sun is an interesting question. Just with adiabatic compression of the primordial dusty plasma, plus the thermalization of the kinetic energy in the implosion, it should have been a lot hotter than 10,000 K. So I'm saying that most of the kinetic energy got converted to electrostatic potential. So how did I settle on the 10,000 K figure?
If stars form at roughly the same mass (?), and if they cool according to the Stefan-Boltzmann Law, and if they're forming at random times, then in a large population of stars, we should see specific quantities of stars at each temperature. The Stefan-Boltzmann Law requires that stars cool rapidly at first, and thereafter, the heat loss levels off with time, asymptotically approaching absolute zero at an infinite time from now. So we should see just a few stars at the higher temperatures, and lots of stars at the lower temperatures. And that's exactly what we see in star inventories.
Code: Select all
class temperature percent min max of total --------------------------------------- O 30,000 ∞ 0.00003 B 10,000 30,000 0.13 A 7,500 10,000 0.60 F 6,000 7,500 3.00 G 5,200 6,000 7.60 K 3,700 5,200 12.10 M 2,400 3,700 76.45
Where the Stefan-Boltzmann curve diverged from the observations of large populations of stars was in the K class. But we know that stars in that class are flare stars, where sporadically the temperature jumps way up, and so does the heat loss rate. So a star isn't just a simple black-body radiator that will cool according to the Stefan-Boltzmann Law -- it's a complex EMHD system that undergoes some sort of transition in the K class, and that needs to be taken into account. Then the M class falls right in line.
So most stars seem to begin with masses between 1.4 and 0.3 solar masses, and at something like 10,000 K, to produce the large population statistics that we're seeing. Any given individual star could be anywhere within the valid range.
If the Sun began at its present temperature, then why isn't it getting cooled off by radiative heat loss? I "think" that the only answer to that question would be that the Sun's heat is being generated dynamically, such as from nuclear fusion. But by my reckoning, fusion is only responsible for 1/3 of the Sun's power -- the rest is electrostatic potential getting reconverted to kinetic energy (in the form of ohmic heating). And that energy source will be reduced by radiative heat loss.Lloyd wrote:Isn't it possible that the Sun could have started at near it's present size and temperature?
BTW, another implication of the Sun and the Earth beginning at their current temperatures is that radiometric dating should be reliable. I'm saying that it isn't, because both the Sun and the Earth used to be a lot hotter, and radioactive decay rates run faster at higher temperatures. So I can get away with saying that the Earth is a lot younger than in the standard model.
That could also be taken to mean that both the Sun and the Earth are a lot older.Lloyd wrote:And isn't it also possible that it could have formed at one size and then accreted other bodies and gotten larger a long time later? And, if it got larger by accretion, wouldn't it also have gotten hotter? And wouldn't that mean it could be very young?