Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>The numbers we are given don't make sense

These are all numbers you just provided, with no source for them.

But even using your numbers, 300 billion is 3x10^11. The Sun provides about 10^5 lux, while starlight overall provides about 10^-4 lux[1], which is a difference of 10^9, meaning the difference between "all the starlight on a dark night" and "just the starlight from Sirius" would be around 10^2, which... seems about right?

1. https://en.wikipedia.org/wiki/Orders_of_magnitude_%28illumin...



Look up the provided numbers if you disagree.

You're comparing the Sun's illuminance at Earth (10^5 lux at 1 AU) to all starlight combined (10^-4 lux), then trying to work backward to what a single star should provide. That's not how this works.

The question isn't "what's the ratio between sunlight and all starlight." The question is: what happens when you move the Sun to stellar distances using inverse square law?

At 1 AU: ~10^5 lux

At 544,000 AU: 10^5 / (544,000)^2 = 10^5 / 3×10^11 ≈ 3×10^-7 lux

That's the Sun at Sirius's distance. Multiply by 25 for Sirius's actual luminosity: ~7.5×10^-6 lux.

Your own Wikipedia source says the faintest stars visible to naked eye are around 10^-5 to 10^-4 lux. So we're borderline at best, and that's with the 25× boost.

But moreover, you said "the difference between all starlight and just Sirius would be around 10^2." There are ~5,000-9,000 stars visible to the naked eye. If Sirius provides 1/100th of all visible starlight, and there are thousands of other stars, the math doesn't work. You can't have one star be 1% of the total while thousands of others make up the rest - unless most stars are providing almost nothing, which contradicts the "slightly brighter" compensation model.

Address the core issue: inverse square law predicts invisibility. The 25× luminosity factor is insufficient compensation. Citing aggregate starlight illuminance doesn't resolve this.


It's been a long time since my astrophysics, but I think the seeming contradiction you're running into might be from treating lux (illuminance) as a measure of emitted energy, when its actually a measure of received energy.

The Sun's (or any star's) emitted energy is measured in terms of solar luminosity.[1] The nominal value of solar luminosity is 3.83×10^26 watts. At twenty five times as luminous, Sirus' luminosity is 9.5710^27 watts. We can divide that by your 296 billon times, which gives.. 3.2x10^16 watts as what actually makes it to Earth. If the we convert that back into solar luminosity (to figure out the apparent brightness at Earth), its 8.3595 10^-11.

Now, if we look up at the sky, and check how bright the Sun and Sirius are from Earth on the magnitude scale, which each step is ~2.5 times brighter than the one below it (and vice versa), the Sun has an apparent magnitude of -27, while Sirus' is -1.46. I.e. the Sun in the sky is about 8 billion times brighter that Sirus is. That's within an order of magnitude of what its calculated solar luminosity should be. Again, it seems about right.

1. https://en.wikipedia.org/wiki/Solar_luminosity


Yeah, people get really messed up by just how good our eyes are. (For a close-to-home example, people think indoor plants get a lot closer to sunlight-level amounts than they really do.)

We can spot a single photon in the right conditions. https://www.nature.com/articles/ncomms12172


Eye sensitivity isn't the issue. Sirius isn't barely visible at the detection threshold, it's the brightest star in our sky. If a 25x luminosity boost over the Sun only gets you to the edge of naked-eye visibility at that distance, where do the additional orders of magnitude come from to make it one of the most prominent objects in the night sky? Show me the math.


> Sirius isn't barely visible at the detection threshold, it's the brightest star in our sky.

And it's entirely washed out during the day. The full Moon is very bright, but it's still 400,000 times dimmer than the Sun when seen from Earth, and that's only ten different. The brightest star in our sky is simply not very bright; our eyes are just pretty awesome.

That star you are seeing is 25 orders of magnitude dimmer.

https://astro.wku.edu/labs/m100/mags.html

"While you may perceive one star to be only a few times brighter than another, the intensity of the two stars may differ by orders of magnitude. (Light intensity is defined as the amount of light energy striking each square cm of surface per second.) The eye is a logarithmic detector. While the eye is perceiving linear steps in brightness, the light intensity is changing by multiplicative factors. This is fortunate; if the eye responded linearly instead of logarithmically to light intensity, you would be able to distinguish objects in bright sunlight, but would be nearly blind in the shade! If logarithms are a faint memory, you should peruse a refresher on logs and logarithmic scales before continuing."

https://physics.stackexchange.com/questions/329971/how-many-... says looking up at a sunny sky lets you take in 3×10^14 photons per second per eye. Yet you can see a single photon! https://www.nature.com/articles/ncomms12172

Or, we can conclude the entire field in dozens of countries simply can't do math. Your choice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: