[*]If the sun emits 10^45 photons per second,
If 1 attosecond: the time it takes for light to travel the length of three hydrogen atoms
If 1 attosecond=10^-18 seconds
The sun emits 10^27 photons per attosecond
If Plank time, is the smallest unit quantum of time 10^-43 seconds
The sun emits 100 photons per plank time
If The Sun is 864,400 miles (1,391,000 kilometers) diameter
If the Sun's surface area is 6.09×10^12 km2
If the average photon width is 400 nanometers (1/1000th of the human hair width
The atom is 0.1 nanometers
The space between photons at the surface of the sun would be random
The average space between photons at the surface of the sun would be vast, per plank time
The average resolution per plank time of photons at the sun would be a grid consisting of 1 photon every 13,910,000 meters.
The average grid space would be approximately 14 million meters, or ~42 million feet between each photon, every plank time burst of photon particle emission.
The smallest interval of burst a human has accomplished is 12 attoseconds.
If the sun were to burst photons for only 1 attosecond, the average space between each photon would be
1.391 x 10^-18 meters, or 1.391 x 10^-9 nanometers. The photons would be overlapping, since many photons are larger than 400 nanometers.
The earth's distance from the sun is 149,500,000 km.
Therefore, the surface area of the virtual sphere representing earth's orbit is 280861524823581120 km(sq).
At this distance, the average separation distance between the sun's photons would be
0.000028086 meters, which is 28086 nanometers, which is 2.8 micrometers. In comparison, the width of human hair is 10 micrometers. At this range, the resolution of the sun during an attosecond burst would be crisp by human standards. The resolution during an attosecond burst would be 4x finer a human hair.
If the MilkyWay galaxy is 1,000,000,000,000,000,000 km, or ~100,000 lightyears in diameter, and the sun were
in the middle...
The virtual surface area of the MilkyWay galaxy would be approximately pie *10^36
3.1415926535897935e+36
The average separation between the sun's photons would be by the time it reached someone on the edge of the galaxy, would be 1 photon per 3141592653 km, or approximately 1 photon per 3,141,562,653,000 meters. So at the edge of the galaxy, a attosecond burst of photons from the sun would give a resolution of 1 photon per 3 trillion meters. Well let's see about a 1 second burst of photons. What resolution would that provide?
It would provide an average resolution of 1 photon every 3.14*10^-9 meters, or 3.14 nanometers. The light would still overlap.
z8_GND_5296 is 1.2299e+23km (minimum) distance away. The sa is 1.9008570900956566e+47. This would indicate an attosecond burst resolution of 190085709009565660000 km, or one photon every 190,085,709,009,565,660,000,000 (~190 billion trillion meters). This would indicate a per second resolution of 1 photon, every 190085 meters.
If the resolution of the human eye is 7 megapixels, and the eye is approximately 20 mm, then the "pixel" size of the eye is 3 nanometers. In comparison a human hair is 10 nanometers. In order to maintain satisfactory acuity, z8's resolution would have to be multiplied by a factor of .666e+14, or 66,666,666,666,666.
Significant attenuation loss at distances exceeding the radius of the Milkyway Galaxy, or approximately 100,000 lightyears.
Conclusion, we cannot see the aliens on z8_GND_5296 with our telescopes.
Hypothesis, our eyes are only meant to see the inside of the milkyway galaxy.
Hypothesis, the milkyway galaxy is approximately 1, for easy mathematical access. This correlates with our orbit, because lightyear is defined by our orbit. Therefore, planet Earth is a mathematical hub of the galaxy, to which other maths and observations can be formed.