If the Sun's light is supposed to be 'focused' in the manner of a flashlight, then why is it always the case that wherever and whenever on Earth we always see the Sun shining fully (when not attenuated by clouds, fog, volcanic ash, etc.) anytime it is above the horizon? For everyone, everywhere, all the time during the the day, the Sun's 'beam' is directed toward them. And so such a 'focusing' or 'constriction' of the light must necessarily encompass the entirety of the Earth with essentially a uniform light (except for atmospheric and other attenuation.)
If the Earth is flat, then how to account for this? For an observer whose Sun is overhead he sees the solar disk subtend an apparent angle of 1/2 degree. While on the phone with another person 1/8 of the terrestrial circumference (as we know it) distant, he is told the sun is 45 degrees above the horizon. Simple geometry tells us that the sun for the latter must be 1.414 times more distant, therefore subtending an apparent angle 0.707 times as large as when at the zenith, and supplying 1/2 the light/energy. For ever lower solar altitude the numbers quickly get into ludicrous territory.
If atmospheric refraction is then invoked as a counterargument to provide the coincidentally precise magnification required to compensate for the geometrical scaling, the retort is this. The patterns among the stars are essentially unchanged down to very near the horizon, where only a vertical distortion of small extent is incurred due to the stratified density gradient. The required 'magnification' cannot be imposed in the horizontal axis due to the fact of the density gradient operating only in the vertical (which is why the solar disk at setting/rising is often somewhat compressed vertically with no alteration in the angular width.) Moreover, how can magnification of a field of stars possibly retain an unchanged scale for their large-scale pattern? Such a concept as magnification without change of scale at various scales is utterly ludicrous.