I failed geometry, and never took calculus, so I dont even know how to go about doing this...but if someone is good at math, and can help out, I am all ears....

Here are the specifics.

Station A. 450' ASL, 30' antenna height, 1090 Mhz operating frequency. Ridge at 550' ASL at 1 mile from station A, blocking signals approx 2 to 5 degrees from horizontal.

Station B, 259' ASL, 40,000' antenna height (airplane), 1090 Mhz operating frequency

Now I know because of the ridge, the "line" to the horizon is tilted up about 2 degrees or more if I point at the top of the hill. This would cause a "shadow" area behind it. What I need to figure out, is if I am not pointing at the horizon, but instead pointing 2 degrees above the horizon (to clear the hill), at what distance would a receiver at 40,000 feet no longer be in the site of the antenna, but would be in the shadow area of the hill?

Using real world testing, the calculator shows the distance to be 253 miles. In reality, I am getting about 120-150 miles.

What I am trying to determine, is if that range (120-150) is the actual radio horizon, or if a higher gain antenna would improve this range.

Station A is using a 1/2 wave dipole (coaxial), with 50' of LMR400 coax. Receiver RF gain set at max (49.5db) to obtain the highest data rate.