A while back, a friend of mine set up a 220 mHz repeater on his 50 foot tower for testing purposes. (Can't remember what antenna he used). For my mobile station using a 1/4 wave antenna, it had an effective range of about 8 miles. I could trip it with my 5 Watt handheld at 4 to 5 miles.
I could work it from my home station 25 miles away. I have a 40 foot tower with a 3dB omni antenna almost at the top. I've always been able to talk to him directly on VHF/UHF simplex.
This is over the flat prairie..
Of course, by "flat", you mean no rise or obstructions above round earth, like the calm ocean--but I will use the term "flat" also, like you, lacking a better term (perhaps "round earth" would be).
Let's do a little calculation. There is simple unquestionable math to predict line-of-sight.
Line_of_sight_distance[miles] = 1.22*(sqrt(transmit_antenna _height [ft]) + sqrt(antenna_height_[ft])) In the above case, this becomes (ignoring the "almost"):
1.22*(sqrt(40)+sqrt(50))=16 miles
If you add say 10 feet to each height for an antennas (your ground plane was shorter), the result becomes 18 miles.
Radio range vs. line of sight will be just insignificantly farther for 220 MHz(unless one has tropo, sporadic_E, etc.) but gain or power is not going to help beyond that.
Hearing it at 25 miles means the path it is not actually flat. I would have been surprised if it were flat" for that distance as finding that for distance that far is extremely hard--this is why the range is often farther than the above calculation.
(If the ground at each tower was 50 ft above the round earth, the above range would would give 24 miles.)