RDF gets pretty involved, and while its true that so long as the antenna is capable of receiving the frequency the RDF unit is tuned to, its the digital processing of the received signal that determines how well the RDF works (ultimately).
The problem with most consumer RDF units is that they are built to work with specific antennas and antenna placement i.e. the do not have electronics to generate calibration signals for change in and/or variation in signal phase/amplitude/frequency or angle of arrival that comes about thru operator change in antenna placement or type, versus changes that come about thru the transmitted/monitored signal parameter(s). Professional and Mil Spec units can and do accomodate and recognise such changes. It costs big bucks to write log's and software for FPGA's and other RDF hardware to process this sort of data.
Remember - we are talking here about changes that the RDF unit electronics often need to read & measure in the time domain in micro/nano and pico second sort of time frames to undertake the bearing calculations that often need to be undertaken - and one of the most important changes that have to be recognised and processed for accurate DF tasking, is the effect that all the above have on signal timing, for which antenna type/placement is only one of the "inputs".
LarrySC, I hear where you are coming from idea wise, but after a lifetime working with Mil and space type antennas, I have to tell you that unless you can cater for factors I have described above, I'm afraid that any changes to an out-of-the-box RDF setup, or its antenna type/layout is unlikely to realise you any real world performance improvement that will consistanty display its self for all the changes that occur in carrying out DF tasks.