Video quality varies with a few factors. Placement of reciever and transmitter antenna. Noise floor of the surrounding area. (how noisy certain frequencies are in your location) As well as small stuff like how many times is the signal interrupted. By this I mean, are other video components plugged in. It is entirely possiple to splice in a display signal which overlays data on the screen. However due to an extra soldering joint or two in wiring it can slightly (sometimes unnoticeable) degrade video quality… Again completely dependent on set up… There is also the possibility of interference from surrounding equipment. For instance an electric drill 2 feet away from your receiver will impact quality. However if the receiver is placed away from equipment this disappears.
In my testing to come I will be experimenting with different broadcast frequencies. Depending on the frequency that the transmitter broadcasts can impact video quality. For example, some frequencies will penetrate objects better than others, at the same time some frequencies have a busy noisefloor. For example, 900mhz isn’t a great frequency to use due to the general use of chordless (house) telephones.
I’m working on sorting some of these questions out.
Keep in mind this technology is generally good for transmitting up to 10-30km. Transmission quality degrades exponentially with distance, meaning a system that might have a slight bit of intereference at 10km will be exponentially stronger (less interference, better video) at short distances (such as <500m at a kart track.)