Yes, a UHF antenna can pick up VHF signals, but not well. The size difference makes the uhf antenna "inefficient" for the longer vhf wave length, but some energy can be received by the uhf antenna.
Different people see the boundar between VHF and UHF differently. I tend to regard UHF as starting at 200 Mhz.
First, you need an antenna that is both uhf and vhf comaptible. Usuaully it will have a pair of rabbit ears (vhf) and a center loop or plate (uhf). Make sure the rabbit ears are completely extended then just re-scan your channels on your digital box or your hdtv.
Een draadloze microfoonset met VHF is storingsgevoeliger dan UHF microfoons.
it rely depends where you are. UHF stands for ultra high frequency and vhf stands for very high frequency. vhf has ben around longer than UHF making them cheaper to use but UHF means smaller antennas and better output. vhf is better in rural locations with expectations to be used in close proximity because buildings and natural barriers affect them. UHF can easily get through buildings and natural barriers but you pay lots more for them. but any to answer you question they do use both just ask your locals what they use.
VHF is the more common for tv channels, you are probably thinking UHF
I wouldn't think so. The uhf coupler is made NOT to pass vhf freqs. The transmitter and the coupler are made to operate in different bands.
You need a VHF/UHF antenna (channels 2 to 60 ). Check out: http://www.fcc.gov/cgb/consumerfacts/digitaltv.html
Digital TV signals are transmitted on VHF starting on channel 2 and ending at the top of the UHF spectrum channel 69. At one time the plan was not to use the VHF band but it didn't work out that way.
Yes
The frequency. Oddly enough, uhf actually stands for Ultra High Frequency and vfh stands for Very High Frequency. VHF for old TV covered the range 90Mhz to 150Mhz UHF for modern TV covers the range 400Mhz to 800Mhz
A couple of points: When you're on the receiving end, UHF signals have a few disadvantages over VHF signals, owing to UHF's higher frequencies: 1) UHF transmitters tend to be less powerful than VHF transmitters 2) Transmission lines lose appreciably more signal at UHF than VHF, and that's just for transporting the signal from the transmitter room to the antenna 3) UHF signals tend to get weaker more quickly than VHF signals as they propagate outwards from their transmitters 4) UHF receivers tend to be less sensitive than VHF receivers UHF signals have one particular advantage over VHF signals: The smaller wavelengths of UHF signals allow for a smaller antenna to provide the same performance as a larger VHF antenna. Alternately, you can make the UHF antenna larger for enhanced performance, and it might still be small compared to a nominal VHF antenna. (Please pardon all the vague, qualitative references.) High-performance antennas for UHF that are not especially huge can more-than-compensate for lower power transmitters, lossier transmission lines, higher path-losses, and less sensitive receivers. Another advantage of UHF (and microwave) is that there tend to be more frequencies available than at the lower VHF frequencies. As technology advanced over the years, radio-spectrum habitation moved from the lower frequencies to the higher frequencies. Necessarily, the tendency is for lower, "older" (VHF) frequencies to be more crowded than higher, "newer" (UHF / microwave) frequencies. More than likely, the frequency you operate on will be determined by the radio-communications-licensing authority in your country (i.e., the FCC in the USA). A particular service might have allocations available in more than one band, i.e., VHF and UHF, and then it's up to the Engineering Department to decide which of those legally available frequencies to chose for their system.