What does a dvi digital visual interface do over a vga video graphics array input on a monitor?
Rule of thumb: if your output is digital, try to keep it digital all the way. If your output is analog, same deal. The more times you switch over, the worse the end result will be.If your computer supports a digital (DVI) output, and your monitor does too, then keeping the signal all digital helps maintain the integrity of the feed. Every time you switch between a digital signal (DVI) and an analog (VGA) signal, you're "interpreting" between the two. Interpreting always involves a degree of error. That might mean frame stuttering in motion video, or horizontal asynch in stillframes.The best approach is to avoid interpreting as much as possible. If your computer can support DVI out, and your display can support DVI in, then connect them directly with a DVI cable. If one can support DVI and the other is VGA, a single switchover is best.In essence, the fewer translations the better. I frequently see DVI computers connected to DVI displays through two VGA conversions. (The cable connections translate DVI to VGA, and then back again.) That's basically "interpreting" a true-to-life signal two times; and each time comes with loss of data. The fewer of those conversions, the more true-to-life your output will be.