Its called DVI - Digital Visual Input (I don't know whether thats the official abbreviation but its how I remember it) There are virtually no cons comparing VGA (what you will be using now assuming I understood your description correctly) to DVI. The differences are that DVI is Digital therefore does not have signal degradation as its 101011010 rather then a wave which is how analogue signals are sent. With a decent VGA cable you shouldn't have signal degradation when the cable is less then 1.5m.
Dvi can support higher resolutions but you need a monitor that supports these resolutions aswell and I assume this is not needed in your case. DVI is the new standard so majority of monitors/graphics card support it as there default but you can use converter cables and adapter to change to the old vga if you so wish.
GPU's or graphics cards offload the graphical and visual calculations from the cpu to the gpu thus giving you faster all round performance when using graphically intensive programs such as games, Cad designs and media editing.
As a casual computer user in short you don't need to get dvi or a gpu however if you do just want to spend money, I would go with this:
Newegg.com - ATI 100-505564 FirePro V3700 256MB PCI Express 2.0 x16 Workstation Video Card - Workstation Graphics / Video Cards
Its a workstation card so is optimised for photo editing and pixel perfection however it would be pretty dire at playing a video game.