no thats incorrect, UDMA and ATA are not manufacturer specific naming conventions. they all use this to describe IDE device tranfer rates and motherboard IDE support. they are both Interface types.
UDMA means ultra direct memory access (UDMA) interface. this is used to rate what the IDE controllers support on the motherboard.
ATA is the rating given to IDE devices like hard disks to tell you what data tranfers speed it supports.
these are maximum speeds, it does not mean they always tranfer at this rate all the time.
UDMA 133/ 100/ 66/ 33 would mean that the motherboard IDE controller supports IDE devices with these data transfer rates: 133MB per second, 100MB per second, 66MB per second, and 33megabytes per second. MB=megabytes.
so ATA 133/100/66/33 is the same thing except that tells you what the IDE device supports. in your case the IDE device is a hard disk.
what you do is match up what the motherboard supports to what the IDE device supports. if your hard disk is ATA133 but you only have a motherboard that supports UDMA100 than your hard disk will slow down to ATA100.
the reason they say 133/100/66/33 means that it can run at any of those rated speeds if they must. they will run at the maximum supported tranfer speed by default. actually they do it all automatically by default.
yes there is a big difference in the different interfaces. given the same hard disk manufacturer and cache buffer, a ata133 will run significantly faster than an ata100 hard disk.