In computing, you have a hierarchy of storage, ordered in terms of capacity, latency, and cost. A cache is a is a component used for storage that is faster than forms of storage above it; the example PP gave is CPU cache compared to RAM. The tradeoff for increased speed is that the cache will have a much smaller capacity, and increase in cost.
RAM comes in multi-gigabyte capacities, takes between 10 and 100 nanoseconds to read or write to, and costs about $0.03 per megabyte.
CPU cache is usually less than 10 megabytes in a normal desktop CPU, takes less than 10 nanoseconds to read or write to, and costs in the range of a dollar per megabyte.
Here's a good diagram this illustrates the point: