A microprocessor functions by executing simple operations that are in a list (a program). Such operations include reading and writing to memory (the RAM in your computer), performing arithmetic operations (sums, multiplications, etc), and getting and generating Input/Output signals (such as writing to disk, or displaying things on-screen). So that these instructions are carried out in order, there is a thing called a clock signal that tells the CPU when to execute each instruction. Imagine it's a device "feeding" the processor these instructions one by one.
A clock cycle is the basic unit of time in which the processor carries out a defined numbers of these actions. In one clock cycle, the CPU could for instance READ from memory, perform an operation on the data and then WRITE to memoy. The clock signal is kind of the orchestra director, it tells the CPU when to do things.
Why is this important? Because the shorter a clock cycle is, the faster the CPU will do things. The clock frequency of a CPU tells us how many cycles there are in a second of time. A 1Ghz processor goes through 1 billion clock cycles per second! The larger the frequency, the more instructions the CPU can do per second.
I hope that explained the issue a bit.