A memory leak is something that occurs in software that is written using languages that do not perform garbage collection. (In other words, everything up until Java/.NET stuff)
There are two ways to allocate memory in a program:
The "Stack":
Code:
void func3()
{
int int_variable;
int_variable = 1;
int_variable++;
// some other math
}
Allocating memory on the "stack" means that the actual memory is stuck right on top of a stack of other stuff. If my program called functions like main() calls func1() calls func2() calls func3(), and inside func3() I define a variable x, it would be like this on the stack:
Code:
func3() & x, other local variables
func2() & local variables
func1() & local variables
main() & local variables
When func3 completed, it would wipe the top entry off the stack, including all the associated variables. Likewise, when my sample func3 function completes, the stack variables it contains are destroyed
The other (more common, better, whatever) way to do things is "Heap allocation" also known as "Dynamic allocation"
For example:
Code:
void func1()
{
int *int_variable;
int_variable = new int;
*int_variable=1;
*int_variable++;
// More math here
}
Now, when I run that function, when it completes, int_variable stays inside memory, on something called the "heap". This is maintained as long as the program is running, and is not "garbage collected". So, if I make a variable, and I never unallocate the memory, it never gets freed up. If I call func1() 1,000 times in a row, I waste 1,000 little bits of memory. Over time, in applications, this adds up to vast memory "leakage."
This can be corrected by adding a line:
Code:
void func1()
{
int *int_variable;
int_variable = new int;
*int_variable=1;
*int_variable++;
// More math here
delete int_variable;
}
The delete command tells the underlying OS that the memory is now marked as free and unused, and can be allocated to other processes, or back to my process again.