System Performance Degrade over time

Status
Not open for further replies.
What is the exact definition of a memory leak? I cant say that I've heard of that.
 
usually it's a poorly written program that's not acting right. winfs of MS vista was leaking way too much memory, so they had to shelf it for now
 
Thats right, I knew I heard that term somewhere, I wasnt sure what it meant though.

WinFS as in their new file system? What are they using if that is shelved?
 
A memory leak is something that occurs in software that is written using languages that do not perform garbage collection. (In other words, everything up until Java/.NET stuff)

There are two ways to allocate memory in a program:

The "Stack":
Code:
void func3()
{
   int int_variable;

   int_variable = 1;
   int_variable++;
   // some other math
}
Allocating memory on the "stack" means that the actual memory is stuck right on top of a stack of other stuff. If my program called functions like main() calls func1() calls func2() calls func3(), and inside func3() I define a variable x, it would be like this on the stack:
Code:
func3() & x, other local variables
func2() & local variables
func1() & local variables
main() & local variables
When func3 completed, it would wipe the top entry off the stack, including all the associated variables. Likewise, when my sample func3 function completes, the stack variables it contains are destroyed

The other (more common, better, whatever) way to do things is "Heap allocation" also known as "Dynamic allocation"

For example:
Code:
void func1()
{
   int *int_variable;
   int_variable = new int;

   *int_variable=1;
   *int_variable++;

   // More math here
}

Now, when I run that function, when it completes, int_variable stays inside memory, on something called the "heap". This is maintained as long as the program is running, and is not "garbage collected". So, if I make a variable, and I never unallocate the memory, it never gets freed up. If I call func1() 1,000 times in a row, I waste 1,000 little bits of memory. Over time, in applications, this adds up to vast memory "leakage."

This can be corrected by adding a line:
Code:
void func1()
{
   int *int_variable;
   int_variable = new int;

   *int_variable=1;
   *int_variable++;

   // More math here

   delete int_variable;
}

The delete command tells the underlying OS that the memory is now marked as free and unused, and can be allocated to other processes, or back to my process again.
 
Ouch

Not a programmer, but I kind of understood.

What is Garbage Collection?
 
A background process or thread that is generated by the compiler in every program using a "managed" language (which is to say, a language that is garbage collected) which constantly watches memory and looks for when variables go "out of scope" and are no longer needed.

For instance, in Java, if I translated the above code, I never would be *required* to explicitly call delete, since once my function completed, the garbage collection thread would see that the variable "went out of scope" (in other words, is no longer in use) and would free up the memory for me.

EDIT: Some nice examples of modern "managed" or garbage collected languages: Java, C#, C++ .NET, Visual Basic .NET, etc.
 
That makes perfect sense, I dig. I'm pretty sure I understand now. Thanks for the knowledge bomb!
 
Yeah I just thought I'd toss the more specific technical explanation out there since its of some interest.

Basically, just remember, things like memory leaks and such are going to become less and less common in the future due to managed code.

If there is one thing in software development that is universal, it is that programmers are lazy. You can always count on programmers to cobble together solutions, and then go back and fix the problems later. Even in huge projects where they shouldn't really be doing that.

The problem is so pervasive in software development that there are entire "fields of study" that revolve solely around the software development process and creating stable, standards compliant, neatly packaged code. But even then, programmers hack stuff together for beta revisions and so on and so forth.

This is what is giving rise to all the new programming languages recently... people realized that hardware is cheap, and programmers are expensive. It used to be the other way around, and so efficiency was the name of the game.

Standard C/C++ was not "unmanaged" arbitrarily, it was made that way because the hardware of the day (1973) was so slow that the overhead caused by managed code would have made the software prohibitively slow. It is left to the programmer to do what is neccesary, and only what is neccesary, if they want.

Now, we can build a monster of a box (compared to just 2-3 years ago) for <$1000. So people started saying, why design to the hardware, lets design for the user and design for the programmer. They realized that software was getting so large that going back and finding all those little bugs was becoming an unmanageable task.

Enter managed code and garbage collection. Makes it just one little bit easier to manage large projects. Sure, it runs slower, but its easier to program for, and most importantly, a hell of a lot easier to debug.

Notice that your XP box probably crashes a lot less than any 95/98 box you used to run. This isn't just because 98 sucks (it did), but because tighter code management is being implemented at all levels.

Consider these huge problems and sources of instability, hacks, security risks, whatever:

- Memory leakage --- Solved almost entirely by modern managed languages.
- Buffer overflows --- Solved almost entirely by modern processor technology which implements (advanced) protected memory areas at the hardware level
- Incompatibility/Nonadherence to standards --- Solved almost entirely by modern managed languages where all standard protocols are implemented already by the language.

Code today is achieving levels of stability and complexity that were almost unobtainable years ago. Sure, they don't run as fast as hand tuned assembly code, but writing an OS like Windows XP totally in machine language is an undertaking the size of the space program.
 
Did you just write that? It's like a god damn essay on coding! Was very informative and quite interesting. I would like to do some programming, but I'm so impatient (and I taught myself HTML back in the day and ended up HATING it because it was so monotonous... I could only imagine coding a real program) and I'm terriblle at math. Thing is, I love math, but am so ridiculously bad at it. Arent you suppose to be good at thigns you enjoy?
 
Status
Not open for further replies.
Back
Top Bottom