Well, the way I see it, there are really only 4 choices, being C, C++, C#, and Java.
I learned on Pascal but it is a dead language now, unfortunately.
VB is alright, but I think people who learn on VB tend to develop nasty habits and turn out inferior code to people who learned things the "old school" way, and moved up to that sort of environment.
As far as LISP and Prolog, I can't even see them making the list. I think functional programming has no place being taught to beginners since it is so abstract as to be almost incomprehensible to even veteran programmers. Human beings just do not have an easy time thinking recursively, in my opinion. As long as computers are essentially Von Neumman machines, I think programming should be taught in imperative languages, since they are the closest to the underlying assembly language instructions that actually get executed.
I am not familiar with Scheme.
As far as the C/C++ question, I consider it a wash, since most people use just a subset of C++ anyway, so you might as well learn "C" in a C++ compiler since thats probably how it'll be done later anyway. In my opinion, teaching procedural computing is still important, and so people should start with simple procedural (non-OO) programming and move up later to OO. I think to start with OO is to invite disaster. You wouldn't believe the number of people I ran into in CS classes at my university who are absolutely abysmal programmers because once you take away Java/C++/C#/VB and their precious object oriented nature, they can't think.
I say get down into the trenches and learn it the way people have been learning it since the 70s. Be it C, FORTRAN, etc, learn the basics, then move up. Once you master merge sorts, linked lists, binary trees, etc, move on up to classes and actual objects.
Doing it this way enables you to go back later and adapt more easily. Someone versed in C and by extension C++ concepts can pick up almost any language (except Lisp and Prolog, etc) usually in hours.
Not only that, but the syntax of them is so universal, that it is found nearly everywhere. PHP, shell scripts, perl, and even hardware descriptor languages follow C/C++ syntactical conventions. (For instance, for one class, I had to debug a MIPS R3000 processor synthesized completely in Verilog, which is a very C-like 'language').
I say why start with Java or C#? Not everything needs to be object oriented. In a world where more and more devices are being embedded, it is my feeling that there is going to be a deficit of people who are still experienced in the grunt work. Out of all people in my CS classes, I doubt 80% of them could read and understand basic MIPS, x86, or PowerPC assembly code, or for that matter even basic C libraries.
In designing a lot of embedded applications, every bit, every line of code, every transistor counts. Why not learn to do it the hard way and then spend a few minutes to adapt to it a new way (Java, C#, etc). At least this way you'll be able to do both. On one hand, you'll be able to use the powerful features of the OO languages, and and on the other hand, you'll still be a competent resource-conscious, efficiency-minded programmer that can optimize code with the best of them.
JMHO