The pre-calculus definition is:
0.333.... = 1/3
Calculus definition:
The limit of sum(3/10^n} can be made arbitrarily close to 1/3 as n tends to infinity.
When you read them, you can see a subtle difference in the definitions. The first definition "treats" infinity as something that can be quantisized, whereas teh latter treats it as a "concept" that can be used to give you an arbitrarily close approximation.
To me, it just doesn't make sense to say "when you add an infinite # of 3's, you get 1/3" - in a
useful sense.. b/c well, infinity is a never ending thing. It IS a concept. When you add the infinity symbol (or a version of that) into your equations, I think your equations break down very fast. And that is why the limit notation exists.
The limit just says that (for this situation), you can make something arbitrarily close to the limit, as n approaches infinity, where n = # of 3's in this case.
SAYING something like {0.333 with infinite 3's} = 1/3 might be an "ok" "DEFINITION" for some people, but IMO, you can't really "do" anything with that definition. I mean, "how" do you work with "infinities".
What's that famous pre-school fraction approximation lesson:
x=0.333....
10x = 3.333....
10x-x = 3
x = 1/3
But, I don't think in a strict sense this is correct. Substracting infinities from other infinities is a strange thing indeed. We are essentially treating them like a finite sequence when we do this IMO. Whever we deal with situations with dividing infinities or substracting infinities (which is the case here), or deal with "infinities of different sizes", you are doing something wrong with your formulation IMO.
That is why I think mathematicians introduced very specific notation to deal with inifite sequences and such - one of them being the limit ("lim").