DZ wrote:Oh they do exist independent of man. Numbers do not just "do what we define them to do"... they appear to, but we can be wrong, theories are overturned all the time. Mathematics evolves, we realize that we were wrong. For all we know 2+2=4 could be wrong if someone proves it so. You are arrogant to assume that all current mathematics is "right", just as all previous mathematicians presumed about their ideas. 100 years from now someone could prove that all calculus is a pile of horse shit.

What? I'm sorry for the confusion, DZ, but you a) don't seem to really know what you're talking about and consequently b) don't seem to understand what I said in my reply in the first place. Our modern Calculus is based entirely upon how we want to define and understand real numbers (dedekind cuts atm), which is, at the very fundamentals, an arbitrary thing - based upon the observations we feel like categorizing thusly. There are assumptions (logical, sound, but assumed nonetheless) at the core of mathematics and, like I said, there aren't actually numbers in nature. There is no math in nature. Numbers are just symbols that we use to artificially group and represent the transient and transcendent

physical forces that we can observe; this allows the human recognition, classification, and prediction thereof.

We simply pattern the logic (re: theory) by which we manipulate these arbitrary creations (re: mathematics) after the interactions of these forces

as best we can - keeping what seems to work, and marginalizing what does not. What we are right or wrong about where mathematics are concerned is simply how our

**creation** (re: math and

*numbers*) applies to the forces in nature - which, again, aren't numbers or mathematics. Calculus, therefore, functions and is navigable

*as we define it* to be. There is no system floating in the aether - there are only physical properties, trends, actions, reactions, and a host of unknown. And even saying that may not be entirely accurate, because that is just how we quantify what we observe from our linear, physical perspective.

Math is to Nature as Language is to Nature. If I point to a shovel and say "this is a shovel", it isn't as if the word shovel or my understanding of "what a shovel is or does" is somehow intrinsic within the object. The letters come together to form the word "shovel" and then relate to that object

*only* because the man-made structure of English specifies that it does. "S" corresponds to its sound because someone said that it does. Saying "Sh" together makes a specific blending, because someone said that it should. Math, as a system, is the same. The data is taken from the physical environment, but the system operates only as we have constructed it to operate. We make it mimic the physical world in a way that we can sort - it is not actually the physical world.

And I really wish people would stop using that stupid 2+2=4 example. You can make 2+2 equal ElephantAndCheese if you feel like it. It isn't some tangible piece of the universe, it is merely a component of the logic that we have fabricated and use to interpret and describe the physical universe that we can observe. It isn't actually out there somewhere - it is in our heads. You don't prove it wrong anymore than you prove a screwdriver wrong. It's a tool.

DZ wrote:What is a limit then?

Let us say that an open interval; (g,h); contains a number; x'0.

Let us also say that a

**F**(x) is defined in (g,h).

A limit (L) thereof is the val, derived from the surrounding/nearby xvals, that

**F**(x) would return when our x'0 equals that x.

**F**(x) can be made to go as close to L as one feels like putting it without actually encountering x'0 - there becomes no applicable mathematical difference between the two. And blah blah blah, I'm not going to go into detail about the epsilon/deltas.

This relates to our example in a specific way. Limits are used in the formal proof of the relationship, for reasons of uniformity, but that is not even necessary. Some people can't seem to understand that we aren't simply talking about some {.99...} infinite sum, pulled out of the air. What you see here is merely an imperfection in decimal notation - a representation - at which we arrive by dividing the whole 9 by itself. The infinite number {.99...} is derived entirely from an operation that

**already equals 1** (re:

**not** sum infinity). Thus, as we try to move backwards from 1 into {.99...}, we discover that we always remain at 1 because there is infact no difference between the two. There

*can be no distance between infinity and a 1 that lies at the 'end' of it*. It isn't an infinitely small distance, it is NO DISTANCE. It's impossible. They MUST BE representative of the same number, according to the logic of Calculus. This makes them the same number. That's all there is to it, and debating the point in this instance is purely philosophical (re: not mathematical).