Nootropic Posted May 17, 2007 Report Posted May 17, 2007 So a few, well, more than a few, months back my calculus teacher had us look how to formulate derivatives of inverse trig functions. Look meaning spend about two hours investigating (which I would happily if I ever had time...), so being in a rush I had a thought about the tangent lines of two functions that are inverses of one another being inverses as well. Little did I know this would be an hour long discussion in my class...well, mostly me explaining...but anyhow, I didn't mention it to my teacher until today—who had originally told me he had never heard of the tangent lines being inverses as well—and he told me that he was pretty sure I was right and that he had written up a proof, but he was interested in seeing how I would forumulate a proof. So afterwards, I left to the library to go do one, which wasn't all too difficult (I'll write it up, scan, and then post it due to my lack of latex knowledge). But I'm curious if anyone else has heard of this, cause I haven't been able to find any information about it. I would think that most people woudln't look at the tangent lines and go straight to the derivatives. Oh, and as a sub-topic, has anyone ever seen a Taylor series for y = x^x? I attempted one, but the derivatives became long and laborious, and I have a short attention span. I used x = 1, and f(1) is obviously one, and I think I got as far as the fourth derivative and went to do something else. This was after I assumed the derivatives evaluated at x = 1 would proceed to be integers, but that is not the case, if I remember. Quote
Jay-qu Posted May 18, 2007 Report Posted May 18, 2007 Im not sure what you mean.. do you mean an equation for the tangent to the function at any abitrary point? Quote
Qfwfq Posted May 18, 2007 Report Posted May 18, 2007 Well the derivative is the tangent of the tangent's angle (formed by the tangent with the x axis), for the function and its inverse these sum to the right angle. Quote
sanctus Posted May 18, 2007 Report Posted May 18, 2007 I don't really see an use of a taylor series for x^2, because usually you use a taylor series to transfrom a given function into a power series, but there you have already a power series (with all the coefficients zero a part from the one of the x^2 term being one)? But anyway you would have:[math]x^2=1+2(x-1)+\underbrace{\frac{2}{2!}}_{=1}(x-1)^2+0[/math]which gives actually an exact result (ie calculate the right side and you'll see it gives exactly x^2). This can be another reason why you never saw such a taylor expansion. Eventually, I have to add that you must have made a mistake somewhere because (x^2)'=2x, (x^2)''=2 and[math](x^2)^n=0 [/math] for all n>2(I don't remember the sign for this in latex), so i don't know how you got to a term in the fourth derivative. Quote
Nootropic Posted May 19, 2007 Author Report Posted May 19, 2007 I don't really see an use of a taylor series for x^2, because usually you use a taylor series to transfrom a given function into a power series, but there you have already a power series (with all the coefficients zero a part from the one of the x^2 term being one)? But anyway you would have:[math]x^2=1+2(x-1)+\underbrace{\frac{2}{2!}}_{=1}(x-1)^2+0[/math]which gives actually an exact result (ie calculate the right side and you'll see it gives exactly x^2). This can be another reason why you never saw such a taylor expansion. Eventually, I have to add that you must have made a mistake somewhere because (x^2)'=2x, (x^2)''=2 and[math](x^2)^n=0 [/math] for all n>2(I don't remember the sign for this in latex), so i don't know how you got to a term in the fourth derivative. You misread the funciton. Not y = x^2, but y = x^x, of which there is most certainly a fourth derivative. The derivatives aren't insanely difficult to calculate, especially with logarithmic differentiation (which is really pointless, but most argue it provides an easier method than saying f(x) = e^(ln(f(x))) and then differentating from there). Im not sure what you mean.. do you mean an equation for the tangent to the function at any abitrary point? I wasn't really clear there. Consider the point f(a) = b on the original function, f(x), and the tangent line to that point, x = a, y = f'(a)(x-a) + f(a). The point on the inverse function g(x), g(b) = a (this follows directly from the definition of an inverse: domain reversal). This point has has a tangent line at x = b: y = g'(b)(x-b) + g(b). You can use the chain rule for derivatives to show that g'(b) = 1/f'(a) and from the above definitions, g(b) = a and b = f(a), which turns out to be the inverse of the line tangent to the graph of f(x) at x = a. Quote
Jay-qu Posted May 20, 2007 Report Posted May 20, 2007 This makes sense graphically, but what are you getting at, do you propose it is of some concequence or just an interesting proof? Quote
Nootropic Posted May 20, 2007 Author Report Posted May 20, 2007 Taylor Series:[math]f(x) = x^x = 1 + x \cdot \ln x + \frac{(x \cdot \ln x)^2}{2} + \frac{(x \cdot \ln x)^3}{6} + ... + \frac{(x \cdot \ln x)^n}{n!} + ...[/math] [math]f(x) = x^x = \sum_{n=0}^\infty \frac{(x \cdot \ln x)^n}{n!}[/math] Reference:Taylor series - Wikipedia, the free encyclopedia The problem with this series is that the y = x^x has a discontinuity at x = 0 (a removable discontinuity, but a discontinuity nonetheless). Inserting the expression xln(x) into the taylor series for y = exp(x) about x = 0 will not work, since the mclaurin series (taylor seris about x = 0) gives exact information about x = 0. Not only this, this series, while not impossible to integrate (applying integration by parts is plausible), certainly does not make integration easy. This makes sense graphically, but what are you getting at, do you propose it is of some concequence or just an interesting proof? It's not of great consequence, or even anything big, but it's more of an interesting proof, noting that there are always things which people make overlook. I suppose one of the few things it may be able to do is to allow someone to write a tangent line a bit quicker, but who knows, I'm not mathematician...yet. Quote
CraigD Posted May 21, 2007 Report Posted May 21, 2007 [math]f(x) = x^x = \sum_{n=0}^\infty \frac{(x \cdot \ln x)^n}{n!}[/math]From the Taylor Series,[math]f(x) = x^x[/math][math]f’(x) = (1 + \ln x)x^x[/math]Like the Taylor Series, it’s not defined at x=0. I suspect f’(x) isn’t, either, though a proof is beyond me. Though the derivative of [math]x^x[/math] is interesting, and provides an identity I’ve either not seen or don’t remember,[math]\ln x = \lim_{p \rightarrow 0} \frac{(x+p)^{x+p} –x^x}{px^x} -1[/math], I can’t immediately see that it’s very useful. Looking at it when x is very large,[math]\ln x \dot= (x+1)^{x+1} –x^x -1[/math], which hints at some sort of rational approximation using Pascal’s triangle. Mostly, though, this hints gloomily to me how out-of-practice I’ve become at calculus. What you don’t use, you lose :( Quote
Nootropic Posted May 22, 2007 Author Report Posted May 22, 2007 A way to fix the problem with discontinuities in the taylor series, is to write an expansion for exp(x) about x = 1. The nth derivative of exp(x) is just exp(x), so exp(1) = e, and the expression becomes e((x-1)^n)/n!. Insert xln(x) into that expression and you have a bonified taylor series. Not the prettiest, but it's a taylor series nonetheless. Quote
CraigD Posted May 23, 2007 Report Posted May 23, 2007 [math]f(x) = x^x = \sum_{n=0}^\infty \frac{(x \cdot \ln x)^n}{n!} \; \; \; x > 0[/math]The problem with this series is that the y = x^x has a discontinuity at x = 0 (a removable discontinuity, but a discontinuity nonetheless). [math]f(x) = g(h(x)) = \sum_{n=0}^{\infty} \frac{e(x \cdot \ln x - 1)^n}{n!} = x^x[/math]How has the latter sequence fixed the discontinuity at [math]x=0[/math]? [math](\ln 0) -1[/math] is as undefined as [math]\ln 0[/math] . Please forgive my ignorance, but I’m confused by what is meant (by implication) by the derivative of the function [math]f(x) = x^x[/math] being removable, fixable, or otherwise made not undefined. [math]f(x)[/math] is clearly defined at [math]x=0[/math] in its [math]x^x[/math] form, but I don’t think [math]f'(x)[/math] can be expressed with a definite value at [math]x=0[/math]. [math]f(x)[/math] appears to me simply a function defined at [math]x=0[/math], with derivatives ([math]f’(x)[/math], [math]f’’(x)[/math] etc.) that are not. Its graph is a “fishhook” with it’s left “point” at (0,1) with a slope of [math]-\infty[/math], its minimum at [math](e^{-1} , e^{-e^{-1}}) \dot= (0.3679,0.6922)[/math], and it’s right “shank” vanishing into [math](\infty,\infty)[/math] and a slope of [math]+\infty[/math]. [math]f’(x)[/math] is no more defined at [math]x=0[/math] than at [math]x=+\infty[/math]. Quote
Qfwfq Posted May 23, 2007 Report Posted May 23, 2007 I’m confused by what is meant (by implication) by the derivative of the function [math]f(x) = x^x[/math] being, fixable, or otherwise made not undefined.He said the discontinuity is removable, also called of third species. It means the limit is not infinite (and no difference between right and left limit) so the function may be extended by continuity. While [math]\lim_{x\rightarrow0}\ln x = \infty[/math] and thus isn't a removable discontinuity, [math]\lim_{x\rightarrow0}(\ln x)^{-1} = 0[/math], not infinity. P. S. I wouldn't say though that the discontinuity of [math]x^x[/math] in 0 is removable. Quote
Nootropic Posted May 23, 2007 Author Report Posted May 23, 2007 When I said there is a "removable discontinuity" I implied that, upon using direct substiution, the limit of f(x) is an indertiminate form of 0^0. To verify the limit exists and is finite, a combination of l'hopital's rule and logarithmic manipulation can be used. Before doing this the funciton y = x^x can be defined as f(x) = x^x for all x not equal to 0 and 1 for x = 0. The above method can be used to verify the continuity of f(x) x =0. Quote
Qfwfq Posted May 24, 2007 Report Posted May 24, 2007 I could agree if you're considering only non-negative values of x but, if you consider all real values, then the right and left limits have opposite sign: 1 and -1. Quote
Nootropic Posted May 24, 2007 Author Report Posted May 24, 2007 That's true for negative integers, but for irrational and rational x like (-1/2)^(-1/2) and (-1/pi)^(-1/pi), I can't say if there's a limit a or not, at least on the real line. Quote
Qfwfq Posted May 25, 2007 Report Posted May 25, 2007 Well, OK :doh: in my hurry and without even writing the substitution on paper I had confused [math]\norm(-x)^{-x}[/math] with [math]\norm-x^{-x}[/math]. I can't say if there's a limit a or not, at least on the real line.Well I can. There is a left limit, although obviously the function isn't real-valued for x < 0 and not even univocally defined (but all choices have the same x-->0 limit). Just think a bit more carefully... ;) Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.