The point A is located on the coordinate (0, 5) and B is located on (10, 0). Point P(x, 0) is located on the line segment OB with O(0, 0). The coordinate of P so that the length AP + PB minimum is ....
A. (3, 0)
B. (3 1/4, 0)
C. (3 3/4, 0)
D. (4 1/2, 0)
E. (5, 0)
What I did:
f(x) = AP + PB =[MATH]\sqrt{5^2+x^2}+(10-x)=\sqrt{25+x^2}+10-x[/MATH]
In order to make AP + PB minimum, so:
f'(x) = 0
[tex]\frac12(25+x^2)^{-\frac12}(2x)+(-1)=0[/tex]
[tex]\frac{x}{\sqrt{25+x^2}}=1[/tex]
[tex]x=\sqrt{25+x^2}[/tex]
[tex]x^2=25+x^2[/tex]
This is where I got stuck. Subtracting [tex]x^2[/tex] from both sides would leave me with 0 = 25 which is obviously incorrect. Where did I do wrong?