Question : linear approximations and percentage error


This is the question:

Use the linear approximation method to derive approximations for the following functions by setting x to 0 and making dx very small:

If y = (1+x)^1/3 show that (1+ dx) ~ 1 + dx/3

Determine the percentage error introduced by these approximations for dx = 0.01, 0.1 and 0.5

I could use a linear approximation to show the first part but I didn't know how to do the second bit about percentage error. I didn't really know what it meant.


Answer : linear approximations and percentage error

You can calculate from the original equation (y = w^1/3) what y would be if x were 1 + 0.01. Do so. That is your base number.
The approx number is 1 +dx/3.  Calculate, subtract from base number, divide by base number, multiply by 100. That is your percent difference. Repeat with dx = 0.1 and 0.05
Random Solutions  
programming4us programming4us