Not sure which one to categorize this, and I feel like this should be pretty simple but I'm a little confused.
GPS uses at least three satellites to find the position for a receiver. Now the satellite clocks all have synchronized atomic time; however, the receiver clock is fast by 0.05 microseconds. How much error in distance measurement between satellite and receiver is induced by receiver clock that's fast by 0.05 microseconds?
No measurements are given besides the 0.05 microseconds. So, assuming I can make an arbitrary distance (13,000 meters) per time (let's say a minute) for the equation, wouldn't it be:
13,000 m/min= 216.67 m/s
216.67 m/s X 1 sec/1,000,000 microseconds X 0.05 microseconds=1.083 X 10^-5 m
so it the distance error would be 1.083 X 10^-5 m for the receiver clock error of 0.05 microseconds
Thanks for the help!