Distance Error Caused by GPS Receiver Error

Distance Error Caused by GPS Receiver Error

Postby Guest » Sun Sep 30, 2018 1:44 pm

Not sure which one to categorize this, and I feel like this should be pretty simple but I'm a little confused.

GPS uses at least three satellites to find the position for a receiver. Now the satellite clocks all have synchronized atomic time; however, the receiver clock is fast by 0.05 microseconds. How much error in distance measurement between satellite and receiver is induced by receiver clock that's fast by 0.05 microseconds?

No measurements are given besides the 0.05 microseconds. So, assuming I can make an arbitrary distance (13,000 meters) per time (let's say a minute) for the equation, wouldn't it be:

13,000 m/min= 216.67 m/s
216.67 m/s X 1 sec/1,000,000 microseconds X 0.05 microseconds=1.083 X 10^-5 m

so it the distance error would be 1.083 X 10^-5 m for the receiver clock error of 0.05 microseconds

Thanks for the help!
Guest
 

Return to Word Problems



Who is online

Users browsing this forum: No registered users and 1 guest