# Calculation of phase error for GSM standard

Discussion in 'General Electronics Discussion' started by splatapus, Jul 24, 2013.

1. ### splatapus

3
0
Jul 24, 2013
Hi,

I was reading the article "Introduction to GSM and GSM mobile RF transceiver derivation" by Paul Kimuli [http://www.rfdh.com/ez/system/db/li..._and_GSM_mobile_RF_transceiver_derivation.pdf].

On page 2 (page 16 within the article) under the heading "Phase Error and frequency error", he says: "To measure phase and frequency error, test sets can be used to sample transmitted output of the devices under test to capture the actual phase trajectory. This is then demodulated and the ideal phase trajectory is derived mathematically."

I'm wondering why would you calculate the ideal phase trajectory from the demodulated transmitted output. Shouldn't the ideal phase trajectory be calculated from the raw data bits being fed into the transceiver? Because what if the transceiver's transmitted output is completely off. Then the demodulated signal will be completely off. Then the calculated "ideal" phase trajectory will be completely off. Then the comparison between the ideal phase trajectory and the actual phase trajectory would be invalid since the calculated ideal phase trajectory is wrong in itself.

Thank you!

Last edited: Jul 24, 2013
2. ### shrtrnd

3,821
519
Jan 15, 2010
You're comparing apples and oranges.
I MEASURE Phase Error everyday, with precision instruments.
I CALCULATE Phase Error pretty closely.
Your article (which I didn't read), probably tells you to MEASURE what you actually have,
and then CALCULATE the correction.

3. ### splatapus

3
0
Jul 24, 2013
I'm sorry I don't understand what you mean.

From my understanding of the quoted text from the article in my original post, it was saying this:
1) Take the transmitted output signal
2) Demodulate it
3) Using the demodulated signal compute the perfect phase trajectory
4) Compare perfect phase trajectory with actual phase trajectory

The problem is I don't understand why would you calculate the perfect phase trajectory from the demodulated signal. What if the transmitted output signal itself was already wrong to the extent the demodulated signal would be wrong as well? Wouldn't a better approach be to calculate the perfect phase trajectory from the raw data bits before they were fed into the transceiver?

I'm new to the world of telecommunications BTW, so please if you want to explain something, please keep it understandable.

Last edited: Jul 24, 2013
4. ### Laplace

1,252
184
Apr 4, 2010
The purpose of the test is to verify that the transmitted phase error is within the limits set by the GSM standard. It would seem that a test set capable of recovering the ideal phase trajectory from the transmitted signal is more versatile and simpler to operate. In the case of a transmitter so far out of spec that that correct recovery is impossible, this is not a problem because the purpose of the test is to measure how well the signal is within spec. At what point does the test set not give a numerical phase error measurement and instead provide an indicator that the transmitted signal to too far out of spec?

5. ### splatapus

3
0
Jul 24, 2013
Ah your right. I guess this test case only tests for the signal transmitted to be within the limits of phase error and frequency error. The test for that the transmitted signal is too far out of spec would probably be something like Modulation quality, or Error Vector Magnitude tests?

Thanks, I understand now.