Cesium Magnetometer...
Clear all

Information Cesium Magnetometer Sensor Bandwidth

1 Posts
1 Users
0 Reactions
Member Admin
Joined: 5 years ago
Posts: 117
Topic starter  

The subject of "Bandwidth" comes up often when discussing cesium magnetometers. There are two different aspects of bandwidth that are different and need to be differentiated:

The cesium magnetometer uses an atomic resonance of the Cs 133 atom (see note 1 below) which varies proportional to the ambient magnetic field. This atomic resonance is used to set/control the frequency of an oscillator. Therefore the output signal from the magnetometer is a *frequency* which is proportional to the earth's magnetic field at a coefficient of 3.498572 Hertz per nT. Thus the output frequency (called the Larmor frequency) varies from roughly 70KHz at the equator to 350 KHz at the poles.

Because the cesium magnetometer is an oscillator, and because phase is important in an oscillator, the "Bandwidth" of the electronics in the magnetometer must be at least 10 times higher than the maximum output frequency of 350 Khz, or roughly 3.5 MHz.

This bandwidth should not be confused with the magnetic field measurement "Bandwidth", or how fast of a magnetic field change can be measured. To put a scaler value on any magnetic field reading the output frequency of the magnetometer must be counted, and then scaled appropriately to get a field reading in nanoTeslas. The counting process involves opening a gate period, counting the number of Larmor (frequency) cycles that occur, divide that number by the precise time interval of the gate period. then scale that value by dividing by the 3.498572 Hz / Larmor coefficient. You get one reading per gate period, which by default is five or ten hertz (200mS to 100 mS gate period). What you get for a reading during any gate period is the time interval average of the Larmor frequency over that period.

The transfer function of a "time interval averaged" signal is [sine(x) / x] with the first zero falling at the sample frequency. Thus if the G-882 is sampling at 10 hertz the maximum resolvable magnetic field change is roughly 5 hertz.

The sample interval of the G-882 is adjustable by sending commands to it. If the sample rate is set to 20 hertz the measurement bandwidth will double (from a 10 hertz sample rate) but the base line noise will go up as well.

It should also be noted that the basic system noise level of the G-882 for a stationary sensor is set by the counter resolution - not by the signal to noise ratio of the oscillator electronics. If the sensor is tilted away from its optimum orientation the magnetometer signal will decrease (and therefore the signal to noise ratio), but the counted field output will not show any significant degradation until the sensor is approaching the dead zone (where the signal is really low).