|
Post by leviw on Apr 1, 2017 8:56:43 GMT -7
Hi everyone, we're having a problem with our antenna tracking system. It's not very accurate. We've been struggling with this for a couple weeks now trying different things, but haven't found a solution yet. I'm hoping someone can give us some advice. The setup: antenna tracker is set up in a field near a parking lot. It's level and the arduino has a gps lock (blinks every ~15 seconds I believe). We turn on the irridium modem (gps and irridium locks) and move it around the parking lot to practice tracking it. We complete the IMU calibration, but have also tried using the cardinal direction and gps location options. We use the MSU website at 153.90.202.26/ to verify that our payload reports the correct position, which is usually within ~50 feet of the real location. The final result is that our tracking system is off by some amount, seemingly related to the angle away from center. If the payload is 90 degrees away from the center position, the tracker will point around 20 degrees short of the payload. (Correct angle 90 degrees, actual angle ~70 degrees.) If it helps, the magnetic deviation in our region is around 15 degrees, but I believe the tracking software is already taking that into account. I'm wondering if maybe it's a hardware issue with the servo/pot, or something else. Is the system working well for other people or is it just us?
|
|
|
Post by David MSGC on Apr 19, 2017 14:25:33 GMT -7
Most of it has to do with the IMU calibration, sometimes I think it is dumb luck when it works. Ours can be within 2 degrees or so, then the next time we calibrate it will be off by 20 degrees. We usually just re-calibrate if we notice it being way off. The software does have a lot of room for improvement and some of our interns will be working on this when school is not so busy. With the difference in tolerance with the pots in the servo, every servo will have a different max and min position relative to pulse width. The software assumes all servos are the same, you almost have to re-map every servo for it to be more accurate. Also the software assumes the change in rotational movement is linear relative to the change in pulse width, and it is not. For example: say the particular servo potentiometer pair the minimum position of the servo, all the way clockwise(the large gear clockwise, the servo gear counter-clockwise) is at 1000 us pulse width and the maximum is at 2000 us, this would put the center at 1500 us. From center to move either direction 5 degrees might be a difference in pulse width by 15 us, but near the max or min a change of 15 us might move the servo 8 degrees. These numbers are purely for discussion, the actually numbers are like 980 minimum and 2180 maximum and these numbers are different for every servo due to the tolerance on the pots. Some servos might be a minimum of 900 or less and a maximum of 2200 or more microseconds. We are thinking about making a self calibration method that would map each servo better based upon its actual min and max position in microseconds pulse. If the servo gets close to the min or max it will drift and you will have to re-zero the servo. Typically if we get a good calibration the antenna, as is, will point to within 2 degrees, but when the balloon gets far enough away this 2 degrees can be a difference in 3dB of signal strength, which could be the difference in getting a video stream or no video stream. We are also trying to find a way to verify this accuracy, where is the dish pointing compared to where it thinks it is pointing, there are so many places in the mechanics of the antenna gimble that can make it point different than what the IMU reads. We are also thinking about changing how the IMU is mounted as well.
|
|