This file implements the discrete-time observer task. It reads motor-input Shares, measured wheel displacement Shares, and IMU heading/yaw-rate Shares, then estimates forward position, heading, and wheel speeds. It can also stream those estimated states over UART.
This task provides model-based state estimation. It goes beyond simply relaying measurements and instead uses a state-space update to estimate the robot’s internal motion state.
observer_matrices.py| Name | Description |
|---|---|
uL, uR | Input Shares for left and right motor voltage. |
sL, sR | Measured wheel displacement Shares. |
psi, psiDot | Measured heading and yaw-rate Shares. |
s_hat, psi_hat, wL_hat, wR_hat | Output Shares for the estimated states. |
_xhat | Internal estimated state vector. |
_Ad, _Bd | State-space matrices used for the observer update. |
_use_bd_full | Indicates whether the full input matrix form is being used. |
_stream_hz | Streaming rate for UART output. |
_x_path, _y_path | Integrated path values used during streamed output. |
| Method / Logic | Description |
|---|---|
_load_matrices() | Attempts to import and validate observer matrices. |
_startup_print() | Prints a summary of observer startup status. |
_reset_stream_path() | Zeros the streamed x-y path and writes a header over UART. |
_maybe_stream_sample() | Streams one line of estimated-state data if streaming is enabled and enough time has elapsed. |
| Fallback mode | If matrices are unavailable, falls back to a simpler estimate based on measured values. |
This file is one of the strongest demonstrations of controls theory in the project. It shows that the robot is not just reacting to measurements, but also estimating its internal state using a model-based approach.
This file gives the project much more technical depth. It bridges the gap between raw sensing and state estimation, which is one of the clearest connections between the final robot and the theory covered in class.