Diskussion:Controlled Autonomous Driving for a JetRacer: Unterschied zwischen den Versionen
Zur Navigation springen
Zur Suche springen
Keine Bearbeitungszusammenfassung |
Keine Bearbeitungszusammenfassung |
||
| Zeile 18: | Zeile 18: | ||
|| | || | ||
* The heading from the camera is a localized heading with the optical axis of the camera as the reference direction. Then the angle between the optical axis of the camera and the target path chord is measured to be used as local heading from that particular frame. | * The heading from the camera is a localized heading with the optical axis of the camera as the reference direction. Then the angle between the optical axis of the camera and the target path chord is measured to be used as local heading from that particular frame. | ||
* The gyroscope and the motor steering unit are located on the RP2040 extension board. The RP2040 extension board uses USB serial PORT = '/dev/ttyACM0' for communication with Jetson Nano hence the there is latency and FPS drop when reading gyro and | * The gyroscope and the motor steering unit are located on the RP2040 extension board. The RP2040 extension board uses USB serial PORT = '/dev/ttyACM0' for communication with Jetson Nano hence the there is latency and FPS drop when reading gyro and writing steering commands is done in same application as compared to only steering commands application. | ||
|} | |} | ||
Version vom 13. April 2026, 13:15 Uhr
| # | date | plan for this week | progress |
|---|---|---|---|
| 4 | 31.03.26 |
|
|
Concept
1) Per-frame camera processing (lower rate)
- Extract lane/track features (lane lines, road edges, path points).
- Estimate curvature κ_cam(s) at a chosen look-ahead distance s (or a small set of look-ahead distances).
2) High-rate IMU loop
- Read yaw rate r, steering angle δ (if available), wheel speeds.
- Use a simple kinematic/dynamic model (e.g. bicycle model) to predict short-term motion and where the vehicle will be in 1–2 m given current yaw rate and speed.
3) State estimation / sensor fusion
- Run an EKF/UKF or Kalman filter with states like lateral error, heading error, yaw rate, possibly curvature.
- Inputs/measurements: IMU yaw rate, steering angle, odometry; camera curvature or look-ahead heading as a lower-rate measurement/observation.
- Let the IMU provide high-frequency propagation and the camera provide periodic corrections (adapt measurement covariances when camera quality is low).
4) Compute steering command (feedforward + feedback)
- Feedforward: compute required steering angle from κ_cam and vehicle speed v (use steering geometry/bicycle model).
- Feedback: compute corrective term from state errors (e. g. lateral error, heading error, yaw-rate error) using a controller (PID on yaw-rate or lateral error, LQR, or state-feedback).
- Final command: , then apply actuator limits (max steering angle, rate limits).
Latency compensation & look-ahead tuning
Compensate for camera processing delay by propagating the estimated state forward by the measured latency before using camera-derived references.
Choose look-ahead distance dependent on speed: higher speed → longer look-ahead; lower speed → shorter.
Dynamically weight camera vs IMU in the estimator based on confidence (visibility, image quality).