Diskussion:Controlled Autonomous Driving for a JetRacer: Unterschied zwischen den Versionen
Zur Navigation springen
Zur Suche springen
Keine Bearbeitungszusammenfassung |
|||
| (11 dazwischenliegende Versionen von 2 Benutzern werden nicht angezeigt) | |||
| Zeile 7: | Zeile 7: | ||
* Is heading from camera a good solution? | * Is heading from camera a good solution? | ||
* Compare Heading from cam, gyro, TopCon and map in one plot. | * Compare Heading from cam, gyro, TopCon and map in one plot. | ||
* Why is the gyro measurement slowing down the data processing? | |||
* Decide what is the best. | * Decide what is the best. | ||
* Speed um the car. | * Speed um the car. | ||
| Zeile 16: | Zeile 17: | ||
* Compare different controller for this task: Pure Pursuit, Stanley Controller, Linear-Quadratic Regulator (LQR), Model Predictive Control (MPC),... | * Compare different controller for this task: Pure Pursuit, Stanley Controller, Linear-Quadratic Regulator (LQR), Model Predictive Control (MPC),... | ||
|| | || | ||
* | * The heading from the camera is a localized heading with the optical axis of the camera as the reference direction. Then the angle between the optical axis of the camera and the target path chord is measured to be used as local heading from that particular frame. | ||
* The gyroscope and the motor steering unit are located on the RP2040 extension board. The RP2040 extension board uses USB serial PORT = '/dev/ttyACM0' for communication with Jetson Nano hence the there is latency and FPS drop when reading gyro and writing steering commands is done in same application as compared to only steering commands application. | |||
* The best solution for max speed is the vision lane detection based steering only. This has achieved a maximum speed of 1.45 meters per second. | |||
* Increasing the lookahead to 1.5 meters changes the pixel to physical distance ratio. Right now the 1 pixel equal to 0.5 cm physical distance horizontally and vertically. Increasing the lookahead distance to 1.5 meters will stretch the vertical resolution to more than 0.5 cm and horizontal resolution will remain the same. This will result in changing the whole controller behaviour and mathematics. Right now the controller is not designed for this uneven resolution calculations. Increasing the image resolution will cause a drop in FPS as more pixels need to be processed by the lane detection algorithm. | |||
|} | |} | ||
| Zeile 36: | Zeile 40: | ||
4) Compute steering command (feedforward + feedback) | 4) Compute steering command (feedforward + feedback) | ||
*Feedforward: compute required steering angle <math>\delta_{ff}</math> from κ_cam and vehicle speed v (use steering geometry/bicycle model). | *Feedforward: compute required steering angle <math>\delta_{ff}</math> from κ_cam and vehicle speed v (use steering geometry/bicycle model). | ||
*Feedback: compute corrective term from state errors <math> | *Feedback: compute corrective term from state errors <math>\delta_{fb}</math> (e. g. lateral error, heading error, yaw-rate error) using a controller (PID on yaw-rate or lateral error, LQR, or state-feedback). | ||
* Final command: <math> | * Final command: <math>\delta = \delta_{ff} + \delta_{fb}</math>, then apply actuator limits (max steering angle, rate limits). | ||
Aktuelle Version vom 13. April 2026, 13:30 Uhr
| # | date | plan for this week | progress |
|---|---|---|---|
| 4 | 31.03.26 |
|
|
Concept
1) Per-frame camera processing (lower rate)
- Extract lane/track features (lane lines, road edges, path points).
- Estimate curvature κ_cam(s) at a chosen look-ahead distance s (or a small set of look-ahead distances).
2) High-rate IMU loop
- Read yaw rate r, steering angle δ (if available), wheel speeds.
- Use a simple kinematic/dynamic model (e.g. bicycle model) to predict short-term motion and where the vehicle will be in 1–2 m given current yaw rate and speed.
3) State estimation / sensor fusion
- Run an EKF/UKF or Kalman filter with states like lateral error, heading error, yaw rate, possibly curvature.
- Inputs/measurements: IMU yaw rate, steering angle, odometry; camera curvature or look-ahead heading as a lower-rate measurement/observation.
- Let the IMU provide high-frequency propagation and the camera provide periodic corrections (adapt measurement covariances when camera quality is low).
4) Compute steering command (feedforward + feedback)
- Feedforward: compute required steering angle from κ_cam and vehicle speed v (use steering geometry/bicycle model).
- Feedback: compute corrective term from state errors (e. g. lateral error, heading error, yaw-rate error) using a controller (PID on yaw-rate or lateral error, LQR, or state-feedback).
- Final command: , then apply actuator limits (max steering angle, rate limits).
Latency compensation & look-ahead tuning
Compensate for camera processing delay by propagating the estimated state forward by the measured latency before using camera-derived references.
Choose look-ahead distance dependent on speed: higher speed → longer look-ahead; lower speed → shorter.
Dynamically weight camera vs IMU in the estimator based on confidence (visibility, image quality).