Diskussion:Controlled Autonomous Driving for a JetRacer

Aus HSHL Mechatronik
Zur Navigation springen Zur Suche springen
# date plan for this week progress
4 31.03.26
  • Is heading from camera a good solution?
  • Compare Heading from cam, gyro, TopCon and map in one plot.
  • Why is the gyro measurement slowing down the data processing?
  • Decide what is the best.
  • Speed um the car.
  • Extend propagation distance to 1,5 m.
  • Focus on lane following instead of lidar.
  • heading: 0° = north, 90° = east, 180° = south, 270° = west
  • In our map the first straigth leads the car to the east, than north ...
  • New scenario: AMR starts in left lane and stears automaticly in the middle of the right lane.
  • Compare different controller for this task: Pure Pursuit, Stanley Controller, Linear-Quadratic Regulator (LQR), Model Predictive Control (MPC),...
  • The heading from the camera is a localized heading with the optical axis of the camera as the reference direction. Then the angle between the optical axis of the camera and the target path chord is measured to be used as local heading from that particular frame.
  • The gyroscope and the motor steering unit are located on the RP2040 extension board. The RP2040 extension board uses USB serial PORT = '/dev/ttyACM0' for communication with Jetson Nano hence the there is latency and FPS drop when reading gyro and wrting steering commands is done in same application as compared to the just steering commands.

Concept

1) Per-frame camera processing (lower rate)

  • Extract lane/track features (lane lines, road edges, path points).
  • Estimate curvature κ_cam(s) at a chosen look-ahead distance s (or a small set of look-ahead distances).

2) High-rate IMU loop

  • Read yaw rate r, steering angle δ (if available), wheel speeds.
  • Use a simple kinematic/dynamic model (e.g. bicycle model) to predict short-term motion and where the vehicle will be in 1–2 m given current yaw rate and speed.

3) State estimation / sensor fusion

  • Run an EKF/UKF or Kalman filter with states like lateral error, heading error, yaw rate, possibly curvature.
  • Inputs/measurements: IMU yaw rate, steering angle, odometry; camera curvature or look-ahead heading as a lower-rate measurement/observation.
  • Let the IMU provide high-frequency propagation and the camera provide periodic corrections (adapt measurement covariances when camera quality is low).

4) Compute steering command (feedforward + feedback)

  • Feedforward: compute required steering angle δff from κ_cam and vehicle speed v (use steering geometry/bicycle model).
  • Feedback: compute corrective term from state errors δfb (e. g. lateral error, heading error, yaw-rate error) using a controller (PID on yaw-rate or lateral error, LQR, or state-feedback).
  • Final command: δ=δff+δfb, then apply actuator limits (max steering angle, rate limits).


Latency compensation & look-ahead tuning


Compensate for camera processing delay by propagating the estimated state forward by the measured latency before using camera-derived references. Choose look-ahead distance dependent on speed: higher speed → longer look-ahead; lower speed → shorter. Dynamically weight camera vs IMU in the estimator based on confidence (visibility, image quality).