Autonomous Driving Analysis, sumarry And Conclusion: Unterschied zwischen den Versionen

Aus HSHL Mechatronik
Zur Navigation springen Zur Suche springen
Evrard.leuteu-feukeu@stud.hshl.de (Diskussion | Beiträge)
Keine Bearbeitungszusammenfassung
Evrard.leuteu-feukeu@stud.hshl.de (Diskussion | Beiträge)
Keine Bearbeitungszusammenfassung
 
(12 dazwischenliegende Versionen desselben Benutzers werden nicht angezeigt)
Zeile 1: Zeile 1:
=== Materials and Setup ===
== Autonomous Driving Neural Network Training ==
=== Training Results ===
* Models trained on discrete steering values showed significantly better results than regression-based approaches.
* Grayscale recordings increased model stability.
* Calibrated images helped reduce the steering error caused by lens distortion.
* The project switched to **classification** using **discrete steering values**. with the following parameters as seen on the figure below:


* Jetson Nano Developer Kit
[[Datei:Cleaned CNN Classify.png||Fig. 38: Classification Training]]
* USB Gamepad
[[Datei:regression.png||Fig. 39: Classification Result]]
* MATLAB® 2020B (compatible with CUDA 10.2) [https://de.mathworks.com/support/requirements/supported-compilers.html]
* Wi-Fi connection between Jetson Nano and host PC


More details of further work on autonomous driving can be found here[https://wiki.hshl.de/wiki/index.php/JetRacer:_Autonomous_Driving,_Obstacle_Detection_and_Avoidance_using_AI_with_MATLAB].


=== Experiment Design and Requirements ===


* MATLAB® was used to connect to the Jetson Nano via SSH.
== Experiment Analysis ==
* The Gamepad receiver had to be physically connected to the Jetson Nano.
* MATLAB® controlled the Gamepad indirectly using SSH commands to access libraries on the Jetson.
* Data such as images and steering values were collected and transferred via Wi-Fi.




[[Datei:Matlab ssh code.jpg|mini|frame|Fig. 3: Ssh code snipped to connect using MATLAB and JetRacer]]


== Experiment Procedure ==


=== Step 1: System Connection and Gamepad Control ===
=== Limitations ===
Since MATLAB® could not directly access the Gamepad receiver, an SSH command method was developed to control the Jetson Nano libraries from the background without delay or signal loss.
* Hardware constraints limited the maximum processing speed (without GPU Coder).
 
* The camera did not allow 50 fps, and the recording dimension was limited to 256 x 256.
=== Step 2: Data Collection ===
* Initial data quality was poor due to randomized manual steering input.
* Data was collected using the integrated camera at 256×256 pixel resolution.
* Better results were obtained after switching to discrete, consistent control signals.
* Real-time steering commands were logged and synchronized with images.
* However, simultaneous recording, steering, and saving caused performance bottlenecks due to the Jetson Nano’s limited resources.
* To reduce image distortion, camera calibration was performed using a calibration256by256.mat parameter.
[[Datei:Camera_no_calibration.png|mini|Fig. 5: Camera setup for image without calibration]]
 
[[Datei:Camera_with_calibration.png|mini|Fig. 6: Camera setup for image with calibration]]
 
=== Step 3: Neural Network Training ===
* Initially, a pretrained network was used with RGB images, but results were poor. with the following parameters:
 
[[Datei:Pretrained 50 epoch.png|mini]]
[[Datei:Pretrained 50 epoch 2.png|mini]]
 
* Regression methods were tried but also showed poor performance. with the following parameters:
 
[[Datei:regression_50_epoch2.png|mini|Fig. 9: regression_50_epoch2]]
 
[[Datei:regression_50_epoch.png|mini|frame|Fig. 10: regression_50_epoch]]
 
* The project switched to **classification** using **discrete steering values**. with the following parameters:
 
* MATLAB® Apps (GUI) were used to manually annotate images by clicking the correct path.
 
=== Step 4: System Optimization ===
* Lane tracking algorithms were integrated to automatically label lane positions.
* Additional features like lane angles were extracted to improve model inputs.
* Grayscale image recordings were used to achieve better generalization and faster processing.
* Still using **classification**. with the following parameters:
[[File:
[[Datei:Cleaned CNN Classify.png|mini]]
.png|frame|Fig. 13: Camera setup for image recording during driving]]
 
 
 
== Experiment Analysis ==


=== Training Results ===
Note: I have provided alternatives and solutions to the limitations above and could not implement them myself due to time constraints.
* Models trained on discrete steering values showed significantly better results than regression-based approaches.
* Grayscale recordings increased model stability.
* Calibrated images helped reduce the steering error caused by lens distortion.








=== Limitations ===
* Hardware constraints limited the maximum processing speed.
* Initial data quality was poor due to randomized manual steering input.
* Better results were obtained after switching to discrete, consistent control signals.
* However, simultaneous recording, steering, and saving caused performance bottlenecks due to the Jetson Nano’s limited resources.


== Summary and Outlook ==
== Summary and Outlook ==


This project demonstrated that it is possible to optimize autonomous navigation on the JetRacer using a combination of classical control algorithms and neural networks. Despite hardware limitations, stable autonomous driving behavior was achieved.
This experiment aimed to use the Jetson Nano and Matlab to complete various tasks, from "Familiarization with the existing framework" to "Autonomous Driving." Although the final result of the car driving several laps in the right lane could not be attained, there is plenty of room for future improvements as far as:
 
* Training model method with (more data, different training algorithms like reinforcement learning, etc.)
Future improvements could include:
* Expanding the dataset with more diverse environmental conditions.
* Expanding the dataset with more diverse environmental conditions.
* Flash working Model directly on jetson nano using gpu coder
* Flash working model directly on Jetson Nano using GPU Coder.
* Use accurate gamepad and use more complex training method.
* Use an accurate gamepad.
* Improving automated labeling and refining the PD controller parameters for faster driving without loss of robustness.
* Improving automated labeling and refining the PD controller parameters for faster driving without loss of robustness.

Aktuelle Version vom 12. Juni 2025, 22:36 Uhr

Autonomous Driving Neural Network Training

Training Results

  • Models trained on discrete steering values showed significantly better results than regression-based approaches.
  • Grayscale recordings increased model stability.
  • Calibrated images helped reduce the steering error caused by lens distortion.
  • The project switched to **classification** using **discrete steering values**. with the following parameters as seen on the figure below:

Fig. 38: Classification Training Fig. 39: Classification Result

More details of further work on autonomous driving can be found here[1].


Experiment Analysis

Limitations

  • Hardware constraints limited the maximum processing speed (without GPU Coder).
  • The camera did not allow 50 fps, and the recording dimension was limited to 256 x 256.
  • Initial data quality was poor due to randomized manual steering input.
  • Better results were obtained after switching to discrete, consistent control signals.
  • However, simultaneous recording, steering, and saving caused performance bottlenecks due to the Jetson Nano’s limited resources.
Note: I have provided alternatives and solutions to the limitations above and could not implement them myself due to time constraints.



Summary and Outlook

This experiment aimed to use the Jetson Nano and Matlab to complete various tasks, from "Familiarization with the existing framework" to "Autonomous Driving." Although the final result of the car driving several laps in the right lane could not be attained, there is plenty of room for future improvements as far as:

  • Training model method with (more data, different training algorithms like reinforcement learning, etc.)
  • Expanding the dataset with more diverse environmental conditions.
  • Flash working model directly on Jetson Nano using GPU Coder.
  • Use an accurate gamepad.
  • Improving automated labeling and refining the PD controller parameters for faster driving without loss of robustness.