JetRacer: Spurführung mit künstlicher Intelligenz: Unterschied zwischen den Versionen

Aus HSHL Mechatronik
Zur Navigation springen Zur Suche springen
Keine Bearbeitungszusammenfassung
Evrard.leuteu-feukeu@stud.hshl.de (Diskussion | Beiträge)
Keine Bearbeitungszusammenfassung
Zeile 1: Zeile 1:
[[Kategorie:Projekte]]  
[[Category:Projects]]
[[Kategorie:JetRacer]]  
[[Category:JetRacer]]
[[Datei:JetRacer Waveshare.jpg|thumb|rigth|500px|Abb. 1: JetRacer AI Pro von Waveshare]]
[[File:JetRacer Waveshare.jpg|thumb|right|500px|Fig. 1: JetRacer AI Pro from Waveshare]]
 
{| class="wikitable"
{| class="wikitable"
|-
|-
| '''Autor:'''  || [[Benutzer:Evrard_Leuteu-Feukeu|Evrard Leuteu]]
| '''Author:'''  || [[User:Evrard_Leuteu-Feukeu|Evrard Leuteu]]
|-
|-
| '''Art:''' || Projektarbeit
| '''Type:''' || Project Work
|-
|-
| '''Starttermin''': || TBD
| '''Start Date:''' || TBD
|-
|-
| '''Abgabetermin''': || TBD
| '''Submission Date:''' || TBD
|-
|-
| '''Betreuer''': || [[Benutzer:Ulrich_Schneider| Prof. Dr.-Ing. Schneider]]
| '''Supervisor:''' || [[User:Ulrich_Schneider| Prof. Dr.-Ing. Schneider]]
|}
|}
== Einführung ==
 
== [[Installation der virtuellen Maschine VirtualBox|Installation der virtuellen Maschine VirtualBox]] ==
== Introduction ==
== Aufgabenstellung ==
 
# Einarbeitung in das bestehende Framwework
The goal of this project was to optimize the JetRacer’s AI for autonomous navigation on the test circuit of the [[Labor_Autonome_Systeme|Autonomous Systems Laboratory]] at HSHL. The JetRacer was required to drive counterclockwise in the right lane as fast and robustly as possible using MATLAB® and a self-developed PD controller.
# Optimierung der KI für den Rundkurs im Labor [[Labor_Autonome_Systeme|Autonome Systeme]] (Geschwindigkeit, Robustheit).
 
# Optimierung des Reglers (z. B. PD-Regler)
Tasks included training a neural network, limiting maximum speed to approximately 1 m/s, taking videos during laps, and deploying the system using GPU Coder.
# Nutzung von MATLAB zum Anlernen des Jetson Nano.
 
# Drive the JetRacer in the right lane counterclockwise with the gamepad controller. Limit the speed via Software to a maximum (e. g. 1 m/s).
# Familiarization with the existing framework.
# Optimization of the AI for the circuit in the [[Labor_Autonome_Systeme|Autonomous Systems Laboratory]] (speed, robustness).
# Optimization of the controller (e.g., PD controller).
# Use of MATLAB to train the Jetson Nano.
# Drive the JetRacer in the right lane counterclockwise with the gamepad controller. Limit the speed via Software to a maximum (e.g., 1 m/s).
# Take a video while driving a lap with MATLAB<sup>®</sup> using a MATLAB<sup>®</sup>-script.
# Take a video while driving a lap with MATLAB<sup>®</sup> using a MATLAB<sup>®</sup>-script.
# Load the pretrained NN.
# Load the pretrained NN.
# Train the pretrained NN with MATLAB<sup>®</sup> with a MATLAB<sup>®</sup>-App (GUI) by clicking the desired path in the images.
# Train the pretrained NN with MATLAB<sup>®</sup> with a MATLAB<sup>®</sup>-App (GUI) by clicking the desired path in the images.
# Option: Use classic lane tracking algorithms to teach the NN automatically.
# Option: Use classic lane tracking algorithms to teach the NN automatically.
# Write a PD-contoller that uses the NN to drive in the right lane. Program this in MATLAB<sup>®</sup> and let it run on the JetRacer-GPU using GPU Coder.
# Write a PD controller that uses the NN to drive in the right lane. Program this in MATLAB<sup>®</sup> and let it run on the JetRacer GPU using GPU Coder.
# Goal: the car should drive autonomously several laps in the right lane as fast as possible.
# Goal: the car should drive autonomously several laps in the right lane as fast as possible.
<!--
<!--
# Nutzung von ROS2 zum Anlernen des Jetson Nano.
# Use of ROS2 to train the Jetson Nano.
# Bewertung der Vor- und Nachteile der Programmierumgebungen.  
# Evaluation of the advantages and disadvantages of the programming environments.
# Auswahl einer KI-Entwicklungsumgebung
# Selection of an AI development environment.
-->
-->
# Dokumentation nach wissenschaftlichem Stand im HSHL-Wiki
# Documentation according to scientific standards in the HSHL Wiki.


== Anforderungen ==
== Requirements ==
Das Projekt erfordert Vorwissen in den nachfolgenden Themengebieten. Sollten Sie die Anforderungen nicht erfüllen müssen Sie sich diese Kenntnisse anhand im Rahmen der Arbeit anhand von Literatur/Online-Kursen selbst aneignen.


* Erfahrungen mit Künstlicher Intelligenz/Deep Learning
The project requires prior knowledge in the following subjects. If you do not meet the requirements, you must acquire these skills during the course of the project using literature or online courses.
* Programmierung in C++, Python
* Dokumentenversionierung mit SVN


== Anforderungen an die wissenschaftliche Arbeit ==
* Experience with Artificial Intelligence/Deep Learning
* Wissenschaftliche Vorgehensweise (Projektplan, etc.), nützlicher Artikel: [[Gantt-Diagramm| Gantt Diagramm erstellen]]
* Programming in C++, Python
* Wöchentlicher Fortschrittsberichte (informativ), aktualisieren Sie das [[Diskussion:JetRacer:_Spurf%C3%BChrung_mit_k%C3%BCnstlicher_Intelligenz|Besprechungsprotokoll]] - Live Gespräch mit Prof. Schneider
* Document versioning with SVN
* Projektvorstellung im Wiki
* Tägliche Sicherung der Arbeitsergebnisse in SVN
* Tägliche Dokumentation der geleisteten Arbeitsstunden
*[[Studentische_Arbeiten_bei_Prof._Schneider|Studentische Arbeiten bei Prof. Schneider]]
*[[Anforderungen_an_eine_wissenschaftlich_Arbeit| Anforderungen an eine wissenschaftlich Arbeit]]


==Projektplan==
== Requirements for Scientific Work ==


<br clear = all>
* Scientific approach (project plan, etc.), useful article: [[Gantt-Diagram|Create a Gantt diagram]]
* Weekly progress reports (informative), update the [[Discussion:JetRacer:_Lane_Following_with_Artificial_Intelligence|Meeting Minutes]] – live discussion with Prof. Schneider
* Project presentation in the Wiki
* Daily backup of project results in SVN
* Daily documentation of working hours
* [[Student_Projects_with_Prof._Schneider|Student Projects with Prof. Schneider]]
* [[Requirements_for_Scientific_Work|Requirements for Scientific Work]]


== SVN-Repositorium ==
== Theoretical Background ==
*[https://svn.hshl.de/svn/HSHL_Projekte/trunk/JetRacer/ SVN-Repositorium]


== Getting started ==
The NVIDIA® Jetson Nano™ is a small, powerful computing platform ideal for embedded AI projects. It allows the execution of real-time video analysis, neural network inference, and hardware-accelerated image processing using tools like MATLAB® and GPU Coder.
Lesen Sie zum Einstieg diese Artikel
*Siddiquy, T.: ''[[Automated lane following of a Waveshare JetRacer with artificial intelligence]]. Bachelorarbeit<br>
*Kamal, A.: ''[[JetRacer: Optimierung der Streckenführung]]''. Projektarbeit
*[[Gantt-Diagramm| Gantt Diagramm erstellen]]
*[[Wiki-Artikel_schreiben | Tipps zum Schreiben eines Wiki-Artikels]]
*[[Software_Planung| PAP Designer Einstieg]]
*[[Software_Versionsverwaltung_mit_SVN| Einführung in SVN]]


Classical lane detection algorithms and steering control (e.g., PD controllers) are fundamental in autonomous driving and were combined with neural networks to improve the JetRacer’s decision-making.


== [[Understand the existing system framework]] ==
== Experiment Preparation ==


== [[Optimize AI for speed and robustness in the lab]] ==
=== Materials and Setup ===
== [[Improve the controller (e.g., PD controller)]] ==
== [[Use MATLAB to train on Jetson Nano]] ==
== [[Drive JetRacer counterclockwise using a gamepad, limiting speed (e.g., 1 m/s)]] ==
== [[Record a lap using a MATLAB script]] ==
== [[Load and retrain a pretrained NN with a MATLAB GUI]] ==
== [[Optionally, automate NN training with classic lane tracking]] ==
== [[Develop a PD controller using the NN, coded in MATLAB for JetRacer GPU]] ==


* Jetson Nano Developer Kit
* USB Gamepad
* MATLAB® 2020B (compatible with CUDA 10.2) [https://de.mathworks.com/support/requirements/supported-compilers.html]
* Wi-Fi connection between Jetson Nano and host PC


[[File:Camera-setup.png|frame|Fig. 2: Camera setup for image recording during driving]]


== Summary==
=== Experiment Design and Requirements ===
* Firstly connecting to the Jetracer and MATLAB according to the [https://de.mathworks.com/help/coder/nvidia/ref/jetson.html#mw_82bd63bc-9f39-4bf7-8b06-794227589348] MATLAB documentation, which help up to setup the full process.
* For modeling the project it can be make as simple as possible to understand the steps to follow, in this paper [https://www.researchgate.net/publication/289116142_Implementation_and_evaluation_of_image_processing_techniques_on_a_vision_navigation_line_follower_robot],it shows some of the ideas and implementation can be done.
* A outline of the line following algorithm diagram can be:
    [[Datei:Model diagram.png|frame]]


* MATLAB® was used to connect to the Jetson Nano via SSH.
* The Gamepad receiver had to be physically connected to the Jetson Nano.
* MATLAB® controlled the Gamepad indirectly using SSH commands to access libraries on the Jetson.
* Data such as images and steering values were collected and transferred via Wi-Fi.


[[File:matlab_ssh_code
[[Datei:Matlab ssh code.jpg|mini]]
.jpg|frame|Fig. 3: Ssh code snipped to connect using MATLAB and JetRacer]]


== Experiment Procedure ==


=== Step 1: System Connection and Gamepad Control ===
Since MATLAB® could not directly access the Gamepad receiver, an SSH command method was developed to control the Jetson Nano libraries from the background without delay or signal loss.


=== Step 2: Data Collection ===
* Data was collected using the integrated camera at 256×256 pixel resolution.
* Real-time steering commands were logged and synchronized with images.
* To reduce image distortion, camera calibration was performed using a calibration256by256.mat parameter.
[[File:Camera_no_calibration.png|frame|Fig. 4: Camera setup for image without calibration]]
[[File:
[[Datei:Camera_with_calibration.png|mini]]
.png|frame|Fig. 5: Camera setup for image with calibration]]


=== Step 3: Neural Network Training ===
* Initially, a pretrained network was used with RGB images, but results were poor. with the following parameters:


[[File:pretrained_NN.png|frame|Fig. 2: Camera setup for image recording during driving]]
[[File:pretrained_NN.png|frame|Fig. 2: Camera setup for image recording during driving]]


* Firstly the camera configuration, Detecting the Jetracer camera then creating a video input object to set all the parameter like frame rate, resolution and others. Then the image acquisition can start its video labeling. [https://de.mathworks.com/help/supportpkg/usbwebcams/ug/webcam.html]
* Regression methods were tried but also showed poor performance. with the following parameters:


* For image acquisition after setting up camera, it start capturing image on fixed parameter which has been set. Image Processing for the line following starts as a capture the lane data from track, then convert it to grayscale image. Also applying edge detection(Canny), thresholding to locate the line and process the data. Then it continues the loop for the full process.
[[File:Regression_NN.png|frame|Fig. 2: Camera setup for image recording during driving]]
 
[[File:pretrained_NN.png|frame|Fig. 2: Camera setup for image recording during driving]]
[[Datei:Image-processing.png|frame]]


* The project switched to **classification** using **discrete steering values**. with the following parameters:


[[File:Classification_NN.png|frame|Fig. 2: Camera setup for image recording during driving]]
[[File:pretrained_NN.png|frame|Fig. 2: Camera setup for image recording during driving]]
* MATLAB® Apps (GUI) were used to manually annotate images by clicking the correct path.


=== Step 4: System Optimization ===
* Lane tracking algorithms were integrated to automatically label lane positions.
* Additional features like lane angles were extracted to improve model inputs.
* Grayscale image recordings were used to achieve better generalization and faster processing.
* Still using **classification**. with the following parameters:
[[File:lane_tracking.png|frame|Fig. 2: Camera setup for image recording during driving]]
[[File:lane_tracking.png|frame|Fig. 2: Camera setup for image recording during driving]]




== Experiment Analysis ==


=== Training Results ===
* Models trained on discrete steering values showed significantly better results than regression-based approaches.
* Grayscale recordings increased model stability.
* Calibrated images helped reduce the steering error caused by lens distortion.


=== System Performance ===
* A maximum speed limit (~1 m/s) was successfully enforced via software.
* The JetRacer could autonomously drive several laps counterclockwise in the right lane.
* However, simultaneous recording, steering, and saving caused performance bottlenecks due to the Jetson Nano’s limited resources.


=== Limitations ===
* Hardware constraints limited the maximum processing speed.
* Initial data quality was poor due to randomized manual steering input.
* Better results were obtained after switching to discrete, consistent control signals.


== Summary and Outlook ==


* After getting all the data its time for analyze and make it setup for the Jetracer. While from the frame capture extracting the following line feature is main goal. Line positioning, direction of car, prediction of the path to follow next is also part of the process. All the analysis process the speed and steering control gets calculated to create a PID controller.
This project demonstrated that it is possible to optimize autonomous navigation on the JetRacer using a combination of classical control algorithms and neural networks. Despite hardware limitations, stable autonomous driving behavior was achieved.


* Controlling the Jetracer with PID controller has various complex steps to follow such as maintaining the speed, line, steering angle. Setting up the PID values to minimum at first then after identifying the error rate of the predicted line. Then tuning the parameters of the PID help to get the control results.
Future improvements could include:
* Expanding the dataset with more diverse environmental conditions.
* Upgrading the hardware to a Jetson Xavier NX for better real-time performance.
* Improving automated labeling and refining the PD controller parameters for faster driving without loss of robustness.


* The structure of the model is divided in various part: 
== SVN Repository ==
main.m: Contains the main loop and integrates other modules.
camera.m: Contains code for camera initialization.
processimage.m: Handles image processing & find line to follow.
PID_controller.m: Manages the PID control logic.
adjustSteering.m: Contains the code to adjust the Jetracer's steering.


== Mögliche Folgethemen ==
*[https://svn.hshl.de/svn/HSHL_Projekte/trunk/JetRacer/ SVN Repository]
* Kreuzungserkennung
 
* Vorfahrterkennung
== Getting Started ==
* Hinderniserkennung und Umfahrung
 
* Schildererkennung
To get started, read the following articles:
* Siddiquy, T.: ''[[Automated lane following of a Waveshare JetRacer with artificial intelligence]]. Bachelor's Thesis''
* Kamal, A.: ''[[JetRacer: Optimization of Lane Following]]. Project Work''
* [[Gantt-Diagram|Create a Gantt Diagram]]
* [[Writing_Wiki_Articles | Tips for Writing a Wiki Article]]
* [[Software_Planning|Introduction to PAP Designer]]
* [[Software_Version_Management_with_SVN|Introduction to SVN]]
 
== [[Step by Step How to Program Jetson Nano Using Matlab]] ==
 
== Useful Articles ==


== Nützliche Artikel ==
* [https://de.mathworks.com/videos/matlab-and-simulink-robotics-arena-deep-learning-with-nvidia-jetson-and-ros--1542015526909.html Deep Learning with MATLAB, NVIDIA Jetson, and ROS]
* [https://de.mathworks.com/videos/matlab-and-simulink-robotics-arena-deep-learning-with-nvidia-jetson-and-ros--1542015526909.html Deep Learning with MATLAB, NVIDIA Jetson, and ROS]
* [https://github.com/NVIDIA-AI-IOT/jetracer NVidia: JetRacer]
* [https://github.com/NVIDIA-AI-IOT/jetracer NVidia: JetRacer]
* [https://www.formulaedge.org/what-is-jetracer formulaedge.org]
* [https://www.formulaedge.org/what-is-jetracer formulaedge.org]
* [https://www.waveshare.com/wiki/JetRacer_AI_Kit Waveshare Wiki]
* [https://www.waveshare.com/wiki/JetRacer_AI_Kit Waveshare Wiki]
* [https://de.mathworks.com/matlabcentral/answers/2072946-matlab-to-control-jetracer-jetson-nano-tx1-motor/?s_tid=ans_lp_feed_leaf FAQ: Matlab to control jetracer(jetson nano tx1) motor]
* [https://de.mathworks.com/matlabcentral/answers/2072946-matlab-to-control-jetracer-jetson-nano-tx1-motor/?s_tid=ans_lp_feed_leaf FAQ: MATLAB to control JetRacer (Jetson Nano TX1) motor]
* [https://arxiv.org/html/2404.06229v1 Towards Autonomous Driving with Small-Scale Cars: A Survey of Recent Development]
* [https://arxiv.org/html/2404.06229v1 Towards Autonomous Driving with Small-Scale Cars: A Survey of Recent Development]
* [https://www.mathworks.com/help/ros/ug/lane-detection-in-ros2-using-deep-learning-with-matlab.html Lane Detection in ROS 2 Using Deep Learning with MATLAB]
* [https://www.mathworks.com/help/ros/ug/lane-detection-in-ros2-using-deep-learning-with-matlab.html Lane Detection in ROS 2 Using Deep Learning with MATLAB]
Zeile 139: Zeile 182:
* [https://www.mathworks.com/videos/machine-learning-with-simulink-and-nvidia-jetson-1653572407665.html Machine Learning with Simulink and NVIDIA Jetson]
* [https://www.mathworks.com/videos/machine-learning-with-simulink-and-nvidia-jetson-1653572407665.html Machine Learning with Simulink and NVIDIA Jetson]
* [https://www.researchgate.net/publication/365940001_Comparative_Transfer_Learning_Models_for_End-to-End_Self-Driving_Car Comparative Transfer Learning Models for End-to-End Self-Driving Car]
* [https://www.researchgate.net/publication/365940001_Comparative_Transfer_Learning_Models_for_End-to-End_Self-Driving_Car Comparative Transfer Learning Models for End-to-End Self-Driving Car]
* [https://www.researchgate.net/publication/349219294_Multi-task_deep_learning_with_optical_flow_features_for_self-driving_cars Multi-task deep learning with optical flow features for self-driving cars]
* [https://www.researchgate.net/publication/349219294_Multi-task_deep_learning_with_optical_flow_features_for_self-driving_cars Multi-task Deep Learning with Optical Flow Features for Self-Driving Cars]
* [https://www.mathworks.com/help/reinforcement-learning/ug/train-dqn-agent-for-lane-keeping-assist.html Train DQN Agent for Lane Keeping Assist]
* [https://www.mathworks.com/help/reinforcement-learning/ug/train-dqn-agent-for-lane-keeping-assist.html Train DQN Agent for Lane Keeping Assist]
* [https://www.researchgate.net/publication/372281750 Real-Time End-to-End Self-Driving Car Navigation]
* [https://www.researchgate.net/publication/372281750 Real-Time End-to-End Self-Driving Car Navigation]
Zeile 145: Zeile 188:
* [https://developer.nvidia.com/embedded/learn/jetson-ai-certification-programs Jetson AI Courses and Certifications]
* [https://developer.nvidia.com/embedded/learn/jetson-ai-certification-programs Jetson AI Courses and Certifications]
* [https://forums.developer.nvidia.com/ NVIDIA Developer Forums]
* [https://forums.developer.nvidia.com/ NVIDIA Developer Forums]
* [[ChatGPT Anleitung für einen Lane Following Algorithmus]]
* [[ChatGPT Guide for a Lane Following Algorithm]]


== Literatur ==
== Literature ==
S<span style="font-variant:small-caps">chreiber</span>, C.: ''KI-gestützte „Follow-Me“-Funktion am Beispiel des JetRacer''. Mittweida, Hochschule Mittweida – University of Applied Sciences, Fakultät Ingenieurwissenschaften,
 
Masterarbeit, 2023. URL: [https://monami.hs-mittweida.de/frontdoor/deliver/index/docId/14754/file/MA_55114_Christian-Schreiber_geschwaerzt.pdf]
S<span style="font-variant:small-caps">chreiber</span>, C.: ''AI-Supported "Follow-Me" Function Using the JetRacer as an Example''. Mittweida, Hochschule Mittweida – University of Applied Sciences, Faculty of Engineering Sciences, Master's Thesis, 2023. URL: [https://monami.hs-mittweida.de/frontdoor/deliver/index/docId/14754/file/MA_55114_Christian-Schreiber_geschwaerzt.pdf]


----
----
zurück zum Hauptartikel: [[Studentische_Arbeiten|Studentische Arbeiten]]
 
back to the main article: [[Student_Projects|Student Projects]]

Version vom 30. April 2025, 00:55 Uhr

Fig. 1: JetRacer AI Pro from Waveshare
Author: Evrard Leuteu
Type: Project Work
Start Date: TBD
Submission Date: TBD
Supervisor: Prof. Dr.-Ing. Schneider

Introduction

The goal of this project was to optimize the JetRacer’s AI for autonomous navigation on the test circuit of the Autonomous Systems Laboratory at HSHL. The JetRacer was required to drive counterclockwise in the right lane as fast and robustly as possible using MATLAB® and a self-developed PD controller.

Tasks included training a neural network, limiting maximum speed to approximately 1 m/s, taking videos during laps, and deploying the system using GPU Coder.

  1. Familiarization with the existing framework.
  2. Optimization of the AI for the circuit in the Autonomous Systems Laboratory (speed, robustness).
  3. Optimization of the controller (e.g., PD controller).
  4. Use of MATLAB to train the Jetson Nano.
  5. Drive the JetRacer in the right lane counterclockwise with the gamepad controller. Limit the speed via Software to a maximum (e.g., 1 m/s).
  6. Take a video while driving a lap with MATLAB® using a MATLAB®-script.
  7. Load the pretrained NN.
  8. Train the pretrained NN with MATLAB® with a MATLAB®-App (GUI) by clicking the desired path in the images.
  9. Option: Use classic lane tracking algorithms to teach the NN automatically.
  10. Write a PD controller that uses the NN to drive in the right lane. Program this in MATLAB® and let it run on the JetRacer GPU using GPU Coder.
  11. Goal: the car should drive autonomously several laps in the right lane as fast as possible.
  12. Documentation according to scientific standards in the HSHL Wiki.

Requirements

The project requires prior knowledge in the following subjects. If you do not meet the requirements, you must acquire these skills during the course of the project using literature or online courses.

  • Experience with Artificial Intelligence/Deep Learning
  • Programming in C++, Python
  • Document versioning with SVN

Requirements for Scientific Work

Theoretical Background

The NVIDIA® Jetson Nano™ is a small, powerful computing platform ideal for embedded AI projects. It allows the execution of real-time video analysis, neural network inference, and hardware-accelerated image processing using tools like MATLAB® and GPU Coder.

Classical lane detection algorithms and steering control (e.g., PD controllers) are fundamental in autonomous driving and were combined with neural networks to improve the JetRacer’s decision-making.

Experiment Preparation

Materials and Setup

  • Jetson Nano Developer Kit
  • USB Gamepad
  • MATLAB® 2020B (compatible with CUDA 10.2) [1]
  • Wi-Fi connection between Jetson Nano and host PC
Datei:Camera-setup.png
Fig. 2: Camera setup for image recording during driving

Experiment Design and Requirements

  • MATLAB® was used to connect to the Jetson Nano via SSH.
  • The Gamepad receiver had to be physically connected to the Jetson Nano.
  • MATLAB® controlled the Gamepad indirectly using SSH commands to access libraries on the Jetson.
  • Data such as images and steering values were collected and transferred via Wi-Fi.

[[File:matlab_ssh_code

.jpg|frame|Fig. 3: Ssh code snipped to connect using MATLAB and JetRacer]]

Experiment Procedure

Step 1: System Connection and Gamepad Control

Since MATLAB® could not directly access the Gamepad receiver, an SSH command method was developed to control the Jetson Nano libraries from the background without delay or signal loss.

Step 2: Data Collection

  • Data was collected using the integrated camera at 256×256 pixel resolution.
  • Real-time steering commands were logged and synchronized with images.
  • To reduce image distortion, camera calibration was performed using a calibration256by256.mat parameter.
Fig. 4: Camera setup for image without calibration

[[File:

.png|frame|Fig. 5: Camera setup for image with calibration]]

Step 3: Neural Network Training

  • Initially, a pretrained network was used with RGB images, but results were poor. with the following parameters:
Datei:Pretrained NN.png
Fig. 2: Camera setup for image recording during driving
Datei:Pretrained NN.png
Fig. 2: Camera setup for image recording during driving
  • Regression methods were tried but also showed poor performance. with the following parameters:
Datei:Regression NN.png
Fig. 2: Camera setup for image recording during driving
Datei:Pretrained NN.png
Fig. 2: Camera setup for image recording during driving
  • The project switched to **classification** using **discrete steering values**. with the following parameters:
Datei:Classification NN.png
Fig. 2: Camera setup for image recording during driving
Datei:Pretrained NN.png
Fig. 2: Camera setup for image recording during driving
  • MATLAB® Apps (GUI) were used to manually annotate images by clicking the correct path.

Step 4: System Optimization

  • Lane tracking algorithms were integrated to automatically label lane positions.
  • Additional features like lane angles were extracted to improve model inputs.
  • Grayscale image recordings were used to achieve better generalization and faster processing.
  • Still using **classification**. with the following parameters:
Datei:Lane tracking.png
Fig. 2: Camera setup for image recording during driving
Datei:Lane tracking.png
Fig. 2: Camera setup for image recording during driving


Experiment Analysis

Training Results

  • Models trained on discrete steering values showed significantly better results than regression-based approaches.
  • Grayscale recordings increased model stability.
  • Calibrated images helped reduce the steering error caused by lens distortion.

System Performance

  • A maximum speed limit (~1 m/s) was successfully enforced via software.
  • The JetRacer could autonomously drive several laps counterclockwise in the right lane.
  • However, simultaneous recording, steering, and saving caused performance bottlenecks due to the Jetson Nano’s limited resources.

Limitations

  • Hardware constraints limited the maximum processing speed.
  • Initial data quality was poor due to randomized manual steering input.
  • Better results were obtained after switching to discrete, consistent control signals.

Summary and Outlook

This project demonstrated that it is possible to optimize autonomous navigation on the JetRacer using a combination of classical control algorithms and neural networks. Despite hardware limitations, stable autonomous driving behavior was achieved.

Future improvements could include:

  • Expanding the dataset with more diverse environmental conditions.
  • Upgrading the hardware to a Jetson Xavier NX for better real-time performance.
  • Improving automated labeling and refining the PD controller parameters for faster driving without loss of robustness.

SVN Repository

Getting Started

To get started, read the following articles:

Step by Step How to Program Jetson Nano Using Matlab

Useful Articles

Literature

Schreiber, C.: AI-Supported "Follow-Me" Function Using the JetRacer as an Example. Mittweida, Hochschule Mittweida – University of Applied Sciences, Faculty of Engineering Sciences, Master's Thesis, 2023. URL: [2]


→ back to the main article: Student Projects