표제지
초록
목차
1. 서론 9
1.1. 연구의 배경 9
1.2. 연구의 목적 11
1.3. 문헌 조사 12
2. 재료 및 방법 14
2.1. 온실용 자율 주행 로봇 플랫폼 14
2.1.1. 운영 환경 14
2.1.2. 시스템 구성 15
2.1.3. 운동학 22
2.2. 작업 경로 검출 및 자세 보정 알고리즘 25
2.2.1. Y-Correction 25
2.2.2. Yaw-Correction 29
2.2.3. X-Correction 32
2.3. 주행 알고리즘 37
2.4. 현장 평가를 위한 실험 설계 41
2.4.1. 개요 41
2.4.2. 전체 경로 주행 시나리오 검증 42
3. 결과 및 고찰 45
3.1. 작업 경로 검출 및 로봇 자세 보정 45
3.1.1. Y-Correction 결과 분석 45
3.1.2. X-Correction 결과 분석 47
3.1.3. Yaw-Correction 결과 분석 48
3.2. 전체 경로 주행 결과 분석 50
4. 결론 52
참고 문헌 54
Abstract 58
Table. 2-1. Specifications of the depth camera 17
Table. 2-2. Specifications of the proximity sensor 18
Table. 2-3. Specifications of the 1D LiDAR 19
Table. 2-4. Specifications of the main controller (SBC) 20
Table. 2-5. Specifications of autonomous greenhouse robot 21
Table. 3-1. RMSE of robot position error for rail 46
Table. 3-2. RMSE of robot stop position error for ground boundary 48
Table. 3-3. RMSE of robot rotation error for rail 50
Fig. 1-1. (a) Different distances between rails (b) The robot's posture when exiting the rail (c) Rails deviated with a lateral shift 11
Fig. 2-1. Operating environment of greenhouse robot 14
Fig. 2-2. Double wheel structure of greenhouse robot 15
Fig. 2-3. Moving direction of mecanum wheeled mobile robot 16
Fig. 2-4. How to enter the next working path according to wheel: (a) general robot wheels (b) mecanum wheels 17
Fig. 2-5. View of autonomous greenhouse robot 21
Fig. 2-6. Kinematics of mecanum wheeled mobile robot 22
Fig. 2-7. Y-axis driving of greenhouse robot 26
Fig. 2-8. Problems with RGB-based rail detection: (a) Fallen leaves or branches around the rail - daytime (b) Peeled off the paint on the rail -... 26
Fig. 2-9. Comparison of RGB and Depth images for rails 27
Fig. 2-10. Rail detection algorithm for Y-Correction 28
Fig. 2-11. Correction information for Y-Correction algorithm 29
Fig. 2-12. Rotational posture correction of greenhouse robot 29
Fig. 2-13. Vanishing point detection algorithm for Yaw-Correction 30
Fig. 2-14. Correlation between vanishing point and robot posture: (a) When the rail and the robot are parallel (b) When the robot rotates to the left... 31
Fig. 2-15. Correction information for Yaw-Correction algorithm 32
Fig. 2-16. Adjusting the X-axis distance of the greenhouse robot 32
Fig. 2-17. Problems with RGB-based ground boundary detection: (a) When the boundary is covered by plants (b) When the boundary is invisible 33
Fig. 2-18. Depth difference between working path area and ground area 33
Fig. 2-19. Correction information for X-Correction algorithm 36
Fig. 2-20. Full path for greenhouse environment driving 37
Fig. 2-21. System node diagram based on ROS 2 38
Fig. 2-22. Flowchart of full path driving algorithm for greenhouse robot 40
Fig. 2-23. Test field and greenhouse robot for evaluation 41
Fig. 2-24. Robot position error measurement criteria for rail 42
Fig. 2-25. Robot stop position error measurement criteria for ground boundary 43
Fig. 2-26. Robot rotation error measurement criteria for rail 44
Fig. 3-1. Changes in cxerror by Y-Correction[이미지참조] 45
Fig. 3-2. Results for Y-Correction 46
Fig. 3-3. Depth image comparison for rail: (a) Daytime (b) Nighttime 47
Fig. 3-4. Results for X-Correction 48
Fig. 3-5. Changes in vxerror by Yaw-Correction[이미지참조] 49
Fig. 3-6. Results for Yaw-Correction: (a) Before (b) After 50