Đăng ký Đăng nhập
Trang chủ Development of machine vision systems on object classification and measurement f...

Tài liệu Development of machine vision systems on object classification and measurement for robot manipulation

.PDF
137
1
110

Mô tả:

.. 國立高雄科技大學 機械工程系 博士班 博士論文 操縱機器人所需物件分類和量測機械視覺系統之開發 Development of machine vision systems on object classification and measurement for robot manipulation Graduate student: Ngo Ngoc Vu 研究生:吳玉武 指導教授:許光城 教授 Advisor: Prof. Quang-Cherng Hsu 中華民國 108 年 1 月 i 操縱機器人所需物件分類和量測機械視覺系統之開發 Development of machine vision systems on object classification and measurement for robot manipulation Graduate student: Ngo Ngoc Vu 研究生:吳玉武 Advisor: Prof. Quang-Cherng Hsu 指導教授:許光城 教授 國立高雄科技大學 機械工程系 博士論文 A Dissertation Submitted to Department of Mechanical Engineering National Kaohsiung University of Science and Technology in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Mechanical Engineering January, 2019 Kaohsiung, Taiwan, Republic of China 中華民國 108 年 1 月 ii iii 操縱機器人所需物件分類和量測機械視覺系統之開發 研究生:吳玉武 指導教授:許光城 教授 國立高雄科技大學 機械工程系 博士班 中文摘要 本研究介紹了機器視覺系統在物體分類和機器人操縱測量方面的研究。首先, 在不同的照明條件下開發了一套可用於金屬零件的自動分類和量測的機械視覺系 統,並將其應用於具 6 個自由度(DOF)的機器手臂。為了獲得準確的定位資訊, 整個圖像藉由工作平台上方的 CMOS 攝影機來進行取像。本研究中亦探討不同照明 條件下對本系統的影響,在正向打光的條件中有四種不同的打光狀態,且在每種條 件,皆使用全域和局部閾值來獲得較佳的圖像品質。研究中主要利用張氏方法、線 性轉換函式與二階轉換函式來取得影像座標與世界座標的關係。 實驗結果顯示,在背向打光相較於前向打光其所獲得的物體中心更為準確,而 在校正結果的部分,二階轉換函式比其他校正方法更準確。透過使用二階轉換函式 的校正偏差,在 X 和 Y 軸其最大正偏差分別為 0.48 mm 和 0.38 mm 而最大負偏差分 別為-0.34 mm 和-0.43 mm。 本研究亦開發了一套機械視覺系統於六自由度(DOF)的機械手臂用以進行物件 顏色的分類和座標的量測。整個圖像藉由工作平台上方的兩個相機以藉此獲得準確 的定位資訊,並在二維(2-D)和三維(3-D)的校正過程中採用二階轉換函式與透視投影 法來建立影像坐標和世界坐標間的關係。在二維校正的部分,於 X 和 Y 軸上的最大 正偏差分別為 1.29 mm 和 1.12 mm;而最大負偏差分別為-1.48 mm 和-0.97 mm。在 三維校正的部分,X,Y 和 Z 方向的偏差分別為 0.07、-0.418 和-0.063 mm。本研究 所提出的視覺識別系統可以對物件進行分類並取得該物件之三維坐標。 關鍵字:機械視覺、機械手、相機校正、影像分析、物件分類、光源 i Development of machine vision systems on object classification and measurement for robot manipulation Graduate student: Ngo Ngoc Vu Advisor: Prof. Quang-Cherng Hsu Department of Mechanical Engineering National Kaohsiung University of Science and Technology ABSTRACT This research presents development of machine vision systems on object classification and measurement for robot manipulation. Firstly, a machine vision system for the automatic metal part classification and measurement process is developed under different lighting conditions, and has been applied to the operation of a robot arm with 6 degrees of freedom (DOF). In order to obtain accurate positioning information, the overall image is captured by a CMOS camera which is mounted above the working platform. The effects of back-lighting and frontlighting conditions to the proposed system were investigated. With the frontlighting condition, four different conditions were performed. For each condition, global and local threshold operations were used to obtain good image quality. The relationship between the image coordinates and the world coordinates was determined through Zhang’s method, the linear transformation and the quadratic transformation during the calibration process. Experimental results show that in a back-lighting environment, the image quality is improved, such that the positions of the centers of objects are more accurate than in a front-lighting environment. According to the calibration results, the quadratic transformation is more accurate ii than other methods. By calculating the calibration deviation using the quadratic transformation, the maximum positive deviation is 0.48 mm and 0.38 mm in the X and Y directions, respectively. The maximum negative deviation is -0.34 mm and -0.43 mm in X and Y directions, respectively. The proposed system is effective, robust, and can be valuable to industry. The second, a machine vision system for color object classification and measurement process for robot arm with six degree of freedom (DOF) is developed. In order to obtain accurate positioning information, the overall image is captured by a double camera C615 and a camera C525 which are mounted above the working platform. The relationship between the image coordinate and the world coordinate is performed through calibration procedure. The quadratic transformation and generalized perspective transformation algorithms were used to transform coordinates in 2-D and 3-D calibration process, respectively. According to calibration results, with 2-D calibration, the positive maximum deviation is 1.29 mm and 1.12 mm in X and Y directions, respectively. The negative maximum deviation is -1.48 mm and -0.97 mm in X and Y directions, respectively. With 3-D calibration, the deviation is 0.07 mm, -0.418 mm, and -0.063 mm in X, Y and Z directions, respectively. The proposed system can catch the three dimensional coordinates of the object and perform classification and assembly automatic operations by the data from visual recognition system. Keywords: machine vision, robot arm, camera calibration, image analysis, object recognition, lighting source. iii ACKNOWLEDGMENTS The fulfillment of over three years of study at National Kaohsiung University of Science and Technology (NKUST) has brought me into closer relations with many enthusiastic people who wholeheartedly devoted their time, energy, and support to help me during my studies. Therefore, this is my opportunity to acknowledge my great debt of thanks to them. I wish to express my thanks and gratitude to my academic supervisor, Prof. Dr. Quang-Cherng Hsu, for his continuous guidance, valuable advice, and helpful supports during my studies. He has always been supportive of my research work and gave me the freedom to fully explore the different research areas related with my study. I wish to acknowledge my deepest thanks to Vietnam Ministry of Education and Taiwan Ministry of Education for giving me a great opportunity, necessary scholarships to study at NKUST via VEST500 scholarship which is corporation between Vietnam Government and Taiwan Government, and many enthusiastic helps during my time in NKUST. I am also particularly grateful to Thai Nguyen University of Technology (TNUT) provided me unflagging encouragement, continuous helps and support to complete this course. My gratitude also goes to all of the teachers, Dean and staffs of Department of Mechanical Engineering at NKUST for their devoted teaching, great helping and thoughtful serving during my study. iv I would also like to express my sincere gratitude to all of my colleagues at the Precision and Nano Engineering Laboratory (PANEL), Department of Mechanical Engineering, NKUST. I want to express my sincere thanks to all my Vietnamese friends in NKUST for their helpful sharing and precious helping me over the past time. I also wish to express my gratitude to all those who directly or indirectly helped me during my study in NKUST. Finally, my special thanks to my dad Ngo The Long and my mom Vu Thi Hai, to my older sister Ngo Thi Phuong, to my adorable wife Duong Thi Huong Lien, to two lovely little daughters Ngo Duong Anh Thu and Ngo Phuong Linh, who are the most motivation for me over years in Taiwan! v CONTENTS 中文摘要…………………………………………………………………………….i ABSTRACT………………………………………………………………………...ii ACKNOWLEDGMENTS………………………………………………………...iv CONTENTS………………………………………………………………………..vi LIST OF FIGURES……………………………………………………………....xii LIST OF TABLES…………………………………………………………….....xvi NOMENCLATURE……………………………………………………………..xvii Chapter 1 Introduction…………………………………………………………….1 1.1 Motivation of the research……………………………………………………1 1.2 Scopes of the research………………………………………………………..8 1.3 Contributions…………………………………………………………………9 1.4 Organization of the dissertation……………………………………………...9 Chapter 2 Theory of image processing and machine vision……………………12 2.1 Image processing system……………………………………………………12 2.1.1 Basics in image processing……………………………………………..13 2.1.1.1 Pixels……………………………………………………………..13 2.1.1.2 Resolution of image……………………………………………...13 vi 2.1.1.3 Gray level………………………………………………………...14 2.1.1.4 Histogram………………………………………………………...15 2.1.1.5 Image presentation……………………………………………….16 2.1.1.6 Color models……………………………………………………..17 2.1.1.7 Neighbors of pixels………………………………………………18 2.1.2 Morphological image processing ……………………………………...19 2.1.2.1 Erosion operation………………………………………………...19 2.1.2.2 Dilation operation………………………………………………..19 2.1.2.3 Opening and closing operations …………………………………20 2.1.3 Blob analysis…………………………………………………………...21 2.1.3.1 Goal of blob analysis . ……………………………………………21 2.1.3.2 Feature extraction . ………………………………………………21 2.1.3.3 Steps to perform blob analysis . …………………………………22 2.2 Machine vision system .. …………………………………………………….22 2.2.1 Lighting design .... ………………………………………………….......23 2.2.2 Lens .... …………………………………………………………….. …..25 2.2.3 Image sensors .... …………………………………………………... …..27 Chapter 3 Coordinate calibration methods and camera calibration………….30 3.1 Two-Dimensional coordinate calibration…………………………………...30 vii 3.1.1 Linear transformation ... ………………………………………………..30 3.1.2 Quadratic transformation……………………………………………….32 3.2. Three-Dimensional coordinate calibration.. . ………………………………36 3.2.1 Stereo imaging .... …………………………………………………........36 3.2.2 The generalized perspective transformation .... …………………….......37 3.2.3 The perspective transformation with camera model ..... ………………..39 3.3 Camera calibration…………………………………………………………..44 3.3.1 Intrinsic parameter ...... …………………………………………………44 3.3.2 Extrinsic parameter .... …………………………………………….........45 3.4 Lens Distortions ...... ………………………………………………………...46 3.4.1 Radial distortion ........ ………………………………………………….46 3.4.2 Tangential distortion ........ …………………………………………… 47 3.5 Camera calibration using Matlab tool……………………………………….48 Chapter 4 Development of a metal part classification and measurement system under different lighting conditions………………………………………………51 4.1 Materials and Experimental Setup…………………………………………..51 4.1.1 Platform Description ........ ……………………………………………...51 4.1.2 Robot arm ........ ………………………………………………………...53 4.2 Research methodology………………………………………………………54 viii 4.2.1 Image processing……………………………………………………….54 4.2.2 Camera calibration……………………………………………………..55 4.2.3 Coordinate calibration method…………………………………………57 4.2.4 Resolution of the Measurement System…….………………………….58 4.2.5 Calculating the Coordinate Deviation ........ …………………………....58 4.3 Experimental Procedures……………………………………………………59 4.3.1 Illumination conditions ........ …………………………………………...59 4.3.2 Calibration work ........ ………………………………………………….59 4.4 Image Segmentation…………………………………………………….......61 4.4.1 Using backlighting ........ ………………………………………………..61 4.4.2 Using front-lighting ........ ……………………………………………....63 4.5 Algorithm for object classification…………………………………….........65 4.6 Algorithm for the bolt head determination ..... ..…………………………....66 4.7 Results and Discussion ...... ………………………………………………….68 4.7.1 Calibration results…… ........ …………………………………………...68 4.7.2 Coordinate deviation among different lighting conditions …………….70 4.8 Implementation ……… ..... …………………………………………………73 Chapter 5 Development of a color object recognition and measurement system…..77 5.1 Platform Description………………………………………………………...77 ix 5.1.1 Experimental setup … ........ …………………………………………....77 5.1.2 The implementation process of the proposed system…………………..79 5.2 Calibration work…………………………………………………………….80 5.2.1 2-D coordinate calibration using the quadratic transformation .............. 80 5.2.2 3-D calibration coordinate using the perspective transformation ........... 81 5.3 Image analysis and object segmentation…………………………………….83 5.4 Algorithm for color object classification……………………………………86 5.4.1 Algorithm for solid classification………………………………………86 5.4.2 Algorithm for hole classification……………………………………….87 5.5 Determination of orientation for triangular holes…………………………...90 5.6 Results and discussion………………………………………………………93 5.6.1 2-D calibration and spatial position measurement results……………...93 5.6.2 3-D calibration and spatial position measurement results……………...95 5.6.3 Determination process for the first points of triangular holes………….97 5.7 Implementation……………………………………………………………...98 Chapter 6 Conclusions and future works……………………………………...100 6.1 Conclusions………………………………………………………………..100 6.1.1 Conclusion for the classification and measurement system for metal parts.. 100 6.1.2 Conclusion for the classification and measurement system for color objects101 x 6.2 Future works……………………………………………………………….102 List of publications………………………………………………………………103 References………………………………………………………………………..105 Appendices……………………………………………………………………….113 xi LIST OF FIGURES Figure 1.1 Flowchart of dissertation ......................................................................... 10 Figure 2.1 Flowchart of image processing system ................................................... 12 Figure 2.2 Positions and gray-scale values within an image ................................... 15 Figure 2.3 Histogram between intensity and frequency ........................................... 15 Figure 2.4 The conversion from gray-scale image into binary image ..................... 16 Figure 2.5 Representation of image.......................................................................... 17 Figure 2.6 Relationship between pixels.................................................................... 18 Figure 2.7 Erosion operation .................................................................................... 19 Figure 2.8 Dilation operation ................................................................................... 20 Figure 2.9 Closing operation .................................................................................... 20 Figure 2.10 Result of blob analysis .......................................................................... 21 Figure 2.11 Architecture of machine vision system ................................................. 23 Figure 2.12 Front lighting source ............................................................................. 24 Figure 2.13 Back lighting source.............................................................................. 24 Figure 2.14 Side lighting source ............................................................................... 25 Figure 2.15 Lens model ............................................................................................ 25 Figure 2.16 Depth of field ........................................................................................ 26 xii Figure 2.17 Field of view ......................................................................................... 27 Figure 2.18 Chip size ................................................................................................ 29 Figure 3.1 The schematic diagram of a stereo imaging system ............................... 36 Figure 3.2 The specific perspective projection model ............................................. 37 Figure 3.3 Pinhole model ......................................................................................... 38 Figure 3.4 The generalized perspective projection model ....................................... 39 Figure 3.5 Converting from object to camera coordinate system ............................ 45 Figure 3.6 Radial distortion ...................................................................................... 46 Figure 3.7 Tangential distortion ............................................................................... 47 Figure 3.8 Chessboard pattern .................................................................................. 49 Figure 4.1 Structure of experimental system ............................................................ 52 Figure 4.2 Articulated and Cartesian coordinate system .......................................... 53 Figure 4.3 The Camera Calibration using Matlab .................................................... 60 Figure 4.4 Calibration board..................................................................................... 61 Figure 4.5 Binary thresholding operation ................................................................. 62 Figure 4.6 Classification and determining coordinates of centers of objects ........... 62 Figure 4.7 Setting the global threshold and binary threshold result ......................... 64 Figure 4.8 Setting the local threshold ....................................................................... 65 Figure 4.9 Result of local threshold process. ........................................................... 65 xiii Figure 4.10 Using the closed function with 3 times ................................................. 65 Figure 4.11 Flowchart of algorithm for object classification process…………… ..66 Figure 4.12(a) Determining the head of bolts left side............................................. 67 Figure 4.12(b) Determining the head of bolts right side .......................................... 67 Figure 4.13 Result of determining the head of bolts ................................................ 67 Figure 4.14 The calibration error, using Matlab toolbox. ........................................ 69 Figure 4.15. Diagram of calibration error of linear transformation method ………69 Figure 4.16. Diagram of calibration error of the quadratic transformation ……….70 Figure 4.17 Diagram of deviation percentage of bolts. ............................................ 71 Figure 4.18 Diagram of deviation percentage of washers ........................................ 72 Figure 4.19 Diagram of deviation percentage of nuts .............................................. 72 Figure 4.20 Flowchart of the implementation process ............................................. 74 Figure 4.21 Model of robot arm ............................................................................... 74 Figure 4.22 End effector of robot arm ...................................................................... 75 Figure 4.23 Experiment result for top view ............................................................. 75 Figure 4.24 Experiment result for front view ........................................................... 76 Figure 5.1 Experimental system ............................................................................... 78 Figure 5.2 Flowchart of the implementation process of the proposed system ......... 80 Figure 5.3 2-D calibration board .............................................................................. 81 xiv Figure 5.4 3-D calibration pattern ............................................................................ 82 Figure 5.5 Determining coordinates of 3-D calibration part on CMM .................... 82 Figure 5.6 Flowchart of algorithm for shape classification process……………….87 Figure 5.7 Flowchart of algorithm for hole classification process…………………88 Figure 5.8 Assembly parts ........................................................................................ 88 Figure 5.9 Recognition results of assembly parts..................................................... 89 Figure 5.10 Assembly models .................................................................................. 89 Figure 5.11 Recognition results of assembly models ............................................... 89 Figure 5.12 Scan line to find the first point of triangular holes ............................... 91 Figure 5.13 All orientations of triangular holes ....................................................... 91 Figure 5.14 Determination of first point for triangular holes of the first world orientation ................................................................................................................. 91 Figure 5.15 Determination of first point for triangular holes of the second world orientation ................................................................................................................. 92 Figure 5.16 Determination of first point for triangular holes (Left image plane) .... 92 Figure 5.17 Diagram of calibration error ................................................................. 94 Figure 5.18 The calibration system verification screen ........................................... 95 Figure 5.19 Calibration parameters and calibration accurate checking ................... 96 Figure 5.20 Determining the orientations of triangular holes .................................. 97 Figure 5.21 The robot arm implemented in this work .............................................. 98 xv LIST OF TABLES Table 2. 1 The type of the image forming device .................................................... 28 Table 2. 2 Advantages and shortcomings of scan modes ......................................... 28 Table 4. 1 Specifications of the CMOS camera (C910). .......................................... 52 Table 4. 2 Specifications of the Robot arm .............................................................. 54 Table 4. 3 Determination of bolt orientation. ........................................................... 68 Table 5. 1 Specifications of the CMOS camera (C525) ........................................... 79 Table 5. 2 Specifications of the CMOS camera (C615) ........................................... 79 Table 5. 3 Specifications of CMM. .......................................................................... 83 Table 5. 4 The world coordinates of calibration points measured by CMM. .......... 83 Table 5. 5 Color recognition algorithms................................................................... 85 Table 5. 6 Threshold values using in this study ....................................................... 86 Table 5. 7 The world coordinates of shapes ............................................................. 93 Table 5. 8 The world coordinates of holes ............................................................... 96 Table 5. 9 The world coordinates of first points of triangular ................................. 97 xvi NOMENCLATURE WCS The World Coordinate System CCS Camera Coordinate System ICS Image Coordinate System CCDs Charge Coupled Devices CMOS Complementary Metal Oxide Semiconductor 2-D Two Dimension space 3-D Three Dimension space CMM Coordinate Measuring Machine ATOS Advanced Topo-metric Optical Sensor SCARA Selective Compliance Assembly Robot Arm DOF Degrees of Freedom RGB Red Green Blue CMYK Cyan, Magenta, Yellow, and Black HSB Hue, Saturation, and Brightness PC Personal Computer DFOV Diagonal Field Of View J Joint xvii
- Xem thêm -

Tài liệu liên quan