哈尔滨工业大学学报  2019, Vol. 51 Issue (1): 178-183  DOI: 10.11918/j.issn.0367-6234.201803093 0

### 引用本文

ZHENG Jingxiang, CAO Bo, BI Shusheng, YANG Dongsheng. Visual target tracking based on dynamic fuzzy-model control[J]. Journal of Harbin Institute of Technology, 2019, 51(1): 178-183. DOI: 10.11918/j.issn.0367-6234.201803093.

### 文章历史

Visual target tracking based on dynamic fuzzy-model control
ZHENG Jingxiang, CAO Bo, BI Shusheng, YANG Dongsheng
School of Mechanical Engineering and Automation, Beijing University of Aeronautics and Astronautics, Beijing 100191, China
Abstract: The problem, that the angle control of the mobile robot's visual target following system based on the traditional linear control law cannot satisfy the need of high efficiency and fastness so that the target is easy missed, is focused in this paper. A visual following method based on dynamic T-S fuzzy control is proposed. The HOG algorithm is used to detect the target and the target position vector is obtained by the camera model. Based on the T-S fuzzy control, dynamic processing is performed to further improve the response speed of angular error convergence. The simulation by MATLAB shows that the convergence time of the angle error is less than 0.4 second. Therefore, the improved fuzzy control method can effectively improve the response speed of the angle error, shorten the time of the angle error convergence, and make the follow-up system have better rapidity and adaptability. By experiment on the mobile robot platform, the convergence time of the angle error is less than 0.5 second.
Keywords: visual inspection     HOG feature     T-S fuzzy     Wheel Mobile Robot     target tracking

1 基于行人检测的跟随模型

 $\frac{f}{y} = \frac{{{k_c}{k_h}\Delta h}}{H} = \frac{{{k_c}\Delta \delta }}{x}.$ (1)
 图 1 行人检测的各项参数 Figure 1 Parameters of human detection

 $\left\{ \begin{array}{l} x = \frac{{H\Delta \delta }}{{{k_h}\Delta h}},\\ y = \frac{{fH}}{{{k_c}{k_h}\Delta h}}. \end{array} \right.$ (2)

 $\left( {\begin{array}{*{20}{c}} \rho \\ \alpha \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {\sqrt {{x^2} + {y^2}} }\\ {\arctan \frac{x}{y}} \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {\frac{H}{{{k_h}\Delta h}}\sqrt {\Delta {\delta ^2} + {{\left( {\frac{f}{{{k_c}}}} \right)}^2}} }\\ {\arctan \frac{{{k_c}\Delta \delta }}{f}} \end{array}} \right).$ (3)

 $\left( {\begin{array}{*{20}{c}} {{v_x}}\\ {{v_y}}\\ \omega \end{array}} \right) = \frac{1}{{\chi L}} \cdot \left( {\begin{array}{*{20}{c}} {\frac{{\chi L}}{2}}&{\frac{{\chi L}}{2}}\\ 0&0\\ { - 1}&1 \end{array}} \right) = \left( {\begin{array}{*{20}{c}} {{v_l}}\\ {{v_r}} \end{array}} \right).$ (4)

 $\left[ {\begin{array}{*{20}{c}} {\dot \rho }\\ {\dot \alpha }\\ {\dot \beta } \end{array}} \right] = \left[ {\begin{array}{*{20}{c}} { - \cos \alpha }&0\\ {\frac{{\sin \alpha }}{\rho }}&{ - 1}\\ { - \frac{{\sin \alpha }}{\rho }}&0 \end{array}} \right]\left[ {\begin{array}{*{20}{c}} v\\ \omega \end{array}} \right].$ (5)
 图 2 动态目标跟随模型 Figure 2 Dynamic target tracking model

 $\left\{ \begin{array}{l} v = {k_\rho }\rho ,\\ \omega = {k_\alpha }\alpha + {k_\beta }\beta . \end{array} \right.$ (6)

 $\left[ {\begin{array}{*{20}{c}} {\dot \rho }\\ {\dot \alpha }\\ {\dot \beta } \end{array}} \right] = \left[ {\begin{array}{*{20}{c}} { - {k_\rho }\rho \cos \alpha }\\ {{k_\rho }\sin \alpha - {k_\alpha }\alpha - {k_\beta }\beta }\\ { - {k_\rho }\sin \alpha } \end{array}} \right].$ (7)
2 动态T-S模糊控制

T-S模糊系统：z1(t)是Mi1, zp(t)是Mip.

 $\mathit{\boldsymbol{\dot y}}\left( t \right) = {\mathit{\boldsymbol{A}}_i}y\left( t \right) + {\mathit{\boldsymbol{B}}_i}\mathit{\boldsymbol{u}}\left( t \right),i = 1,2, \cdots ,r.$ (8)

 $\mathit{\boldsymbol{\dot y}} = \frac{{\sum\limits_{i = 1}^r {{w_i}\left( {\mathit{\boldsymbol{z}}\left( t \right)} \right)\left[ {{\mathit{\boldsymbol{A}}_i}y\left( t \right) + {\mathit{\boldsymbol{B}}_i}\mathit{\boldsymbol{u}}\left( t \right)} \right]} }}{{\sum\limits_{i = 1}^r {{w_i}\left( {\mathit{\boldsymbol{z}}\left( t \right)} \right)} }},$ (9)
 ${w_i}\left( {z\left( t \right)} \right) = \prod\limits_{j = 1}^p {{M_{ij}}\left( {{z_j}\left( t \right)} \right)} ,$ (10)
 ${\mu _i}\left( {z\left( t \right)} \right) = \frac{{{w_i}\left( {z\left( t \right)} \right)}}{{\sum\limits_{i = 1}^r {{w_i}\left( {z\left( t \right)} \right)} }}.$ (11)

 $\left\{ \begin{array}{l} {s_2} \cdot \alpha \le \sin \alpha \le {s_1} \cdot \alpha ,0 \le \alpha < {\rm{ \mathsf{ π} }}/2,\\ {s_1} \cdot \alpha \le \sin \alpha \le {s_2} \cdot \alpha , - {\rm{ \mathsf{ π} }}/2 \le \alpha < 0. \end{array} \right.$ (12)
 ${c_2} \le \cos \alpha \le {c_1},\left| \alpha \right| < {\rm{ \mathsf{ π} }}/2.$ (13)

α满足式(12)和式(13)时，非线性系统模型可以表示为

 $\left\{ \begin{array}{l} {n_1} = \cos \alpha = {\rm{T}}{{\rm{C}}_1}\left( {{n_1}} \right) \cdot {c_1} + {\rm{T}}{{\rm{C}}_2}\left( {{n_2}} \right) \cdot {c_2},\\ {n_2} = \sin \alpha = {\rm{T}}{{\rm{S}}_1}\left( {{n_2}} \right) \cdot {s_1} + {\rm{T}}{{\rm{S}}_2}\left( {{n_2}} \right) \cdot {s_2}. \end{array} \right.$ (14)

 ${\mathit{\boldsymbol{A}}_i} = \left[ {\begin{array}{*{20}{c}} {{c_u}{k_\rho }}&0&0\\ 0&{{k_\alpha } - {s_v}{k_p}}&{{k_\beta }}\\ 0&{{s_v}{k_p}}&0 \end{array}} \right]$ (15)

 $\mathit{\boldsymbol{\dot x}} = \sum\limits_{i = 1}^4 {{\mu _i}{\mathit{\boldsymbol{A}}_i}\mathit{\boldsymbol{x}}}$ (16)

 图 3 不同参数下的函数值 Figure 3 Function value under different parameters

 图 4 改进的T-S模糊控制对比 Figure 4 Comparison of improved T-S fuzzy control

 $\left( \begin{array}{l} v\\ \omega \end{array} \right) = \left( \begin{array}{l} \frac{{{\eta _\rho }H}}{{{k_h}\Delta h}}\left( {\frac{{{k_c}\Delta {\delta ^2}}}{f} + \frac{f}{{{k_c}}}} \right)\\ \frac{{{\eta _\rho }{k_c}\Delta \delta }}{f} + {\eta _\alpha }\arctan \frac{{{k_c}\Delta \delta }}{f} \end{array} \right).$ (17)
3 仿真与实验 3.1 MATLAB软件仿真

 图 5 直线轨迹跟踪仿真 Figure 5 Linear trajectory tracking simulation

 图 6 圆周轨迹跟踪仿真 Figure 6 Circular trajectory tracking simulation

3.2 动态目标跟随实验

 图 7 Falconbot移动机器人平台 Figure 7 Falconbot mobile robot platform

 图 8 参数测定实验照片 Figure 8 Photo of parametric determination experimental

 $\frac{f}{{{k_c}}} = \frac{{y\Delta \delta }}{x},$
 $\frac{f}{{{k_c}}}\left( y \right) = \frac{{y{k_h}\Delta h}}{H}.$

 $\left( \begin{array}{l} v\\ \omega \end{array} \right) = \left( \begin{array}{l} \frac{{2.18{\eta _\rho }}}{{\Delta h}}\sqrt {\frac{{\Delta {\delta ^4}}}{{{{10}^6}}} + 1.3\Delta {\delta ^2} + 3 \times {{10}^5}} \\ \frac{{{\eta _\rho }\Delta \delta }}{{1\;050}} + {\eta _\alpha }\arctan \frac{{\Delta \delta }}{{1\;050}} \end{array} \right),$
 $\left\{ \begin{array}{l} {\eta _\rho } = \sum\limits_u {\sum\limits_v {{c_u}{k_\rho }T{C_u} \cdot T{S_v}} } ,\\ {\eta _\alpha } = \sum\limits_u {\sum\limits_v {\left[ {\gamma \left( {a\alpha + \frac{b}{\alpha } + c} \right) - {s_v}{k_\rho }} \right]T{C_u} \cdot T{S_v}} } . \end{array} \right.$

 图 9 对比实验轨迹示意图 Figure 9 The trajectory of contrast experiment

 图 10 视觉跟随实验照片 Figure 10 Photo of visual following experiment
 图 11 跟随过程中角度误差变化对比 Figure 11 Comparison of angle error changes in following

4 结论

 [1] GOREN C C, SARTY M, WU P Y K. Visual following and pattern discrimination of face-like stimuli by newborn infants[J]. Pediatrics, 1975, 56(4): 544. [2] MAR N J, VAZQUEZ D, LOPEZ A M, et al. Occlusion handling via random subspace classifiers for human detection.[J]. IEEE Transactions on Cybernetics, 2017, 44(3): 342. [3] SHARMA L, YADAV D K, SINGH A. Fisher's linear discriminant ratio based threshold for moving human detection in thermal video[J]. Infrared Physics & Technology, 2016, 78: 118. [4] BURKE M G. Visual servo control for human-following robot[D]. Stellenbosch: University of Stellenbosch, 2011: 27 [5] GUEVARA A E, HOAK A, BERNAL JT, et al. Vision-based self-contained target following robot using bayesian data fusion[C]//Advances in Visual Computing. Springer International Publishing, 2016: 846 [6] SUN C H, CHEN Y J, WANG Y T, et al. Sequentially switched fuzzy-model-based control for wheeled mobile robot with visual odometry[J]. Applied Mathematical Modelling, 2016, 47. [7] TANAKA K, WANG H O. Fuzzy control systems design and analysis: a linear matrix inequality approach[M]. New Jersey: John Wiley & Sons, 2003: 2011. [8] SUN C H, WANG Y T, CHANG C C. Switching T-S fuzzy model-based guaranteed cost control for two-wheeled mobile robots[J]. International Journal of Innovative Computing, Information and Control, 2012, 8(5): 3015. [9] DALAL N, TRIGGS B. Histograms of Oriented Gradients for Human Detection[C]//IEEE Computer Society Conference on Computer Vision & Pattern Recognition. San Diego: IEEE Computer Society, 2005: 886 [10] 刘文振.基于HOG特征的行人检测系统的研究[D].南京: 南京邮电大学, 2016 LIU Wenzhen. Research on HOG-based pedestrian detection system[D]. Nanjing: Nanjing University of Posts and Telecommunications, 2016 http://cdmd.cnki.com.cn/Article/CDMD-10293-1016294170.htm [11] DALAL N, TRIGGS B, SCHMID C. Human detection using oriented histograms of flow and appearance[C]//Computer Vision-ECCV 2006. Berlin Heidelberg: Springer, 2006: 428 [12] YANG Y, WANG H, LIU J, et al. The kinematic analysis and simulation for four-wheel independent drive mobile robot[C]//Control Conference. IEEE, 2011: 3958 [13] ASTOLFI A. Exponential stabilization of a wheeled mobile robot via discontinuous control[J]. Journal of Dynamic Systems Measurement & Control, 1999, 121(1): 121. [14] ZADEH L A. Fuzzy sets[C]// Fuzzy Sets, Fuzzy Logic, & Fuzzy Systems. World Scientific Publishing Co. Inc., 1996: 394 [15] LAM H, WU L, ZHAO Y. Linear matrix inequalities-based membership function-dependent stability analysis for non-parallel distributed compensation fuzzy-model-based control systems[J]. LET Control Theory & Applications, 2014, 8(8): 614.