[1] Saripalli, S., Montgomery, J. F., and Sukhatme, G. S., “Visually Guided
Landing of an Unmanned Aerial Vehicle,” IEEE Transactions on
Robotics and Automation, Vol. 19, No. 3, 2003, pp. 371–380.
doi:10.1109/TRA.2003.810239
[2] Wulan, B., Jizhong, H., and Yuanming, X., “Vision-Based Unmanned
Helicopter Ship Board Landing System,” Second International
Congress on Image and Signal Processing (CISP), IEEE Publ.,
Piscataway, NJ, 2009, pp. 1–5.
doi:10.1109/cisp.2009.5305201
[3] Shakernia, O., Vidal, R., Sharp, C. S., Ma, Y., and Sastry, S., “Multiple
View Motion Estimation and Control for Landing an Unmanned Aerial
Vehicle,” Proceedings of IEEE International Conference on Robotics
and Automation (ICRA), Vol. 3, IEEE Publ., Piscataway, NJ, 2002,
pp. 2793–2798.
doi:10.1109/robot.2002.1013655
[4] Meingast, M., Geyer, C., and Sastry, S., “Vision BasedTerrain Recovery
for Landing Unmanned Aerial Vehicles,” 43rd IEEE Conference on
Decision and Control (CDC), Vol. 2, IEEE Publ., Piscataway, NJ, 2004,
pp. 1670–1675.
doi:10.1109/cdc.2004.1430284
[5] Zhengpeng,Y., Zhenbang, G., Jiaqi,W., Jinbo, C., and Jinjun, R., “Real-
Time Vision-Based Guided Method for Autonomous Landing of a
Rotor-Craft Unmanned AerialVehicle,” IEEE International Conference
on Mechatronics and Automation, Vol. 4, IEEE Publ., Piscataway, NJ,
2005, pp. 2212–2215.
doi:10.1109/icma.2005.1626908
[6] Saripalli, S., “Vision-Based Autonomous Landing of an Helicopter on a
Moving Target,” AIAA Paper 2009-5660, 2009.
[7] Wenzel, K. E., Masselli, A., and Zell, A., “Automatic Take Off, Tracking
andLanding of aMiniatureUAVon aMoving CarrierVehicle,” Journal of
Intelligent&Robotic Systems,Vol. 61, Nos. 1–4,Oct. 2011, pp. 221–238.
doi:10.1007/s10846-010-9473-0
[8] Izzo, D., and de Croon, G., “Nonlinear Model Predictive Control
Applied to Vision-Based Spacecraft Landing,” EURO GNC 2013,
Second CEAS Specialist Conference on Guidance, Navigation &
Control, TU Delft, Delft, The Netherlands, 2013, pp. 91–107.
[9] Tandale, M. D., Bowers, R., and Valasek, J., “Trajectory Tracking
Controller for Vision-Based Probe and Drogue Autonomous Aerial
Refueling,” Journal of Guidance, Control, and Dynamics, Vol. 29,
No. 4, 2006, p. 846.
doi:10.2514/1.19694
[10] Srinivasan, M., Zhang, S., Lehrer, M., and Collett, T., “Honeybee
Navigation en Route to the Goal: Visual Flight Control and Odometry,”
Journal of Experimental Biology, Vol. 199, No. 1, 1996, pp. 237–244.
[11] Franceschini, N., Ruffier, F., and Serres, J., “Optic Flow Based
Autopilots: Speed Control and Obstacle Avoidance,” Flying Insects and
Robots, Springer-Verlag, Berlin, 2010, pp. 29–50.
[12] Izzo, D.,Weiss, N., and Seidl, T., “Constant-Optic-FlowLunar Landing:
Optimality and Guidance,” Journal of Guidance, Control, and
Dynamics, Vol. 34, No. 5, 2011, p. 1383.
doi:10.2514/1.52553
[13] Herisse, X, B., Hamel, T., Mahony, R., and Russotto, F. X., “Landing a
VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical
Flow,” IEEE Transactions on Robotics,Vol. 28, No. 1, 2012, pp. 77–89.
doi:10.1109/TRO.2011.2163435
[14] Rasshofer, R. H., and Gresser, K., “Automotive Radar and Lidar
Systems for Next Generation Driver Assistance Functions,” Advances in
Radio Science, Vol. 3, May 2005, pp. 205–209.
doi:10.5194/ars-3-205-2005
[15] Ruffier, F., and Franceschini, N., “Optic Flow Regulation: The Key to
Aircraft Automatic Guidance,” Robotics and Autonomous Systems,
Vol. 50, No. 4, March 2005, pp. 177–194.
doi:10.1016/j.robot.2004.09.016
[16] Honegger, D., Meier, L., Tanskanen, P., and Pollefeys, M., “Open
Source and Open Hardware Embedded Metric Optical Flow CMOS
Camera for Indoor and Outdoor Applications,” 2013 IEEE International
Conference on Robotics and Automation (ICRA), IEEE Publ.,
Piscataway, NJ, 2013, pp. 1736–1741.
doi:10.1109/ICRA.2013.6630805
[17] Lee, D. N., “Theory of Visual Control of Braking Based on Information
About Time-to-Collision,” Perception, Vol. 5, No. 4, 1976, pp. 437–
459.
doi:10.1068/p050437
[18] Spurr, R. T., “Subjective Aspects of Braking,” Automobile Engineer,
Vol. 59, IPC Transport Press, London, England, 1969, pp. 58–61.
[19] Lee, D. N., and Reddish, P. E., “Plummeting Gannets: A Paradigm of
Ecological Optics,” Nature, Vol. 293, No. 5830, 1981, pp. 293–294.
doi:10.1038/293293a0
[20] Lee, D. N.,Young, D. S., and Rewt, D., “HowDo Somersaulters Land on
Their Feet?” Journal of Experimental Psychology: Human Perception
and Performance, Vol. 18, No. 4, 1992, pp. 1195–1202.
doi:10.1037/0096-1523.18.4.1195
[21] Lee, D. N., Davies, M. N. O., and Green, P. R., “Visual Control of
Velocity of Approach by Pigeons When Landing,” Journal of
Experimental Biology, Vol. 180, No. 1, 1993, pp. 85–104.
[22] Lee, D. N., Reddish, P. E., and Rand, D. T., “Aerial Docking by
Hummingbirds,” Naturwissenschaften, Vol. 78, No. 11, 1991, pp. 526–
527.
doi:10.1007/BF01131406
[23] Shimada, Y., Seto, R., and Ito, K., “Detection of the Tau-Margin and
Application to Autonomous Control of a Flying Robot,” Fifth
International Conference on Intelligent Sensors, Sensor Networks and
Information Processing (ISSNIP), 2009, pp. 103–108.
doi:10.1109/issnip.2009.5416779
[24] Miyagawa,Y.,Kondo,Y., and Ito, K., “Realization of Flock Behavior by
Using Tau-Margin,” International Conference on Control Automation
and Systems (ICCAS), Institute of Electrical and Electronics Engineers,
New York, 2010, pp. 957–961.
[25] Izzo, D., and Croon, G. D., “Landing with Time-to-Contact and Ventral
Optic Flow Estimates,” Journal of Guidance, Control, and Dynamics,
Vol. 35, No. 4, July 2012, pp. 1362–1367.
doi:10.2514/1.56598
[26] Xie, P., Ma, O., and Zhang, Z., “Bio-Inspired Approach for UAV
Landing and Perching,” AIAA Paper 2013-5108, 2013.
[27] Kendoul, F., “Four-Dimensional Guidance and Control of Movement
Using Time-to-Contact: Application to Automated Docking and
Landing of Unmanned Rotorcraft Systems,” International Journal of
Robotics Research, Vol. 33, No. 2, Feb. 2014, pp. 237–267.
doi:10.1177/0278364913509496
[28] Brockers, R., Susca, S., Zhu, D., and Matthies, L., “Fully Self-
ContainedVision-Aided Navigation and Landing of a Micro AirVehicle
Independent from External Sensor Inputs,” SPIE Defense, Security, and
Sensing, Vol. 8387, SPIE, Bellingham,WA, May 2012, Paper 83870Q.
doi:10.1117/12.919278
[29] Weiss, S., Achtelik, M.W., Lynen, S., Achtelik, M. C., Kneip, L., Chli,
M., and Siegwart, R., “Monocular Vision for Long-Term Micro Aerial
Vehicle State Estimation: A Compendium,” Journal of Field Robotics,
Vol. 30, No. 5, 2013, pp. 803–831.
doi:10.1002/rob.2013.30.issue-5
[30] Bresciani, T., Modelling, Identification and Control of a Quadrotor
Helicopter, M.S. Thesis, Lund Univ., Lund, Sweden, 2008.
[31] Stevens, B. L., and Lewis, F. L., Aircraft Control and Simulation,
2nd ed., Wiley, New York, 2003, p. 27.
[32] Wang, J. H., “Active Low-Order Fault-Tolerant State Space Self-Tuner
for the Unknown Sample-Data Linear Regular System with an Input-
Output Direct Feedthrough Term,” Applied Mathematical Sciences,
Vol. 6, No. 97, 2012, p. 4813.
[33] Shaffer, D. M., Krauchunas, S. M., Eddy, M., and McBeath, M. K.,
“How Dogs Navigate to Catch Frisbees,” Psychological Science,
Vol. 15, No. 7, 2004, pp. 437–441.
doi:10.1111/psci.2004.15.issue-7
[34] Shaffer, D. M., and Gregory, T. B., “How Football Players Determine
Where to Run to Tackle Other Players: A Mathematical and
Psychological Description and Analysis,” Open Sports Sciences
Journal, Vol. 2, 2009, pp. 29–36.
doi:10.2174/1875399X00902010029
[35] Alkowatly, M. T., Becerra, V. M., and Holderbaum,W., “Estimation of
Visual Motion Parameters Used for Bio-Inspired Navigation,” Journal
of Image and Graphics, Vol. 1, No. 3, 2013, pp. 120–124.
doi:10.12720/joig
[36] De Croon, G. C. H. E., Ho, H. W., De Wagter, C., Van Kampen, E.,
Remes, B., and Chu, Q. P., “Optic-Flow Based Slope Estimation for
Autonomous Landing,” International Journal of Micro Air Vehicles,
Vol. 5, No. 4, 2013, pp. 287–298.
doi:10.1260/1756-8293.5.4.287
[37] Longuet-Higgins, H. C., and Prazdny, K., “Interpretation of a Moving
Retinal Image,” Proceedings of the Royal Society of London, Series B:
Biological Sciences, Vol. 208, No. 1173, 1980, pp. 385–397.
doi:10.1098/rspb.1980.0057
[38] Menold, P. H., and Pearson, R. K., “Online Outlier Detection and
Removal,” Proceedings of the Seventh Mediterranean Conference
on Control and Automation (MED99), Institute of Electrical and
Electronics Engineers, New York, 1999, pp. 1110–1133.
[39] Harris, L. R., Jenkin, M. R., Zikovitz, D., Redlick, F., Jaekl, P.,
Jasiobedzka, U. T., Jenkin, H. L., and Allison, R. S., “Simulating Self-
Motion I: Cues for the Perception of Motion,” Virtual Reality, Vol. 6,
No. 2, 2002, pp. 75–85.
doi:10.1007/s100550200008
[40] Hain, T. C., and Helminski, J. O., “Anatomy and Physiology of the
Normal Vestibular System,” Vestibular Rehabilitation Contemporary
Perspectives in Rehabilitation, Davis, Philadelphia, 2007, pp. 2–18,
Chap. 1.
[41] Julier, S. J., and Uhlmann, J. K., “Unscented Filtering and Nonlinear
Estimation,” Proceedings of the IEEE, Vol. 92, No. 3, 2004, pp. 401–
422.
doi:10.1109/JPROC.2003.823141
[42] Crassidis, J. L., and Junkins, J. L., Optimal Estimation of Dynamic
Systems, CRC Press, Boca Raton, FL, 2011, pp. 196–198.
[43] Kandepu, R., Imsland, L., and Foss, B. A., “Constrained State
Estimation Using the Unscented Kalman Filter,” Proceedings of the
16th Mediterranean Conference on Control and Automation, Institute
of Electrical and Electronics Engineers, New York, 2008, pp. 1453–
1458.
doi:10.1109/MED.2008.4602001
[44] Ljung, L., System Identification: Theory for the User, 2nd ed., Prentice–
Hall, Upper Saddle River, NJ, 1999, p. 199, Chap. 7.