Basit öğe kaydını göster

dc.contributor.advisorSümer, Bilsay
dc.contributor.authorKeskin, Ali
dc.date.accessioned2021-11-24T12:44:52Z
dc.date.issued2021
dc.date.submitted2021-09-21
dc.identifier.citation[1] S. W. Walker, “Integrating Department of Defense Unmanned Aerial Systems into the National Airspace Structure.” [2] Ministry of Defence, “Joint Doctrine Note 2 / 11 the UK Approach To Unmanned Aircraft,” Jt. Doctrin. Note 2/11, 2011. [3] “Sınırsız Teknolojiler, Güvenli Yarınlar.” https://www.stm.com.tr/tr/ cozumlerimiz/otonom-sistemler/togan-2457 (accessed May 19, 2021). [4] “Sınırsız Teknolojiler, Güvenli Yarınlar.” https://www.stm.com.tr/tr/ cozumlerimiz/otonom-sistemler/kargu (accessed May 19, 2021). [5] “Sınırsız Teknolojiler, Güvenli Yarınlar.” https://www.stm.com.tr/tr/ cozumlerimiz/otonom-sistemler/alpagu-2459 (accessed May 19, 2021). [6] “Switchblade® 300 - Tactical Missile System - Air, Sea, Ground | AeroVironment, Inc.” https://www.avinc.com/tms/switchblade (accessed May 19, 2021). [7] “US20170139416A1 - Unmanned aircraft turn and approach system - Google Patents.” https://patents.google.com/patent/US20170139416A1 (accessed May 19, 2021). [8] R. B. Langley, “Dilution of Precision,” GPS World, vol. 10, no. May, 1999. [9] R. M. Rogers, “Applied Mathematics in Integrated Navigation Systems,” Computers & Geosciences, vol. 27, no. 7. 2001. [10] “IMU (Inertial Measurement Unit) | Product Overview | TDK Product Center.” https://product.tdk.com/en/techlibrary/productoverview/imu.html (accessed May 06, 2021). [11] M. S. Sri, “Object Detection and Tracking Using KLT Algorithm,” vol. 7, no. 2, pp. 75–86, 2019. [12] C. Tomasi, “Detection and Tracking of Point Features Technical Report CMU-CS-91-132,” Image Rochester NY, vol. 91, no. April, 1991. [13] H. D. Lopes, E. Kampen, and Q. P. Chu, “UAVs with GPS / MEMS-AHRS integration,” no. August, pp. 1–17, 2012. [14] T. Duc-Tan, P. Fortier, and H.-T. Huynh, “Design, Simulation, and Performance Analysis of an INS/GPS System using Parallel Kalman Filters Structure,” REV J. Electron. Commun., vol. 1, no. 2, 2011, doi: 10.21553/rev-jec.15. [15] S. Rönnbäck, “Development of a INS / GPS navigation loop for an UAV MASTER ’ S THESIS Development of a INS / GPS navigation loop,” Computer (Long. Beach. Calif)., 2000. [16] “Reference frames and how are they used in inertial navigation.” https://www.vectornav.com/resources/reference-frames (accessed May 06, 2021). [17] “Coordinate Systems - MATLAB & Simulink.” https://www.mathworks.com/ help/vision/gs/coordinate-systems.html (accessed May 06, 2021). [18] D. Koks, “Using Rotations to Build Aerospace Coordinate Systems,” p. 31, 2008, [Online]. Available: http://www.dtic.mil/cgi-bin/GetTRDoc?AD =ADA484864. [19] A. Romano, “Principles of dynamics,” Model. Simul. Sci. Eng. Technol., vol. 56, pp. 197–213, 2012, doi: 10.1007/978-0-8176-8352-8_13. [20] J. Heikkila and O. Silven, “Four-step camera calibration procedure with implicit image correction,” 1997, doi: 10.1109/cvpr.1997.609468. [21] Y. Ma, S. Soatto, J. Košecká, and S. Sastry, An Invitation to 3D Vision, vol. 19, no. 108. 2004. [22] “Camera Calibration - MATLAB & Simulink.” https://www.mathworks.com/ help/vision/camera-calibration.html (accessed May 09, 2021). [23] “What Is Camera Calibration? - MATLAB & Simulink.” https://www.mathworks.com/help/vision/ug/camera-calibration.html (accessed Apr. 23, 2021). [24] Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 11, pp. 1330–1334, 2000, doi: 10.1109/34.888718. [25] X. Wang, J. Liu, and Q. Zhou, “Real-time multi-target localization from unmanned aerial vehicles,” Sensors (Switzerland), vol. 17, no. 1, pp. 1–28, 2017, doi: 10.3390/s17010033. [26] R. He, H. Liu, D. Li, and H. Liu, “A maximum likelihood estimation approach for image based target localization via small unmanned aerial vehicle,” CGNCC 2016 - 2016 IEEE Chinese Guid. Navig. Control Conf., no. 365, pp. 1186–1192, 2017, doi: 10.1109/CGNCC.2016.7828956. [27] Franck Cazaurang, K. Cohen, and M. Kumar, Multi-Rotor Platform-based UAV Systems, vol. 53, no. 9. 2012. [28] G. Dudek and M. Jenkin, Computational Principles of Mobile Robotics. 2010. [29] S. Sohn, B. Lee, J. Kim, and C. Kee, “Vision-based real-time target localization for single-antenna GPS-guided UAV,” IEEE Trans. Aerosp. Electron. Syst., vol. 44, no. 4, pp. 1391–1401, 2008, doi: 10.1109/TAES.2008.4667717. [30] J. D. Redding, T. W. McLain, R. W. Beard, and C. N. Taylor, “Vision-based target localization from a fixed-wing miniature air vehicle,” Proc. Am. Control Conf., vol. 2006, pp. 2862–2867, 2006, doi: 10.1109/acc.2006.1657153. [31] D. B. Barber, J. D. Redding, T. W. McLain, R. W. Beard, and C. N. Taylor, “Vision-based target geo-location using a fixed-wing miniature air vehicle,” J. Intell. Robot. Syst. Theory Appl., vol. 47, no. 4, pp. 361–382, 2006, doi: 10.1007/s10846-006-9088-7. [32] H. R. Hosseinpoor, F. Samadzadegan, and F. DadrasJavan, “Pricise target geolocation and tracking based on UAV video imagery,” Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. - ISPRS Arch., vol. 41, no. July, pp. 243–249, 2016, doi: 10.5194/isprsarchives-XLI-B6-243-2016. [33] R. W. Beard and T. W. McLain, Small unmanned aircraft: Theory and practice. 2012. [34] M. J. Monda, C. A. Woolsey, and C. Konda Reddy, “Ground target localization and tracking in a riverine environment from a UAV with a gimbaled camera,” Collect. Tech. Pap. - AIAA Guid. Navig. Control Conf. 2007, vol. 4, no. August, pp. 3788–3801, 2007, doi: 10.2514/6.2007-6747. [35] I. H. Wang, V. N. Dobrokhodov, I. I. Kaminer, and K. D. Jones, “On vision-based target tracking and range estimation for small UAVs,” Collect. Tech. Pap. - AIAA Guid. Navig. Control Conf., vol. 7, no. August, pp. 5507–5517, 2005, doi: 10.2514/6.2005-6401. [36] V. N. Dobrokhodov, I. I. Kaminer, K. D. Jones, and R. Ghabcheloo, “Vision-based tracking and motion estimation for moving targets using unmanned air vehicles,” J. Guid. Control. Dyn., vol. 31, no. 4, pp. 907–917, 2008, doi: 10.2514/1.33206. [37] National Imagery and Mapping Agency, “Performance specification Digital Terrain Elevation Data (DTED),” Mil-Prf-89020B, no. May, 2000, [Online]. Available: http://dds.cr.usgs.gov/srtm/version2_1/Documentation/. [38] K. Gade, “The Seven Ways to Find Heading,” J. Navig., vol. 69, no. 5, pp. 955–970, 2016, doi: 10.1017/S0373463316000096. [39] G. Conte et al., “High accuracy ground target geo-location using autonomous micro aerial vehicle platforms,” AIAA Guid. Navig. Control Conf. Exhib., pp. 1–14, 2008, doi: 10.2514/6.2008-6668. [40] J. A. Ross, B. R. Geiger, G. L. Sinsley, J. F. Horn, L. N. Long, and A. F. Niessner, “Vision-based target geolocation and optimal surveillance on an Unmanned Aerial Vehicle,” AIAA Guid. Navig. Control Conf. Exhib., no. January 2015, 2008, doi: 10.2514/6.2008-7448. [41] F. Morbidi and G. L. Mariottini, “Active target tracking and cooperative localization for teams of aerial vehicles,” IEEE Trans. Control Syst. Technol., vol. 21, no. 5, pp. 1694–1707, 2013, doi: 10.1109/TCST.2012.2221092. [42] R. W. Deming and L. I. Perlovsky, “Concurrent multi-target localization, data association, and navigation for a swarm of flying sensors,” Inf. Fusion, vol. 8, no. 3, p. 66, 2007, doi: 10.1016/j.inffus.2005.11.001. [43] H. Kwon and D. J. Pack, “A robust mobile target localization method for cooperative unmanned aerial vehicles using sensor fusion quality,” J. Intell. Robot. Syst. Theory Appl., vol. 65, no. 1–4, pp. 479–493, 2012, doi: 10.1007/s10846-011-9581-5. [44] S. Minaeian, J. Liu, and Y. J. Son, “Vision-Based Target Detection and Localization via a Team of Cooperative UAV and UGVs,” IEEE Trans. Syst. Man, Cybern. Syst., vol. 46, no. 7, pp. 1005–1016, 2016, doi: 10.1109/TSMC.2015.2491878. [45] Y. Qu, J. Wu, and Y. Zhang, “Cooperative localization based on the azimuth angles among multiple UAVs,” 2013 Int. Conf. Unmanned Aircr. Syst. ICUAS 2013 - Conf. Proc., pp. 818–823, 2013, doi: 10.1109/ICUAS.2013.6564765. [46] L. Beiner and S. W. Paris, “Direct trajectory optimization using nonlinear programming and collocation,” J. Guid. Control. Dyn., vol. 10, no. 4, pp. 338–342, 1987, doi: 10.2514/3.20223. [47] “System Identification Toolbox - MATLAB.” https://www.mathworks. com/products/sysid.html (accessed May 15, 2021). [48] R. Rysdyk, “Course and heading changes in significant wind,” J. Guid. Control. Dyn., vol. 30, no. 4, pp. 1168–1171, 2007, doi: 10.2514/1.27359. [49] J. Osborne and R. Rysdyk, “Waypoint guidance for small UAVs in wind,” Collect. Tech. Pap. - InfoTech Aerosp. Adv. Contemp. Aerosp. Technol. Their Integr., vol. 1, pp. 459–470, 2005, doi: 10.2514/6.2005-6951. [50] B. Etkin, Dynamics of Atmospheric Flight, vol. XXXIII, no. 2. 2005. [51] A. Zelinsky, Learning OpenCV---Computer Vision with the OpenCV Library (Bradski, G.R. et al.; 2008)[On the Shelf], vol. 16, no. 3. 2009. [52] P. Sturm, S. Ramalingam, J. P. Tardif, S. Gasparini, and J. Barreto, “Camera models and fundamental concepts used in geometric computer vision,” Found. Trends Comput. Graph. Vis., vol. 6, no. 1–2, pp. 1–183, 2010, doi: 10.1561/0600000023. [53] “Geometry of Image Formation | LearnOpenCV #.” https://learnopencv.com/geometry-of-image-formation/ (accessed Jun. 03, 2021). [54] K. Hata and S. Savarese, “CS231A Course Notes 1: Camera Models.” [55] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision. 2004. [56] J. Heikkila, “Using Circular Control Points,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 10, pp. 1066–1077, 2000. [57] L. Ma, Y. Chen, and K. L. Moore, “Rational Radial Distortion Models with Analytical Undistortion Formulae,” 2003, [Online]. Available: http://arxiv.org/abs/cs/0307047. [58] J. Mallon and P. F. Whelan, “Precise radial un-distortion of images,” Proc. - Int. Conf. Pattern Recognit., vol. 1, pp. 18–21, 2004, doi: 10.1109/ICPR.2004.1333995. [59] “Camera Calibration Toolbox for Matlab.” http://www.vision.caltech.edu/ bouguetj/calib_doc/htmls/parameters.html (accessed May 17, 2021). [60] “Find minimum of unconstrained multivariable function using derivative-free method - MATLAB fminsearch.” https://www.mathworks.com/help/ matlab/ref/fminsearch.html (accessed May 18, 2021). [61] K. J. Astrom and T. HÄgglund, “Advanced PID control,” IEEE Control Systems, vol. 26, no. 1. pp. 98–101, 2006, doi: 10.1109/MCS.2006.1580160. [62] A. Cho, J. Kim, S. Lee, and C. Kee, “Wind estimation and airspeed calibration using a UAV with a single-antenna GPS receiver and pitot tube,” IEEE Trans. Aerosp. Electron. Syst., vol. 47, no. 1, pp. 109–117, 2011, doi: 10.1109/TAES.2011.5705663.tr_TR
dc.identifier.urihttp://hdl.handle.net/11655/25644
dc.description.abstractIn this thesis, a target geolocation measurement model for a fixed wing unmanned aerial vehicle which equipped with an image processing system and standard sensors has been created. The measurement model begins to calculate the position of the relevant stationary target when the moment the target of interest is selected in the image. The Extended Kalman filter, which uses the pixel position of the target in the image plane, position, and angular position of the aircraft, is developed for the measurement model. Different from previous studies, the deviation of camera placement angles has been taken into account in the designed filter. In the flight tests, the target position was calculated with an accuracy of 5 meters, from a distance around 400 meters, in a time interval of 15-20 seconds. Also, camera placement angle errors were calculated with a precision of 1 degree. In addition, a loitering maneuver control algorithm has been designed that uses the pixel position of the target on the image plane. This control algorithm holds the target in the camera field of view and desired distance between UAV and target. Unlike the common loitering control algorithms such as waypoint navigation, this loitering maneuver control algorithm is designed independently from any GPS measurement. Hence, it will work regardless of GPS, as long as the angular positions of the aircraft are measured correctly. The control algorithm is designed as modular, so it can be used in any fixed-wing aircraft which has an image processing system and standard sensors. It can be easily implemented as an outer loop for roll attitude controllers. Keywords: Target geolocation, Image based GPS denied loitering, Image based target line of sight estimationtr_TR
dc.language.isoentr_TR
dc.publisherFen Bilimleri Enstitüsütr_TR
dc.rightsinfo:eu-repo/semantics/openAccesstr_TR
dc.rightsAttribution-NonCommercial 3.0 United States*
dc.rights.uriAn error occurred getting the license - uri.*
dc.rights.urihttp://creativecommons.org/licenses/by-nc/3.0/us/*
dc.subjectTarget geolocationtr_TR
dc.subjectImage based GPS denied loiteringtr_TR
dc.subjectImage based target line of sight estimationtr_TR
dc.subject.lcshMakina mühendisliğitr_TR
dc.titleFıxed Wıng Uav Target Geolocatıon Estımatıon From Camera Imagestr_en
dc.title.alternativeSabıt Kanat İha’lar İçin Kamera Bazlı Hedef Konum Kestirimitr_tr
dc.typeinfo:eu-repo/semantics/masterThesistr_TR
dc.description.ozetBu tez çalışmasında görüntü işleme sistemine sahip ve standart algılayıcılarla donatılmış sabit kanatlı bir insansız hava aracı için ilgili hedefin coğrafi konumunu hesaplayan ölçüm modeli oluşturulmuştur. Ölçüm modeli, görüntüde ilgilenilen hedefin seçildiği andan itibaren ilgili sabit hedefin konumunu hesaplamaya başlar. Ölçüm modeli için hedefin görüntü düzlemindeki piksel konumunu ve hava aracının konumunu, hızını ve açısal pozisyonunu kullanan genişletilmiş Kalman filtresi kullanılmıştır. Tasarlanan filtre de önceki çalışmalardan farklı olarak kamera yerleşim açılarının sapması dikkate alınmıştır. Uçuş testlerinde yaklaşık 400 metre uzaklıktan, 15 - 20 saniye süre aralığında, 5 metre hassasiyetinde hedef konumu hesaplanmıştır. Ayrıca, kamera yerleşim açı hataları da 1 derece hassasiyetinde hesaplanmıştır. Ek olarak, hedefin görüntü düzleminde bulunan piksel konumunu referans alan ve hava aracının hedef etrafında dolanmasını sağlayan kontrol algoritması tasarlanmıştır. Bu kontrol algoritması, hedefi kamera görüş alanında tutar ve İHA ile hedef arasındaki mesafeyi kontrol eder. Tasarlanan bu algoritma, nokta navigasyonu gibi mevcut manevralardan farklı olarak herhangi bir GPS ölçümüne ihtiyaç duymamaktadır. Dolayısı ile bu algoritma GPS'den bağımsız olarak hava aracının açısal pozisyon bilgileri doğru ölçüldüğü sürece çalışacaktır. Kontrol algoritması modüler olarak tasarlanmıştır ve bu sayede görüntü işleme sistemine ve standart algılayıcılara sahip bir sabit kanat hava aracında kullanılabilir. Yuvarlanma açısı kontrolcüleri için bir dış döngü olarak kolaylıkla uygulanabilir. Anahtar Kelimeler: Hedef konum kestirimi, Görüntü bazlı GPS’den bağımsız hedef etrafında dolanma manevrası, Görüntü bazlı hedef görüş hattı hesaplamatr_TR
dc.contributor.departmentMakine Mühendisliğitr_TR
dc.embargo.termsAcik erisimtr_TR
dc.embargo.lift2021-11-24T12:44:53Z
dc.fundingYoktr_TR
dc.subtypelearning objecttr_TR


Bu öğenin dosyaları:

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster