During the past years, UAVs (Unmanned Aerial Vehicles) became very popular as low-cost image acquisition platforms since they allow for high resolution and repetitive flights in a flexible way. One application is to monitor dynamic scenes. However, the fully automatic co-registration of the acquired multi-temporal data still remains an open issue. Most UAVs are not able to provide accurate direct image georeferencing and the co-registration process is mostly performed with the manual introduction of ground control points (GCPs), which is time consuming, costly and sometimes not possible at all. A new technique to automate the co-registration of multi-temporal high resolution image blocks without the use of GCPs is investigated in this paper. The image orientation is initially performed on a reference epoch and the registration of the following datasets is achieved including some anchor images from the reference data. The interior and exterior orientation parameters of the anchor images are then fixed in order to constrain the Bundle Block Adjustment of the slave epoch to be aligned with the reference one. The study involved the use of two different datasets acquired over a construction site and a post-earthquake damaged area. Different tests have been performed to assess the registration procedure using both a manual and an automatic approach for the selection of anchor images. The tests have shown that the procedure provides results comparable to the traditional GCP-based strategy and both the manual and automatic selection of the anchor images can provide reliable results.

An Image-Based Approach for the Co-Registration of Multi-Temporal UAV Image Datasets / Aicardi, Irene; Nex, Francesco; Gerke, Markus; Lingua, Andrea Maria. - In: REMOTE SENSING. - ISSN 2072-4292. - 8:9(2016), pp. 779-798. [10.3390/rs8090779]

An Image-Based Approach for the Co-Registration of Multi-Temporal UAV Image Datasets

AICARDI, IRENE;LINGUA, Andrea Maria
2016

Abstract

During the past years, UAVs (Unmanned Aerial Vehicles) became very popular as low-cost image acquisition platforms since they allow for high resolution and repetitive flights in a flexible way. One application is to monitor dynamic scenes. However, the fully automatic co-registration of the acquired multi-temporal data still remains an open issue. Most UAVs are not able to provide accurate direct image georeferencing and the co-registration process is mostly performed with the manual introduction of ground control points (GCPs), which is time consuming, costly and sometimes not possible at all. A new technique to automate the co-registration of multi-temporal high resolution image blocks without the use of GCPs is investigated in this paper. The image orientation is initially performed on a reference epoch and the registration of the following datasets is achieved including some anchor images from the reference data. The interior and exterior orientation parameters of the anchor images are then fixed in order to constrain the Bundle Block Adjustment of the slave epoch to be aligned with the reference one. The study involved the use of two different datasets acquired over a construction site and a post-earthquake damaged area. Different tests have been performed to assess the registration procedure using both a manual and an automatic approach for the selection of anchor images. The tests have shown that the procedure provides results comparable to the traditional GCP-based strategy and both the manual and automatic selection of the anchor images can provide reliable results.
2016
File in questo prodotto:
File Dimensione Formato  
2016 an image remotesensing-08-00779 rid.pdf

accesso aperto

Tipologia: 2a Post-print versione editoriale / Version of Record
Licenza: Creative commons
Dimensione 1.03 MB
Formato Adobe PDF
1.03 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2650534
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo