1. How to submit my research paper? What’s the process of publication of my paper?
The journal receives submitted manuscripts via email only. Please submit your research paper in .doc or .pdf format to the submission email: jait@etpub.com.
2.Can I submit an abstract?
The journal publishes full research papers. So only full paper submission should be considered for possible publication. Papers with insufficient content may be rejected as well, make sure your paper is sufficient enough to be published...[Read More]

Differential Geometric Approach to Change Detection Using Remotely Sensed Images

N Panigrahi1, B. K. Mohan2, and G. Athithan1
1. Center for Artificial Intelligence and Robotics, DRDO, Bangalore, India
2. Indian Institute of Technology, Bombay, India
Abstract---Change Detection using multi-temporal satellite images of same area is an established as well as actively pursued research problem. Most of the change detection techniques use algebraic or transform methods to do a pixel by pixel comparison of change detection. These techniques heavily depend upon the correct choice of threshold value to segregate the real changed pixels from the apparent changed ones. Also all these techniques can only compute the two dimensional change of the terrain surface from remotely sensed data. In this paper we propose a differential geometry approach to detect changes from remotely sensed images, which can detect the change using the geometric property of the pixels with respect to its surroundings. It can compute and filter the changed pixels having high curvature from that of flat (2D) changed pixels.

Keywords---Change Detection, Difference of Gaussian, Hessian, Differential Geometry, Spatio-Temporal Change Detection

Cite: N Panigrahi, B. K. Mohan, and G. Athithan, "Differential Geometric Approach to Change Detection Using Remotely Sensed Images," Journal of Advances in Information Technology, Vol. 2, No. 3, pp. 134-138, August, 2011.doi:10.4304/jait.2.3.134-138
Copyright © 2013-2020. JAIT. All Rights Reserved
Creative Commons LicenseThis work is licensed under the Creative Commons Attribution License (CC BY-NC-ND 4.0)