Mostrar el registro sencillo del ítem
Evaluation and comparison of techniques for reconstructing the point spread function of images blurred by uniform linear motion
Evaluación y comparación de técnicas para la reconstrucción de la función de dispersión de punto de imágenes degradadas por difuminación lineal uniforme
dc.creator | Cortés-Osorio, Jimmy A. | |
dc.creator | López-Robayo, Cristian D. | |
dc.creator | Hernández-Betancourt, Nathalia | |
dc.date | 2018-05-14 | |
dc.date.accessioned | 2021-03-18T21:06:57Z | |
dc.date.available | 2021-03-18T21:06:57Z | |
dc.identifier | https://revistas.itm.edu.co/index.php/tecnologicas/article/view/789 | |
dc.identifier | 10.22430/22565337.789 | |
dc.identifier.uri | http://test.repositoriodigital.com:8080/handle/123456789/11739 | |
dc.description | In the field of digital image processing, it is common to find different types of degradation. One of them is motion blur, which is caused by the relative movement between the camera and the observed object. It produces a low-contrast trace on the image that follows the trajectory of the movement. If the relative velocity is constant and the blur is invariant across the entire image, the resulting blur can be modeled by means of the Point Spread Function (PSF) and using the trace’s length and the angle parameters. This work evaluated the accuracy of the estimation of the angle and length parameters, and the robustness to Additive White Gaussian Noise of a set of spatial and frequency approaches for reconstructing the PSF. It is important to highlight that the algorithms’ processing time was also considered. In total, 20 512x512 pixels synthetically-degraded images were used. Besides, five of the best-known techniques for estimating the angle and three for the length of the PSF were evaluated. The experimental results revealed the techniques with the lowest absolute mean error for estimating the angle and the length of the PSF in noise-free images: 2D Cepstrum Transform and 1D Cepstrum Transform, respectively. | en-US |
dc.description | En el área del procesamiento digital de imágenes, es frecuente encontrar diferentes tipos de degradaciones, como lo es la difuminación por movimiento (motion blur), la cual es causada por el movimiento relativo entre la cámara y el objeto observado. Esto produce sobre la imagen una estela de bajo contraste que sigue la trayectoria del movimiento. Si la velocidad relativa es constante y el desenfoque es invariante sobre toda la imagen, la difuminación causada puede ser modelada por medio de la Función de Dispersión de Punto (PSF) usando los parámetros de longitud y ángulo de la estela dejada. Este trabajo evaluó la exactitud en la estimación de dichos parámetros y la robustez al Ruido Aditivo Blanco Gaussiano de un grupo de estrategias espaciales y en frecuencia para la reconstrucción de la PSF, además se consideró el tiempo de ejecución de los algoritmos presentados. Se usaron 20 imágenes de 512x512 píxeles degradadas sintéticamente. Se evaluaron cinco de las técnicas más conocidas para la estimación del ángulo y tres para la longitud. Los resultados experimentales revelaron que las técnicas con los errores absolutos promedio más bajos para la estimación del ángulo y la longitud de la PSF en imágenes sin ruido son la Transformada Cepstrum 2D y la Transformada Cepstrum 1D, respectivamente. | es-ES |
dc.format | application/pdf | |
dc.format | text/html | |
dc.format | text/xml | |
dc.language | spa | |
dc.publisher | Instituto Tecnológico Metropolitano (ITM) | en-US |
dc.relation | https://revistas.itm.edu.co/index.php/tecnologicas/article/view/789/917 | |
dc.relation | https://revistas.itm.edu.co/index.php/tecnologicas/article/view/789/991 | |
dc.relation | https://revistas.itm.edu.co/index.php/tecnologicas/article/view/789/1211 | |
dc.relation | https://revistas.itm.edu.co/index.php/tecnologicas/article/view/789/1257 | |
dc.relation | /*ref*/P. Doynov and S. P. Tankasala, “Ultrafast blur evaluation in ocular biometrics,” in 2016 IEEE Symposium on Technologies for Homeland Security (HST), 2016, pp. 1–6. [2] S. Saiyod, P. Wayalun, C. Khorinphan, J. Chaichawananit, and S. Boonkwang, “Motion blur parameter estimation based on autocorrelation for liver ultrasound image,” in 2016 International Computer Science and Engineering Conference (ICSEC), 2016, pp. 1–6. [3] M. Lee, K.-S. Kim, and S. Kim, “Measuring Vehicle Velocity in Real Time using Modulated Motion Blur of Camera Image Data,” IEEE Trans. Veh. Technol., vol. 66, no. 5, pp. 3659–3673, 2017. [4] A. Taherkhani and J. Mohammadi, “Object Speed Estimation in Frequency Domain of Single Taken Image,” J. Basic Appl. Sci. Res., vol. 3, no. 1, pp. 120–124, 2013. [5] M. Lee, K.-S. Kim, J. Cho, and S. Kim, “Development of a vehicle body velocity sensor using Modulated Motion Blur,” in 2017 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), 2017, pp. 406–411. [6] C. Lyu, Y. Liu, X. Jiang, P. Li, and H. Chen, “High-Speed Object Tracking with Its Application in Golf Playing,” Int. J. Soc. Robot., vol. 9, no. 3, pp. 449–461, Jun. 2017. [7] C. H. Wu, K. Tseng, C. K. Ng, and W. H. Ip, “An effective motion-blurred image restoration approach for automated optical inspection,” HKIE Trans., vol. 22, no. 4, pp. 252–262, Oct. 2015. [8] Y. Yitzhaky and N. S. Kopeika, “Identification of blur parameters from motion blurred images,” Graph. Model. image Process., vol. 59, no. 5, pp. 310–320, 1997. [9] A. M. Deshpande and S. Patnaik, “Radon transform based uniform and non-uniform motion blur parameter estimation,” in 2012 International Conference on Communication, Information & Computing Technology (ICCICT), 2012, pp. 1–6. [10] R. Lokhande, K. V Arya, and P. Gupta, “Identification of parameters and restoration of motion blurred images,” in Proceedings of the 2006 ACM symposium on Applied computing - SAC ’06, 2006, pp. 301–305. [11] I. M. Rekleitis, “Steerable filters and cepstral analysis for optical flow calculation from a single blurred image,” in Vision Interface, 1996, vol. 1, pp. 159–166. [12] A. M. Deshpande and S. Patnaik, “A novel modified cepstral based technique for blind estimation of motion blur,” Opt. - Int. J. Light Electron Opt., vol. 125, no. 2, pp. 606–615, Jan. 2014. [13] J. Park, M. Kim, S. Chang, and K. H. Lee, “Estimation of motion blur parameters using cepstrum analysis,” in 2011 IEEE 15th International Symposium on Consumer Electronics (ISCE), 2011, pp. 406–409. [14] R. Mamta and M. Dutta, “GA based Blind Deconvolution Technique of Image Restoration using Cepstrum Domain of Motion Blur,” Indian J. Sci. Technol., vol. 10, no. 16, pp. 1–8, Apr. 2017. [15] L. Xu, X. Gao, and T. Fang, “Automatic restoration of motion blurred image based on frequency and cepstrum domain,” in AOPC 2015: Image Processing and Analysis, 2015, pp. 9675–9675. [16] M. J. Shah and U. D. Dalal, “Blind estimation of motion blur kernel parameters using Cepstral domain and Hough transform,” in 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI), 2014, pp. 992–997. [17] M. J. Shah and U. D. Dalal, “Hough transform and cepstrum based estimation of spatial-invariant and variant motion blur parameters,” in 2014 International Conference on Advances in Electronics Computers and Communications, 2014, pp. 1–6. [18] R. Dash, P. K. Sa, and B. Majhi, “Blur parameter identification using support vector machine,” ACEEE Int J Control Syst Instrum, vol. 3, no. 2, pp. 54–57, 2012. [19] S. Tiwari, V. P. Shukla, S. R. Biradar, and A. K. Singh, “Blind Restoration of Motion Blurred Barcode Images using Ridgelet Transform and Radial Basis Function Neural Network,” ELCVIA Electron. Lett. Comput. Vis. Image Anal., vol. 13, no. 3, pp. 63–80, Dec. 2014. [20] A. Kumar, “Deblurring of motion blurred images using histogram of oriented gradients and geometric moments,” Signal Process. Image Commun., vol. 55, pp. 55–65, Jul. 2017. [21] Z. Wang, Z. Yao, and Q. Wang, “Improved scheme of estimating motion blur parameters for image restoration,” Digit. Signal Process., vol. 65, pp. 11–18, Jun. 2017. [22] S. Jayaraman, S. Esakkirajan, and T. Veerakima, Digital image processing, 3rd ed. Tata McGraw-Hill Education, 2009. [23] T. Acharya and A. K. Ray, Image Processing: Principles and Applications. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2005. [24] M. Cannon, “Blind deconvolution of spatially invariant image blurs with phase,” IEEE Trans. Acoust., vol. 24, no. 1, pp. 58–63, Feb. 1976. [25] R. Dash and B. Majhi, “Motion blur parameters estimation for image restoration,” Opt. - Int. J. Light Electron Opt., vol. 125, no. 5, pp. 1634–1640, Mar. 2014. [26] N. Phansalkar, “Determination of linear motion point spread function using Hough transform for image restoration,” in 2010 IEEE International Conference on Computational Intelligence and Computing Research, 2010, pp. 1–4. [27] J. Mohammadi, R. Akbari, and M. K. Ba haghighat, “Vehicle speed estimation based on the image motion blur using RADON transform,” in 2010 2nd International Conference on Signal Processing Systems, 2010, vol. 1, pp. V1-243-V1-247. [28] W. T. Freeman and E. H. Adelson, “The design and use of steerable filters,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 13, no. 9, pp. 891–906, 1991. [29] D. Bhattacharya, J. Devi, and P. Bhattacherjee, “Brain image segmentation technique using Gabor filter parameter,” Am. J. Eng. Res., vol. 2, no. 9, pp. 127–132, 2013. [30] V. S. N. Prasad and J. Domke, “Gabor filter visualization,” J. Atmos. Sci, vol. 13, 2005. [31] Y. Yitzhaky, I. Mor, A. Lantzman, and N. S. Kopeika, “Direct method for restoration of motion-blurred images,” J. Opt. Soc. Am. A, vol. 15, no. 6, pp. 1512–1519, 1998. [32] S. Tiwari, V. P. Shukla, and A. K. Singh, “Certain investigations on motion blur detection and estimation,” in Proceedings of international conference on signal, image and video processing, IIT Patna, 2012, pp. 108–114. [33] M. Dobeš, L. Machala, and T. Fürst, “Blurred image restoration: A fast method of finding the motion length and angle,” Digit. Signal Process., vol. 20, no. 6, pp. 1677–1686, Dec. 2010. [34] S. Tiwari, V. P. Shukla, A. K. Singh, and S. R. Biradar, “Review of Motion Blur Estimation Techniques,” J. Image Graph., vol. 1, no. 4, pp. 176–184, 2014. [35] University of Southern California - USC, “The USC-SIPI Image Database.” [Online]. Available: http://sipi.usc.edu/database/database.php?volume=misc [36] M.-M. Sung, H.-J. Kim, E.-K. Kim, J.-Y. Kwak, J.-K. Yoo, and H.-S. Yoo, “Clinical evaluation of JPEG2000 compression for digital mammography,” IEEE Trans. Nucl. Sci., vol. 49, no. 3, pp. 827–832, 2002. | |
dc.rights | https://creativecommons.org/licenses/by/3.0/deed.es_ES | en-US |
dc.source | TecnoLógicas; Vol. 21 No. 42 (2018); 211-229 | en-US |
dc.source | TecnoLógicas; Vol. 21 Núm. 42 (2018); 211-229 | es-ES |
dc.source | 2256-5337 | |
dc.source | 0123-7799 | |
dc.subject | Cepstrum | en-US |
dc.subject | Motion blur | en-US |
dc.subject | Steerable Filters | en-US |
dc.subject | Linear Point Spread Function | en-US |
dc.subject | Reconstruction | en-US |
dc.subject | Hough transform | en-US |
dc.subject | Radon transform | en-US |
dc.subject | Cepstrum | es-ES |
dc.subject | Difuminación por movimiento | es-ES |
dc.subject | Filtros adaptativos | es-ES |
dc.subject | Función de Dispersión | es-ES |
dc.subject | Movimiento lineal uniforme | es-ES |
dc.subject | Reconstrucción | es-ES |
dc.subject | Transformada de Hough | es-ES |
dc.subject | Transformada de Radon | es-ES |
dc.title | Evaluation and comparison of techniques for reconstructing the point spread function of images blurred by uniform linear motion | en-US |
dc.title | Evaluación y comparación de técnicas para la reconstrucción de la función de dispersión de punto de imágenes degradadas por difuminación lineal uniforme | es-ES |
dc.type | info:eu-repo/semantics/article | |
dc.type | info:eu-repo/semantics/publishedVersion | |
dc.type | Research Papers | en-US |
dc.type | Artículos de investigación | es-ES |
Ficheros en el ítem
Ficheros | Tamaño | Formato | Ver |
---|---|---|---|
No hay ficheros asociados a este ítem. |
Este ítem aparece en la(s) siguiente(s) colección(ones)
-
tecnologia [520]