FPGA-Based Multimodal Embedded Sensor System Integrating Low- and Mid-Level Vision



Downloads per month over past year

Botella Juan, Guillermo and Martín Hernández, José Antonio and Santos Peñas, Matilde and Meyer-Baese, Uwe (2011) FPGA-Based Multimodal Embedded Sensor System Integrating Low- and Mid-Level Vision. Sensors, 11 (8). pp. 8164-8179. ISSN 1424-8220

[thumbnail of sensors-11-08164.pdf]
Creative Commons Attribution.


Official URL: https://doi.org/10.3390/s110808164


Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.

Item Type:Article
Uncontrolled Keywords:Bio-inspired systems; machine vision; optical flow; orthogonal variant moments; VLSI
Subjects:Sciences > Computer science > Artificial intelligence
Sciences > Computer science > Expert systems (Computer science)
ID Code:68463
Deposited On:02 Nov 2021 12:45
Last Modified:03 Nov 2021 12:16

Origin of downloads

Repository Staff Only: item control page