The Computational Cognitive Robotics Group was formed in 1984 with an important competitive funding program from the Fundación Ramón Areces for an interdisciplinary research project on human and artificial vision applied to robotic manipulators in which our group developed a computerized Hand-Eye system in collaboration with a research team from the Faculty of Psychology of Complutense University of Madrid.
Afterwards, we have worked on vehicle detection and tracking, vehicle counting, assisted mobility, automated car parking, humanoid and biped robot locomotion, helicopter flight controllers, and developing of intelligent computational solutions for robot and multi-robot systems.
We have designed and built an automated (robot-based) visual inspection system of used industrial pallets with operating plants in a number of European countries (Portugal, France, Spain, and the United Kingdom) and in the USA. This system holds a patent issued by the European Patent Office, at the Hague, The Netherlands.
We are currently developing fundamental and applied research work on symbolic and language-like communication in robot teams. We are using evolutionary algorithms at the design phase as well as reinforcement learning algorithms for on-line language acquisition. We plan to build a physical multi-robot system equipped with vision cameras as the sensory information for the meaning construction and cognitive categorization of the robots' symbolic language and also equipped with musical sound synthesizers for implementing the phonation of the words of the robots' language using on-line reinforcement learning and on-line embodied evolution algorithms.
We are also developing novel data clustering algorithms based on cellular automata and social segregation models. The algorithms are based on the idea of considering the individual data items as mobile agents, which the data items themselves, like individuals in a social neighborhood, are able to move autonomously in a lattice.
We are now extending this work on symbolic communication in teams of robots to a new project on enactive (action-oriented) coordination in teams of mobile robots, in which the robots learn to coordinate their actions based on a self-consensued interpretation of visual symbols with applications to autonomous navigation of mobile robots and to automatic driving systems of commercial cars.
Finally, we plan to continue our previous research on car-like robots automatic parking in cooperation with the IAI-CSIC Autopia Program. In this line of work we intend to migrate our simulated algorithms inspired on the behavior-based and bio-mimetic paradigms to the commercial vehicles developed by the IAI-CSIC.