Assisted remote viewing of a teleoperation work cell
Lewis, M.G. and Sharkey, P. (1996) Assisted remote viewing of a teleoperation work cell. Proceedings of SPIE, 2901. p. 69.
Full text not archived in this repository.
To link to this item DOI: 10.1117/12.263015
Intelligent viewing systems are required if efficient and productive teleoperation is to be applied to dynamic manufacturing environments. These systems must automatically provide remote views to an operator which assist in the completion of the task. This assistance increases the productivity of the teleoperation task if the robot controller is responsive to the unpredictable dynamic evolution of the workcell. Behavioral controllers can be utilized to give reactive 'intelligence.' The inherent complex structure of current systems, however, places considerable time overheads on any redesign of the emergent behavior. In industry, where the remote environment and task frequently change, this continual redesign process becomes inefficient. We introduce a novel behavioral controller, based on an 'ego-behavior' architecture, to command an active camera (a camera mounted on a robot) within a remote workcell. Using this ego-behavioral architecture the responses from individual behaviors are rapidly combined to produce an 'intelligent' responsive viewing system. The architecture is single-layered, each behavior being autonomous with no explicit knowledge of the number, description or activity of other behaviors present (if any). This lack of imposed structure decreases the development time as it allows each behavior to be designed and tested independently before insertion into the architecture. The fusion mechanism for the behaviors provides the ability for each behavior to compete and/or co-operate with other behaviors for full or partial control of the viewing active camera. Each behavior continually reassesses this degree of competition or co-operation by measuring its own success in controlling the active camera against pre-defined constraints. The ego-behavioral architecture is demonstrated through simulation and experimentation.