All listings for this product
Best-selling in Non-Fiction Books
Save on Non-Fiction Books
- AU $9.86Trending at AU $16.88
- AU $11.51Trending at AU $18.89
- AU $24.83Trending at AU $25.37
- AU $71.21Trending at AU $75.12
- AU $31.51Trending at AU $37.08
- AU $25.48Trending at AU $28.39
- AU $9.83Trending at AU $14.06
About this product
- DescriptionRobots able to imitate human beings have been at the core of stories of science?ctionaswellasdreamsofinventorsforalongtime.Amongthe various skills that Mother Nature has provided us with and that often go forgotten, the ability of sight is certainly one of the most important. Perhaps inspired by tales of Isaac Asimov, comics and cartoons, and surely helped by the progress of electronics in recent decades, researchers have progressively made the dream of creating robots able to move and operate by exploiting arti?cial vision a concrete reality. Technically speaking, we would say that these robots position themselves and their end-e?ectors by using the view provided by some arti?cial eyes as feedback information. Indeed, the arti?cial eyes are visual sensors such as cameras that have the function to acquire an image of the environment. Such an image describes if and how the robot is moving toward the goal and hence constitutes feedback information. This procedure is kwn in robotics with the term visual servoing, and it is thing else than an imitation of the intrinsic mechanism that allows human beings to realize daily tasks such as reaching the door of the house or grasping a cup of co?ee.
- Author BiographyGraziano Chesi received the Laurea in Information Engineering from the University of Florence (1997) and the Ph.D. in Systems Engineering from the University of Bologna (2001). He was with the Department of Information Engineering of the University of Siena (2000-2006) and then joined the Department of Electrical and Electronic Engineering of the University of Hong Kong (2006-present). He was a visiting scientist at the Department of Engineering of the University of Cambridge (1999-2000) and at the Department of Information Physics and Computing of the University of Tokyo (2001-2004). Dr. Chesi was the recipient of the Best Student Award of the Faculty of Engineering of the University of Florence (1997). He was Associate Editor of the IEEE Transactions on Automatic Control (2005-2009) and Guest Editor of the Special Issue on Positive Polynomials in Control of the IEEE Transactions on Automatic Control (2009). Since 2007 he is Associate Editor of Automatica. He is the Founder and Chair of the Technical Committee on Systems with Uncertainty of the IEEE Control Systems Society. He is author of the book Homogeneous Polynomial Forms for Robustness Analysis of Uncertain Systems (Springer, 2009) and editor of the book Visual Servoing via Advanced Numerical Methods (Springer, 2010). He is first author in more than 100 technical publications. Koichi Hashimoto is a Professor at the Graduate School of Information Sciences, Tohoku University. He received his BS, MS and DE degrees from Osaka University in 1985, 1987 and 1990, respectively. His major research interests include visual servoing, parallel processing and biological systems. Prof. Hashimoto is a member of IEEE, RSJ, SICE, ISCIE, IPSJ and JSME. He is the Editor of the book Visual Servoing: Real-Time Control of Robot Manipulators Based on Visual Sensory Feedback, World Scientific, 1993, and the book Control and Modeling of Complex Systems, Birkhauser, 2003.
- PublisherSpringer London Ltd
- Date of Publication15/03/2010
- SubjectElectronics Engineering & Communications Engineering
- Series TitleLecture Notes in Control and Information Sciences
- Series Part/Volume Number401
- Place of PublicationEngland
- Country of PublicationUnited Kingdom
- ImprintSpringer London Ltd
- Content Note10 black & white tables, biography
- Weight597 g
- Width156 mm
- Height234 mm
- Spine22 mm
- Edited byGraziano Chesi,Koichi Hashimoto
- Contained items statementContains Paperback and Online resource
- Format DetailsTrade paperback (US)
This item doesn't belong on this page.
Thanks, we'll look into this.