Download Machine Learning in a Humanoid Intelligent Service Robot Hsueh

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Machine Learning in a Humanoid Intelligent Service Robot
Hsueh-Chuan Yen *
College of Living Technology
Chienkuo Technology University
No. 1, Chieh Shou N. Road
Changhua
Taiwan, R.O.C.
Tai-Chang Hsia
Ren-Chieh Liao
Department of Industrial Engineering and Management
Chienkuo Technology University
Changhua
Taiwan, R.O.C.
Abstract
The service robot described herein is an engineering application combining mechanical
technology and computer science. For service robots, intelligence is the core competence.
The humanoid intelligent service robot developed at Chienkuo Technology University will
possess five intelligent capabilities: marketing, advertisement, acceptance, guidance, and security. It must therefore be taught to obtain intelligence. The purpose of this research is to
describe how an instructor teaches the humanoid intelligent service robot. First, according
to the required basic action and distinct service script, the robot instructor demonstrates
the basic human action. The digital 3D motion capture system captures these motions via
video. The movements of the robot instructor’s limbs and trunk will be digitized, encoded,
and stored in the robot’s intelligent computer memory. When service instructions activate
the robot, it will replay the motions taught using 27 DC motors, 5 DC servo motors, and
10 solenoids.
Keywords: humanoid intelligent service robot, robot instructor, service script, digital 3D motion
capture system, service instructions
*E-mail: anny@ctu.edu.tw
Journal of Information & Optimization Sciences
Vol. ( ), No. ( ), pp. 1–13
© Taru Publications
2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
H. C. YEN, T. C. HSIA AND R. C. LIAO
1. Introduction
One of the major purposes of robots is to perform repetitive or
dangerous work in place of human beings. At present, industrial robots
are the main practical achievement of robotics. The International Federation of Robotics and United Nations Economic Commission for Europe
(UNECE) indicated that annual installed multipurpose industrial robots
worldwide now exceed 100,000. The estimated market demand for such
robots in 2012 should reach $80 to $250 billion output value [1]. With the
success of industrial robots, the obvious next step in robot evolution is the
development of service robots.
Chienkuo Technology University established a Humanoid Intelligent
Service Robot research team in 2006, under the leadership of Dr. Shyu. The
purpose of the research team was to develop a service robot with capabilities in five service areas: marketing, advertisement, acceptance, guidance,
and security. Each of these areas represents a potential area of application
for service robots.
2. Literature review
The development of the robot has progressed considerably since the
growth of computers beginning in the 1970s. In 1971 Stanford University developed the electron power generator arm, the Stanford Arm. In
1973 the first computer robot programming language, WAVE, was published. In 1974 Cincinnati Milacron Corporation invented the computer
controlled robot T3, the first to combine robot and minicomputer functions [2]. In 1978 American Unimation Corporation and General Motors
researched and promoted the industrial assembly robot PUMA, a key step
in the maturation of industrial robot technology. In 1984 Engelberger developed an early service robot that could deliver the food, medicine, and
mail for patients in a hospital.
Since the late 1990s service robots for home applications have begun
appearing. In 1999 Sony Corporation promoted a dog-like robot called
AIBO, a service (entertainment) robot for the ordinary family. In 2002
Denmark’s iRobot Corporation invented a vacuum cleaner robot called
Roomba [3]. Roomba avoids barrier, designates a line of forward motion,
and returns to the charge station automatically when electric power is insufficient. In 2007, with the development at Microsoft Corporation of its
Microsoft Robotics Studio (MSRS), the tendency toward robot modularization and unified platforms was becoming clearer. Bill Gates predicted
MACHINE LEARNING IN A SERVICE ROBOT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
3
at the time that the household service robot would quickly spread to families worldwide [4].
The industrial robot operation glossary of ISO 8373: 1994 indicated
that a robot should include a manipulator, an actuating unit, and a control
system comprising software and hardware [5]. The Institute of Occupational Safety and Health pointed out that a topical robot must possess
six subsystems: processing, machinery, electronic, sensor, control, and
planning [6]. The service robot developed at Chienkuo Technology University satisfies these requirements. Further, our robot possesses human
intelligence and can think, judge, observe, and learn.
The history of robot intelligence originates with industrial robots.
Industrial robots use a programmed actuating unit to activate the machine
and maintain specific behaviors. Therefore, an industrial robot can only
work in a predefined environment, and has no ability to deal with unanticipated work. For example, a robot used in an aviation industry production line performs stable behavior according pre-established paths or
processes. Once the external environment changes or its process position
shifts, the robot cannot perform normal actions and behaviors, with potentially serious consequences. In practice, a robot’s establishment should
be rigorous, meaning that its operating limits should be clearly defined.
An intelligent robot obtains and enhances its cognitive abilities
through artificial intelligence. In early research into intelligent robots,
researchers used genetic algorithm, artificial neural network, or combination above two methods to implement robot learning [7-15, 19]. Wang
developed a state generator based on the genetic algorithm, which enables
users to generate various motion states without using any reference motion data [16]. By specifying various types of constraints such as configuration constraints and contact constraints, the state generator can generate
stable states that satisfy the constraint conditions for humanoid robots.
Ram applied genetic algorithms to the learning of local robot navigation
behaviors for reactive control systems. This approach evolves reactive
control systems in various environments, thus creating sets of “ecological
niches” that can be used in similar environments [17]. However, because it
is difficult to define the evaluation function, and because of the restricted
learning behavior in the learning mechanism described above, the development of the robot’s intelligence faces limits. To resolve these problems,
we propose an innovative teaching method to instruct the robot in its
movements and actions. Once it has accomplished the training course, the
robot can duplicate and integrate the digital information on movement
that it has learned.
4
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
H. C. YEN, T. C. HSIA AND R. C. LIAO
The humanoid intelligent service robot developed at Chienkuo Technology University has a contour, scale, movement, and expressions similar
to those of a human being. As the robot executes its duties, it uses a visual
sensor to assess the situation and determine which action is appropriate.
3. System framework of the humanoid intelligent service robot
The humanoid intelligent service robot developed at Chienkuo
Technology University has five predefined cognitive areas:
(1) Instinctive intelligence: the robot has the capability of finding the
shortest path and returning to the charge station automatically
when electric power is insufficient. The robot can also auto activate,
detect failure, diagnose problem, self protect, and follow machine
moral norms.
(2) Action intelligence: the robot can stand, squat, sit, kneel, and bend
over. The robot can also step forward, step backward, turn in a given direction, turn its body, and climb.
(3) Adoption intelligence: the robot can coordinate with other robots,
communicate with the service center, and execute specific responses when confronting specific persons or affairs.
(4) Service knowledge and intelligence: the robot possesses knowledge
of courtesy, advertising knowledge, marketing knowledge, security
knowledge, acceptance knowledge, and guidance knowledge.
(5) Learning intelligence: the robot can imitate human language,
action, and expression. In addition, the robot can collect and reorganize human language and movement information automatically,
to get along with humans.
Based on the intelligences of the robot as defined above, a system
framework for the humanoid intelligent service robot is shown in Figure 1.
4. Humanoid robot service action learning
To develop the five capabilities necessary for the humanoid intelligent
robot, the robot must learn from an instructor. The teaching procedure
is described below. First, the instructor must understand the necessary
abilities in the five fields: marketing, advertising, acceptance, guidance,
and security. Then, the instructor writes service scripts for use in teaching
MACHINE LEARNING IN A SERVICE ROBOT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
5
Figure 1
System framework of humanoid intelligent service robot
the robot. Based on the service script, the instructor demonstrates human
actions. The digital 3D motion capture system then captures these motions
via video. The instructor’s movements are digitized, encoded, and stored
in the robot’s computer, which has a capacity of 80 gigabytes. In the future,
6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
H. C. YEN, T. C. HSIA AND R. C. LIAO
whenever the specific file instruction stored in the memory of the robot is
activated, the robot’s limbs and trunk will replay the recorded motions.
The recorded demonstration of the service script by the instructor is
then transferred to the robot. To avoid jerkiness caused by discontinuities
between two sets of movements, file editing, or inconsistencies between
the initial and final movements for different scripts, single basic actions
such as head nod, head wag, and head swing, are first defined. There are
29 elementary actions (Table 1). Combinations of these elementary actions
constitute simple continuous actions.
The instructor uses a digital 3D motion capture system to teach the
robot. The system is composed of six high speed, high resolution video
cameras, a light-emitting diode device, and a server computer. The six
video cameras are arrayed around a 3 meter radius at intervals of 60°
and networked. The instructor wears a leotard to which is attached 37
light-emitting diodes. Staying within 1 meter of the center of the circle of
cameras, the instructor then executes a single elementary action, a combination of single elementary actions, or a specific service script. The video
camera retrieves the entire signal for the instructor’s light-emitting diode
when recording, including items such as position, speed, and acceleration
number. The digital 3D motion capture system generates a skeleton frame
simultaneously, digitizes the data, and stores the data in the memory of
the robot’s computer, as shown in Figure 2 and 3 [18].
Each action generates different code which must be mapped to a distinct file for actuator instruction. The code consists of 6 to 8 alphanumeric
Figure 2
Instructor demonstrates a crouch
MACHINE LEARNING IN A SERVICE ROBOT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
7
Figure 3
Digital signal retrieved from the instructor in Figure 2 by digital
3D motion capture system
characters. Once the movements are recorded into the robot’s memory,
when it is sent to its place of service and receives the proper service instructions, it will activate and play back the motions it learned before. The
robot’s elementary actions and corresponding codes are shown in Table 1.
The behaviors and actions of the robot are activated by the action
control computer, while the activation command is issued by the intelligent computer. The intelligent computer can communicate with humans
through its remote control teaching monitor (a refurbished PDA). That is,
the intelligent computer receives commands from a remote call center or
user, and then orders the action computer to execute a single elementary
action, a combination of single elementary actions, or a distinct service
script.
When executing actions, the robot can accommodate external messages to the action computer to correct a specific action. The robot can also
communicate with other robots through wireless or Bluetooth systems,
and receive orders from service personnel or computers in the call center through a global system for mobile communication (GSM). The action computer controls each motor and actuator when executing an action,
combination of actions, or service script. Since all actions and behaviors
the robot can accomplish have been taught, transformed, coded, and then
saved in the intelligent computer, if the robot is commanded to carry out
an action or behavior, the corresponding code need only be activated and
the movement will be played back by the machine.
8
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
H. C. YEN, T. C. HSIA AND R. C. LIAO
Table1
Basic actions and codes of humanoid intelligent robot
Basic action
No
Action
Code
1
Head Nod
HDND00~99
2
Head Wag
HDWG00~99
3
Head Swing
HDSG00~99
4
Eye Blink
EYNN00~99
5
Mouth Open
MHON00~99
6
Left Elbow Bend Up-Down
LEBU00~99
7
Right Elbow Bend Up-Down
REBU00~99
8
Left Elbow Bend In-Out
LEBI00~99
9
Right Elbow Bend In-Out
REBI00~99
10
Left Elbow Turn
LETN00~99
11
Right Elbow Turn
RETN00~99
12
Waist Bend Down
WTBD00~99
13
Waist Turn
WTTN00~99
14
Waist Swing
WTSG00~99
15
Left Thigh Swing Out
LTSO00~99
16
Right Thigh Swing Out
RTSO00~99
17
Left Thigh Front Back
LTFB00~99
18
Right Thigh Front Back
RTFB00~99
19
Left Shank Bend
LSBD00~99
20
Right Shank Bend
RSBD00~99
21
Left Foot Bend
LFBD00~99
22
Right Foot Bend
RFBD00~99
23
Left Foot Turn
LFTN00~99
24
Right Foot Turn
RFTN00~99
25
Left Foot Forward Drive
LFFD00~99
26
Right Foot Forward Drive
RFFD00~99
27
Left Foot Backward Drive
LFBE00~99
28
Right Foot Backward Drive
RFBE00~99
29
Feet Crouch
FTCH00~99
MACHINE LEARNING IN A SERVICE ROBOT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
Figure 4
Flowchart of robot action playback
9
10
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
H. C. YEN, T. C. HSIA AND R. C. LIAO
The use of Ethernet as a transmission interface between the intelligent
computer and the action computer is key factor in why the CTU robots
can successfully imitate instructor actions. First, the intelligent computer
transmits signal to 4 server motors via an RS-232 transmission line in order to control eyes rotation, and also transmits a signal to the visual identification system where a server motor controls the mouth opening and
a speaker to generate a voice. Next, the action computer transmits commands via the RS-232 transmission line to an 80C51 communication chip,
which then drives an 80C51 control chip which triggers the 27 AC motors
and 10 solenoids. The robot’s limbs and trunk replay the recorded motions
according to the position, speed, and acceleration number assigned by the
digital 3D motion capture system. If a robot must carry out a continuous
service action, the robot must recall and combine a script of the teaching
program.
The action computer issues commands to control the playback functions. That is, the action computer transmits signals to the 80C51 chips
in the motors and solenoids in the limbs and trunk in order to control
the current flows. The robot’s limbs and trunk generate actions simultaneously. If the robot’s action is not correct, the visual identification system
identifies these deviations and transmits it to the intelligent computer.
The intelligent computer issues another command to the 80C51 communication chips to rectify the error. A flowchart of the playback action is
given in Figure 4. Figure 5 demonstrates the realistic teaching action of the
instructor, followed by the robot replay.
Figure 5
The robot plays back a right elbow bend in-out
MACHINE LEARNING IN A SERVICE ROBOT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
11
5. Conclusions and suggestions for future research
When an intelligent robot possesses higher degree of intelligence, it
demonstrates actions and behaviors similar to those of humans. It appears
likely that humans will be more willing to accept the services provided by
such a robot. In this research, we construct a teaching and learning mechanism for a humanoid intelligent robot. During the research period, we
wrote a variety of service scripts, implemented training of the instructor,
performed digital 3D motion capture system video recording, captured
the motions of the instructor, digitized in file form, and inputted to the robot’s intelligent computer. We trained six instructors for two years during
the preparation for teaching the robot. This method of robotic acquisition
of intelligence is more accurate and rapid than an artificial neural network
or genetic algorithm. This method also has greater adaptability to the external environment than traditional programming controls.
The humanoid intelligent service robot developed at Chienkuo
Technology University was given two public demonstrations. These both
achieve anticipated functions, indicating that the research and development of the robot was successful. In the future we look forward to preparing more practice scripts, training more instructors, and expanding the
range of the robot’s service functions, thus establishing its commercial
usefulness. We expect that the humanoid intelligent service robot developed at Chienkuo Technology University will likely be marketed at some
point in the not too distant future.
References
[1] Industrial Technology Research Institute, Critical Technique in Software Development for Intelligent Robot, Taipei, 2007.
[2] Groover, M. P., Weiss, M., Nagel, R. N., Odrey, N. G., Industrial
Robotics, Technology, Programing and Application, The McGrawHill Companies, Inc. 1994.
[3] Robot world information website, http://www.robotworld.org.tw/
index.htm, Accessed July 2011.
[4] Wang, W. H. Special Issue of Intelligent Robot’s Technique, Journal of
the Mechatronic Industry, Vol. 305, 2008, pp. 1–92.
[5] ISO-8373:1994, Manipulating Industrial Robots, International Organization for Standardization, Geneva, 1994.
[6] Executive Yuan Institute of Occupational Safety & Health, Robot
Damage Prevention Handbook, Taiwan, R.O.C. 2008.
12
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
H. C. YEN, T. C. HSIA AND R. C. LIAO
[7] Pomerleau, D. A., Gowdy, J., Thorpe, C. E., Combining Artificial
Neural Networks and Symbolic Processing for Autonomous Robot
Guidance, Engineering Applications of Artificial Intelligence, Vol. 4(4),
1991, pp. 279–285.
[8] Sugihara, K., Smith, J. Genetic Algorithms for Adaptive Motion Planning of an Autonomous Mobile Robot, Computational Intelligence in
Robotics and Automation, 1997. CIRA’97., Proceedings., 1997 IEEE
International Symposium on, 1997, pp.138–143.
[9] Millan, J. R., Arleo, A. Neural Network Learning of Variable
Grid-Based Maps for the Autonomous Navigation of Robots, Computational Intelligence in Robotics and Automation, 1997. CIRA’97.,
Proceedings., 1997 IEEE International Symposium on, 1997, pp.
40–45.
[10] Schaal, S. Is Imitation Learning the Route to Humanoid Robots?
Trends in Cognitive Sciences, Vol. 3, 1999, pp. 233–242.
[11] Clark, C. M., Mills, J. K., Robotic System Sensitivity to Neural
Network Learning Rate: Theory, Simulation, and Experiments, The
International Journal of Robotics Research, Vol. 19, 2000, pp. 955–968.
[12] Bennewitz, M., Burgard, W., Cielniak, G., Thrun, S., Learning Motion
Patterns of People for Compliant Robot Motion, The International Journal of Robotics Research, Vol. 24, 2005, pp.31–48.
[13] Conforth, M., Meng, Y., An Artificial Neural Network Based Learning
Method for Mobile Robot Localization. in: Conforth, M. and Meng, Y.
Robotics Automation and Control, Publisher: IN-TECH, 2008.
[14] Ekvall, S., Kragic, D., Robot Learning from Demonstration: A
Task-level Planning Approach, International Journal of Advanced
Robotic Systems, Vol. 5(3), 2008, pp. 223–234.
[15] Seredin, O., Kopylov, A., Mottl, V., Selection of Subsets of Ordered
Features in Machine Learning, Transactions on Machine Learning and
Data Mining, Vol. 2(2), 2009, pp. 65–79.
[16] Wang, X., Lu, T., Zhang, P. State Generation Method for Humanoid
Motion Planning Based on Genetic Algorithm, Journal of Humanoids,
Vol. 1, 2008, pp. 17–24.
[17] Ram, A., Boone, G., Arkin, R., Pearce, M. (1994), Using Genetic
Algorithms to Learn Reactive Control Parameters for Autonomous
Robotic Navigation, Adaptive Behavior, Vol. 2(3), pp.277-305.
[18] Phase Space, Inc. PSImpulse system operating procedure, 2004.
MACHINE LEARNING IN A SERVICE ROBOT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
13
[19] Kuo-Wei Lin, Che-Chung Wang, Jung-Hsing Chen, Application of
Integrated Robust Designation Methodology in Multi-Objective
Optimization, Journal of Information & Optimization Sciences, Vol.
33(04-05), 2012, pp. 527–551.
Received March, 2013