Robot Brain Project CREST Development of Brain-Informatics Machines through Dynamical Connection of Autonomous Motion Primitives
Robot Brain Project


Nakamura Group

Dynamical Information Processing for Motion Pattern Generation and Transition
[Motion generation and transition of the humanoid robot using the dynamics-based information processing system]

Motion generation and transition of the humanoid robot with 20 degrees-of-freedom is realized based on the nonlinear dynamics that has multi-limit cycles in the reduced 3 dimensional space. The robot motion is yielded according to the motion of the dynamics.
[Motion of the humanoid robot HOAP-1]

Based on the whole body motion data generated by the nonlinear dynamics, the humanoid robot HOAP-1 moves.
[Motion generation of the humanoid robot using the connection of the sensor space and motor space]

Based on the sensor signal, the basins of the attractor are changed in the sensor space, which changes the motion of the dynamics in the motor space and generates the whole body motion of the humanoid robot.
[On-line design of the dynamics]

The nonlinear dynamics is on-line designed using the on-line least square method. Because of the forgetting parameter, the robot memorizes the new whole body motion.
[Motion of the designed dynamics]

This movie shows the motion of the designed dynamics. The dynamics forgets the old closed curved line and memorizes the new one.
[The whole body motion generation based on the continuous symbol space]

The nonlinear dynamics that generates the robot's whole motion is represented by the one point in the symbol space. The motion of the nonlinear dynamics in the symbol space (left hand side, only three dimensional space of the eight dimensional space is represented) decides the nonlinear dynamics in the motor space that generate the humanoid robot motion.
Humanoid Robot Design for Mobility Evolution
[Free motion of the backlash clutch]

Keeping the gap of the backlash clutch constant, the free motion is realized.
[Motion of the backlash clutch mechanism]

This movie shows another view of the above experiments. The gap of the backlash clutch mechanism is kept to constant.
[Squat motion of the humanoid robot UT-Theta]

This movie shows the squat motion of the UT-Theta.
Stochastic Information Processing that unifies Recognition and Generation of Motion Patterns -Toward Symbolical Understanding of the Continuous World-
[Motion acquisition by mimesis model based on discrete hidden Markov model]

The mimesis model abstracts an input motion pattern (top) as a proto-symbol representation using discrete hidden Markov model. The bottom motion is generated from the proto-symbol with genetic algorithm and the discrete hidden Markov model. The result shows the feasibility of the function of imitation learning.
[Real-time recognition of unknown motion and real-time generation of novel motion using proto-symbol space]

We have performed real-time motion recognition and generation by proto-symbol space which consists of continuous hidden Markov model, Kullback-Leibler information and multi dimensional scaling.
The movie on the top-right is a input motion pattern performed by human, the movie on the bottom-left is real-time recognition result on the proto-symbol space, and the movie on the top-left is the generation result on a humanoid robot.
Dynamics Computation and Behavior Capture of Human Figures
Simulation of a structure-varying kinematic chain using our efficient forward dynamics algorithm. The user can change the structure interactively by a mouse click.
Physically consistent walking motion with turn, generated from a captured straight walking motion using the dynamics filter.
Operation of UTPoser, the intuitive interface for generating whole-body motions using enhanced inverse kinematics algorithm. The user can generate natural motions by simply specifying the pinned and dragged links.
Simultaneous measurement of behavior and intension using the behavior capture system, where the motion capture system, the eye-mark recorder, and the force plates measure the subject's motion, the gaze direction and target, and the center of pressure, respectively.
The muscle forces to realize the kick motion, computed by the inverse dynamics algorithm for musculoskeletal human models. The red lines indicate the muscles that need larger forces.
Asada Group
[Behavior Acquisition by Multi-Layered Reinforcement Learning]

We proposed a multi-layered learning system that constructs state and action spaces of higher level learning system based on situations and behaviors abstracted from learning modules in the lower levels. The two movies show simple navigation and shooting behaviors and camera images of the robot respectively after we applied the multi-layered learning system to the robot.
[The result of reinforcement learning of the parameters of rhythmic walking of a humanoid robot based on vision sensor]

The robot learned to reach the shooting position by a reinforcement learning method. The picture left below shows the image of the fish-eye lens camera on the robot. Rewards are given to the robot when the ball is observed at its foot and the goal is observed in the center. It learned the walking parameters to maximize the expected rewards.
Tsuchiya Group
This movie shows the results of hardware experiments using the humanoid robot, HOAP-1. The stride and duty ratio are fixed as 3 [cm] and 0.50, respectively. Each scene indicates the results of the cases of uphill walking and downhill walking.
The first two scenes indicate the results of the cases that the duty ratio is given as 0.50 and 0.70, respectively. The final scene is that of the case that the duty ratio is commanded to change from 0.70 to 0.50 in 20 [sec] during locomotion.
Ushio Group
Using hybrid state nets, we obtain dynamical representations of periodic motions and transitions among them. This figure shows a consecutive motion transitions among walk, squat, and footing.
Using modular state nets for a left and right arm and timed Petri nets, we obtain collision-free motions. This figure shows that humanoid robot HOAP-I flies flags without collision of arms.
Yoshizawa Group
[Implementation of face recognition system]

based on online linear discriminant analysis (OLDA):Switching recognition mode and learning mode, one can dynamically add or update object classes on the fly.
OLDA based arbitrary pattern recognition system with interactive trainability:Applications of OLDA are not limited to face images. One can register arbitrary image patterns to our recognition system interactively through a graphical user interface.
[Heel tracking by arbitrary pattern recognition system]

In order to extract human walking pitch, person's heel is visually tracked. Arbitrary pattern recognition system based on OLDA is used for registration and discrimination of the heel.


Nakamura Group - Humanoid Robot Design for Mobility Evolution
[Humanoid robot UT-theta]

This robot has the cybernetic shoulder, double spherical hip joint and baklash clutch. The body hight is 157[cm], the body weight is 47[kg], total number of degree-of-freedom is 23. It has six-axes force sensors (in wrist and ankle), color CCD camera and two monochrome progressive cameras.
[Communication robot UT-omega]

This figure shows the whieled mobile robot UT-omega. It has 10 tachtile sensors and the six-axis force sensor.
[A Miniature Anthropomorphic Robot UT-mu for Fundamental Motion Control Experiments]
Height:58[cm], Weight:6.5[kg], Number of joints:20, Actuators:Coreless DC motor11[W], 6.5[W], 4[W], Sensors:3-axes force sensor PicoForce
[High power intelligent motor driver]
Based on Titch Intelligent Motor Driver produced by Okazakisangyo Co., we develop the high power version motor driver cooperate with Okazakisangyo Co. The rated voltage iw 36[V], the maximum current is 20[A].
[2-channnel small size high power motor deiver]
We develop 2-channnel small size high power motor deiver cooperate with 3TEC Co. The rated voltage iw 36[V], the maximu corrent is 20[A]/channel (Total per one boad is 30[A]). It has SH4 CPU and communicates with PC through RS232C.
[TSU-CNT: Sensor signal processing board and digital sensor units]

We developed a small computer board for sensor signal processing. Sensor signals are acquired through 16 digital channels and processed on an SH-4 CPU that runs at 118MHz. The board communicates with PC through a serial port RS-232C. Three sensors above the board in the photo are gyro sensors, a high-frequency accelerometer, anda low-frequency accelerometer from left to right.
Nakamura Group - Dynamics Computation and Behavior Capture of Human Figures
Capture scene in the behavior capture studio, where the motion capture system, the force plate, and the eye-mark recorder are used. The picture in the right-lower shows the electromyograph (EMG).
Tsuchiya Group
This photo shows the developed biped robot. The robot has 15 total DOF and consists of two legs with 6 DOF and the trunk with 3 DOF. Inside the upper body involves electrical circuits such as controller and motor drivers. The robot can interconnect with the host PC through the Ether Net LAN.
This photo shows the developed counter board for optical encoder. This board has 16 channels at the resolution of 24 bit and can count the encoders' pulse signals at non-synchronous mode. The interface bus is a subset of PC104 bus. The size of the board is 80mmÅ~60mm.
This photo shows the developed D/A Converter board. The board has 16 analog output channels at the resolution of 12 bit, and can output analog signals at non-synchronous mode. The output range of each channel is selectable between -5-5 V and 0-10V. The interface bus is a subset of PC104 bus. The size of the board is 80mmÅ~60mm.
Nakamura Group
Asada Group
Tsuchiya Group
Ushio Group
Yoshizawa Group

real one player