robot_simulator

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
robot_simulator [2015/12/29 11:53]
n.arakawa [Robot Simulator]
robot_simulator [2016/06/13 13:34]
n.arakawa [Simulation Environment]
Line 4: Line 4:
 This page describes desired specifications of the robotic simulator to be used around WBAI. This page describes desired specifications of the robotic simulator to be used around WBAI.
 ==== Simulation Environment ==== ==== Simulation Environment ====
 +A sample environment with PyGazebo and an agent controlled with BriCA, Brain Simulator<sup>TM</sup>, or Nengo can be found on [[https://github.com/wbap/WinterMaze2016/|GitHub]].\\  
 +[[https://github.com/wbap/lis/|LIS (Life in Silico)]], another environment with [[http://unity3d.com|the Unity Game Engine]] and [[http://chainer.org|Chainer]] is being developed.
 === Simulator === === Simulator ===
-Currently a prototype is being developed with [[http://pygazebo.readthedocs.org/|PyGazebo]].+Currently a prototype is being developed with [[http://pygazebo.readthedocs.org/|PyGazebo]] ([[https://www.youtube.com/watch?v=V6MfM0d_h4k|video]]) and with the Unity Game Engine (LIS above).
 === Control environment ==== === Control environment ====
-Robots are to be controlled with [[BriCA]] [[http://www.goodai.com/#!brain-simulator/c81c|Brain Simulator]]<sup>TM</sup> from outside of the simulator.\\+Robots are to be controlled with [[BriCA]][[http://www.goodai.com/#!brain-simulator/c81c|Brain Simulator]]<sup>TM</sup>, or [[http://www.nengo.ca|Nengo]] from outside of the simulator.\\
 Recommended controlling language is Python (as it is easy to use &  low in platform dependency). Recommended controlling language is Python (as it is easy to use &  low in platform dependency).
 === Task environment ==== === Task environment ====
 With regard to rodent-level intelligence, [[http://www.ratbehavior.org/RatsAndMazes.htm|mazes for behavioral tests]] are to be implemented.\\ With regard to rodent-level intelligence, [[http://www.ratbehavior.org/RatsAndMazes.htm|mazes for behavioral tests]] are to be implemented.\\
-As for task environment for human-level intelligence, the simulation environment for RoboCup@Home will be considered.  Currently, their referential environment is implemented with [[http://www.sigverse.org/wiki/en/|SigVerse]].\\ +As for task environment for human-level intelligence, the simulation environment for [[http://www.robocupathome.org|RoboCup@Home]] will be considered.  Currently, their referential environment is implemented with [[http://www.sigverse.org/wiki/en/|SigVerse]].  So, if one is to contest in a Robocup league, s/he would have to use SigVerse.
-If one is to contest in a Robocup league, s/he would have to use SigVerse.+
 However, as the simulator we use for our prototype is PyGazebo, we might propose them to use PyGazebo in future Robocup...  However, as the simulator we use for our prototype is PyGazebo, we might propose them to use PyGazebo in future Robocup... 
 ==== Robot (Overview) ==== ==== Robot (Overview) ====
 The body shape of a simulated robot may be:\\ The body shape of a simulated robot may be:\\
   * Two-wheel turtle (good enough for rodent-level)   * Two-wheel turtle (good enough for rodent-level)
-  * ‘Centurtle’: with a turtle-like lower body and humanoid upper body (as used in the Robocup@Home Simulation League)+  * ‘Centurtle’: with a turtle-like lower body and humanoid upper body ([[https://twitter.com/RoboCupHomeSim/media|as used in the Robocup@Home Simulation League]])
   * Bipedal humanoid   * Bipedal humanoid
 It is desirable that a simulated robot has the following functions:\\ It is desirable that a simulated robot has the following functions:\\
Line 31: Line 32:
   * Speech function (optional)   * Speech function (optional)
 ==== Input ==== ==== Input ====
-It is desirable that a simulated robots has the following input functions.+It is desirable that a simulated robot has the following input functions.
   * Visual perception   * Visual perception
     * Color perception\\ While animals do not always have color vision, for the engineering purpose, it is easier to process visual information with color.     * Color perception\\ While animals do not always have color vision, for the engineering purpose, it is easier to process visual information with color.
     * Depth perception\\ While animals do not always have stereo vision, for the engineering purpose, it is easier to process visual information with depth.     * Depth perception\\ While animals do not always have stereo vision, for the engineering purpose, it is easier to process visual information with depth.
   * Reward   * Reward
-    * External reward is given when, for example, the robot gets a specific item (bait).+    * External reward is given when, for example, the robot gets a specific item (e.g., bait).
     * Internal reward is given by internal logic when, for example, curiosity is satisfied.     * Internal reward is given by internal logic when, for example, curiosity is satisfied.
   * Tactile perception (optional)   * Tactile perception (optional)
Line 45: Line 46:
   * Text input (optional)   * Text input (optional)
 ==== Output ==== ==== Output ====
-It is desirable that a simulated robots has the following output functions.+It is desirable that a simulated robot has the following output functions.
   * Locomotion   * Locomotion
-Default: L/R bi-wheel +    * Default: LR bi-wheel 
-Challenger’s option: N-pedal walker +    Challenger’s option: N-pedal walker 
-  * Manipulator (optional) +  * Manipulator (optional)\\ As a minimal way for object manipulation, a robot can move exterior objects by pushing them with its own body. 
-As a minimal way for object manipulation, a robot can move exterior objects by pushing them with its own body. +  * Vocalization (optional)\\ One of the following:
-  * Vocalization (optional) +
-One of the following:+
     * Text-to-speech (parser and word-phonetic dictionary)     * Text-to-speech (parser and word-phonetic dictionary)
     * Phoneme vocalizer (Phoneme sets are language dependent.)     * Phoneme vocalizer (Phoneme sets are language dependent.)
     * General-purpose sound synthesizer     * General-purpose sound synthesizer
   * Text output (optional)   * Text output (optional)
-  * Emotional expression (optional)\\ Robot with social interaction may require it.+  * Emotional expression (optional)\\ Robots with social interaction may require it.
 ==== Perception API ==== ==== Perception API ====
 While perceptual information processing may be implemented with machine learning algorithms, when it is not the main subject of research, it would be easier to use off-the-shelf libraries.  With the simulator, some information may be obtained ‘by cheating’ directly from the simulation environment.\\ While perceptual information processing may be implemented with machine learning algorithms, when it is not the main subject of research, it would be easier to use off-the-shelf libraries.  With the simulator, some information may be obtained ‘by cheating’ directly from the simulation environment.\\
-APIs are to be wrapped for access from BriCA / BrainSimulatorTM.+APIs are to be wrapped for access from BriCA / BrainSimulator<sup>TM</sup>.
   * Visual information processing   * Visual information processing
-The following are served:+The following are to be served:
     * Deep Learning APIs for visual information processing     * Deep Learning APIs for visual information processing
     * Image processing API such as OpenCV / SimpleCV     * Image processing API such as OpenCV / SimpleCV
-And the following may be served: +And the following may be served as options
-    * Object detection API +    * Object detection API\\ (utilizing depth info.) 
-(with depth info.) +      * Border Ownership / Figure Ground Separation API 
-    * Border Ownership / Figure Ground Separation API +      * Measuring the apparent size, relative direction, distance, relative velocity, etc. 
-Apparent size, relative direction, distance, relative velocity, etc. +  * Face detection API 
-  * Face detection API (optional) +  * Human skeletal mapping API (as found in Kinect API
-  * Human skeletal mapping API (as seen in Kinect) (optional+  * Facial expression recognition API (adapted to the facial expression of robots) 
-  * Facial expression recognition API (adapted to the facial expression of robots) (optional+  * Auditory information processing 
-  * Auditory information processing (optional) +    * Sound and speech recognition API\\ Functions such as seen in [[https://en.wikipedia.org/wiki/Julius_(software)|Julius]] [[http://www.hark.jp|HARK]]
-    * Sound and speech recognition API\\ Function such as seen in Julius / HARK+
  • robot_simulator.txt
  • Last modified: 2018/12/29 10:21
  • by n.arakawa