Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
robot_simulator [2015/12/29 12:06]
n.arakawa [Output]
robot_simulator [2018/12/29 10:21] (current)
n.arakawa [Robot Simulator]
Line 2: Line 2:
 Robotic simulators can be used as virtual environment in which AGI (wannabe) systems are trained and evaluated. Robotic simulators can be used as virtual environment in which AGI (wannabe) systems are trained and evaluated.
  
-This page describes desired specifications of the robotic simulator to be used around WBAI.+This page describes desired specifications of the robotic simulator to be used around WBAI.\\  Please also check our request for research: [[https://​wba-initiative.org/​en/​research/​rfr/​3d-agent-test-suites/​|3D Agent Test Suites]].
 ==== Simulation Environment ==== ==== Simulation Environment ====
 +A sample environment with PyGazebo and an agent controlled with BriCA, Brain Simulator<​sup>​TM</​sup>,​ or Nengo can be found on [[https://​github.com/​wbap/​WinterMaze2016/​|GitHub]].\\  ​
 +[[https://​github.com/​wbap/​lis/​|LIS (Life in Silico)]], another environment with [[http://​unity3d.com|the Unity Game Engine]] and [[http://​chainer.org|Chainer]] is being developed.
 === Simulator === === Simulator ===
-Currently a prototype is being developed with [[http://​pygazebo.readthedocs.org/​|PyGazebo]] ([[https://​www.youtube.com/​watch?​v=V6MfM0d_h4k|video]]).+Currently a prototype is being developed with [[http://​pygazebo.readthedocs.org/​|PyGazebo]] ([[https://​www.youtube.com/​watch?​v=V6MfM0d_h4k|video]]) and with the Unity Game Engine (LIS above).
 === Control environment ==== === Control environment ====
-Robots are to be controlled with [[BriCA]] ​[[http://​www.goodai.com/#​!brain-simulator/​c81c|Brain Simulator]]<​sup>​TM</​sup>​ from outside of the simulator.\\+Robots are to be controlled with [[BriCA]][[http://​www.goodai.com/#​!brain-simulator/​c81c|Brain Simulator]]<​sup>​TM</​sup>​, or [[http://​www.nengo.ca|Nengo]] ​from outside of the simulator.\\
 Recommended controlling language is Python (as it is easy to use &  low in platform dependency). Recommended controlling language is Python (as it is easy to use &  low in platform dependency).
 === Task environment ==== === Task environment ====
Line 16: Line 18:
 The body shape of a simulated robot may be:\\ The body shape of a simulated robot may be:\\
   * Two-wheel turtle (good enough for rodent-level)   * Two-wheel turtle (good enough for rodent-level)
-  * ‘Centurtle’:​ with a turtle-like lower body and humanoid upper body (as used in the Robocup@Home Simulation League)+  * ‘Centurtle’:​ with a turtle-like lower body and humanoid upper body ([[https://​twitter.com/​RoboCupHomeSim/​media|as used in the Robocup@Home Simulation League]])
   * Bipedal humanoid   * Bipedal humanoid
 It is desirable that a simulated robot has the following functions:​\\ It is desirable that a simulated robot has the following functions:​\\
Line 46: Line 48:
 It is desirable that a simulated robot has the following output functions. It is desirable that a simulated robot has the following output functions.
   * Locomotion   * Locomotion
-Default: ​L/R bi-wheel +    * Default: ​LR bi-wheel 
-Challenger’s option: N-pedal walker +    ​* ​Challenger’s option: N-pedal walker 
-  * Manipulator (optional) +  * Manipulator (optional)\\ As a minimal way for object manipulation,​ a robot can move exterior objects by pushing them with its own body. 
-As a minimal way for object manipulation,​ a robot can move exterior objects by pushing them with its own body. +  * Vocalization (optional)\\ One of the following:
-  * Vocalization (optional) +
-One of the following:+
     * Text-to-speech (parser and word-phonetic dictionary)     * Text-to-speech (parser and word-phonetic dictionary)
     * Phoneme vocalizer (Phoneme sets are language dependent.)     * Phoneme vocalizer (Phoneme sets are language dependent.)
     * General-purpose sound synthesizer     * General-purpose sound synthesizer
   * Text output (optional)   * Text output (optional)
-  * Emotional expression (optional)\\ ​Robot with social interaction may require it.+  * Emotional expression (optional)\\ ​Robots ​with social interaction may require it.
 ==== Perception API ==== ==== Perception API ====
 While perceptual information processing may be implemented with machine learning algorithms, when it is not the main subject of research, it would be easier to use off-the-shelf libraries. ​ With the simulator, some information may be obtained ‘by cheating’ directly from the simulation environment.\\ While perceptual information processing may be implemented with machine learning algorithms, when it is not the main subject of research, it would be easier to use off-the-shelf libraries. ​ With the simulator, some information may be obtained ‘by cheating’ directly from the simulation environment.\\
-APIs are to be wrapped for access from BriCA / BrainSimulatorTM.+APIs are to be wrapped for access from BriCA / BrainSimulator<​sup>​TM</​sup>​.
   * Visual information processing   * Visual information processing
-The following are served:+The following are to be served:
     * Deep Learning APIs for visual information processing     * Deep Learning APIs for visual information processing
     * Image processing API such as OpenCV / SimpleCV     * Image processing API such as OpenCV / SimpleCV
-And the following may be served: +And the following may be served ​as options
-    * Object detection API +    * Object detection API\\ (utilizing ​depth info.) 
-(with depth info.) +      * Border Ownership / Figure Ground Separation API 
-    * Border Ownership / Figure Ground Separation API +      * Measuring the apparent ​size, relative direction, distance, relative velocity, etc. 
-Apparent ​size, relative direction, distance, relative velocity, etc. +  * Face detection API 
-  * Face detection API (optional) +  * Human skeletal mapping API (as found in Kinect ​API
-  * Human skeletal mapping API (as seen in Kinect) (optional+  * Facial expression recognition API (adapted to the facial expression of robots) 
-  * Facial expression recognition API (adapted to the facial expression of robots) (optional+  * Auditory information processing 
-  * Auditory information processing ​(optional) +    * Sound and speech recognition API\\ Functions ​such as seen in [[https://​en.wikipedia.org/​wiki/​Julius_(software)|Julius]] / [[http://​www.hark.jp|HARK]]
-    * Sound and speech recognition API\\ Function ​such as seen in [[https://​en.wikipedia.org/​wiki/​Julius_(software)|Julius]] / [[http://​www.hark.jp|HARK]]+