Simulated GPS Service

Simulated Pioneer GPS Service

The Simulated Pioneer GPS service provides a simple localization service to be used in the Microsoft Robotics Developer Studio simulator. It consists in a simple box shape entity (PioneerGPSEntity) that can be attached to a simulated robot, and a service (SimulatedPioneerGPS) that provides notifications indicating updated X, Y, Z coordinates of the robot in the simulated world.

 This service is based on the Simulated GPS service that comes with MRDS 2008 R2.

Service Download

 Simulated Pioneer GPS source code for MRDS 2008 R2 is available here.

Installation Instructions

Download the ZIP file to your MRDS home directory, When you unzip the file, it creates one project in the
packages\crubots\simulation\sensors directory under your MSRS installation:

The folder SimulatedPioneerGPS will contain the source code.

If you want to compile the projects yourself, then open
the project and do a rebuild (see the note below first!):



In order to have the project references working for your particular settings,
you will need to run DssProjectMigration.exe. For instance (from the MRDS
command prompt):

 bin\DssProjectMigration.exe packages\crubots\simulation\sensors\SimulatedPioneerGPS

See Readme.txt file for more details. Use the MRDS forum if you have any question about this service.

Can a robot pass the mirror test?

awarerobot_zoomFirst of all, the mirror test is not exactly intended as a general test for consciousness, but a specific test for self-consciousness, and more exactly self-recognition. It is generally applied to some higher mammals and infants. The test consists on determining whether or not the subject can recognize its own reflection in a mirror. So far, only subjects belonging to the following species have passed the mirror test:

humans (over 2 years old),
great apes (bonobos, chimps, orangutans, and gorillas),
rhesus monkeys,
bottlenose dolphins,
and octopuses.

I think it is important to note that only a determined number of individuals of these species have passed the tests, while others generally fail to pass it. Obviously the test has to be adapted to each specie, although it typically consists on an odorless paint mark made in the forehead while the animal is anesthetized.

The mirror test has been considered by some researchers as one of the best available ways to test self-consciousness in organisms (see for instance how it is applied to Elephants in [1], and see [2] for an open discussion about the mirror test validity). Mirror test is famous thanks to its application to primates, as introduced by Gordon Gallup in the 70’s [3]. However, little work has been done in the application of the mirror test to robots.

Can we build a robot able to successfully pass the mirror test? And if so, does it really mean that the robot is self-aware?

Takeno et al. [4] at Meiji University in Japan claim that they have succeeded in achieving mirror image cognition for a robot. They define four steps for their experiments, where four robots are used: the self robot Rs, the other robot Ro, the controlled robot Rc, and the automatic robot Ra. The first two robots are endowed with the mirror image cognition system. The third robot is controlled by the self robot, while the last one moves automatically.

The four experiments are as follows:

1) The self robot Rs imitates the action of its own image reflected in a mirror.
2) The self robot Rs imitates an action taken intentionally by the other robot Ro as imitative behavior.
3) The controlled robot Rc is controlled completely from the self-robot to imitate his behavior.
4) The self robot Rs imitates the random actions of the automatic robot Ra.

The robot is able to recognize its own image reflected in a mirror without confusing it with the image of another robot with the same physical aspect. The mirror image cognition system is based on an artificial neural network. The aim of this system is to recognize and differentiate robot’s own behavior from other robot’s behavior. Takeno also suggests that imitation is a proof of consciousness as it requires the recognition of other subject’s behavior and then the application of that behavior to oneself.

The results described in the paper indicate that in some way the robots are passing the mirror test with an accuracy of 70%, but I am reluctant to claim that they are self-conscious. I would rather say that the present a-consciousness of their recognized image.

[3] Gallup, G.G., Jr. (1977). Self-recognition in primates: A comparative approach to the bidirectional properties of consciousness. American Psychologist, 32, 329-337.
[4] Junichi Takeno, Keita Inaba, Tohru Suzuki. Experiments and examination of mirror image cognition using a small robot. Proceedings. 2005 IEEE International Symposium on Computational Intelligence in Robotics and Automation, 2005. CIRA 2005. Full paper available at:


Wakamaru, the communication robot by Mitshubishi.

Wakamaru is a robot designed for human communication and interaction and its own “personality”.

This robot has been designed by Toshiyuki Kita to live with humans. Its robotic body features original eyes, mouth and eyebrows. The inbuilt technology is expected to endow the robot with rich communication similar to humans. Wakamaru is able to keep eye contact, move autonomously, and connect to the network. Wakamaru is also designed to be safe using the same standards used for toy manufacturing and design.

Wakamaru is equipped with wheels for mobility and the following sensors: self-position measurement, infrared obstacle detection, ultrasonic obstacle detection, and collision detection. Social skills are based on detection of moving persons, face detection, voice recognition and speech synthesis. Wakamaru is 1 meter tall and weigths around 30 kg.

The following video shows Wakamaru in New York city:

More information at Wakamaru Homepage.

Aldebaran NAO

The following is a video of an Aldebaran Nao prototype robot at WAF 2008 (Workshop de Agentes Físicos, 2008. Vigo. Spain). This one is used by University Rey Juan Carlos of Madrid to participate in the Robocup competition. After Sony decided to quit the manufacturing of Sony Aibo, Nao has become the new physical robot in that category.

As can be observed in the demo, some improvements to balance and movements are still to be made in order to have an effective robotic soccer player..

RoboChamps Competition

Microsoft have recently launched the RoboChamps simulation robotics competition and portal.

The portal supports the league competitions with training, access to software, and community features. Robochamps is based on simulation, therefore you don’t need any real robotics hardware in order to compete. The idea is that you can start programming robots simply using the Microsoft Developer Studio 2008 and Visual Studio.

As the competition is based on MRDS 2008 and its simulation capabilities you can use any .NET language to program your simulated robot. The good thing is that you could use exactly the same code to control a real physical robot. Anyhow, the great advantage of this software simulated competition is that you are provided with rich simulation environments. Usually, it is really hard to build an scenario like that in the real world, so using these simulation environment allow us to test and train our robot controllers in nearly real world situations. Imagine controlling your rescue robot in a city that has just suffer a natural dissaster or a terrorist attack, or let your autonomous car drive under intense traffic conditions… or the much more relaxing exploration of the surface of Mars. These are some of the challenges proposed in RoboChamps.

Each challenge consist of a 3D simulation environment, a robots, a chanllenge scenario, and a set of rules for completing the scenario. Also, a referee service will monitor your control service, ensuring that the rules are followed, and determining your score and submitting it to

The available RoboChamp challenges are:

  • AMAZED CHALLENGE:  Use your sensors to avoid traps and other surprises as you navigate the twists and turns of Amazed.

  • MARS ROVER: CHALLENGE: Next Stop, Mars! Navigate the terrain of the red planet and collect data for analysis back on Earth. {mosimage}

  • URBAN CHALLENGE: Do you hate driving downtown? What if you could program a car to do the driving itself? Now you can.

  • SEARCH & RESCUE CHALLENGE: Ready to be a hero? Scour thought the post-disaster rubble to find and rescue survivors in this challenge.

  • SUMO CHALLENGE: Two robots. One ring. It’s sumo time! Outmaneuver your opponent and push it out of the ring to reign victorious.

  • TOURNAMENT: Are you the best of the best? Take to the field with your robot and your best code to compete head to head against fellow leaguers.

I have register the competitor in the Academic zone. Robocard:

WowWee Rovio

The creators of Robosapien WowWee are preparing a new robot called Rovio that has been presented at CES 2008 and will be available this summer and it will cost around 200 Euro.

Rovio is an omnidirectional wheeled robot equipped with an onboard webcam. The robot can be controlled from anywhere using a PC and the images of the webcam can be also visualized. Even the camera angle can be remotelly controlled in order to inspect the surroundings of the robot. It can be used for surveillance, and remote listening and talking. Rovio is also able to automatically navigate to the nearer charging station.

The following video is a Rovio demo at CES 2008:

Simulated Pioneer 3DX Bumper

SimulatedPioneerBumper Service

Microsoft Robotics Studio comes with a simulated Pioneer 3DX entity that can be used in the Visual Simulation Environment. This simulated robot can be equipped with several simulated sensors, like the LRF or the simulated bumper. Usually the P3DX bumper is modeled as just one frontal contact sensor and one rear contact sensor. However, the real Pioneer robot usually comes with two bumper rings, each having five bump panels:


Given the need to use more accurate models of the real sensors, we have been working in additional simulation services, like the Simulated Sonar Service. In this case, we wanted to accurately simulate the frontal and rear bumper rings of the Pioneer Arcos robot base. The Simulated Pioneer Bumper service models the ten bumper panels using ten BoxShapes located approximately at the same possition that corresponds to the real robot. Note the boxes that represent the simulated contact sensors in the following pictures:

NOTE: the boxshapes arranges at angles around the robot are used to calculate the physics collisions with other elements of the simulated world. Altering their positions will impact robot physical behavior.

Service Download

Installation Instructions

Download the ZIP file and unzip it into your MSRS directory. Note that this is assumed to be:

C:\Microsoft Robotics Studio (1.5)

When you unzip the file, it creates one project in the Apps\UC3M directory under your MSRS installation:


If you want to compile the projects yourself, then open the project and do a rebuild:



The SimulatedPioneerBumper services creates a visual entity that models the front and rear bumper panels of a Pioneer robot base. Additionally, the service implements the generic contract for Contact Sensor, therefore it can be used by any code that deals with a contact sensor. It will send notifications to subscribed services everytime a bumper hits any surface in the simulated world.

The state of this service maintain a set of ten contact sensors, identified by the names b1, b2, b3, b4, b5, b9, b10, b11, b12, and b13 as depicted below:

Pioneer 3 DX front and rear bumper pair provide five points of sensing and one reading per bumper panel, which can be reproduced in the Microsoft Robotics Studio Simulator thanks to this service. Each bump panel is 3.97 in x 1 in wide. The segments are distributed at angles around the robot. Real distribution is -52, -19, 0, 19, and 52 degrees, as shown in the picture below:

However, we have slightly adapted the orientation and size of segments to match the 3D model provided with Robotics Studio.

Additionally, we’ve added a graphical representation of the bumpers state in the Cranium Dasboard (the following figure depicts the state when bump panels b1 and b2 are pressed):

Disclaimer and License

This program is licensed under the terms of Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License; you can redistribute it and/or modify it. (If you build any application using this software, I’d like to know it, please provide feedback).

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

Creative Commons License 

Who is this website’s robot model?

conscious_robot_crawlThe robot model that you can see in this website header and section summaries is an animatronic human sculpture in bronze and stainless steel designed and built by artist Mark Ho. Its name is ZOHO ARTFORM NO.1.

The copyright owner of the Images of this sculpture is also by Mark Ho. If you are interested in this sculpture and how to get it, please check the following links:



ExplorerSimSonar Application

ExplorerSimSonar Application

ExplorerSimSonar Application is a set of MSRS services used to recreate a virtual maze-like environment and experiment with a simulated Pioneer 3 DX robot equipped with a frontal Sonar array.

The archive available for download include serveral Robotics Studio services that are used to simulate a robot sonar array, recreate a maze virtual world, and autonomously control a robot within this environment using the sonar readings.

Most of the code is based on Trevor Taylor’s QUT Applications for Robotics Studio (see


ARCOS based robots (like the Pioneer 3 DX) can integrate up to four Sonar rings, each with eight transducers. These sensors provide object detection and distance information. The Robotics Studio platform comes with a sample service called Explorer that uses the Laser Range Finder as sensing device. However, as I don’t have such a device, I wanted to use the P3DX frontal sonar ring instead.

The original ExplorerSim service written by Trevor Taylor builds a map using the laser scans that the explorer is retrieving as the robot wanders around. In this version (ExplorerSimSonar) I’ve added support for a simulated sonar. Therefore, the map is created based on frontal sonar scans.

Services included in this Application:

SimulatedSonar -> Implements a simulated Sonar.
CraniumDashBoard -> Control panel window (formelly known as Control Panel service).
MazeSimulatorRA -> a version of Maze Simulator.
ExplorerSimSonar -> Autonomous robot control.
DifferentialDriveTT -> Trevor’s DifferentialDrive.

Application Download

Installation Instructions – Usage

Download the ZIP file and unzip it into your MSRS directory. Note that this is assumed to be:  C:\Microsoft Robotics Studio (1.5)

When you unzip the file, it creates four projects in the Apps\UC3M directory under your MSRS installation:

– SimulatedSonar
– CraniumDashboard
– MazeSimulatorRA
– ExplorerSimSonar

And one project under Apps\QUT directory:

– DifferentialDriveTT
(this is an unmodified service from Trevor Taylor).
Use the following scripts provided within this application distribution:

These commands must be run from the MSRS root directory, in an MSRS DOS Command Prompt window.

1. Rebuild ExplorerSimSonar Services and all
dependencies by running


2. Run the ExplorerSimSonar application

– Please refer to the readme.txt files located
under each service directory for more details.

Last Version Information

Update 11. November 9, 2007.

Portions of code have been rewritten in this update in order to remove all vision services. The idea is to generate a clean environment for testing and experimentation of autonomous robot navigation using Sonar sensors.

I hope to release a future set of services including the vision processing stuff.


This program is free software; you can redistribute it and/or modify it. (If you build any application using this software, I’d like to know it, please provide feedback). This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.


Finding papers about consciousness and robotics

The following list is a compilation of bibliographic search engines and scientific paper indexes, where you can look for publications related to your areas of interest.

Note that given my particular area of interest (Machine Consciousness) I have focused on neurosciences, robotics, and computer science (specifically Artificial Intelligence). However, these resources are useful for any researcher or student looking for scientific publication in other related areas.

You should also note that many of the websites listed below offer subscription services, i.e. you have to be subscripted in order to access the full paper text or other information.

In most of the cases the subscription is checked by looking at your IP address, so make sure you are connected (directly or through VPN) from your institution network. Doing so, you will be entitled to access the content.

Bibliographic Science Search Engines

Google Scholar
Google Books
CiteSeer.IST Scientific Literature Digital Library
DBLP Computer Science Bibliography
CSB The collection of Computer Science Bibliography

Directories of Journals

IEEE Computer Sciety Publications
ACM Publications
Directory of Computer Science Journals
Index of Information Systems Journals
HighWire Press
Thomson Scientific

Online Access to Scientific Publications

IEEE Explore
PubMed Central
Wiley InterScience
Blackwell Synergy
Project MUSE

Research Database Access


Open Access

Directory of Open Access Journals
PLoS One
ARXIV e-print archive

Cite Management

Reference Manager

In Spanish (En Español)

Biblioteca UC3M
Portal de acceso a la Web of Knowledge
Ranking de revistas (JCR – Journal Citation Report)
Inist cnrs