Sony Aibo is back!

I’m so glad Sony finally decided to come back to the AI and robotics frontline. As many of my colleagues working in robotics, I was so disappointed 11 years ago when Sony decided to discontinue Aibo and Qrio… Aibo played a very significant role in research, being a really good performing robot in challenges such as the Robocup, it was a shame to lose all that research momentum that we had acquired over the years with such a nice platform.

Aibo resurrection is great news, the new model, Aibo ERS-1000, resembles a lot the last ERS-7 model. It is now available to pre-order in Japan, going on sale around mid-January 2018. Check the official Aibo website for more details. The rest of the world will have to wait a little longer for the new Sony Aibo.

It’s for us to see now whether Sony has been doing a great effort during these years to improve the robot or it is just coming back to a redecorated version of the ERS-7 (according to the company they’ve been working on AI internally over these years, although no commercial products were available). My humble opinion: you need to be in the market, even when AI is not the trending topic, in order to really advance the field and play a significant role in the development of new AI and robotics products. Over these years, in many research areas and business prototype applications, Aldebaran Nao kind of took the place left by the discontinuation of Aibo. It’ll be interesting to see now if Aibo is able to reclaim its lost position. My opinion again, that depends on how well the company does with SDK and APIs.

Of course, apart from research and business applications, we have the wide potential market for robotic pets. But again, I think the success in this market depends on the ability of Sony to foster a strong research and developer community around Aibo. Keeping a technology-closed or fully proprietary platform will prevent Aibo to reach its full potential. On the other hand, if Sony decides to go for an open community collaboration, my guess is Aibo will have a successful future.

There is no doubt that current enthusiasm about AI and Deep Learning is influencing all companies worldwide and all industries. Sony decided to take a step back from this arena more than a decade ago, let’s see now if they can keep a strong position in such a competitive domain these days. One key feature of new Aibo is clearly artificial vision using deep learning, and here we can make some comparisons with Google TensorFlow. Sony has developed their own neural network libraries (see, which are written in C++ and designed to be embedded in devices such as Aibo. Apart from that, the whole library scheme seems to be much like TensorFlow. There is a web-based and app-based neural network console you can use to play with Sony’s neural network lib ( However, it looks like it is only available in Japan for the time being… So, we’ll stay tuned for news about this.

The new Aibo is being evaluated at Psicobōtica Labs as a possible platform for social interaction training.

A better way to test for consciousness?

ConsScale Pyramid

No doubt we need better ways to test for consciousness, as much as we need better definitions. From the point of view of science, Consciousness is a controversial and quite elusive phenomenon and trying to build good tests is actually a crucial part of the quest to understand it. Assessment and definition are always inseparable (otherwise, we wouldn’t know what we are testing).

Effectively testing for consciousness in humans is something we take for granted, specially when we put our trust in anesthesiologists right before going into surgery, or when ER physicians perform a Glasgow coma test to assess the level of consciousness of a patient. Assessing the level of consciousness in human subjects is usually a non problematic or challenging task (except from locked-in syndrome patients and the like). However, when we consider the vast number of other sorts of organisms, such as other animals, plants and machines things get much more complicated. The problems is essentially the same for all creatures, and scientifically speaking it involves the very same problem. However, there is a huge difference when we speak about humans: for humans we do assume consciousness as a legitimate feature of the living organism.

So, what do we do if we want to come up with a universal test for consciousness? One that might be applicable to virtually any creature possible, including biological organisms, artificial machines or cyborgs? What features or what measurements do we need to do? In other words, what is consciousness made of, so we can measure it?

ConsScale Summary

As neatly described by Musser in his aeon essay (Consciousness Creep. Our machines could become self-aware without our knowing it) we are making efforts, and hopefully some progress, into building new ways to test for consciousness. ConsScale is an example of this quest for both understanding and measuring consciousness.

Musser presents in his essay several of the tests that have been recently proposed, explaining the vision and position of the authors, including my own. It’s interesting to see how different approaches for testing imply different assumptions about what consciousness is. Tononi’s approach is based on the Information Integration theory, ConsScale is based on cognitive development, Haikonen stresses the importance of inner talk, Schwitzgebel raises the question of consciousness in groups (super-organisms). Perhaps we need to look for a new approach able to deal with all these aspects within the same framework.

New Generation of Atlas Humanoid Robot

Looks like Boston Dynamics is moving fast from quadripedal locomotion to the outstanding Atlas biped robot. In the Atlas the upper limbs are free and they are effectively used to carry and handle loads, and even opening doors! This new Atlas prototype seems to be much more agile and more human-like size. Although the focus for Atlas is in balance and dynamics, I guess next step is endowing this humanoid robot with proper hands (I don’t really see this Atlas version dealing with a regular door handle. Anyhow, I can imagine how it might be able to open the door anyway…).

I’m always taking about robots and emotions…, but this time I just think it’s a good thing that the Atlas robot didn’t show any anger when dealing with the annoying guy with the hockey stick (luckily for the human).

Most people think robots will become conscious

Can robots become conscious? poll results

Back in 2006 I started an online poll published in the frontpage of (that old authentic machine consciousness site, that I just turned into a 2016 modern posh blog ;-). Over these years more than 1.100 followers have participated in the poll, and guess what? Around 85% of the participants believe machine consciousness will be a reality. Almost 72% of the people believe robots will become as conscious as humans. Only 4.3% of the participants think machine consciousness is not possible:

Conscious Robots Poll
Conscious Robots Poll (2006-2016)

Ok, ok, these guys answering the question are not exactly a simple random sample. Let’s say that followers of are expected to have a strong bias towards believing that machine consciousness is something doable. Nevertheless, quite interesting results…

Robotica 2012: 12th International Conference on Autonomous Robot Systems and Competitions

Robotica 2012: 12th International Conference on Autonomous Robot Systems and Competitions

April 11, 2012, Guimarães, Portugal

 The 12th International Conference on Autonomous Robot Systems and Competitions  ( is an international scientific meeting in the field of autonomous robotics, and related areas, which will take place in conjunction with the 12th Portuguese Robotics Open, a RoboCup Local Event.

As in previous years, we expect the conference to get the Technical co-sponsorship of the IEEE Robotics and Automation Society. The conference will be held in the beautiful city of Guimarães, Portugal, on April 11, 2012. Guimarães is the cradle of Portugal, is classified by UNESCO as Worldwide Heritage, and is the European Capital of Culture 2012  ( ).



• Artificial Intelligence
• Architectures for Mobile Robots
• Sensors and Sensor Integration
• Motion and Actuation Systems
• Multi-Robot Systems
• Human-Robot Interaction
• Simulation and Visualization
• Robotic Competitions
• Planning, Reasoning and Modeling
• Cooperative Navigation and Control
• Cooperative Perception
• Computer Vision and Image-Processing
• Navigation and Control of Mobile Robots
• Recognition, Localization, Tracking, SLAM
• Robot Learning
• Applications of Autonomous Intelligent Robots
• Computer and Robotic Entertainments
Important Dates:

All dead-lines are 23:59 GMT

·         23rd January, 2012 Submission of full-length papers

·         20th February, 2012 Notification of acceptance

·         5th March, 2012 Camera Ready Papers

·         12th March, 2012 Early Registration Deadline

·         11th April, 2012 Conference takes place

Urbi goes Open Source

Gostai releases an Open Source Edition of Urbi (Urbi 2.1 – Open Source AGPL v3)

URBI is a software development platform for robotics that supports asynchronous event management and orchestration. URBI also provides software components and interfaces to many robots like Lego NXT and Aldebaran Nao.

Recently, Gostai have decided to go Open Source and release a new version of URBI under a licence AGPL v3 (they also maintain a commercial licence for commercial partners). Source code is available for the URBI Kernel, however the code of the Gostai Studio graphical interface is not included in the open source initiative.

More details can be found at:

Robotics Developer Studio 2008 R3

Microsoft Robotics have recently announced the release of Robotics Developer Studio 2008 R3, which is now available for download free of charge

More information and direct download is available at RDS official website:

Details about this new 2008 R3 version are:

– Microsoft RDS is now offered free of charge.
– Microsoft RDS is now available as a single edition — containing all of the functionality of the previous Standard Edition at no cost.
– New features in Microsoft RDS 2008 R3 include added support for Visual Studio 2010 and two additional simulation environments (Multi-level House and Factory).
– Additional samples have been made available on CodePlex, including Sumo and Soccer simulations. By making source code available on CodePlex, the community can modify and extend the Microsoft RDS platform.

Other new updates/changes include:

– The CCR & DSS Toolkit has been merged into RDS 2008 R3.
– CCR & DSS will remain a core component of RDS.
– CCR & DSS can be obtained by installing the full RDS package.
– R3 is no longer compatible with Compact Framework (CF) development.
– Samples for languages other than C# have been moved to CodePlex.

Logitech Webcam Pan Tilt Service

Logitech Webcam Pan Tilt Service (PanTiltCam Service)

The CRUBOTS Pan Tilt Service for Logitech webcams (PanTiltCam service) is a component to be used with Microsoft Robotics Developer Studio to control the pan and tilt functions of a Logitech webcam like the Logitech Quickcam Sphere MP (also kown as Quickcam Orbit MP).

This service is based on the Logitech Orbit/Sphere Mover DLL Library and permits to control the pan and tilt motors of the Logitech webcam from MRDS 2008 services using C# or any other .NET programming language.

Service Source Code Download

Pan Tilt Cam service source code is available at: MRDS 2008 R2 Download section.

Service Details and Instructions

{mosimage} To illustrate how the service works I’ve added Pan Tilt control buttons to the CRANIUM Dashboard service included in CRUBOTS. As far as I know there is no simulated web cam that supports pan-tilt functions, therefore it can only be used with a real Logitech cam.

The CRANIUM Dashboard service is able to subscribe to a generic Pan Tilt Service, and expose the functionality of such a service in the form of Up, Down, Left, Right buttons. Every time you press these buttons the corresponding commands are sent to the Pan Tilt services you connected to.

The distribution ZIP file actually contains two services:

 – GenericPanTilt Service: The Generic Pan Tilt Service is a Robotics Developer Studio generic service for Pan Tilt actuators. In other words, this generic service defines the behavior (operations) that any pan tilt controller will support. This service is the one you should reference from your control code if you want it to be hardware independent.

 – PanTiltCam Service: The Pan Tilt Cam Service is an implementation of the Generic Pan Tilt Service which can control Logitech pan tilt cameras. It uses the Logimove dll library to communicate with the Logitech driver.

Download the ZIP file to your MRDS home directory, when you unzip the file, it creates two directory projects in the packages\crubots\Actuators directory under your RDS installation:

The folder GenericPanTilt contains the source code of the generic service, and the folder PanTiltCam contains the Logitech pan tilt service.

If you want to compile the projects yourself, then open the project and do a rebuild (see the note below first!)


The logimove.dll file has to be copied to the application working directory so the Logitech service finds it at runtime. It can be placed in the MRDS bin directory.

In order to have the project references working for your particular settings, you will need to run DssProjectMigration.exe. For instance (from the MRDS command prompt):

 bin\DssProjectMigration.exe packages\crubots\Actuators

The Pan Tilt service includes just a new operation called PanTiltOperation, which takes a PanTiltOperationRequest object as parameter. This request is defined as follows:

    /// <summary>
/// Pan Tilt Operation Request
/// </summary>
public class PanTiltOperationRequest
public enum OpType
MoveUp,     // Move the camera up
MoveDown,   // Move the camera down
MoveLeft,   // Move the camera left
MoveRight,  // Move the camera right
Reset       // Reset

See Readme.txt file for more details. Use the MRDS forum if you have any question about this service.


CRUBOTS Utilities for Robot Simulation

(available at:

CRUBOTS is a set of Robotics Developer Studio (MRDS) services developed as part of a research program in Machine Consciousness. Although these services were developed originally to work with the CERA-CRANIUM cognitive architecture, they can be reused in any robotics project.


As we work primarily with a Pioneer 3DX robot, most of the simulation services have been designed to be a reproduction of the real robotic mobile base.

CRUBOTS is distributed as a ZIP file containing the source code for all MRDS services. Each service code in enclosed in its own folder under packages/crubots in MRDS home directory.

See below for specific instructions and a description of the services included in CRUBOTS.

The following is a set of services included in the present release of CRUBOTS:


The Cranium Dashboard service is based on the Simple Dashboard service included in the MRDS and has been extended (based on this older version) to display real-time information of other sensors: robot camera, sonar, bumpers, and GPS. Additionally, the simulation window with the Pursue Camera has been integrated in the same GUI:

Using the Cranium Dashboard you can inspect the readings of all sensors (including the camera image) and have the simulation window, all in the same integrated GUI.

Pioneer Sonar Representation

 For instance, sonar readings from the Pioneer robot can be monitored graphically as well as quantitatively.

Each Pioneer 3 DX SONAR ring is composed of eight transducers arranged at angles -90, -50, -30, -10, 10, 30, 50, and 90 degrees. They are polled sequentially at a configurable rate (usually 25 Hz – 50 ms per transducer per array).

The graphical representation you can see in the Dashboard depicts each of the Sonar transducers reading as a 2D cone (the red line represents the scaled range measured by each Sonar transducer, the rest of blue lines represent the apperture of each Sonar transducer, and therefore the area which is free of obstacles according to Sonar readings. Note the blue lines get darker as closer obstacles are detected).

You can also check the actual values of measurement (in milimiters) obtained by the Sonar. They are the S0 to S7 values that appear to the right of the graphical representation.

Pioneer Bumpers

A graphical representation of the state of Pioneer robot’s bumper panels is also available (the figure depicts the state when bump panels b1 and b2 are pressed).

Most of the code in the Cranium Dashboard also works with the real robot, hence it can be used for real robot monitoring and control as well.

Simulated GPS Service

The Simulated Pioneer GPS service provides a simple localization service to be used in the Microsoft Robotics Developer Studio simulator. It consists in a simple box shape entity (PioneerGPSEntity) that can be attached to a simulated robot, and a service (SimulatedPioneerGPS) that provides notifications indicating updated X, Y, Z coordinates of the robot in the simulated world.

More information about Simulated GPS Service.

Simulated Pioneer 3DX Bumpers

The Simulated Pioneer 3DX bumper service provides an accurate simulation of all the independent bump panels in both frontal and rear bumpers.

More information about Simulated Pioneer 3DX Bumpers.

Explorer Sim Sonar Service

This service is basically a modification of the Explorer service that comes with MRDS adapted to use sonar ranging for mapping.

More information about ExplorerSimSonar service.

Maze Simulator Service

Simulated MazeThe Maze Simulator Service creates a simulated 3D world made of walls from a 2D bitmap (bmp) image. This service also creates (programatically) a simulated Pionner 3DX robot equiped with a frontal camera, frontal sonar, frontal and rear bumpers, and a Laser range finder (LRF).

The pixels is the 2D color bitmap specify the position, color, and textures of the walls.

More information about the Maze Simulator Service.

Source Code Download

CRUBOTS and related services can be downloaded from CR download pages, category MRDS 2008 R2 Services.

Any question, feedback or comment can be posted and shared at the MRDS Forum.