Finding papers about consciousness and robotics

The following list is a compilation of bibliographic search engines and scientific paper indexes, where you can look for publications related to your areas of interest.

Note that given my particular area of interest (Machine Consciousness) I have focused on neurosciences, robotics, and computer science (specifically Artificial Intelligence). However, these resources are useful for any researcher or student looking for scientific publication in other related areas.

You should also note that many of the websites listed below offer subscription services, i.e. you have to be subscripted in order to access the full paper text or other information.

In most of the cases the subscription is checked by looking at your IP address, so make sure you are connected (directly or through VPN) from your institution network. Doing so, you will be entitled to access the content.

Bibliographic Science Search Engines

Google Scholar
Google Books
CiteSeer.IST Scientific Literature Digital Library
DBLP Computer Science Bibliography
CSB The collection of Computer Science Bibliography

Directories of Journals

IEEE Computer Sciety Publications
ACM Publications
Directory of Computer Science Journals
Index of Information Systems Journals
HighWire Press
Thomson Scientific

Online Access to Scientific Publications

IEEE Explore
PubMed Central
Wiley InterScience
Blackwell Synergy
Project MUSE

Research Database Access


Open Access

Directory of Open Access Journals
PLoS One
ARXIV e-print archive

Cite Management

Reference Manager

In Spanish (En Español)

Biblioteca UC3M
Portal de acceso a la Web of Knowledge
Ranking de revistas (JCR – Journal Citation Report)
Inist cnrs

3rd Computer Science Symposium at UEM

The UEM Computer Science Symposium is a seven years old event hold at the Villaviciosa de Odon campus of the European University of Madrid. The aim of this symposium is to get computer science students and technology companies together.

This symposium is organized by the UEM Free Software and Linux user group (GLUEM) and by the UEM itself. This year, the symposium is sponsored by companies like Fon, Stratebi, Hakin9, Linux+, and website collaborators like TodoBI and will be part of the event, offering a presentation about Cognitive Robotics and Machine Consciousness as part of the Symposium.


The Huggable Robot

The Huggable project started in the MIT Media Lab (The Robotic Life Group) in 2005. The Huggable is a new type of robotic companion for healthcare, education, and social communication. It is inspired in traditional companion animal therapy.

The Huggable is equipped with a full body multi-modal sensory skin (see the video below for details), quite mechanical servos, inertial sensors, eyes cameras, ear microphones, and mouth speaker. In addition, it has an embedded PC with WiFi (802.11) communications capability.

The Huggable project has two main components: the Huggable robot itself and a set of Huggable technologies. Additionally, the Huggable robot has two modes of function. On one hand, it can work as a fully autonomous robot interacting with the patient. On the other hand, it can also work as a semi-autonomous robot avatar with some level of human control via the Internet.

These capabilities make the Huggable robot a really interesting platform for many applications in the fields of healthcare and education.

A human operator (say, a nurse) can collect remote date from the Huggable and the patient such as live video feed, live audio feed, and live sensor feed. Moreover, the human operator can send commands to the robot, so the interaction with the pation is done via the robotic avatar. The huggable uses Microsoft Robotics Studio (MSRS) for their “communication avatar” service

The follwing video shows some of the main features of the Huggable.

More information can be fount at:

New anesthesia derived from chillis block pain without impairing movement

Scientists at the Massachusetts General Hospital have combined a normally inactive lidocaine derivative with capsaicin, the ‘heat’-generating ingredient in chili peppers, to produce pain-specific local anesthesia. When injected into rats, this combination completely blocked pain without interfering with either motor function or sensitivity to non-painful stimuli.

This technique could revolutionize pain management, as it specifically targets pain-sensing neurons. Current local anesthetics block all neurons, not just pain-sensing ones, and produce dramatic side effects such as temporary paralysis and complete numbness. [1]

This means that using this drug you are still aware of touch while you are unaware of pain. A lot new applications could result if the new method is validated in humans (hopefully in 2 or 3 years).

As reposrted in Nature [2], rats given an injection of the anaesthetic were able to tolerate more heat than usual, while moving around normally. Then, they tried injecting the anesthetic near the sciatic nerve of the rats and pricked their paws with nylon probes. The animals seemed to ignore the painful prick, but continued to move normally and responded to other stimuli.

Researcher Professor Clifford Woolf, of Harvard Medical School and Massachusetts General Hospital in Boston in the US, said: ‘We’re optimistic that this method will eventually be applied to humans and change our experience during procedures ranging from knee surgery to tooth extractions.

‘Eventually this method could completely transform surgical and post-surgical analgesia, allowing patients to remain fully alert without experiencing pain or paralysis. ‘In fact, the possibilities seem endless. I could even imagine using this method to treat itch, as itch-sensitive neurons fall into the same group as pain-sensing ones.’

In time, it may be possible to package it in pill form, rather than giving it as an injection. There are, however, several hurdles to be crossed before the technique can be tested on human patients. Scientists will have to find a way of removing the temporary burning sensation associated with the use of capsaicin, as well as prolonging the pain-relieving effect of the drug.

 [1] News release, Massachusetts General Hospital and Harvard Medical School. (

[2] Nature, Oct. 4, 2007
“Inihibition of nociceptors by TRPV1-mediated entry of impermeant sodium channel blockers”
Alexander M. Binshtok (1), Bruce P. Bean (2), and Clifford J. Woolf (1)

CB2 The baby robot

CB2 (Child-robot with Biomimetic Body)  is a young android created by Japanese researchers Minoru Asada and Hiroshi Ishiguro (who famously created an android twin of himself). The following video shows this silicone skin baby bot rolling around and trying to kind-of speak:

 This 130 cms tall robot weights 33 Kg and is endowed with 197 tactile sensors and 51 compressed air-powered actuators. The robot is able to develop a behavior similar to a 1 or 2 years old baby. He react to touch and turns his gaze towards the person who touched him. The next step for this project from the Science and Technology Agency in Osaka is to develop a new version able to emulate a three years old child, able to walk and talk. That will be a real challenge requiring some degree of consciousness…

Robotic Carp

The robotic carp developed by Ryomei Engineering (a subsidiary of Mitsubishi Heavy Industries) is a curious example of a fish robot. This remote-controlled metal fish resembles to a koi carp, and it’s actually a great catch: 80 cm and 12 Kg. The following video shows the smooth tail movement.

The robotics koi is able to swiming in reverse and rotating in place thanks to its five motors. Additionally, it is equiped with a CCD camera and sensors for analyzing water quality.

New version of Asimo in Barcelona

A new version of the famous Asimo robot was presented last month in Barcelona, Spain. In addition to the former features of this humanoid model, the new version of Asimo features a streamlined design and is able to perform receptionist tasks, carrying objects, and improved mobility.

Advanced speed and mobility:


Running (6km/h)
Running whilst cornering (5km/h)
Turning on the spot
Slaloming (5km/h)


Advanced functions to operate in a human environment:


Interaction with people by recognising them
Interaction with people by calculating their distance
Greeting passers by
Walking hand-in-hand and moving in sync
Receiving and delivering a tray
Walking whilst holding a tray
Walking and changing directions whilst pushing a trolley
Operating a trolley in a number of ways


Honda demonstrated the new version of Asimo for the first time in Europe, at the Barcelona Biomedical Research Park and organised in partnership with the Barcelona City Council as part of the City’s ‘Year of Science’ activities. 
Have a look to the Asimo commercial:
More Information at: