Here are my thoughts regarding the cognitive processes involved during the conversation and also some thoughts concerning how we can create a more human like robot.
Attention
What is it that affects my attention? If I find my friend unstimulating my thoughts sometimes start to wander and I lose attention. And perhaps we meet at a public place with a lot of background noise and people passing by. I try to filter out the disruptive environment, but it is exhausting and I easily lose focus. Perhaps I can improve my attention if I come well rested to the meeting with my friend? Perhaps I can temporarily increase my concentration by drinking a cup coffee?
Does a robot need to have different degrees of attention? Probably not if we assume that there will be computers with almost unlimited resources (CPU, memory etc) in the future. However, if a resource, such as processing power is limited, then perhaps the system should include some sort of control mechanism, which ensures that the resources focus on the part of the system which, in this particular moment is most critical. And maybe a robot may need to be designed so that it has the ability to filter information in cases where the information flow becomes unmanageable.
Perception
What do I see? Perhaps my senses are distorted and my internal view of the world will be less objective if I talk to someone I love or hate. And is the sky behind my friend really blue, or do I interpret it as blue because the brain is programmed that the default colour of the sky is blue? And if my friend also perceives the sky as blue does that means we perceive the sky in the same way or that we have different internal interpretations of the meaning of blue?
Can we build a robot that is able to see colours? How can the robot know that an object exposed to different types of light (or total darkness) have the same colour? It is probably a challenge to build a robot that understand colours, smells, tastes, but I think It is an even greater challenge to build a robot that are able to communicate the information in a human way. To build a robot that can describe with words how food tastes or flowers smell is not easy.
Memory
There are different types of memory functions, procedural (for example, manages the ability to sit on a Chair) and declarative (for example, manages the ability to remember places). And there are three different phases, encoding, storage and retrieval. If you want to create a human-like robot it probably makes sense to take advantage of the knowledge of how the human memory works. And storage technology probably needs to take a real step forward if we are going to fit the memory inside a human sized robot.
In our scenario where I talk to my friend, which part of our conversation will I remember and for how long? My own experience is that I more easily can recall memories from conversations that are emotionally engaging.
Thinking
I look at the thinking as the body's processor and switch, its here that our senses are converted to thoughts, and its here our will are converted into action in the form of movements. All impressions from the conversation are processed here and will result in emotions and actions (such as speech). Cerebellum is the part of the brain responsible for learning and coordinate movements (including eye movements). Basal ganglia are responsible for movement sequences that have become habits. Over time, repeated movements sequences result in changes in the procedural memory, and we can from that memory automatically (without to be aware of it) recall the movement sequence again.
Ethics and morality is interesting, can a robot be ethical and moral? If we program the robot to act according to the Swedish law does that mean that the robot always will behave ethically and morally correct?
And I don’t think today's processor technology is good enough, I think we must find a completely new technology. Perhaps quantum mechanics where we can have more states than only 1 and 0, can be the solution?
Language
Are we adjusting our language depending on who we are talking to and in what context? And if so how are we going to create a robot that adapt the speech or writing depending on context? In a conversation regarding the universe, how should the robot understand that it must express it self different depending on whether it’s talking to an astronomy professor or a child? To get a robot to understand written text is one thing, but how can a robot be programmed to be able to understand speech? When we talk, we use different accents depending on what we mean, we use irony, sometimes we read between the lines and we interpret facial expressions and gestures.
We will probably have more benefit of the conversation if we both have roughly the same reference framework, that is to say that we understand each other without the need to explain everything in detail.
Some believe that the brain has a built-in feature, called the language acquisition device, and that all people have this from birth and which manages the basic linguistic and grammatical understanding. If this function exists I think there will be plenty to learn from it and maybe we can use that knowledge to build better talking robots.
EmotionsScientists believe that emotion is governed by a number of interconnected brain organs called the limbic system. If I'm nervous to meet my friend perhaps my pulse will rise and I begin to sweat. This is achieved by the autonomic nervous system. Hypothalamus then connects the limbic system and the autonomic nervous system.
How do our emotions affect the five factors above? If the conversation is emotional does our attention increase? I think so. Do strong feelings affect our ability to memorise? I think so. Maybe emotions are connected to creativity and that some of our actions need that creativity. Maybe we need emotions and creativity in combination with logic to evolve as human beings?
.....