Artificial self recognition although getting closer to completion with the detection of self in conversation and further adds a real feeling of talking to a human in chat conversation, there will always be a major gap in understanding to what is inputed, stored and reverberated back.
Although artificial intelligent learning chat systems are getting better at arranging new data from convention, arranging the data based on a key word detection and filtering system, artificial intelligence still has a long way to go in understand the details of the data itself on a single process.
Data from chat inputs are stored and arranged for later use, the chat data inputs are relayed back at appropriate points in the conversion, and is not analysed on a deeper level by the artificial intelligence.
Artificial intelligence chat does not truly know or understand what it is saying, only repeating what is stored from previous conversations.
Training an chatbot solo can have an echo chamber effect, storing how you speak and using it back. Depending on how you speak to the chatbot will train it on how to speak back. If you are aggressive, you are training it to be aggressive back, if your are polite, the artificial intelligence will be trained to be polite back.
The words and letters themselves have no meaning to the artificial intelligence, even when detecting and arranging data for storage. To the machine, it is just zeros and ones.
To the system, it is just matching characters and the arranging data to be recalled laters on when certain requirements are met.
Even if emotions are high, and even if the artificial intelligence seems to be emotional, it is not. What the artificial intelligence is doing is just repeating what has been said to it from previous conversations, with no emotion behind it what so ever.