Creating and sharing knowledge for telecommunications

Learning by Observation of Agent Software Images

Costa, P. ; Botelho, L.M.

Journal of Artificial Intelligence Research Vol. 47, Nº 47, pp. 313 - 349, June, 2013.

ISSN (print): 1076-9757
ISSN (online):

Journal Impact Factor: 1,056 (in 2012)

Digital Object Identifier: 10.1613/jair.3989

Abstract
Learning by observation can be of key importance whenever agents sharing similar
features want to learn from each other. This paper presents an agent architecture that
enables software agents to learn by direct observation of the actions executed by expert
agents while they are performing a task. This is possible because the proposed architecture
displays information that is essential for observation, making it possible for software agents
to observe each other.
The agent architecture supports a learning process that covers all aspects of learning
by observation, such as discovering and observing experts, learning from the observed
data, applying the acquired knowledge and evaluating the agent's progress. The evaluation
provides control over the decision to obtain new knowledge or apply the acquired knowledge
to new problems.
We combine two methods for learning from the observed information. The first one, the
recall method, uses the sequence on which the actions were observed to solve new problems.
The second one, the classifi cation method, categorizes the information in the observed data
and determines to which set of categories the new problems belong.
Results show that agents are able to learn in conditions where common supervised
learning algorithms fail, such as when agents do not know the results of their actions a
priori or when not all the effects of the actions are visible. The results also show that
our approach provides better results than other learning methods since it requires shorter
learning periods.