Personal assistants need to allow the user to interact with the system in a flexible and adaptive way such as through spoken language dialogue. This research is aimed atachieving robust and effective dialogue management in such applications. We focuson an application, the Smart Personal Assistant (SPA), in which the user can use avariety of devices to interact with a collection of personal assistants, each specializingin a task domain. The current implementation of the SPA contains an e-mail managementagent and a calendar agent that the user can interact with through a spokendialogue and a graphical interface on PDAs. The user-system interaction is handledby a Dialogue Manager agent.We propose an agent-based approach that makes use of a BDI agent architecturefor dialogue modelling and control. The Dialogue Manager agent of the SPA acts asthe central point for maintaining coherent user-system interaction and coordinatingthe activities of the assistants. The dialogue model consists of a set of complex butmodular plans for handling communicative goals. The dialogue control flow emergesautomatically as the result of the agent's plan selection by the BDI interpreter. Inaddition the Dialogue Manager maintains the conversational context, the domain-specificknowledge and the user model in its internal beliefs.We also consider the problem of dialogue adaptation in such agent-based dialoguesystems. We present a novel way of integrating learning into a BDI architecture sothat the agent can learn to select the most suitable plan among those applicable inthe current context. This enables the Dialogue Manager agent to tailor its responsesaccording to the conversational context and the user's physical context, devices andpreferences.Finally, we report the evaluation results, which indicate the robustness and effectivenessof the dialogue model in handling a range of users.
展开▼