In the advent of ambient intelligence, introducing ubiquitous mobile systems and services in general and mobile code in particular, network latency becomes a critical factor, especially in wireless, low-bandwidth environments. This paper investigates anticipative mobility, a high performance computing technique that exploits parallelism between loading and execution of applica-tions to reduce network latency. The technique sends the mobile application code anticipative to the remote host, long before the actual migration is re-quested. Then, at the actual migration, we won't transfer the complete applica-tion anymore but only the delta of the current computational state with the al-ready migrated computational state. Our experiments show that some applications can migrate in 2% of their original migration time. This allows ap-plications to migrate very fast from host to host without a significant loss of execution time during the migration phase.
展开▼