Balancing of parallel systems workload is very essential to ensure minimal response time of tasks submitted to process. Complexity of data warehouse systems is very high with respect to system structure, data model and many mechanisms used, which have a strong influence on overall performance. In this paper we present a dynamic algorithm of spatial telemetric data warehouse workload balancing. We implement HCAM data partitioning scheme which use Hilbert curves to space ordering. The scheme was modified in a way that makes possible setting of dataset size stored in each system node. Presented algorithm iteratively calculates optimal size of partitions, which are loaded into each node, by executing series of aggregation on a test data set. We investigate both situation in which data are and are not fragmented. Moreover we test a set of fragment sizes. Performed system tests conformed possibility of spatial telemetric data warehouse balancing algorithm realization by selection of dataset size stored in each node. Project was implemented in Java programming language with using of a set of available technology.
展开▼