Classical multivariate principal component analysis has been extended to functional data and termed Functional principal component analysis (FPCA), but most existing FPCA approaches do not accommodate covariate information. The goal of this thesis is to develop alternative approaches to incorporate covariate information in FPCA and to develop specific approaches for dynamic positron emission tomography (PET) data. The thesis consists of two projects.;Two approaches are studied in the first project. The first focuses on the conditional distribution of the functional data given the value of a covariate Z, thereby leading to a modelling approach where both the mean and covariance functions depend on the covariate Z and time scale. The second approach can be motivated by a marginal approach that pools all the centered functional data together into one single population and thereby average out the influence of the covariate. Both new approaches can accommodate additional measurement errors and functional data sampled at regular time grids as well as sparse longitudinal data sampled at irregular time grids. We develop general asymptotic theory for both approaches and provide numerical support through simulations. The two approaches are also compared numerically through simulations and a data set consisting of the egg-laying trajectories of 567 Mexican Flies.;The covariate adjusted FPCA in the first project is adapted in the second project for Dynamic PET data, which are collected in four dimensions, three spatial and one temporal, and it will be the time dimension that is of particular interest here. We take the viewpoint that the observed PET time-course data at each voxel are generated by a smooth random function measured with additional noise on a time grid. By borrowing information across space and accounting for this pooling through the use of a non-parametric covariate adjustment, it is possible to smooth the PET time course data thus reducing the noise. We found that a multiplicative nonparametric random-effects model more accurately account for the variation in the data. The use of this model to smooth the data then allows subsequent analysis by methods such as Spectral Analysis to be dramatically improved in terms of their mean squared error.
展开▼