Motivation: Brain imaging genetics aims to reveal genetic effects on brain phenotypes, where most studies examine phenotypes defined on anatomical or functional regions of interest (ROIs) given their biologically meaningful interpretation and modest dimensionality compared with voxelwise approaches. Typical ROI-level measures used in these studies are summary statistics from voxelwise measures in the region, without making full use of individual voxel signals.
Results: In this article, we propose a flexible and powerful framework for mining regional imaging genetic associations via voxelwise enrichment analysis, which embraces the collective effect of weak voxel-level signals and integrates brain anatomical annotation information. Our proposed method achieves three goals at the same time: (i) increase the statistical power by substantially reducing the burden of multiple comparison correction; (ii) employ brain annotation information to enable biologically meaningful interpretation and (iii) make full use of fine-grained voxelwise signals. We demonstrate our method on an imaging genetic analysis using data from the Alzheimer's Disease Neuroimaging Initiative, where we assess the collective regional genetic effects of voxelwise FDG-positron emission tomography measures between 116 ROIs and 565 373 single-nucleotide polymorphisms. Compared with traditional ROI-wise and voxelwise approaches, our method identified 2946 novel imaging genetic associations in addition to 33 ones overlapping with the two benchmark methods. In particular, two newly reported variants were further supported by transcriptome evidences from region-specific expression analysis. This demonstrates the promise of the proposed method as a flexible and powerful framework for exploring imaging genetic effects on the brain.
Availability and implementation: The R code and sample data are freely available at https://github.com/lshen/RIGEA.
Supplementary information: Supplementary data are available at Bioinformatics online.
© The Author(s) 2019. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: [email protected].