Large N-body particle datasets present a unique challenge for analysis and visualization. With multi-terabyte datasets becoming increasingly common, the goal of performing large-scale analysis and visualization of such large quantities of data becomes increasingly challenging.
In this talk we describe a new particle indexing scheme we have designed for yt, a python toolkit for the analysis and visualization of 3D simulation data. By making use of compressed Morton bitmaps to index the locations of particles, we substantially decrease the overhead to perform spatial chunking. By rethinking the high-level yt API for N-body data to be more particle-centric, we are able to scale analysis and visualization to datasets containing very large numbers of particles while reaping performance improvements and decreased memory overhead when working with smaller datasets.