Both the measurement of crystal sizes and locations in three dimensions and the simulation of crystallization produce data sets that require characterization. Reduce3D is a new computer program for measuring spatial statistics of interest to kinetic studies of crystallization. The principal statistical measures of such interest are the L′ function, the pair correlation function, and the mark correlation function. Reduce3D implements the ability to create a large number of pseudorandom interface controlled crystal arrays designed to mimic the rock being studied but with interface control of nucleation and growth, forming a null-hypothesis envelope. Values that exceed this envelope in the direction of ordering of crystal centers and/or growth suppression of nearby crystals may be interpreted as significantly ordered, which for metamorphic crystallization may imply diffusional control of nucleation and growth. Reduce3D, together with utility programs GraphCFs for visualizing relevant statistics and Render3D for visualizing the data set, form a suite that makes such interpretation rapid and straightforward.


Porphyroblast or phenocryst crystallization results from the interplay of several atomic-scale processes involved in nucleation and growth. Because these cannot be directly observed, we must infer their relative importance and rates (i.e., kinetics) by reference to observable features in natural samples. One such avenue to understanding the kinetics of crystallization is through quantitative analysis of crystal textures (e.g., Kretz, 1969; Carlson, 1991). Quantitative textural analysis of rocks has grown in recent decades to become an important tool for understanding crystallization kinetics in both igneous and metamorphic petrology. Igneous petrology has focused primarily on the use of crystal size distributions (e.g., Marsh, 1988; Gualda, 2006) with other methods of textural analysis used as well, such as nearest-neighbor measures (e.g., Jerram et al., 1996) and grain-shape analysis (e.g., Mock and Jerram, 2005). Metamorphic petrology has used crystal size distributions (e.g., Cashman and Ferry, 1988; Carlson, 1989; Hirsch, 2007), radius-rate relations (e.g., Kretz, 1974; Carlson, 1989), as well as a variety of single-value (e.g., Kretz, 1969; Carlson, 1989) and scale-dependent (e.g., Daniel and Spear, 1999; Hirsch et al., 2000) measures of ordering and clustering to infer kinetic mechanisms. Both disciplines have also brought modeling techniques to bear on the problem (e.g., Carlson, 1991; Hersum and Marsh, 2006).

The data required for such analyses are by necessity three-dimensional. Such data may be calculated from easily collected two-dimensional data sets via stereological methods (Marsh, 1988; Higgins, 2000), but stereology entails additional uncertainty in the three-dimensional data set. Other methods that collect three-dimensional data directly are X-ray (or synchrotron) computed tomography (e.g., Denison et al., 1997; Gualda and Rivers, 2006), serial grinding (e.g., Spear and Daniel, 1998; Mock and Jerram, 2005), or dissection (e.g., Kretz, 1993). Computed tomographic techniques offer the fastest data collection and are nondestructive, but require a sufficient contrast in beam absorption among the phases of interest to discriminate them in the results.

Spatial correlation functions taken from the statistics literature and in use in the biological sciences have been proposed as promising tools for metamorphic textural analysis (Raeburn, 1996; Daniel and Spear, 1999). These scale-dependent measures were examined in Hirsch et al. (2000), which focused on their calculation, utility, and reliability, and proposed methods for their use. This study presents a computational implementation of these statistics and methods, Reduce3D1, together with a program, GraphCFs, to graphically display the statistical results, and Render3D, to visualize the data set.


Although the factors controlling nucleation and growth of porphyroblasts have been described elsewhere (e.g., Carlson, 1989, 2010; Hirsch et al., 2000), the rationale for the spatial statistics described in the following is closely linked to these factors, and I present a brief primer. Much of this description may apply equally well to crystallization in some igneous or diagenetic environments, but the suppression of nucleation and growth of nearby crystals by existing ones must be a possibility, and inhomogeneous movement of existing crystal centers relative to one another in a rock must not be possible (so this rules out many igneous crystallization regimes).

During crystallization of a porphyroblast (and presuming some degree of nucleation), any of the following factors can conceivably be rate limiting: transport of heat to or from the rock, dissolution of reactant phases, transport of chemical species to or from the growing porphyroblast, and attachment of the chemical species to the porphyroblast crystal lattice. The latter two are believed by most workers (e.g., Spear and Daniel, 1998) to be the most likely candidates for the rate-liming factor and are here denoted as diffusion control (in which slow transport would be via diffusion rather than advection) and interface control (IC, because the precipitation is occurring at the porphyroblast-matrix interface). Which of these is the controlling factor can be inferred from the relative size and spatial distribution of porphyroblasts.

Diffusional control leads to ordering of crystal centers and size reductions in closely spaced crystals relative to either a random case or to interfacial control. As a crystal grows in a regime in which diffusion is slow relative to other processes, a halo depleted in chemical components required for crystal growth will develop around each porphyroblast. Within this halo, nucleation will be depressed or prohibited due to decreased reaction affinity (Fig. 1A). In addition, as the depletion haloes of neighboring crystals impinge, each crystal's growth will slow due to the reduced volume from which they derive chemical components (Fig. 1B). In contrast, under interfacial control, there is no depletion halo, and so new nucleation events may occur with equal probability in any portion of the rock not yet occupied by the porphyroblast phase (Fig. 1C). In addition, there should be no reduction in crystal growth rate for even very closely spaced crystals (Fig. 1D). Porphyroblasts that impinge upon each other will undergo a reduction in growth rate due to portions of their surface area being unavailable for continued interfacial growth processes.

These kinetic effects will in turn affect the sizes and locations of porphyroblasts in a rock. In diffusion-controlled crystallization, the suppression of nucleation will produce ordering in crystal centers, because the crystals will be more widely separated on average than in either a random point array or an interface-controlled regime. The slowing of growth as diffusional haloes impinge will lead to a reduction in size for closely spaced crystals, relative to a random crystal array or to the interface-controlled regime. Interfacial control will show ordering relative to a random point array, but only at the very smallest scales, reflecting the inability to nucleate a new crystal within an existing one. In addition, interfacial control will show size reduction relative to a random crystal array, but only for crystals that have impinged upon another.


The statistics used here have been described in detail in the literature (Carlson, 1989; Carlson and Denison, 1992; Hirsch et al., 2000), but a brief discussion is necessary to understand the operation of the software. There are two types of statistics calculated by the software: single-value statistics, and scale-dependent statistics.

Single-value statistics take into account all the porphyroblasts in the data volume. Some of these are trivial (e.g., mean crystal radius), and do not shed much light on crystallization kinetics, but others such as the ordering index, random point index, and clustering index have been used to discriminate among crystallization mechanisms. The ordering index compares the mean center-to-center distance for nearest-neighbor crystal pairs with that for a random disposition of the same set of crystals (i.e., a set of crystal centers with the same number density and size distribution as that of the data set); higher values suggest ordering and thus diffusion control. The clustering index (also called random point index) examines a set of randomly placed points, and compares the mean distance to the nearest crystal center with that for a random disposition of the same set of crystals; lower values suggest ordering. The clustering index is essentially identical to the statistic termed “R” by Clark and Evans (1954). The impingement index (also called the Avrami ratio) compares the degree of crystal interpenetration with that for a random disposition of the same set of crystals. Note that the random arrays mentioned here are substantially different from the interface-controlled envelope arrays described in the following. Other single-value statistics are calculated and reported by the software, but are not displayed by GraphCFs and are not treated further here.

Scale-dependent statistics are calculated for a specific length scale, which is varied over a range of interest. This allows these statics to extract information about processes that might operate at a particular length scale, such as diffusional suppression of nucleation or growth, which should operate at length scales to about the mean nearest neighbor distance. Three statistics have been used to detect diffusional effects in rocks: the L′ function, the pair correlation function (PCF), and the mark correlation function (MCF). The L′ function compares the number of crystal centers separated by a distance up to the length scale of interest with the number expected for a random disposition of the same set of crystals (Fig. 2A). The PCF compares the number of crystal centers separated by a distance approximately equal to the length scale of interest with the number expected for a random disposition of the same set of crystals (Fig. 2B). The MCF compares the mean size of crystals separated by a distance approximately equal to the length scale of interest with that expected for a random disposition of the same set of crystals. The degree of approximation included in the PCF and MCF is governed by the Epanecnikov kernel bandwidth. Larger values correspond to smoother PCF and MCF curves. For all of these functions, lower values imply ordering (L′, PCF) or small size (MCF) at the length scale of interest (and thus diffusional control) and higher values imply clustering (L′, PCF) or larger size (MCF) at the length scale.

The implementation of the scale-dependent statistics in Reduce3D is that described in Hirsch et al. (2000), which includes a number of improvements relative to earlier implementations (Raeburn, 1996; Daniel and Spear, 1999). These include: (1) a correction to the PCF to prevent underestimation of the statistic at small length scales; (2) an improvement to the MCF to make it unbiased (denoted MCF′); and (3) a new implementation of interface-controlled null-hypothesis arrays to allow statistically rigorous conclusions.


Because the statistics described above compare values for the data set to those for a random array, they do not by themselves allow rigorous conclusions as to the controlling crystallization mechanism. Both diffusional and interfacial controls predict nucleation and growth suppression near existing crystals relative to a random array, although the scales involved differ substantially. Diffusional control suppresses nucleation and growth over length scales on the order of the nearest neighbor distance, while interfacial control suppresses nucleation and growth over scales near that of the mean crystal radius. We therefore expect both interfacial and diffusional control to show the same type of signal (suppression of nucleation and growth) for these statistics. In order to infer a kinetic mechanism from these statistics, the results of each statistic measured on the data set must be compared to the expected value for an interface-controlled disposition of the same set of crystals.

To implement this, the software can create a set of semirandomized interface-controlled crystal arrays and measure the value of each statistic on each member of the set. The 95% (or 99%) confidence bounds of a set of interface-controlled crystal arrays are denoted the interface-control envelope for the data set. If the results for the data set differ sufficiently toward ordering or growth suppression such that they exceed the IC envelope, then we are justified in rejecting the null hypothesis of interface control with the specified confidence. Details of the envelope creation process are described in the following.


The software package Reduce3D implements the statistical measures described above; its operation comprises a number of parts: (1) load the data set representing the crystal array; (2) fit a bounding primitive (and in some cases a convex hull) around the data; (3) measure statistics on the data set; (4) create a number of interface-controlled null-hypothesis simulations and measure statistics on each; (5) export results to a text file for viewing directly or via GraphCFs. Each of these will be described in the following.

Input Data Format

Reduce3D can accommodate data files supplied as ASCII text, which must be in the following format. Note that this implicitly treats all crystals as spheres, and thus would be unsuitable for modeling or measuring crystallization of staurolite, for example.

  1. A line giving the version number of the file and/or the program that produced it.

  2. A comment line that can contain anything, but if it contains the terms “Qd”; “diffusion”; “interface”; or “heat flow” it is treated as a crystallization simulation produced from Crystallize software (Carlson et al., 1995; Ketcham and Carlson, 2004).

  3. An optional secondary comment line that is appended to the first. The presence of such a line is indicated by the absence of the term “Number” at the beginning of the line.

  4. A line giving the number of crystals in the data set in the format: “Number of crystals:[tab] [number].”

  5. An optional line giving the bounds of the data volume as a rectangular prism or a cylinder in one of these formats: “Bounds:[tab] [lower bound in x] [lower bound in y] [lower bound in z] [upper bound in x] [upper bound in y] [upper bound in z]” for a rectangular prism or “Ctr/R/H:[tab] [center in x] [center in y] [center in z] [radius] [height]”. Note that a cylindrical data volume must be oriented such that the cylinder axis is parallel to the z axis.

  6. A set of lines, one per crystal, in the format: “[ordinal] [tab] [center in x] [tab] [center in y] [tab] [center in z] [tab] [radius] [tab] [numerical code] [tab] [numerical code].” The numerical codes are exported by some other programs in the data flow pathway such as Blob3D (Ketcham, 2005) and are not used by Reduce3D.

Bounding Primitive and Convex Hull

In order to perform many of the statistical tests, the boundaries of the data volume must be known or deduced. If the input data contain such information, it is simply adopted and no further calculation is required. However, if no such information is provided, then the user has two options: choose a primitive to fit to the data set, or wrap the data set with a convex hull.

If the general shape is known to be a rectangular prism or cylinder, the user may specify which, and the program will (at the user's discretion) either inscribe the largest primitive of that type within a convex hull wrapped around the data set (see following for the convex hull description), or exscribe the smallest primitive of that type around the data set. Note that the inscribed primitive option is based on the convex hull, while the exscribed primitive option is based upon the data set. The inscribe option will discard all crystals whose centers fall outside the primitive, while the exscribe option will preserve all crystals. However, the exscribe option has the opportunity for error if the data set is not well modeled by the primitive, and in addition contains a bias due to the fact that the edges of the data volume must shrink to touch the outermost crystal centers, artificially increasing the number density beyond that likely found in the data source, which in turn affects the values of the correlation functions. To ameliorate this bias at the cost of reduced crystal numbers, the user may opt to shrink the exscribed primitive by a given distance and discard any crystals that fall outside the newly reduced volume.

If the data set is known to be different from the two primitives (e.g., spherical or irregular), then the convex hull may be used directly. Using a convex hull rather than a primitive to delineate the data boundary typically results in an increase in overall processing time of one to two orders of magnitude. The algorithm that wraps the convex hull around the data set is as follows (after Clarkson and Shor, 1988):

  1. Choose four random crystal centers. Make a tetrahedron out of these, and discard any points inside this tetrahedron (for the convex hull operation only). This is the initial candidate convex hull.

  2. Choose a new point not on any vertices.

  3. Determine what facets of the current candidate hull can be “seen” by the point: those facets that can be connected to the point by a straight line that does not intersect any other facets.

  4. Discard the “seen” facets and make new facets by connecting their vertices to the point. This is the new candidate hull. Discard any points inside the hull.

  5. Repeat 2–4 until no more points can be found.

Because operations on a convex hull (e.g., finding a random point inside the hull) are so slow, a number of additional optimizations are performed. First, an exscribed cube (the smallest cube that will fit around the hull) is created. Second, an inscribed primitive is created; either a cube, or, if the cube is very small, an irregular octahedron whose vertices are just inside the convex hull located on the six orthogonal axes centered within the hull (Fig. 3). Finally, each facet of the hull is given a vector that points toward the center of the hull. Many of the hull-related calculations are those that ask in some way whether a point is inside the hull or not. These optimizations speed up this type of query because if a point is inside the inscribed primitive (a rapid calculation), it is inside the hull; if it is outside the exscribed primitive (also rapid), it is outside the hull; and if it is outside any of the sides (a slow-in-total, but per-facet-rapid calculation based upon the dot product of two vectors), it is outside the hull.

Dealing with Volume Boundaries

When calculating the number of other crystals surrounding a given crystal, if the sphere defined by the given crystal's center and the length scale of interest intersects the edge of the data volume, then the calculation will be incorrect, due to uncounted crystals outside the data volume. This can be addressed in either of two ways: create a guard region (minus sampling), or adjust the calculation to take into account the fraction of the sphere outside the data volume (translation).

Minus-sampling method.

Minus sampling prohibits calculation for any crystal for which the examination region would intersect the sample boundary (Stoyan and Stoyan, 1994, p. 280). This method can be conceptualized as installing a guard region inside the sample boundary whose thickness is that of the current test distance (Fig. 4A). Crystals within this region will not have the examination region centered on them, but may still be counted if they fall within the examination region centered on another crystal not within the guard region. This method has the advantage of introducing the fewest artifacts into the calculation, but requires a large number of crystals, because many are excluded from the calculation, even at small test distances. In addition, there is a maximum test distance allowable with this method, although in most cases of interest, this maximum is greater than the typical test distances of interest for the statistic.

Translation method.

The translation method attempts to correct each summand for those crystals missed outside the sample boundary. It divides the summand for each pair of crystals by a factor that is the volume of intersection of the data volume with a copy offset by the vector from one crystal to the other (Fig. 4B). Dividing each summand in this way causes the summand to increase for crystals that are more widely separated, which tends to compensate for the fraction of the examination region that is outside the sample boundary. However, because the method does not take into account the degree to which the examination region actually is outside the sample boundary for any particular pair, it is possible in concept for the method to introduce artifacts into the statistics. Nevertheless, the translation method of edge correction produces values very close to those from the minus-sampling method (Hirsch et al., 2000), with the advantage of retaining larger numbers of crystals in the analysis. For this reason, the translation method is generally preferred.

Monte Carlo Comparison to Null-Hypothesis Interface-Control Case

In order to produce a correct crystal array to create the interface-control envelope, the crystal array must have as much in common with the data set as possible, while reflecting the constraints of interfacial control. It is as if the same set of crystals had been formed under an IC crystallization regime: in particular, the number density of crystals, the size and shape of the sample volume, and the crystal size distribution should be identical to those of the data set. In addition, the IC crystal array must reflect any limitations on crystal observability that are present in the data set (Hirsch et al., 2000; Ketcham et al., 2005). The observability criterion must be tuned for the data-collection measures used to obtain the spatial data from the rock sample.

To actually implement this, the set of radii is placed randomly, subject to the IC growth criterion. Each radius found in the data set, from largest to smallest, is placed randomly until an allowable location is found. Locations that are not allowed are those for which the crystal would have nucleated inside an existing (already placed) crystal. However, the criterion doesn't use the full size of the existing crystal, or that of the to-be-placed crystal. Based on the fact that IC growth should be radially constant with time, one can subtract from the existing set of placed crystals the radius of the to-be-placed crystal, and see if the center of the to-be-placed crystal intersects an existing crystal given its reduced radius at the time of “nucleation” of the to-be-placed crystal. In this way, all the crystals of the data set are placed randomly subject to interface-controlled nucleation and growth criteria.

Following the creation and statistical measurement of a user-specified number of IC crystal arrays, the algorithm collects the set of statistical measures for output. The output, at the user's discretion, can be the bounds of an explicit percentage of the runs (e.g., the inner 95% of the statistical measures), or it can be simply the mean and standard deviation of each statistical measure, allowing later processing software to calculate the n-sigma boundaries on the fly. These options are distinguished from each other in the text headers of the output file.

Observability Limitations in Natural Data Sets

Another factor to consider in the production of crystal arrays that describe what the data set would look like if it had formed under an IC crystallization regime is that our ability to observe highly impinged pairs of crystals may be limited. Such crystals are generated in interface-controlled nucleation and growth simulations, and are permitted by the interface-controlled placement criterion described above. In natural samples, however, they might not be observed if the interpenetration produces shapes for the compound crystals that cannot be resolved as clusters and separated into individual crystals. This observability problem will arise during the analysis of sample data collected by many techniques, including the serial grinding and optical scanning technique (Raeburn, 1996; Daniel and Spear, 1999) and the technique of X-ray computed tomography (Carlson and Denison, 1992; Denison et al., 1997; Hirsch et al., 2000; Ketcham et al., 2005). Notably, however, it is not an issue in the analysis of data sets generated by nucleation and growth simulations. If the input file contains markers in the comment line that identify it as having been derived from a simulation, then the criteria described in the following are not used.

If pairs of highly intergrown crystals in a rock are misidentified as single, larger crystals, then the data set extracted from the rock will contain too few crystals, those crystals will have overly large sizes, and the mean nearest-neighbor distance will be too large. These errors will invalidate comparisons made to IC envelope crystal arrays in which all crystals are regarded as separately observable, regardless of how close to one another they may be. To remedy this problem, the user may choose to employ the following observability criterion in the creation of IC envelope crystal arrays (Hirsch et al., 2000).

The observability criterion has two parts: the pair must satisfy both parts in order to be considered separately observable. The first part of the criterion is: 
where a1 is a tuned constant, d is the center-to-center distance between the two crystals, and dI is the distance from the center of the larger crystal to the plane that contains the circle of intersection of the (spherical) crystal surfaces (Fig. 5A). The second part of the criterion is: 
where a2 is a tuned constant, l is the total length of the pair of crystals, and rS is the radius of the smaller crystal (Fig. 5B). If the pair fails either of these tests, then it is concluded that the two crystals cannot be distinguished as separate from one another, and the proposed placement is rejected in the same way as if it had failed the IC criterion described in the previous section.

The values of the numerical constants in these tests were obtained by a tuning process that subjected actual data sets obtained from both CT imagery and time-explicit nucleation and growth simulations to the observability criterion. In Hirsch et al. (2000), the value of a1 was determined to be 0.85, and the value of a2 was determined to be 3.0 for the high-resolution X-ray computed tomography (HRXCT) data sets used in that study. The numerical values were varied in order that the number of crystal pairs classed as “inseparable on observation” is maximized in the simulations and minimized in the CT-derived data sets. This is clearly appropriate because all the crystals in the CT-derived data sets were, in fact, separately observed and must therefore be clearly distinct from their neighbors. Note that these values will vary with the quality of the data-acquisition process and the methods used for data processing. In addition, these values were obtained using a now-obsolete data analysis method for HRXCT data sets, superseded by Blob3D (Ketcham, 2005).

Work with Blob3D (Ketcham, 2005) suggests a more sophisticated method of tuning these values based on the lower limits of (d/dI) and (l/rs) for each data set (Ketcham et al., 2005). Reduce3D incorporates this functionality, allowing the user to specify the observability criteria indirectly: specifying the values such that a user-selected fraction of the impinged crystal pairs in the data set would be rejected for the given criteria, if applied to the data set. So if the user selects 1% rejection, and 1% of the data have (l/rS) values that are below 3.3, then 3.3 will be the value chosen for a2.

Other Program Functions

A number of other functions have been added to Reduce3D. These include the ability to iteratively shave the data set, and the ability to account for voids within the data set.

Shave and Export and/or Reduce

In order to evaluate the effect of small numbers of crystals on resulting statistical measures, the software incorporates the ability to shave a data set, reducing the size of the data volume and discarding crystals located outside the new reduced boundaries. This process is used to gauge the sensitivity of the statistics to small numbers of crystals in the data set. Shaving requires the input file to have a line giving the explicit bounds of the data volume as described above. The shaving is iterative, and each shave may be followed by either export of the data set to a new text file usable as input for Reduce3D, or direct statistical processing and output of results. Shaving may be performed in the x, y, or z directions (for cylinders, x and y are equivalent), or for all directions simultaneously, retaining the aspect ratio of the data. If shaving from the x, y, or z directions, the algorithm can remove crystals from the top, bottom, or both equally. Shaving iterations will continue until there remains a specified minimum number of crystals or a specified minimum fraction of the original number of crystals, or until a maximum aspect ratio is reached, or until all three criteria are met.

Voids in the Data Set

Some data sets contain regions that should be excluded from analysis, for example, multiple porphyroblast phases in a metamorphic rock, or vesicles in an igneous rock. This complicates data processing. If, for example, one were examining biotite crystallization that was partially obscured by later garnet formation (e.g., Hirsch and Carlson, 2006), then naive collection of statistics for the biotite would be skewed by the voids in the biotite data set (the garnet volumes). Creation of appropriate IC crystal arrays would be impossible, because crystals would be invariably placed within these voids, making the IC crystal arrays differ from the natural data set. To accommodate such cases, Reduce3D includes the ability to load auxiliary data sets containing spherical holes, i.e., regions of void space that should be taken into account when calculating statistics or placing crystals for the IC crystal arrays.

Crystal Size Distributions

Traditional spatial statistics have focused on crystal size distributions (CSDs), and Reduce3D includes the ability to create these as well. Histograms representing five types of CSDs can be created:

  1. A typical CSD giving the number of crystals whose radii fall into each size class.

  2. A CSD of nearest-neighbor radii, giving the numbers of crystals whose nearest-neighbor's radii fall into each size class.

  3. A CSD giving the number of crystals whose radii (normalized to the mean radius in the data set) fall into each size class; can be cumulative or noncumulative.

  4. A CSD giving the number of crystals whose radii (normalized to the maximum radius in the data set) fall into each size class; can be cumulative or noncumulative.

  5. A cumulative CSD giving the number of crystals whose radii (normalized to the maximum radius and the size of the volume and plotted on a logarithmic scale) fall into each size class; this is the type used by Cashman and Ferry (1988).


Use of Reduce3D is straightforward: it is a stand-alone, double-clickable graphical user interface–based program for Mac OS X. It includes detailed help files as well as help tags (tooltips) for the various user interface items in the settings window (Fig. 6). The input files typically derive from CT data sets via Blob3D, or from nucleation and growth simulations such as Crystallize, but could potentially be created in any other way that produced a correctly formatted input file. The user is prompted to select a large number of settings for program operation (Fig. 6), most of which are described above. The program will use the Mac OS X preferences system to store user settings between runs.

During operation, the program gives feedback as to the calculation progress. Initially, the progress for the natural data set calculations is displayed, and then the progression through the desired number of IC crystal arrays (envelope simulations). The user may choose to have the program provide audible feedback at completion. Following calculation, the program creates a text output file with the results of all the statistical measures. In addition a log file may be created with details of the program's operation.

Some details of the program's construction may be relevant. Reduce3D was written in C++ (for the calculations) and Objective-C (for user interaction). The calculations run on a separate thread from the user interface to improve performance. Although portions of the program rely on Mac OS X application programming interfaces (user interface, threading, file input, and file output), isolation of the calculations into C++ makes porting the code to other platforms straightforward. Currently, Reduce3D is at version 3.30, and requires Mac OS X 10.4. In total Reduce3D comprises ∼3600 nontrivial lines of code.

Visualization Tools: GraphCFs and Render3D

GraphCFs is a companion program to Reduce3D, also written for Mac OS X, which translates the scale-dependent statistical measures and the three indices described above into graphical displays (Fig. 7). GraphCFs version 2.2 requires Mac OS X 10.6, although an older version exists (version 1.2) that requires only Mac OS X 10.1. The application metadata is set up such that the user should be able to double click on the output from Reduce3D and immediately view the data in GraphCFs.

GraphCFs takes the primary text output file from Reduce3D and parses it to extract the data that can be plotted. For the scale-dependent statistical measures, it plots the data, the null hypothesis envelope, and the nearest neighbor distance (±1 standard deviation). For the scale-independent statistical measures (the three indices), it plots the data and the null hypothesis envelope only. If Reduce3D output expresses the envelope in explicit limits, then these are displayed; if the output expresses the envelope as a mean and standard deviation, then the number of standard deviations represented by the plotted envelope is user customizable. Colors, fonts, line weights, and other details of the plots are user customizable.

Render 3D is a small application that displays a graphical rendering of the sizes and positions of all crystals in a volume (Fig. 8). It accepts the same input files as Reduce3D, and displays a three-dimensional rendering of the data volume that can be interactively rotated and resized. The purpose of the program is primarily to examine the data set for any outliers, and secondarily to gain an impression of the crystals in the data volume as compared to the natural specimen and any HRXCT-derived rendering.


As an illustration of the results that can be obtained from these programs, a rock from the Waterville Formation of Maine was analyzed. This specimen (number 711) has been described previously (Cashman and Ferry, 1988; Hirsch, 2007). It is a garnet-biotite schist metamorphosed to sillimanite zone conditions (650 ± 25 °C and 0.35 ± 0.03 GPa) (Hirsch, 2007). It was analyzed using HRXCT, and the data processed with Blob3D to identify individual garnets using a primitive-fitting algorithm (Ketcham, 2005). It contains 6370 garnets in a cylindrical data volume (Fig. 8).

Statistics for the resulting data set were measured with Reduce3D, and reveal evidence for diffusional ordering of crystal centers at scales up to approximately the nearest-neighbor distance. In particular, the PCF, L′ function, and ordering and clustering indices all show evidence for suppression of nucleation of nearby crystals, and the MCF and impingement index both show evidence for suppression of growth of nearby crystals (Fig. 7). In the scale-dependent statistics, these effects are observed at precisely those length scales over which one would expect diffusional suppression to operate, i.e., up to approximately the nearest-neighbor distance.


Reduce3D is a new program to calculate spatial statistics for crystal arrays, either from natural data sets or crystallization models. It improves upon earlier methods by implementing revisions in the pair correlation and mark correlation functions (Hirsch et al., 2000). It is linked to companion programs for visualization of correlation functions and the crystal array. Most important, it includes rigorous production of interface-controlled crystal arrays designed to compose a null-hypothesis region for statistically rigorous determination of kinetic controls on nucleation and growth.

Although the development and testing of Reduce3D has been directed primarily at metamorphic crystal analysis, it could be applied more broadly. Perhaps growing crystals in a rhyolite act to suppress later crystallization by increasing the temperature around them via the heat of crystallization; this ought to produce nucleation and growth suppression of the sort that diffusional control exerts in metamorphic systems. Indeed, the technique might apply more broadly still, to other areas such as formation of nodules or concretions, or even other fields such as biology. The only requirement is a set of sizes and locations in three dimensions for objects that can be well approximated as spheres.


The programs and source code are freely available for use and modification under the GNU General Public License v3.0 (http://www.gnu.org/licenses/gpl.html). The most recent version is available from my website (http://davehirsch.com), or by contacting me. The most recent version as of publication is available as an electronic appendix to this manuscript.

This work was supported in part by National Science Foundation grant EAR-9902682. Discussions with Rich Ketcham aided the early programming effort. I am grateful to Bill Carlson for an informal and very helpful review of the manuscript. Helpful comments from two anonymous reviewers served to improve the manuscript substantially.

1Reduce3D Software Archive. Zipped file containing three Mac OS X programs and demonstration data sets. You will need the Mac Os X operating system (version 10.4 or later) to view the archive. If you are viewing the PDF of this paper or reading it offline, please visit http://dx.doi.org/10.1130/GES00631.S1 or the full-text article on www.gsapubs.org to view the software archive.