# Investigation of Nonuniform Dose Voxel Geometry in Monte Carlo Calculations

## Abstract

^{3}, which was further divided into smaller voxels. The smallest voxel size was 1.25 × 1.25 × 3 mm

^{3}. We found that the simulation time per history for the nonuniform voxels is about 30% to 40% faster than the uniform fine voxels (1.25 × 1.25 × 3 mm

^{3}) while maintaining similar accuracy.

## Introduction

^{1–4}In order to increase the calculation speed, numerous variance reduction techniques such as history repetition, Russian roulette, and photon splitting are employed to improve the calculation efficiency.

^{5}In most dose engines for the MC treatment planning, the particles are tracked and the energy is scored in uniform geometric voxels which are resampled from the original computer tomography (CT) image data with decreased spatial resolution. The original CT pixel resolution which is 512 × 512 is not usually used in the MC simulations. The main reason is that the calculations are very slow because an extremely large number of particles are required to reduce the statistical uncertainty for small voxels to an acceptable level.

^{6}that the central processing unit (CPU) time spent for voxel-to-voxel boundary crossing and energy loss deposition with the MC code VMC++ is proportional to the number of voxel boundaries crossed by the particle track and thus proportional to the inverse of the voxel resolution. In order to obtain better dose resolution, small voxel size should be used. However, small voxel size results in longer calculation time. Large voxel size helps to increase the calculation speed since the number of voxel boundary crossing is reduced and the number of particles required to achieve a certain statistical uncertainty is decreased as well. The choice of the voxel size in MC simulations is a trade-off between precision and speed. The voxel size (eg, 5 or 2 mm) is chosen at the beginning of the simulation to uniformly discretize the dose volume.

^{7–9}The scoring grid is used for energy deposition only. The CT scan data are in a separate grid called “geometry grid” for particle transport. The voxels in the scoring grid are a number of grouped original CT voxels or separate superimposed spherical voxels. The size of the scoring voxels and the interspacing between scoring voxels are predefined by the user. The other approach that allows nonuniform voxel size is to use an octree as a compression algorithm to represent the CT data.

^{10,11}The octree refers to as a data structure in which each node has 8 subnodes. The octree is built by recursively subdividing a cubic volume into octants symmetrically. This technique has been used in the general purpose MC code GEANT4

^{12}to reduce the memory consumption as well as to decrease the calculation time. Other approaches include the region-oriented approach

^{13}and the non-voxelated approach.

^{14,15}The region-oriented approach segments the CT image volume into regions of homogeneous composition and density and was implemented in the GEANT4 toolkit.

^{12}It was reported that the calculation time was decreased by up to a factor of 15. The non-voxelated approach isolates regions of uniform density and composition from the scoring grid to minimize the number of boundary crossings. It was implemented in DOSXYZnrc,

^{16}and the reported speedup factor was up to 1.2 for CT phantoms. The other technique used for increasing the calculation speed is the macro MC method, which applies precalculated particle path library in a well-defined simple local geometry to a patient’s heterogeneous global geometry. Large steps are taken through the global geometry to accelerate the calculations.

^{17}

^{18}which starts with a uniform coarse mesh and then each coarse voxel is further recursively split into 2 subvoxels whenever the density variation in the coarse voxel exceeds a presetable limit. The AMR technique is widely used in hydrodynamics to reveal the details of shock waves by changing the grid spacing at those regions. In the simulation of hydrodynamics, the mesh is changing dynamically over time by refining the mesh resolution. In our work, we use static mesh refinement in the sense that the nonuniform mesh is formed beforehand and it stays the same during the calculation. All neighboring information for a voxel is stored in the memory as a tree-like data structure; therefore, it is relatively quick to locate a voxel by a given particle location required in the track algorithm.

## Materials and Methods

### Algorithm to Generate Nonuniform Voxels

*D*

_{x,y,z}in the X, Y, and Z directions within a coarse voxel is calculated by the difference in the average densities of the 2 halves of the voxel in that direction. The largest density difference and the associated direction are stored for each coarse voxel.

*D*

_{x}or Δ

*D*

_{y}or Δ

*D*

_{z}) in the coarse voxel is greater than the predefined split criterion (eg, 0.5 g/cm

^{3}), then the voxel is divided into half. Otherwise, the voxel remains intact. The stored average densities in step 2 are assigned to the 2 subvoxels.

^{3}to a small voxel with the size of 1.25 × 1.25 × 3 mm

^{3}, 4 passes are needed since each pass applies only in 1 direction. The generated voxels are stored in a file with an unstructured format, which includes the information such as voxel indices, voxel neighbor connectivity, density, and material.

### Monte Carlo Tracking Algorithm

^{19}which is a fast MC program for patient dose calculations. As in DPM, Compton scattering, photoelectric ionization, and pair production were considered for photon transport. These 3 processes were simulated in an analog fashion, that is, every interaction was modeled individually until the energy of the particle falls below a cutoff energy or the particle escapes the simulation volume. For other details about the physics implemented, readers may refer to DPM.

^{19}We only emphasize the boundary crossing algorithm here.

### Statistics and Dose Evaluation

^{20}as ∊ = (σ

^{2}

*T*)

^{−1}, where σ

^{2}is the variance and

*T*is the time required to obtain this variance. The efficiency expresses how fast an MC algorithm can reach a desired level of statistical accuracy. This metric was invented to compare the performance between MC packages with different variance reduction techniques so that the simulated number of histories does not reflect the true performance. For the calculation of σ

^{2}, the average uncertainty proposed by Rogers and Mohan

^{20}is generally employed

*D*is the dose in the voxel

_{i}*i*,

*ΔD*is its statistical uncertainty, and σ

_{i}^{2}is averaged over all voxels with a dose greater than 50% of the maximum.

*N*is the total number of simulated histories. Note that the efficiency is independent of

*N*but dependent on the simulated time per history.

^{21}is widely used to compare dose distributions. The γ index function is defined for each reference dose point

*r*as

_{r}*r*refers to a dose point at the evaluation distribution and Γ is simply the Euclidean distance in the renormalized dose–distance space. In our work, we used Δ

_{e}*d*= 3 mm as the acceptance criterion for the distance to agreement and Δ

*D*= 3% as the acceptance criterion for the dose difference.

^{22}Since all the dose distributions are evaluated with the fine grid (1.25 mm) dose distribution as the reference, the acceptance criterion Δd = 3 mm is reasonable. The criterion for acceptable comparison is defined as the γ index γ

*≤ 1, and the passing rate is defined as the ratio of the number of voxels with γ < 1 to the total number of voxels. The dose distribution from the fine uniform mesh was used as the reference. The evaluation distribution in the γ index calculations is the dose points from the unstructured nonuniform mesh (at the voxel center). All calculations were performed on a personal computer with a CPU of 3.3 GHz and 3 GB of memory.*

_{avg}### Accuracy and Performance of the Algorithm Implementation

^{20}The size of the slab phantom is 30.5 × 39.5 × 30 cm

^{3}. For the photon test case, the ICCR phantom consists of water from 0 to 3 cm, aluminum from 3 to 5 cm, lung (ρ = 0.26 g/cm

^{3}) from 5 to 12 cm, and water from 12 to 30 cm. For the electron test case, the phantom consists of water from 0 to 2 cm, aluminum from 2 to 3 cm, lung from 3 to 6 cm, and water from 6 to 30 cm. The beam size is 1.5 × 1.5 cm

^{2}with 100 cm source to surface distance. The statistical precision was set to 0.3%. The phantom was discretized to 5 × 5 mm

^{2}voxels in the X and Y directions and 2 mm in the Z direction.

^{2}field size originating from a point source. The voxel sizes of the 30 × 30 × 30 cm

^{3}water phantom are 1, 1.5, 2.5, 3.5, and 5 mm, respectively. Particles of 10

^{6}were simulated. The study was used to demonstrate the time dependence on the voxel size.

### Clinical Cases

^{23}to sample the source particles from the field fluence maps. The norm error |p

_{f}− p

_{f}*|

_{2}was recorded in the simulations, where p

_{f}and p

_{f}* are probability vectors for the simulated fluence map and the TPS fluence map, respectively. Particles of 10

^{9}were simulated to ensure that the simulated fluence map faithfully matched with the TPS fluence map so that the norm error was less than 10

^{−3}. A real clinical source model is underdevelopment. For this work, simple point source assumption was employed, that is, the multi-leaf collimator and the detailed source model were not considered in this work. Therefore, the calculated dose was not comparable with the TPS dose.

## Results

### Accuracy and Performance of the Algorithm Implementation

*x*. This result is consistent with the findings of Kawrakow.

^{6}Kawrakow suggested that the CPU time

*T*for an MC code using algorithms that do not truncate steps at boundaries is:

*N*is the number of particle histories and

*T*

_{0}is the average time per history spent for the simulation of geometry-independent quantities. The time, $\mathit{\alpha}/\mathrm{\Delta}x$, represents the CPU time needed for voxel-to-voxel boundary crossing and energy loss deposition. We found that

*T*

_{0}= 37 and α = 47 (unit: μs/history) fit the line very well in our simulations.

### Clinical Cases

^{3}(slice thickness was 3 mm) and several nonuniform voxel meshes. The dose distribution calculated on the fine uniform mesh was used as the reference for comparison and for the γ index evaluations. For the nonuniform mesh, the initial voxel size was set to 5 × 5 × 3 mm

^{3}. The allowed minimum voxel size was set to 1 × 1 × 3 mm

^{3}so that no split occurred in the Z direction. By applying the split density gradient threshold (DGT) to the initial voxel size (5 × 5 × 3 mm

^{3}), several nonuniform meshes were created. Figure 5A and B shows 1 slice of the generated nonuniform voxels for the H&N case and the lung case, respectively. Note that the nonuniform mesh depicts the patient structure, which shows the boundary of the patient and the interfaces of soft tissue and bones. A cuboid region covering the PTV was defined as an input for calculating the maximum uncertainty of all the voxels in the region. With DGT of 0.1, the volume of PTV was discretized into 1020 voxels (5 mm voxel size) and 20 voxels (2.5 mm voxel size), and 5 voxels (1.25 mm voxel size) in the cuboid region for the H&N case. For the lung case, the volume of lung PTV was discretized into 17 520 voxels (5 mm voxel size) and 237 voxels (2.5 mm voxel size), and there are no voxels having the size of 1.25 mm since the lung PTV is quite homogeneous, and no smaller voxels are necessary.

^{6}the goal of an MC simulation for a radiation treatment plan is to accurately and efficiently calculate a dose distribution to a prescribed statistical uncertainty. By using the nonuniform voxels, the number of particles needed to reach the prescribed statistical uncertainty to the region of 50% of maximum dose was reduced by a factor of approximately 6 when compared to the number of particles needed when using the uniform voxels. Thus, an efficiency improvement of about 6 was obtained for these test cases. The other quantity for the speedup measurement is the simulation time per history. We can see that the nonuniform voxel approach is about 30% to 40% faster than the uniform fine voxels (1.25 × 1.25 × 3 mm

^{3}).

^{a}

Case | Mesh | DGT | # Voxels | γ_{avg} | PR | T, µs/history | Eff |
---|---|---|---|---|---|---|---|

H&N | A | 2.03 × 10^{7} | 24.8 | 1.1 | |||

B | 0.1 | 1.87 × 10^{6} | 0.44 | 96.1 | 18.4 | 5.8 (5.3) | |

0.3 | 1.62 × 10^{6} | 0.45 | 95.9 | 16.6 | 6.2 (5.6) | ||

0.5 | 1.53 × 10^{6} | 0.49 | 95.4 | 16.1 | 6.5 (5.8) | ||

0.7 | 1.48 × 10^{6} | 0.48 | 95.3 | 15.1 | 6.9 (6.2) | ||

1.0 | 1.44 × 10^{6} | 0.48 | 95.1 | 15.1 | 6.9 (6.2) | ||

Lung | A | 2.69 × 10^{7} | 29.2 | 0.6 | |||

B | 0.1 | 2.23 × 10^{6} | 0.45 | 99.9 | 21.5 | 3.1 (5.2) | |

0.3 | 1.73 × 10^{6} | 0.45 | 99.9 | 18.7 | 3.5 (5.8) | ||

0.5 | 1.60 × 10^{6} | 0.33 | 100 | 17.3 | 3.9 (6.4) | ||

0.7 | 1.54 × 10^{6} | 0.45 | 99.8 | 16.9 | 4.0 (6.6) | ||

1.0 | 1.48 × 10^{6} | 0.46 | 99.8 | 16.7 | 4.1 (6.9) |

^{3}; B, nonuniform mesh generated with different DGTs used for voxel splitting; γ

*, the γ index averaged over all voxels with a dose greater than 50% of the maximum; PR, the passing rate of the number of voxels with the voxel γ index less than 1;*

_{avg}*T*(µs), the calculation time in microsecond per history.

^{a}The number in the parentheses is the efficiency gain comparing with the calculation on the uniform mesh.

## Discussion and Conclusion

^{24,25}There are 2 kinds of approaches to handle boundary crossing. One approach is to force all particles to stop at the boundary and the direction of the particles are changed (GEANT4, EGS4/EGSnrc, and PENELOPE). As an example, 2 algorithms of boundary crossing are implemented in EGSnrc, namely EXACT and PRESTA-I. The deflection occurs within a user input parameter distance (skin depth for BCA) and then either single elastic scattering mode (EXACT) or a multiple scattering event (PRESTA-I) is applied to calculate the changed direction. It has been demonstrated by David et al

^{13}that the computation time can be decreased by a factor of 15 with GEANT4 by using the proposed segmented volume approach to minimize the number of boundary crossing. The idea is to merge the voxel having the same material into large volume segment so that the number of boundary crossing is reduced.

^{6}The theoretical maximum speed of an MC simulation has an upper limit when the entire time is spent for geometry calculations. For example, VMC++,

^{6}using the simultaneous transport of particle sets technique, reaches 90% of the maximum possible time for 1 mm voxels. The DPM

^{19}MC code spends 41% for voxel-to-voxel boundary crossing and energy loss deposition. The PMC,

^{26}using the precalculated data technique, spends 56% of the time in ray tracing. Although a patient’s CT is represented by millions of small voxels, many large homogeneous regions with the same tissue type exist such that boundary crossing considerations inside those regions are not necessary. A natural approach to reduce the number of boundary crossing is to create nonuniform voxels with various voxel sizes such that larger voxels are allowed for smooth density regions and smaller voxels for the regions with larger density gradient. The rationale of this approach is that high-dose gradient is often associated with high-density gradient, which is commonly used to differentiate the material. Note that in the fast dose algorithm, which does not truncate steps at boundaries, the operations to calculate the distance to the voxel boundary are still required. Therefore, as the number of boundaries that are encountered is reduced, the amount of computing time spent on boundary crossings is also reduced.

^{3}). In addition, as a result of nonuniform voxel decomposition, the average voxel is larger. To reduce the statistical uncertainty, large voxel volume helps since more particles deposit in the voxel. For the target region, large voxels may be generated if the PTV region is fairly homogeneous and therefore fewer particles may be needed to reach the statistical uncertainty than using the fine grid without compromising much accuracy to the surrounding organs as demonstrated in DVHs. For the heterogeneous areas such as the interface of tissue and bone, small voxels are generated to keep high spatial resolution as large dose gradients may occur. By using nonuniform voxels such that large voxels are generated for smooth density regions and small voxels for heterogeneous regions, the number of voxels can be reduced by an order of magnitude. Efficiency can be improved because of the resulting larger average voxels. For dose algorithms such as collapse-cone convolution and superposition convolution,

^{27,28}a regular uniform mesh may be crucial for the algorithm to take advantage of the regularity to increase the calculation speed. However, for the MC simulation, the regularity of the mesh is not important because of the randomness of the particle trajectory. In addition to common variance reduction techniques, geometrical division into nonuniform voxels can be a useful technique to further accelerate MC simulations.

## Abbreviations

- MC
- Monte Carlo
- MRS
- multi-resolution subdivision
- AMR
- adaptive mesh refinement
- DTA
- distance-to-agreement
- PC
- personal computer
- SSD
- source to surface distance
- IMRT
- intensity modulated radiation therapy
- H&N
- head and neck
- DVH
- dose volume histogram
- TPS
- treatment planning system
- PTV
- planning target volume
- DGT
- density gradient threshold
- PDD
- percentage depth dose
- MLC
- multi-leaf collimator
- CSDA
- continuous slowing down approximation
- ICCR
- International Conference on the Use of Computers in Radiation Therapy
- CT
- computerized tomography

## Declaration of Conflicting Interests

## Funding

## References

*Med Phys*. 2004;31 (1):142–153.

*Med Phys*. 2007;34 (12):4818–4853.

*Phys Med Biol*. 2000;45 (8):2163–2183.

*Advanced Monte Carlo for Radiation Physics, Particle Transport Simulation and Application: Proceedings for the Monte Carlo 2000 Meeting Lisbon*. Berlin: Springer Berlin Heidelberg; 2000:229–236.

*Med Phys*. 2001;28 (7):1322–1337.

*Phys Med Biol*. 2004;49 (14):N235–N241.

*Phys Med Biol*. 2012;57 (11):3273–3280. doi:10.1088/0031-9155/57/11/3273.

*Med Phys*. 2006;33 (8):2819–2831.

*IEEE Trans Med Imaging*. 2009;28(12):1894–1901. doi:10.1109/TMI.2009.2021615.

*Nucl Instrum Methods A*. 2003;506:250–303. doi:10.1016/S0168-9002(03)01368-8.

*Med Phys*. 2008;35 (4):1452–1463. doi:10.1118/1.2884854.

*Med Phys*. 2008;35 (9):4106–4111. doi:10.1118/1.2968094.

*Med Phys*. 2008;35 (2):633–644.

*NRC Report PIRS 794. Ionizing Radiation Standards*. Ottawa, Ontario: NRC; 2005.

*Phys Med Biol*. 1992;37 (1):107–125. doi:10.1088/0031-9155/37/1/007.

*J Comput Phys (Elsevier)*. 1989;82 (1):64–84. doi:10.1016/0021-9991(89)90035-1.

*Phys Med Biol*. 2000;45 (8):2263–2291. doi:10.1088/0031-9155/45/8/315.

*Proceedings of the 13th ICCR The Use of Computers in Radiation Therapy: XIIIth International Conference*. Heidelberg: Springer-Verlag; 2000:120–122.

*Med Phys*. 1998;25 (5):656–661. doi:10.1118/1.598248.

*Med Phys*. 2010;37 (9):4868–4873. doi:10.1118/1.3480964.

*J Chem Phys*. 1953;21:1087–1092. doi:10.1109/5992.814660.

*Nucl Instr Methods Phys Res B*. 1987;18:165–181. doi:10.1016/S01680583X(86)80027.

*Monte Carlo Transport of Electrons and Photons*. New York: Plenum; 1988:115–137.

*Med Phys*. 2009;36 (2):530–540.

*Med Phys*. 1989;16 (4):577–592. doi:10.1118/1.596360.

*Med Phys*. 2011;38 (3):1150–1161. doi:10.1118/1.3551996.

## Cite article

### Cite article

#### Cite article

#### Download to reference manager

If you have citation software installed, you can download article citation data to the citation manager of your choice

## Information, rights and permissions

### Information

#### Published In

**Article first published online**: September 14, 2014

**Issue published**: August 2015

#### Keywords

#### History

**Manuscript received**: January 14, 2014

**Revision received**: May 15, 2014

**Manuscript accepted**: May 20, 2014

**Published online**: September 14, 2014

**Issue published**: August 2015

### Authors

## Metrics and citations

### Metrics

#### Journals metrics

This article was published in *Technology in Cancer Research & Treatment*.

#### Article usage^{*}

Total views and downloads: 744

^{*}Article usage tracking started in December 2016

#### Altmetric

See the impact this article is making through the number of times it’s been read, and the Altmetric Score.

Learn more about the Altmetric Scores

#### Articles citing this one

Web of Science: 7 view articles Opens in new tab

Crossref: 7

- The dose accumulation and the impact of deformable image registration ...
- A tool for precise calculation of organ doses in voxelised geometries ...
- Enhanced optimization of volumetric modulated arc therapy plans using ...
- An Integrated Framework Based on Full Monte Carlo Simulations for Doub...
- A Monte Carlo model and its commissioning for the Leksell Gamma Knife ...
- Experimental Validation of Monte Carlo Simulations Based on a Virtual ...
- Development of a Monte Carlo model for treatment planning dose verific...

## Figures and tables

### Figures & Media

### Tables

## View Options

### View options

#### PDF/ePub

View PDF/ePub### Get access

#### Access options

If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:

*loading institutional access options*

Alternatively, view purchase options below:

Purchase 24 hour online access to view and download content.

Access journal content via a DeepDyve subscription or find out more about this option.