|
Abstract |
Quantification of annular dark field (ADF) scanning transmission electron microscopy (STEM) images in terms
of composition or thickness often relies on probe-position integrated scattering cross sections (PPISCS). In
order to compare experimental PPISCS with theoretically predicted ones, expensive simulations are needed for
a given specimen, zone axis orientation, and a variety of microscope settings. The computation time of such
simulations can be in the order of hours using a single GPU card. ADF STEM simulations can be efficiently
parallelized using multiple GPUs, as the calculation of each pixel is independent of other pixels. However, most
research groups do not have the necessary hardware, and, in the best-case scenario, the simulation time will
only be reduced proportionally to the number of GPUs used. In this manuscript, we use a learning approach and
present a densely connected neural network that is able to perform real-time ADF STEM PPISCS predictions as
a function of atomic column thickness for most common face-centered cubic (fcc) crystals (i.e., Al, Cu, Pd, Ag,
Pt, Au and Pb) along [100] and [111] zone axis orientations, root-mean-square displacements, and microscope
parameters. The proposed architecture is parameter efficient and yields accurate predictions for the PPISCS
values for a wide range of input parameters that are commonly used for aberration-corrected transmission
electron microscopes. |
|