In the human hand, high-density contact information provided by afferent neurons is essential for many human grasping and manipulation capabilities. In contrast, robotic tactile sensors, including the state-of-the-art SynTouch BioTac, are typically used to provide low-density contact information, such as contact location, center of pressure, and net force. Although useful, these data do not convey or leverage the rich information content that some tactile sensors naturally measure. This research extends robotic tactile sensing beyond reduced-order models through (1) the automated creation of a precise experimental tactile dataset for the BioTac over a diverse range of physical interactions, (2) a 3D finite-element (FE) model of the BioTac, which complements the experimental dataset with high-density, distributed contact data, (3) neural-network-based mappings from raw BioTac signals to not only low-dimensional experimental data, but also high-density FE deformation fields, and (4) mappings from the FE deformation fields to the raw signals themselves. The high-density data streams can provide a far greater quantity of interpretable information for grasping and manipulation algorithms than previously accessible. Datasets, CAD files for the experimental testbed, FE model files, and videos are available at https://sites.google.com/nvidia.com/tactiledata .