Due to their common use for dose measurements in space and hadron therapy facilities, it is of fundamental importance to know the efficiency of luminescent detectors for measuring a wide range of particles and energies. However, due to experimental limitations it is often not possible to irradiate the detectors with very high energies, less common isotopes or exotic particles. Furthermore, the efficiency determination at low energies is biased with associated large uncertainties in range, linear energy transfer and dose. This paper presents the recently developed Microdosimetric d(z) Model able to assess the relative efficiency of thermoluminescent detectors for measuring different radiation qualities by relating the simulated dose probability distribution of the specific energy in nanometric targets with an experimentally determined response function. The model was tested in case of LiF:Mg, Ti (MTS) and LiF:Mg,Cu,P (MCP) thermoluminescent detectors exposed to charged particles from 1H to 132Xe in the energy range 3–1000 MeV/u. A comparison with experimentally determined efficiency results showed a very good agreement in case of calculations performed in a simulated target size of 40 nm. This validated model can be used to assess detector efficiency to exotic particles, unavailable radiation qualities and energies at ground level accelerators or complex mixed fields. The assumptions behind the model, its methodology and results are discussed in detail. Furthermore, a systematic investigation on the effect of simulation parameters on the calculated efficiency values is included in the manuscript.