The sub-grain size, d, during steady-state dislocation creep of polycrystalline metals is theoretically formulated to be inversely proportional to the dislocation density, ρ, which is defined as the number of dislocations swept out of a sub-grain divided by the cross-sectional area of the sub-grain. This dislocation density differs from the typically observed dislocation density inside a sub-grain after unloading, ρ_ob. In the current work, the ρ_ob values inside sub-grains in steadily crept specimens of Al, Cu, Fe, Fe–Mo alloy, austenitic stainless steel, and high-Cr martensitic steel reported in the literature were used to evaluate the relation ρ_ob=ηρ. It was confirmed that η≈1 for pure metals (regardless of the type of metal) crept at high temperatures and low stresses or for long durations and η>1 for Mo-containing alloys and martensitic steel crept at low temperatures and/or high stresses. Moreover, it is suggested that the condition η>1 corresponds to a state of excess immobile dislocations inside the sub-grain. The theoretical relation d_ob (≈d)∝η∙〖ρ_ob〗^(-1), where d_ob is the observed sub-grain size, essentially differs from the well-known empirical relation d_ob∝〖ρ_ob〗^(-0.5).