In a previous paper the absorption of γ-rays in the K-X-ray levels of the atom in which they are emitted was calculated according to the Quantum Mechanics, supposing the γ-rays to be emitted from a doublet of moment
f
(
t
) at the centre of the atom. The non-relativity wave equation derived from the relativity wave equation for an electron of charge — ε moving in an electro-magnetic field of vector potential K and scalar potential V is
h
2
∇
2
ϕ
+ 2μ (
ih
∂/∂
t
+ εV +
ih
ε/μ
c
(K. grad))
ϕ
= 0. (1) Suppose, however, that K involves the space co-ordinates. Then, (K. grad)
ϕ
≠ (grad . K)
ϕ
, and the expression (K . grad)
ϕ
is not Hermitic. Equation (1) cannot therefore be the correct non-relativity wave equation for a single electron in an electron agnetic field, and we must substitute
h
2
∇
2
ϕ
+ 2μ (
ih
∂/∂
t
+ εV)
ϕ
+
ih
ε/
c
((K. grad)
ϕ
+ (grad. K)
ϕ
) = 0. (2)