(1) Background: In nervous system information is conveyed by a sequence of action potentials, called spikes-trains. As MacKay and McCulloch proposed, spike-trains can be represented as bits sequences coming from Information Sources. Previously, we studied relations between Information Transmission Rates (ITR) carried out by the spikes, their correlations, and frequencies. Here, we concentrate on the problem of how spikes fluctuations affect ITR. (2) Methods: The Information Theory Method developed by Shannon is applied. Information Sources are modeled as stationary stochastic processes. We assume such sources as two states Markov processes. As a spike-trains' fluctuation measure, we consider the Standard Deviation sigma, which, in fact, measures average fluctuation of spikes around the average spike frequency. (3) Results: We found that character of ITR and signal fluctuations relation strongly depends on the parameter s which is a sum of transitions probabilities from no spike state to spike state and vice versa. It turned out that for smaller s (s<1) the quotient ITR\sigma has a maximum and can tend to zero depending on transition probabilities. While for s large enough (1<s) the ITR\sigma is separated from 0 for each s. Similar behavior was observed also when we replaced Shannon entropy terms in Markov entropy formula by their approximation with polynomials. We also show that the ITR quotient by Variance behaves in a completely different way. (4) Conclusions: Our results show that for large transition parameter s the Information Transmission Rate by sigma will never decrease to zero. Specifically, for 1<s<1.7 the ITR will be always, independently on transition probabilities which form this s, above the level of fluctuations, i.e. in this case we have sigma<ITR. Thus, we conclude that in a more noisy environment, to get appropriate reliability and efficiency of transmission, Information Sources with higher tendency of transition from the state no spike to spike state and vice versa should be applied.