When estimating regression models using the least squares method, one of its prerequisites is the lack of autocorrelation in the regression residuals. The presence of autocorrelation in the residuals makes the least-squares regression estimates to be ineffective, and the standard errors of these estimates to be untenable. Quantitatively, autocorrelation in the residuals of the regression model has traditionally been estimated using the Durbin-Watson statistic, which is the ratio of the sum of the squares of differences of consecutive residual values to the sum of squares of the residuals. Unfortunately, such an analytical form of the Durbin-Watson statistic does not allow it to be integrated, as linear constraints, into the problem of selecting informative regressors, which is, in fact, a mathematical programming problem in the regression model. The task of selecting informative regressors is to extract from the given number of possible regressors a given number of variables based on a certain quality criterion.The aim of the paper is to develop and study new criteria for detecting first-order autocorrelation in the residuals in regression models that can later be integrated into the problem of selecting informative regressors in the form of linear constraints. To do this, the paper proposes modular autocorrelation statistic for which, using the Gretl package, the ranges of their possible values and limit values were first determined experimentally, depending on the value of the selective coefficient of auto-regression. Then the results obtained were proved by model experiments using the Monte Carlo method. The disadvantage of the proposed modular statistic of adequacy is that their dependencies on the selective coefficient of auto-regression are not even functions. For this, double modular autocorrelation criteria are proposed, which, using special methods, can be used as linear constraints in mathematical programming problems to select informative regressors in regression models.