optimal sample size
Recently Published Documents


TOTAL DOCUMENTS

115
(FIVE YEARS 26)

H-INDEX

13
(FIVE YEARS 2)

Kybernetes ◽  
2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Yang Liu ◽  
Yi Chen ◽  
Kefan Xie ◽  
Jia Liu

PurposeThis research aims to figure out whether the pool testing method of SARS-CoV-2 for COVID-19 is effective and the optimal sample size is in one bunch. Additionally, since the infection rate was unknown at the beginning, this research aims to propose a multiple sampling approach that enables the pool testing method to be utilized successfully.Design/methodology/approachThe authors verify that the pool testing method of SARS-CoV-2 for COVID-19 is effective under the situation of the shortage of nucleic acid detection kits based on probabilistic modeling. In this method, the testing is performed on several samples of the cases together as a bunch. If the test result of the bunch is negative, then it is shown that none of the cases in the bunch has been infected with the novel coronavirus. On the contrary, if the test result of the bunch is positive, then the samples are tested one by one to confirm which cases are infected.FindingsIf the infection rate is extremely low, while the same number of detection kits is used, the expected number of cases that can be tested by the pool testing method is far more than that by the one-by-one testing method. The pool testing method is effective only when the infection rate is less than 0.3078. The higher the infection rate, the smaller the optimal sample size in one bunch. If N samples are tested by the pool testing method, while the sample size in one bunch is G, the number of detection kits required is in the interval (N/G, N).Originality/valueThis research proves that the pool testing method is not only suitable for the situation of the shortage of detection kits but also the situation of the overall or sampling detection for a large population. More importantly, it calculates the optimal sample size in one bunch corresponding to different infection rates. Additionally, a multiple sampling approach is proposed. In this approach, the whole testing process is divided into several rounds in which the sample sizes in one bunch are different. The actual infection rate is estimated gradually precisely by sampling inspection in each round.


Symmetry ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 926
Author(s):  
Eliardo Costa ◽  
Manoel Santos-Neto ◽  
Víctor Leiva

The fatigue-life or Birnbaum–Saunders distribution is an asymmetrical model that has been widely applied in several areas of science and mainly in reliability. Although diverse methodologies related to this distribution have been proposed, the problem of determining the optimal sample size when estimating its mean has not yet been studied. In this paper, we derive a methodology to determine the optimal sample size under a decision-theoretic approach. In this approach, we consider symmetric and asymmetric loss functions for point and interval inference. Computational tools in the R language were implemented to use this methodology in practice. An illustrative example with real data is also provided to show potential applications.


2021 ◽  
Vol 48 (3) ◽  
pp. 3037-3045
Author(s):  
Kentaro Inoue ◽  
Bernard E. Sietman ◽  
Stephen E. McMurray ◽  
J. Scott Faiman ◽  
David T. Zanatta

Author(s):  
Agnese barbensi ◽  
Dimos Goundaroulis

Recent studies classify the topology of proteins by analysing the distribution of their projections using knotoids. The approximation of this distribution depends on the number of projection directions that are sampled. Here, we investigate the relation between knotoids differing only by small perturbations of the direction of projection. Since such knotoids are connected by at most a single forbidden move, we characterize forbidden moves in terms of equivariant band attachment between strongly invertible knots and of strand passages between θ -curves. This allows for the determination of the optimal sample size needed to produce a well-approximated knotoid distribution. Based on that and on topological properties of the distribution, we probe the depth of knotted proteins with the trefoil as the predominant knot type without using subchain analysis.


Author(s):  
Pavel Ukrainskiy

A promising fast method for estimating land cover areas from satellite imagery is the use of random point sampling. This method allows you to obtain area values without spatially continuous mapping of land areas. The accuracy of the area estimate by this method depends on the sample size. The presented work describes a method for empirically finding the optimal sample size. To use this method, you must select a key site for which a reference land cover exists. For the key site, we perform multiple generation of samples of different sizes. Further, using these samples, we estimate the area of land cover. Comparison of the obtained areas with the reference areas allows you to calculate the measurement error. Analysis of the mean and the range of errors for different sample sizes allows us to identify the moment when the error ceases to decrease significantly with an increase in the sample size. This sample size is optimal. We tested the proposed method on the example of the Kalach Upland. The size range from 100 to 3000 sampling points per key site is analyzed (the size of the sampling in the row increases by 100 points). For each element of this row, we created 1000 samples of the corresponding size. We then analyzed the effect of sample size on the overall relative error in area estimates. The analysis showed that for the investigated key site the optimal sample size is 1000 points (1.1 points/km2). With this sample size, the overall relative error in determining areas was 4.0 % on average, and the maximum error was 9.9 %. Similar accuracy should be at the same sample size for other uplands in the foreststeppe and steppe zones of the East European plain.


2020 ◽  
Vol 40 (6) ◽  
pp. 797-814
Author(s):  
Michael Fairley ◽  
Lauren E. Cipriano ◽  
Jeremy D. Goldhaber-Fiebert

Purpose. Health economic evaluations that include the expected value of sample information support implementation decisions as well as decisions about further research. However, just as decision makers must consider portfolios of implementation spending, they must also identify the optimal portfolio of research investments. Methods. Under a fixed research budget, a decision maker determines which studies to fund; additional budget allocated to one study to increase the study sample size implies less budget available to collect information to reduce decision uncertainty in other implementation decisions. We employ a budget-constrained portfolio optimization framework in which the decisions are whether to invest in a study and at what sample size. The objective is to maximize the sum of the studies’ population expected net benefit of sampling (ENBS). We show how to determine the optimal research portfolio and study-specific levels of investment. We demonstrate our framework with a stylized example to illustrate solution features and a real-world application using 6 published cost-effectiveness analyses. Results. Among the studies selected for nonzero investment, the optimal sample size occurs at the point at which the marginal population ENBS divided by the marginal cost of additional sampling is the same for all studies. Compared with standard ENBS optimization without a research budget constraint, optimal budget-constrained sample sizes are typically smaller but allow more studies to be funded. Conclusions. The budget constraint for research studies directly implies that the optimal sample size for additional research is not the point at which the ENBS is maximized for individual studies. A portfolio optimization approach can yield higher total ENBS. Ultimately, there is a maximum willingness to pay for incremental information that determines optimal sample sizes.


Sign in / Sign up

Export Citation Format

Share Document