scholarly journals Dynamic and Statistical Operability of an Experimental Batch Process

Processes ◽  
2021 ◽  
Vol 9 (3) ◽  
pp. 441
Author(s):  
Willy R. de Araujo ◽  
Fernando V. Lima ◽  
Heleno Bispo

The operability approach has been traditionally applied to measure the ability of a continuous process to achieve desired specifications, given physical or design restrictions and considering expected disturbances at steady state. This paper introduces a novel dynamic operability analysis for batch processes based on classical operability concepts. In this analysis, all sets and statistical region delimitations are quantified using mathematical operations involving polytopes at every time step. A statistical operability analysis centered on multivariate correlations is employed for the first time to evaluate desired output sets during transition that serve as references to be followed to achieve the final process specifications. A dynamic design space for a batch process is, thus, generated through this analysis process and can be used in practice to guide process operation. A probabilistic expected disturbance set is also introduced, whereby the disturbances are described by pseudorandom variables and disturbance scenarios other than worst-case scenarios are considered, as is done in traditional operability methods. A case study corresponding to a pilot batch unit is used to illustrate the developed methods and to build a process digital twin to generate large datasets by running an automated digital experimentation strategy. As the primary data source of the analysis is built in a time-series database, the developed framework can be fully integrated into a plant information management system (PIMS) and an Industry 4.0 infrastructure.

2001 ◽  
Vol 64 (12) ◽  
pp. 2083-2087 ◽  
Author(s):  
R. Y. MURPHY ◽  
L. K. DUNCAN ◽  
E. R. JOHNSON ◽  
M. D. DAVIS ◽  
R. E. WOLFE ◽  
...  

Fully cooked chicken breast strips were surface inoculated to contain 9 log10 (CFU/g) Salmonella Senftenberg or Listeria innocua. The inoculated products were vacuum packaged in 0.2-mm-thick barrier bags (241 by 114 mm), then steam pasteurized at 88°C in a continuous process for 26 to 40 min or in a batch process for 33 to 41 min. After the treatments, the products were analyzed for the survivors of Salmonella or Listeria. The models were developed to correlate the surviving rate of Salmonella and Listeria with cooking time for both continuous and batch processes. A cooking time of 34 min was needed to achieve 7 logs of the reduction in a batch process. To achieve the same log reduction, a longer (6 min) cooking time was needed in a batch process than in a continuous process. The results from this study will be useful for processors to evaluate postcooking treatment procedures for ready-to-eat meat products.


Processes ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 1074
Author(s):  
Federico Zuecco ◽  
Matteo Cicciotti ◽  
Pierantonio Facco ◽  
Fabrizio Bezzo ◽  
Massimiliano Barolo

Troubleshooting batch processes at a plant-wide level requires first finding the unit causing the fault, and then understanding why the fault occurs in that unit. Whereas in the literature case studies discussing the latter issue abound, little attention has been given so far to the former, which is complex for several reasons: the processing units are often operated in a non-sequential way, with unusual series-parallel arrangements; holding vessels may be required to compensate for lack of production capacity, and reacting phenomena can occur in these vessels; and the evidence of batch abnormality may be available only from the end unit and at the end of the production cycle. We propose a structured methodology to assist the troubleshooting of plant-wide batch processes in data-rich environments where multivariate statistical techniques can be exploited. Namely, we first analyze the last unit wherein the fault manifests itself, and we then step back across the units through the process flow diagram (according to the manufacturing recipe) until the fault cannot be detected by the available field sensors any more. That enables us to isolate the unit wherefrom the fault originates. Interrogation of multivariate statistical models for that unit coupled to engineering judgement allow identifying the most likely root cause of the fault. We apply the proposed methodology to troubleshoot a complex industrial batch process that manufactures a specialty chemical, where productivity was originally limited by unexplained variability of the final product quality. Correction of the fault allowed for a significant increase in productivity.


2001 ◽  
Vol 73 (6) ◽  
pp. 631-631
Author(s):  
R. Oliveira ◽  
J. Peres ◽  
S. Feyo de Azevedo ◽  
M. J. Gonçalves

Cerâmica ◽  
2018 ◽  
Vol 64 (370) ◽  
pp. 176-182 ◽  
Author(s):  
C. I. Torres ◽  
N. M. Rendtorff ◽  
M. Cipollone ◽  
E. F. Aglietti ◽  
G. Suárez

Abstract The results of qualitative and quantitative properties of clay based ceramic are presented in this work. Four different shaping methods and sintering temperatures were used to understand their influence in the final properties of a ceramic material formulated using kaolinite clay and calcined alumina. This material can be used as a structural ceramic for different applications, and there is no pre-established relation between the forming method and the final sintered properties. Forming methods used to prepare the samples were uniaxial pressing (a batch process that allows application in dry samples), extruding (a continuous process that requires moisture), slip casting (a process that allows to shape complex ceramic ware), and lamination (a batch process that requires moisture). Sintering temperatures were in the range of 1100 and 1400 °C. In order to compare how properties behave as the shaping method and sintering temperature change, textural properties, shrinkage, porosimetry, phase composition and mechanical strength were evaluated and analyzed. Scanning electron microscopy and microtomography were employed for analyzing and comparing the developed microstructures. Differences in the resulting properties are explained in terms of the developed crystalline phases and microstructure.


2015 ◽  
Vol 9 (7) ◽  
pp. 8 ◽  
Author(s):  
Tri Widjaja ◽  
Ali Altway ◽  
Arief Widjaja ◽  
Umi Rofiqah ◽  
Rr Whiny Hardiyati Erlian

One form of economic development efforts for waste utilization in rural communities is to utilize stem sorghum to produce food grade ethanol. Sorghum stem juice with 150 g/L of sugar concentration was fermented using conventional batch process and cell immobilization continuous process with K-carrageenan as a supporting matrix. The microorganism used was Mutated Zymomonas Mobilis to be compared with a mixture of Saccharomyces Cerevisiae and Pichia Stipitis, and a mixture of Mutated Zymomonas Mobilis and Pichia Stipitis. Ethanol in the broth, result of fermentation process, was separated in packed distillation column. Distilate of the column, still contain water and other impurities, was flown into molecular sieve for dehydration and activated carbon adsorption column to remove the other impurities to meet food grade ethanol specification. The packing used in distillation process was steel wool. For batch fermentation, the fermentation using a combination of Saccharomyces Cerevisiae and Pichia Stipitis produced the best ethanol with 12.07% of concentration, where the yield and the productivity were 63.49%, and 1.06 g/L.h, respectively. And for continuous fermentation, the best ethanol with 9.02% of concentration, where the yield and the productivity were 47.42% and 174.27 g/L.h, respectively, is obtained from fermentation using a combination of Saccharomyces Cerevisiae and Pichia Stipitis also. Fermentation using combination microorganism of Saccharomyces Cerevisiae and Pichia Stipitis produced higher concentration of ethanol, yield, and productivity than other microorganisms. Distillation, molecular sieve dehydration and adsorption process is quite successful in generating sufficient levels of ethanol with relatively low amount of impurities.


2021 ◽  
Vol 905 (1) ◽  
pp. 012122
Author(s):  
H P Pramana ◽  
S Hastjarjo ◽  
Sudarmo

Abstract This study explains millennial and Gen-Z attitudes, perspectives, and behaviors in implementing the eco-office concept. As a qualitative method study, the study uses primary data through semi-structured interviews and secondary data collected from reports, public/private publications, and census results, using the Yin case study model as data analysis. The study results reveal that the biggest challenge in implementing new policies is self-thinking. Leaders, as change agents, play an essential role in penetrating messages that make them act pro-environment. The reward system will be very effective, especially providing satisfaction for self-actualization. The findings of this study have implications for policymakers as input. For example, the results show that social media plays a crucial role in increasing environmental awareness. In addition, simple shifts such as electronic media use at work will cut the file chain, making it more effective and favored by young people. They understand the consequences of their actions on the environment and have the education, motivation, and social awareness to participate in the green movement. However, beliefs and actions are not fully integrated, and investigating and understanding their behavior and unique needs in the workplace will lead employees to integrate and succeed together to support the environment.


1992 ◽  
Vol 8 (02) ◽  
pp. 77-88
Author(s):  
S. Madden ◽  
H. H. Vanderveldt ◽  
J. Jones

Computer Aided Process Planning (CAPP) integrated with Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) will form the basis of engineering/planning systems of the future. These systems will have the capability to operate in a paperless environment and provide highly optimized process operation plans. The WELDEXCELL System is a prototype of such a system for welding in shipyards. The paper discusses three significant computer technology advances which have been in into the WELDEXCELL prototype. First is a computerized system for allowing multiple knowledge sources (expert systems, humans, data systems, etc.) to work together to solve a common problem (the weld plan). This system is called a "blackboard." The second is a methodology for the blackboard to communicate to the human user. This interface includes full interactive graphics fully integrated to CAD as well as data searches and automatic completion of routine engineering tasks. The third is artificial neural networks (ANS's), which are based on biological neural networks (such as the human brain) and which can do neural reasoning tasks about difficult problems. ANS's offer the opportunity to model highly complex multivariable and nonlinear processes (for example, welding) and provide a means for an engineer to quantitatively assess the process and its operation.


Author(s):  
João Sousa ◽  
José Ferreira ◽  
Carlos Lopes ◽  
João Sarraipa ◽  
João Silva

Abstract The continuous thrive for working safety, customer satisfaction and increasing profits for companies has led to numerous manufacturing and management strategies. One of the most promising strategies nowadays is Zero Defects that focuses on the elimination of defected parts in the manufacturing processes. The benefits of Zero Defect implementation in the manufacturing industry are mainly related to the reduction of scrap material, and everything that does not bring any added value to the product. The result is a reduction of the company’s expenditure for dealing with defective products. In spite the concept not being new, the practical application of such strategies were limited by technological constraints and high investment costs. With the Industry 4.0 evolution, some Zero Defects concepts are more accessible due to the availability of sensors and data related techniques such as Machine Learning and Big Data although a lot of work is still required for component integration to enhance the capability of the heterogeneous technologies. The quality of the steel tubes is evaluated by sampling and relies on the expertise of the operators for checking for nonconformities. When a defect is detected, the process parameters are adjusted based on prior experience. However, since this is a continuous process, the delay between the appearance of a defect in the process and its awareness leads to a considerable amount of produced scrap material. Worst-case scenario, the defective product can be delivered to the customer damaging the customers trust and leading to additional replacement costs. This paper addresses the application of the Zero Defects approach to the steel tube manufacturing industry. This approach is part of the Zero Defects Manufacturing Platform EU project that is based around a Service Oriented Architecture and microservices approach capable of building, running and managing specific use-case oriented software applications called zApps. The Zero Defects methodology to design a zApp based on key criteria for the steel tube industry is described. Additionally, the envisioned zApps to monitor all the produced steel tube during the manufacturing process are detailed. The inspection systems uses a scanning camera and a laser profile scanner to capture the steel tube defects during manufacturing and prior to packaging. Although the ultimate goal is to eliminate the cause of the defective products, the objective of the zApp is to increase the number of detections of defective products based on industry standards and reduce the amount of generated scrap material.


2019 ◽  
Vol 42 (6) ◽  
pp. 1204-1214
Author(s):  
Wei Guo ◽  
Tianhong Pan ◽  
Zhengming Li ◽  
Shan Chen

Multi-model/multi-phase modeling algorithm has been widely used to monitor the product quality in complicated batch processes. Most multi-model/ multi-phase modeling methods hinge on the structure of a linearly separable space or a combination of different sub-spaces. However, it is impossible to accurately separate the overlapping region samples into different operating sub-spaces using unsupervised learning techniques. A Gaussian mixture model (GMM) using temporal features is proposed in the work. First, the number of sub-model is estimated by using the maximum interval process trend analysis algorithm. Then, the GMM parameters constrained with the temporal value are identified by using the expectation maximization (EM) algorithm, which minimizes confusion in overlapping regions of different Gaussian processes. A numerical example and a penicillin fermentation process demonstrate the effectiveness of the proposed algorithm.


Sign in / Sign up

Export Citation Format

Share Document