cost information
Recently Published Documents


TOTAL DOCUMENTS

608
(FIVE YEARS 182)

H-INDEX

33
(FIVE YEARS 7)

2022 ◽  
Vol 9 ◽  
Author(s):  
Houyin Long ◽  
Hong Zeng ◽  
Xinyi Lin

The Chinese government has adopted many policies to save energy and electricity in the chemical industry by improving technology and reforming its electricity market. The improved electricity efficiency and the electricity reform may indirectly reduce expected energy and electricity savings by decreasing the effective electricity price and the marginal cost of electricity services. To analyze the above issues, this paper employs the Morishima Elasticity of Substitution of the electricity cost share equation which is estimated by the DOLS method. The results show that: 1) There exists a rebound effect in the Chinese chemical industry, but it is quite large because the electricity price is being controlled by the government; 2) the reform of the electricity market reduces the rebound effect to 73.85%, as electricity price begins to reflect cost information to some extent; 3) there is still a lot of space for the reform to improve, and the rebound effect could be reduced further once the electricity price is adjusted to transfer the market information more correctly. In order to succeed in saving electricity and decreasing the rebound effect in the chemical industry, the policy implications are provided from perspectives of the improved energy efficiency and electricity pricing mechanism.


2022 ◽  
Vol 6 (1) ◽  
pp. 18
Author(s):  
James Clarke ◽  
Alistair McIlhagger ◽  
Dorian Dixon ◽  
Edward Archer ◽  
Glenda Stewart ◽  
...  

Lack of cost information is a barrier to acceptance of 3D woven preforms as reinforcements for composite materials, compared with 2D preforms. A parametric, resource-based technical cost model (TCM) was developed for 3D woven preforms based on a novel relationship equating manufacturing time and 3D preform complexity. Manufacturing time, and therefore cost, was found to scale with complexity for seventeen bespoke manufactured 3D preforms. Two sub-models were derived for a Weavebird loom and a Jacquard loom. For each loom, there was a strong correlation between preform complexity and manufacturing time. For a large, highly complex preform, the Jacquard loom is more efficient, so preform cost will be much lower than for the Weavebird. Provided production is continuous, learning, either by human agency or an autonomous loom control algorithm, can reduce preform cost for one or both looms to a commercially acceptable level. The TCM cost model framework could incorporate appropriate learning curves with digital twin/multi-variate analysis so that cost per preform of bespoke 3D woven fabrics for customised products with low production rates may be predicted with greater accuracy. A more accurate model could highlight resources such as tooling, labour and material for targeted cost reduction.


2022 ◽  
Vol 12 (1) ◽  
Author(s):  
Mehdi Rezaee ◽  
Khosro Keshavarz ◽  
Sadegh Izadi ◽  
Abdosaleh Jafari ◽  
Ramin Ravangard

Abstract Background Multiple Sclerosis (MS) is a chronic debilitating disease that imposes a heavy socioeconomic burden on societies. This study aimed to determine the economic burden of MS on patients using the first (CinnoVex and ReciGen) and second (Fingolimod and Natalizumab) drug therapy lines. Methods This cost of illness study was an economic evaluation carried out as cross-sectional research in 2019 in southern Iran. A total of 259 patients were enrolled in two lines of drug therapy (178 patients in the first line and 81 ones in the second). The prevalence-based approach and the bottom-up approach were used to collect cost information and to calculate the costs from the societal perspective, respectively. The human capital approach was applied to calculate indirect costs. To collect the required data a researcher-made data collection form was utilized. The data were obtained using the information available in the patients’ medical records and insurance invoices as well as their self-reports or that of their companions. Results The results showed that the annual costs of MS in the first and second lines of drug therapy per patient were $ 1919 and $ 4082 purchasing power parity (PPP), respectively, and in total, $ 2721 PPP in 2019. The highest mean costs in both lines were those of direct medical costs, of which purchasing the main medicines in both lines accounted for the highest. Conclusion Considering the findings of this study and in order to reduce the burden of the disease, the following suggestions are presented: providing necessary facilities for the production of MS drugs in the country; proper and equitable distribution of neurologists; expanding the provision of home care services; and using the technologies related to the Internet, including WhatsApp, to follow up the MS patients’ treatment.


2022 ◽  
Vol 2146 (1) ◽  
pp. 012004
Author(s):  
Chen Chen ◽  
Kankan Chen ◽  
Xiaoli Chen ◽  
Xuemei Zhu ◽  
Ye Ke

Abstract With the reform of China’s power system, power transmission and transformation project (hereinafter referred to as PTATP) are gradually developing in the direction of integration, informatization, large-scale and systematization. Therefore, the traditional project cost can no longer meet the needs of the society, which requires the project cost based on BD (hereinafter referred to as BD) technology. Through the information platform (hereinafter referred to as IPF), we can collect a lot of information, including policies and regulations database, talent and machine price information database, project cost index database, industry information database, etc., which will provide important support for project cost. Project cost informatization will solve the problems of low information sharing rate, low information value and high information cost, which will more scientifically complete the cost of PTATP. Based on BD technology, we can collect, sort out and analyze the cost information data of PTATP, which will fully explore the data value. Firstly, this paper analyzes the main algorithms needed for project cost. Finally, this paper constructs a PTATP cost IPF based on BD analysis, which will provide accurate countermeasures.


2021 ◽  
Author(s):  
Saniya Karnik ◽  
Supriya Gupta ◽  
Jason Baihly ◽  
David Saier

Abstract Recent advancements in the field of natural language processing (NLP) and machine learning has allowed for the potential to ingest decades of field history and heterogeneous production records. This paper proposes an analytics workflow that leverages artificial intelligence to process thousands of historical workover reports (handwritten and electronic), extract important information, learn patterns in production activity, and train machines to quantify workover impact and derive best practices for field operations. Natural language processing libraries were developed to ingest and catalog gigabytes of field data, identify rich sources of workover information, and extract workover and cost information from unstructured reports. A clustering based architecture was developed and trained to categorize documents based on free text describing the activities found in reports. This machine learning model learnt the pattern and context of repeating words and was able to cluster documents with similar content together. This enabled the user to find a category of documents e.g. workover intervention reports instantaneously. Statistical models were built to determine return on investment from workovers and rank them based on production improvement and payout time. Today, 80% of an oilfield expert's time can be spent manually organizing data. When processing decades of historical oilfield production data spread across both structured (production timeseries) and unstructured records (e.g., workover reports), experts often face two major challenges: 1) How to rapidly analyze field data with thousands of historical records. 2) How to use the rich historical information to generate effective insights to take the proper actions to optimize production. In this paper, we analyzed multiple field datasets in a heterogeneous file environment with 20 different file formats (PDF, Excel, and other formats), 2,000+ files, production history spanning 50+ years across, and 2,000+ producing wells. Libraries were developed to extract files from complex folder hierarchies, machine learning architectures assisted in finding the workover reports from the myriad documents. Information from reports was extracted through Python libraries and optical character recognition technology to build master data source with production history, workover and cost information. The rich dataset was then used to analyze episodic workover activity by well and compute key performance indicators (KPIs) to identify well candidates for production enhancement. The building blocks included quantifying production upside and calculating return of investment for various workover classes. O&G companies have vast volumes of unstructured data and use less than 1% of it to uncover meaningful insights about field operations. Our workflow describes a methodology to ingest both structured and unstructured documents, capture knowledge, quantify production upside, understand capital spending, and learn best practices in workover operations through an automated process. This process helps optimize forward operating expense (OPEX) plans with a focus on cost reduction and shortened turnaround time for decision making.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Yi Jiang

The traditional budget quota pricing model seems incompatible with the continuous development of the market economy. Low degree of visualization, extensive resource management, difficulty in collecting construction data, lagging control work, and separation of cost control from project management have become the main problems of current construction project cost control. This paper combines data mining technology to improve the project cost on-site verification network system, build a project cost verification system that can be used for on-site verification, and build a cost information platform. Moreover, this paper makes full use of computer information technology and uses the local area network and network as the basis to transform the vertical transmission of cost information into horizontal, so as to ensure the effective communication and exchange of cost information. According to the analysis of the case study, it can be seen that the project cost field verification method based on data mining proposed in this paper is very effective, and the system proposed in this paper can be used in actual projects.


2021 ◽  
Vol 13 (23) ◽  
pp. 13425
Author(s):  
Kwanho Suk ◽  
Triza Mudita

Charities face common problems in which donors tend to avoid charities with high overhead rates. This overhead aversion phenomenon forces charities to suppress their overhead spending, which impedes them from performing as best as they can. Substantial research has attempted to mitigate overhead aversion by eliminating the need to cover overhead expenses by donors. The present work takes a different approach and presents a method to reduce overhead aversion and to improve the attitude toward the charity by providing donors with details of the overhead costs. Study 1 demonstrates that disclosing the overhead cost improves donors’ attitude toward the charity. Moreover, the effect is mediated by the donor’s attitude toward the overhead. Study 2 shows that presenting cost information is more effective than the methods proposed by the existing literature (e.g., presenting a message that addresses the importance of overhead). The research contributes to the literature by demonstrating how to communicate with donors to increase their evaluations of the charity.


2021 ◽  
Vol 935 (1) ◽  
pp. 012034
Author(s):  
E A Ivanov ◽  
L Yu Malinina ◽  
N N Pushkarenko ◽  
A V Korotkov

Abstract As one of the leading segments of modern agriculture in the Russian Federation, the hop production is currently on the rise and upscales its activities every year. This is largely facilitated by strong financial government support. The purpose of this study is to examine the main theoretical and methodological aspects of organizing the appropriate production accounting to provide the common approaches to cost justification while filing of applications by hop farms for grants. To reveal the main scientific provisions, such techniques and methods as observation, induction and deduction, analysis and synthesis, observation, comparison and other were used. The findings of the study point to the fact that no updated regulatory framework for accounting of costs in hop farms is available, and the issue of the structure and content of the incurred cost information carrier has not been completely elaborated within the government grant issuing mechanism. A small number of international and Russian studies to determine the essential characteristics of hops as a biological asset has a negative impact on the arrangement of the accounting process. The article suggests the methodology for organizing cost accounting by the main agro-technological stages of hop management and cultivation.


2021 ◽  
Author(s):  
Karl Schuhmacher ◽  
Michael Burkert

Accurate cost information is critical to effective decision making within organizations. Cost computations often rely on subjective judgments by employees regarding the amount of time that different tasks consume. In an experimental setting, we examine the accuracy of two common approaches to eliciting subjective time estimates vital for accurate cost information. Specifically, we compare estimation error when employees estimate (i) the total time for all iterations of a task (the pool approach) versus (ii) the average time for one iteration of a task (the unit approach). These two approaches have received interest by both practitioners and researchers and are at the heart of the difference between conventional activity-based costing (ABC) and time-driven ABC. While mathematically equivalent, we hypothesize and find that the two approaches evoke different cognitive processes that lead to differences in estimation error. Relative to the unit approach, the pool approach produces larger error in the allocation of time among different tasks, but only when the number of iterations per task varies across tasks. Further, the pool approach results in overestimation of productive time, whereas the unit approach leads to underestimation of productive time. Our findings are robust to different response modes of the pool approach (estimates in absolute time units and in percentages). This study is relevant for designers and users of cost and performance-measurement systems in that allocation errors lead to cost cross-subsidization and poor resource-allocation decisions, while overall errors undermine capacity utilization decisions. This paper was accepted by Brian Bushee, accounting.


Sign in / Sign up

Export Citation Format

Share Document