Innovations in Database Design, Web Applications, and Information Systems Management
Latest Publications


TOTAL DOCUMENTS

16
(FIVE YEARS 0)

H-INDEX

1
(FIVE YEARS 0)

Published By IGI Global

9781466620445, 9781466620452

Author(s):  
Sami Bhiri ◽  
Walid Gaaloul ◽  
Claude Godart ◽  
Olivier Perrin ◽  
Maciej Zaremba ◽  
...  

Web services are defined independently of any execution context. Due to their inherent autonomy and heterogeneity, it is difficult to examine the behaviour of composite services, especially in case of failures. This paper is interested in ensuring composite services reliability. Reliable composition is defined as a composition where all instance executions are correct from a transactional and business point of view. In this paper, the authors propose a transactional approach for ensuring reliable Web service compositions. The approach integrates the expressivity power of workflow models and the reliability of Advanced Transactional Models (ATM). This method offers flexibility for designers to specify their requirements in terms of control structure, using workflow patterns, and execution correctness. Contrary to ATM, the authors start from the designers’ specifications to define the appropriate transactional mechanisms that ensure correct executions according to their requirements.


Author(s):  
Jesús Pardillo ◽  
Jose-Norberto Mazón ◽  
Juan Trujillo

To customize a data warehouse, many organizations develop concrete data marts focused on a particular department or business process. However, the integrated development of these data marts is an open problem for many organizations due to the technical and organizational challenges involved during the design of these repositories as a complete solution. In this article, the authors present a design approach that employs user requirements to build both corporate data warehouses and data marts in an integrated manner. The approach links information requirements to specific data marts elicited by using goal-oriented requirement engineering, which are automatically translated into the implementation of corresponding data repositories by means of model-driven engineering techniques. The authors provide two UML profiles that integrate the design of both data warehouses and data marts and a set of QVT transformations with which to automate this process. The advantage of this approach is that user requirements are captured from the early development stages of a data-warehousing project to automatically translate them into the entire data-warehousing platform, considering the different data marts. Finally, the authors provide screenshots of the CASE tools that support the approach, and a case study to show its benefits.


Author(s):  
Peter Aiken ◽  
Mark Gillenson ◽  
Xihui Zhang ◽  
David Rafner

Data management (DM) has existed in conjunction with software development and the management of the full set of information technology (IT)-related components. However, it has been more than two decades since research into DM as it is practiced has been published. In this paper, the authors compare aspects of DM across a quarter-century timeline, obtaining data using comparable sets of subject matter experts. Using this information to observe the profession’s evolution, the authors have updated the understanding of DM as it is practiced, giving additional insight into DM, including its current responsibilities, reporting structures, and perceptions of success, among other factors. The analysis indicates that successfully investing in DM presents current, real challenges to IT and organizations. Although DM is evolving away from purely operational responsibilities toward higher-level responsibilities, perceptions of success have fallen. This paper details the quarter-century comparison of DM practices, analyzes them, and draws conclusions.


Author(s):  
Pnina Soffer ◽  
Maya Kaner

This paper investigates the need for complementing automated verification of business process models with a validity analysis performed by human analysts. As business processes become increasingly automated through process aware information systems, the quality of process design becomes crucial. Although verification of process models has gained much attention, their validation, relating to the reachability of the process goal, has hardly been addressed. The paper investigates the need for model validation both theoretically and empirically. The authors present a theoretical analysis, showing that process model verification and validation are complementary in nature, and an empirical evaluation of the effectiveness of validity criteria in validating a process model. The theoretical analysis, which relates to different aspects of process model quality, shows that process model verification and validation are complementary in nature. The empirical findings corroborate the effectiveness of validity criteria and indicate that a systematic criteria-supported validity analysis improves the identification of validity problems in process models.


Author(s):  
Geert Poels

In this paper, the author investigates the effect on understanding of using business domain models that are constructed with Resource-Event-Agent (REA) modeling patterns. First, the author analyzes REA modeling structures to identify the enabling factors and the mechanisms by means of which users recognize these structures in a conceptual model and description of an information retrieval and interpretation task. Based on this understanding, the author hypothesizes positive effects on model understanding for situations where REA patterns can be recognized in both task and model. An experiment is then conducted to demonstrate a better understanding of models with REA patterns compared to information equivalent models without REA patterns. The results of this experiment indicate that REA patterns can be recognized with minimal prior patterns training and that the use of REA patterns leads to models that are easier to understand for novice model users.


Author(s):  
Dinesh Batra ◽  
Debra VanderMeer ◽  
Kaushik Dutta

The article evaluates the feasibility of extending agile principles to larger, dynamic, and possibly distributed software development projects by uncovering the theoretical basis for agile values and principles for achieving agility. The extant literature focuses mainly on one theory – complex adaptive systems – to support agile methods, although recent research indicates that the control theory and the adaptive structuration theory are also applicable. This article proposes that at least three other theories exist that are highly relevant: transaction cost economics, social exchange theory, and expectancy theory. By employing these theories, a rigorous analysis of the Agile Manifesto is conducted. Certain agile values and principles find theoretical support and can be applied to enhance agility dynamic projects regardless of size; some agile principles find no theoretical support while others find limited support. Based on the analysis and the ensuing discussion, the authors propose a framework with five dimensions of agility: process, design, people, outcomes, and adaptation.


Author(s):  
Anat Aharoni ◽  
Iris Reinhartz-Berger

Situational methods are approaches to the development of software systems that are designed and constructed to fit particular circumstances that often refer to project characteristics. One common way to create situational methods is to reuse method components, which are the building blocks of development methods. For this purpose, method components must be stored in a method base, and then retrieved and composed specifically for the situation in hand. Most approaches in the field of situational method engineering require the expertise of method engineers to support the retrieval and composition of method components. Furthermore, this is usually done in an ad-hoc manner and for pre-defined situations. In this paper, the authors propose an approach, supported by a tool that creates situational methods semi-automatically. This approach refers to structural and behavioral considerations and a wide variety of characteristics when comparing method components and composing them into situational methods. The resultant situational methods are stored in the method base for future usage and composition. Based on an experimental study of the approach, the authors show that it provides correct and suitable draft situational methods, which human evaluators have assessed as relevant for the given situations.


Author(s):  
Dickson K.W. Chiu ◽  
Qing Li ◽  
Patrick C. K. Hung ◽  
Zhe Shan ◽  
S. C. Cheung ◽  
...  

Service-Oriented Computing (SOC) has recently gained attention both within industry and academia; however, its characteristics cannot be easily solved using existing distributed computing technologies. Composition and interaction issues have been the central concerns, because SOC applications are composed of heterogeneous and distributed processes. To tackle the complexity of inter-organizational service integration, the authors propose a methodology to decompose complex process requirements into different types of flows, such as control, data, exception, and security. The subset of each type of flow necessary for the interactions with each partner can be determined in each service. These subsets collectively constitute a process view, based on which interactions can be systematically designed and managed for system integration through service composition. The authors illustrate how the proposed SOC middleware, named FlowEngine, implements and manages these flows with contemporary Web services technologies. An experimental case study in an e-governmental environment further demonstrates how the methodology can facilitate the design of complex inter-organizational processes.


Author(s):  
Marco Crasso ◽  
Alejandro Zunino ◽  
Marcelo Campo

Discovering services acquires importance as Service-Oriented Computing (SOC) becomes an adopted paradigm. SOC’s most popular materializations, namely Web Services technologies, have different challenges related to service discovery and, in turn, many approaches have been proposed. As these approaches are different, one solution may be better than another according to certain requirements. In consequence, choosing a service discovery system is a hard task. To alleviate this task, this paper proposes eight criteria, based on the requirements for discovering services within common service-oriented environments, allowing the characterization of discovery systems. These criteria cover functional and non-functional aspects of approaches to service discovery. The results of the characterization of 22 contemporary approaches and potential research directions for the area are also shown.


Author(s):  
Hyunjung Park ◽  
Sangkyu Rho ◽  
Jinsoo Park

The information space of the Semantic Web has different characteristics from that of the World Wide Web (WWW). One main difference is that in the Semantic Web, the direction of Resource Description Framework (RDF) links does not have the same meaning as the direction of hyperlinks in the WWW, because the link direction is determined not by a voting process but by a specific schema in the Semantic Web. Considering this fundamental difference, the authors propose a method for ranking Semantic Web resources independent of link directions and show the convergence of the algorithm and experimental results. This method focuses on the classes rather than the properties. The property weights are assigned depending on the relative significance of the property to the resource importance of each class. It solves some problems reported in prior studies, including the Tightly Knit Community (TKC) effect, as well as having higher accuracy and validity compared to existing methods.


Sign in / Sign up

Export Citation Format

Share Document