scholarly journals A FIRST ORDER PREDICATE LOGIC FORMULATION OF THE 3D RECONSTRUCTION PROBLEM AND ITS SOLUTION SPACE

Author(s):  
MARTIN ROBINSON ◽  
KURT KUBIK ◽  
BRIAN LOVELL

This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 × 5 voxel space gives 10 to 107 solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.

2021 ◽  
Vol 348 ◽  
pp. 01011
Author(s):  
Aicha Allag ◽  
Redouane Drai ◽  
Tarek Boutkedjirt ◽  
Abdessalam Benammar ◽  
Wahiba Djerir

Computed tomography (CT) aims to reconstruct an internal distribution of an object based on projection measurements. In the case of a limited number of projections, the reconstruction problem becomes significantly ill-posed. Practically, reconstruction algorithms play a crucial role in overcoming this problem. In the case of missing or incomplete data, and in order to improve the quality of the reconstruction image, the choice of a sparse regularisation by adding l1 norm is needed. The reconstruction problem is then based on using proximal operators. We are interested in the Douglas-Rachford method and employ total variation (TV) regularization. An efficient technique based on these concepts is proposed in this study. The primary goal is to achieve high-quality reconstructed images in terms of PSNR parameter and relative error. The numerical simulation results demonstrate that the suggested technique minimizes noise and artifacts while preserving structural information. The results are encouraging and indicate the effectiveness of the proposed strategy.


2018 ◽  
Vol 26 (6) ◽  
pp. 799-820 ◽  
Author(s):  
Lingli Zhang ◽  
Li Zeng ◽  
Chengxiang Wang ◽  
Yumeng Guo

Abstract Restricted by the practical applications and radiation exposure of computed tomography (CT), the obtained projection data is usually incomplete, which may lead to a limited-angle reconstruction problem. Whereas reconstructing an object from limited-angle projection views is a challenging and ill-posed inverse problem. Fortunately, the regularization methods offer an effective way to deal with that. Recently, several researchers are absorbed in {\ell_{1}} regularization to address such problem, but it has some problems for suppressing the limited-angle slope artifacts around edges due to incomplete projection data. In this paper, in order to surmount the ill-posedness, a non-smooth and non-convex method that is based on {\ell_{0}} and {\ell_{1}} regularization is presented to better deal with the limited-angle problem. Firstly, the splitting technique is utilized to deal with the presented approach called LWPC-ST-IHT. Afterwards, some propositions and convergence analysis of the presented approach are established. Numerical implementations show that our approach is more capable of suppressing the slope artifacts compared with the classical and state of the art iterative reconstruction algorithms.


2020 ◽  
Vol 28 (6) ◽  
pp. 829-847
Author(s):  
Hua Huang ◽  
Chengwu Lu ◽  
Lingli Zhang ◽  
Weiwei Wang

AbstractThe projection data obtained using the computed tomography (CT) technique are often incomplete and inconsistent owing to the radiation exposure and practical environment of the CT process, which may lead to a few-view reconstruction problem. Reconstructing an object from few projection views is often an ill-posed inverse problem. To solve such problems, regularization is an effective technique, in which the ill-posed problem is approximated considering a family of neighboring well-posed problems. In this study, we considered the {\ell_{1/2}} regularization to solve such ill-posed problems. Subsequently, the half thresholding algorithm was employed to solve the {\ell_{1/2}} regularization-based problem. The convergence analysis of the proposed method was performed, and the error bound between the reference image and reconstructed image was clarified. Finally, the stability of the proposed method was analyzed. The result of numerical experiments demonstrated that the proposed method can outperform the classical reconstruction algorithms in terms of noise suppression and preserving the details of the reconstructed image.


1962 ◽  
Vol 27 (1) ◽  
pp. 58-72 ◽  
Author(s):  
Timothy Smiley

Anyone who reads Aristotle, knowing something about modern logic and nothing about its history, must ask himself why the syllogistic cannot be translated as it stands into the logic of quantification. It is now more than twenty years since the invention of the requisite framework, the logic of many-sorted quantification.In the familiar first-order predicate logic generality is expressed by means of variables and quantifiers, and each interpretation of the system is based upon the choice of some class over which the variables may range, the only restriction placed on this ‘domain of individuals’ being that it should not be empty.


1999 ◽  
Vol 9 (4) ◽  
pp. 335-359 ◽  
Author(s):  
HERMAN GEUVERS ◽  
ERIK BARENDSEN

We look at two different ways of interpreting logic in the dependent type system λP. The first is by a direct formulas-as-types interpretation à la Howard where the logical derivation rules are mapped to derivation rules in the type system. The second is by viewing λP as a Logical Framework, following Harper et al. (1987) and Harper et al. (1993). The type system is then used as the meta-language in which various logics can be coded.We give a (brief) overview of known (syntactical) results about λP. Then we discuss two issues in some more detail. The first is the completeness of the formulas-as-types embedding of minimal first-order predicate logic into λP. This is a remarkably complicated issue, a first proof of which appeared in Geuvers (1993), following ideas in Barendsen and Geuvers (1989) and Swaen (1989). The second issue is the minimality of λP as a logical framework. We will show that some of the rules are actually superfluous (even though they contribute nicely to the generality of the presentation of λP).At the same time we will attempt to provide a gentle introduction to λP and its various aspects and we will try to use little inside knowledge.


1986 ◽  
pp. 155-183
Author(s):  
Igor Aleksander ◽  
Henri Farreny ◽  
Malik Ghallab

2013 ◽  
Vol 78 (3) ◽  
pp. 837-872 ◽  
Author(s):  
Łukasz Czajka

AbstractWe show a model construction for a system of higher-order illative combinatory logic thus establishing its strong consistency. We also use a variant of this construction to provide a complete embedding of first-order intuitionistic predicate logic with second-order propositional quantifiers into the system of Barendregt, Bunder and Dekkers, which gives a partial answer to a question posed by these authors.


Author(s):  
Scott C. Chase

AbstractThe combination of the paradigms of shape algebras and predicate logic representations, used in a new method for describing designs, is presented. First-order predicate logic provides a natural, intuitive way of representing shapes and spatial relations in the development of complete computer systems for reasoning about designs. Shape algebraic formalisms have advantages over more traditional representations of geometric objects. Here we illustrate the definition of a large set of high-level design relations from a small set of simple structures and spatial relations, with examples from the domains of geographic information systems and architecture.


1992 ◽  
Vol 71 (3_suppl) ◽  
pp. 1091-1104 ◽  
Author(s):  
Peter E. Langford ◽  
Robert Hunting

480 adolescents and young adults between the ages of 12 and 29 years participated in an experiment in which they were asked to evaluate hypotheses from quantified first-order predicate logic specifying that certain classes of event were necessarily, possibly, or certainly not included within a universe of discourse. Results were used to test a two-stage model of performance on hypothesis evaluation tasks that originated in work on the evaluation of conditionals. The two-stage model, unlike others available, successfully predicted the range of patterns of reply observed. In dealing with very simple hypotheses subjects in this age range tended not to make use of alternative hypotheses unless these were explicitly or implicitly suggested to them by the task. This tells against complexity of hypothesis as an explanation of the reluctance to use alternative hypotheses in evaluating standard conditionals.


Sign in / Sign up

Export Citation Format

Share Document