Browsing by Author "Spencer, Bruce"
Now showing 1 - 14 of 14
Results Per Page
Sort Options
Item Ancestor Reduction in Binary Resolution TreesSpencer, Bruce; Horton, Joseph, D.It is shown how the operation of ancestor reduction, found in tableaux calculi, can be applied to the resolution calculus. An efficient algorithm tells how to reorder some of the previous resolution steps in a binary resolution tree, and thus enables an additional factoring step that removes a literal from a clause. This ancestor reduction operation has a small search space, constrained by the size of the proof tree. If the additional search is successful, the new operation can eliminate literals and reduce the overall time to find a proof. A calculus is defined combining atom (literal) ordered resolution and ancestor reduction. Atom ordered resolution restricts search, but may generate large proofs for some atom orderings. This proposed theorem prover is slightly less restrictive in that an additional search for some ancestor reductions is required, but it generates smaller proof trees under many atom orderings.Item Assimilation in Plan Recognition via Truth Maintenance with Reduced Redundancy(1990) Spencer, BrucePlan recognition is the recognition of an agent's plan from the agent's actions. Recognizing the agent's plan can help predict the agent's next action; in a natural language setting it can help to form a cooperative response. Most artificial intelligence research into plan recognition relies on a complete set of pre-stored plans, a form of the closed world assumption. Faced with a novel plan, these systems simply fail. Our approach for giving up this assumption entails (1) providing new planning information on demand, and (2) incorporating the new information into the candidates that are proposed as the agent's current plan. We focus on task (2). Most plan recognition settings require timely responses. So as new information is provided, the candidates should be be repaired rather than recalculated. We found that existing truth maintenance systems, such as de Kleer's ATMS, were unsuitable for candidate repair. They introduce extra search and redundancy to handle the disjunctions that arise in plan recognition. We provide a refinement of linear resolution that reduces redundancy in general. Based on this refinement we provide a new truth maintenance system that does not introduce extra search or redundancy. We then use this truth maintenance system as the basis for a plan recognition system which incorporates novel information through candidate repair.Item Bottom up procedures to construct each minimal clause tree once(1997) Horton, J., D.; Spencer, BruceBottom up procedures are introduced for constructing only minimal clause trees. Two general methods are proposed. The first performs surgery on each non-minimal clause tree; the second ignores non-minimal clause trees and retains only minimal ones. By ranking open leaves and activating/deactivating leaves according to the ranking, the second method constructs each minimal clause tree exactly once. Methods for incorporating subsumption are described and a prototype system for propositional logic is compared against OTTER.Item Bottom-up procedures for minimal clause trees(1996) Horton, J., D.; Spencer, BruceClause trees provide a basis for reasoning procedures .that use binary resolution. In this paper binary resolution trees are used as the data structure for implementing these procedures. Several properties of these procedures are explored: size, stages, minimality, disequalities, activity and rank. Procedures for uniquely building all minimal clause trees are introduced. Ordinary subsumption within these procedures does not preserve completeness, but contracting subsumption, a new restricted form of subsumption that relates to clause trees, can be integrated. The minimal restriction and unique construction address the problem of redundancy with binary resolution.Item Clause Trees and Factor Paths(1994) Horton, J. D.; Spencer, BruceThe concept of a clause is generalized to that of a clause tree which shows how the clause can be proved from a set of input clauses. Procedures are provided to find factorizations and tautologies at build time using the internal structure of the tree. Model Elimination and SLI are specializations of this technique. Other resolution-based proof procedures, including bottom-up ones, could include these concepts to make better procedures. One example top-down procedure, ALPOC, has been implemented as part of Stickel's PTTP system. Content areas: automated reasoning, theorem proving, disjunctive logic programmingItem Clause trees: a tool for understanding and implementing resolution in automated reasoning(1995) Horton, J., D.; Spencer, BruceA new methodology/data structure, the clause tree, is developed for automated reasoning based on resolution in first order logic. A clause tree T on a set S of clauses is a 4-tuple , where N is a set of nodes, divided into clause nodes and atom nodes, E is a set of edges, each of which joins a clause node to an atom node, L is a labeling of NuE which assigns to each clause node a clause of S, to each atom node an instance of an atom of some clause of S, and to each edge either+ or-. An instance of a clause C in S is represented by an elementary clause tree which consists of a single clause node labeled by S and joined to atom nodes labeled by the atoms of the instance. The edge joining a clause node to an atom node is labeled by the sign of the corresponding literal in the clause. A resolution is represented by unifying two atom nodes of different clause trees which represent complementary literals. In T an atom node can be adjacent either to one clause node, or to two clause nodes if it has been resolved on. The merge of two identical literals is represented by placing the path joining the two corresponding atom nodes that are leaves of the tree into the set M of chosen merge paths. The tail of the merge path becomes a closed leaf, while the head remains an open leaf which can be resolved on. The clause cl(T) that T represents, is the set of literals corresponding to the labels of the open leaves modified by the signs of the incident edges. The fundamental purpose of a clause tree T is to show that cl(T) can be derived from S using resolution. Loveland's model elimination ME, the selected literal procedure SL, and Shostak's graph construction procedure GC are explained in a unified manner using clause trees. The condition required for choosing a merge path whose head is not a leaf is given. This allows a clause tree to be built in one way (the build time) but justified as a proof in another (the proof time). The ordered clause set restriction and the foothold score restriction are explained using the operation on clause trees of merge path reversal. A new procedure called ALPOC, which combines ideas from ME, GC and OC to form a new procedure tighter than any of the top down procedures above, is developed and shown to be sound and complete. Another operation on clause trees called surgery is defined, and used to define a minimal clause tree. Any non-minimal clause tree can be reduced to a minimal clause tree using surgery, thereby showing that non-minimal clause trees are redundant. A sound procedure MinALPOC that produces only minimal clause trees is given, but whether it is complete is not known. Mergeless clause trees are shown to be equivalent to each of linear resolution, unit resolution and relative Horn sets, thereby giving short proofs of some known results. Many other new proof procedures using clause trees are discussed briefly, leaving many open questions.Item Combining Rank/Activity with Set of Support, Hyperresolution and SubsumptionHorton, J., D.; Spencer, BruceThe rank/activity (R/A) restriction of binary resolution specifies that literals in clauses have ranks, and are either active (can be resolved) or inactive. When a literal is resolved, literals in the new clause, that were of lower rank in the parent clause than the resolved literal, become inactive in the child clause. Literals that merge or factor with literals from other clauses become active. Two derivations are equivalent if they consist of the same resolutions, possibly in a different order, but resulting in the same clause. It is known that with R/A, no two equivalent derivations are found (uniqueness), but that one derivation from each equivalence class is found. In !Jlls paper, R/A is combined with the set of support strategy, with hyperresolution, and also with subsumption. In the first two cases, both completeness and uniqueness are n;taintained. In the subsumption case, completeness and uniqueness can be retained if full rank/activity is kept, but subsumption of small clauses by bigger clauses is not done. With full subsumption, and a restricted version of R/A, completeness is kept but uniqueness is lost.Item Disjunctive deductive databases(1996) Obimbo, Charlie; Spencer, BruceItem Efficient classification of complex ontologies(University of New Brunswick, 2015) Song, Weihong; Du, Weichang; Spencer, BruceDescription logics (DLs) are knowledge representation languages that provide the theoretical underpinning for modern ontology languages such as OWL and serve as the basis for the development of ontology reasoners. The task of ontology classification is to compute the subsumption relationships between all pairs of atomic concepts in an ontology, which is the foundation for other ontology reasoning problems. There are two types of mainstream reasoners to perform ontology classification: 1) tableau-based reasoners usually support very expressive DLs, but they may not be efficient for large and highly cyclic ontologies. 2) consequence-based reasones are typically significantly faster than tableau-based reasoners, but they support less expressive DLs. It is difficult to extend the consequence-based reasoners directly to support more expressive DLs. In the present thesis, we propose a weakening and strengthening based approach for ontology classification, which aims to extend the capability of an efficient reasoner for a less expressive base language Lb (Lb-reasoner) to support a more expressive language. The approach approximates the target ontology by a weakened version and a strengthened version in Lb. Their subsumptions are a subset and a superset of the subsumptions of the target ontology. There are two cases: (1) the subsumptions of the strengthened ontology are the same as that of the target ontology; (2) there may be more subsumptions in the strengthened ontology, which is therefore unsound. In case (1) which we call soundness-preserved strengthening, we classify only the strengthened ontology with the Lb-reasoner to get the final classification results. In case (2) which we call soundness-relaxed strengthening, a hybrid approach is employed – we first classify both the weakened and strengthened ontologies with the Lb-reasoner, and then use a full-fledged (hyper)tableau-based assistant reasoner to check whether the subsumptions implied by the strengthened ontology are also implied by the target ontology. We first study the general principles to apply weakening and strengthening to extend an Lb-reasoner for a DL language that has one more constructor than Lb, i.e., single extension. Then we study the combination of several single extensions for multiple extended constructors for the reasoner, i.e., multiple extension. Based on the general principles, we investigate two single extensions from the ALCH description language to ALCH(D)¯ and ALCHI with soundness-preserved strengthening, as well as a single extension from ALCH to ALCHO with soundness-relaxed strengthening. Then, we show how to combine them into multiple extensions from ALCH to ALCHI(D)¯, ALCHOI, ALCHO(D)¯, and ALCHOI(D)¯. The soundness and completeness of all the single and multiple extensions are proved. We also develop a prototype ALCHOI(D)¯-reasoner, WSClassifier, based on the proposed approach. We experiment with and evaluate WSClassifier on large and highly cyclic real-world ontologies such as FMA and Galen ontologies, the required languages of which are beyond the capability of the current efficient consequence-based reasoners. The experiments show that on most of these ontologies, WSClassifier outperforms or significantly outperforms conventional (hyper)tableau-based reasoners.Item Experiments with the ALPOC theorem prover(1995) Spencer, Bruce; Horton, J., D.; Francis, KelseyA system for selecting and preparing a batch of data files, and running a selected computer program with each data file is described. Another facility collects the experimental results from several different batches and summarizes the results in a tabular form. A theorem prover, ALPOC, is implemented that combines Shostak's C-literal resolution steps with Stickel's PTTP compiler, and uses Spencer's Ordered Clause set restriction. A series of experiments is run that compares ALPOC with PTTP, using the batch system. The results are summarized and compared with Stickel's PTTP implemented in Prolog. The results show that ALPOC is slower than PTTP by a factor of at most 4, but frequently is much faster. On the problems where ALPOC is faster the number of extension steps in the ALPOC proof is less than the PTTP proof, which leads the iterative deepening search method to explore fewer levels.Item Merge Path Improvements for Minimal Model Hyper Tableaux(1998) Baumgartner, Peter; Horton, J. D.; Spencer, BruceWe combine techniques originally developed for refutational first-order theorem proving within the clause tree framework with techniques for minimal model computation developed within the hyper tableau framework. This combination generalizes well-known tableaux teclmiques like complement splitting and foldingup/down. We argue that this combination allows for efficiency improvements over previous, related methods. It is motivated by application to diagnosis tasks; in particular the problem of avoiding redundancies in the diagnoses of electrical circuits with reconvergent fanouts is addressed by the new technique. In the paper we develop as our main contribution in a more general way a sound and complete calculus for propositional circumscriptive reasoning in the presence of minized and varying predicates.Item Reducing search with minimal clause trees(1995) Horton, J., D.; Spencer, BruceThe smallest and most efficient resolution based proof of a theorem is represented by a minimal clause tree, proposed by Horton and Spencer. A clause tree T is minimal if and only if it contains no legal tautology paths and no legal unchosen merge paths. A characterization of minimal clause trees is given in terms of derivations. A subsumption relationship between resolution based procedures is defined and all such procedures are shown to be subsumed by clause tree procedures. Three classes of procedures are proposed which produce only minimal clause trees. The first performs surgery on any clause tree that is produced to reduce it to a minimal clause tree. The second avoids any resolution that produces a clause tree on which surgery can be performed, but leaf to internal node paths are allowed to be chosen. The third, and most restrictive, avoids any resolution that produces a non-minimal clause tree. Keywords: theorem proving, reasoning (resolution), search.Item Semi-supervised Bayesian learning(University of New Brunswick, 2012) Guo, Yuanyuan; Zhang, Huajie; Spencer, BruceIn traditional supervised learning, classifiers are learned from a number of training data examples where each data example has a label showing the category that the example falls into. Bayesian network classifiers, representing the joint probability and the conditional independencies among a set of random variables, have been widely used in traditional supervised classification. As a special case of Bayesian Networks, Naive Bayes has also been popularly used in many applications such as text classification. However, in many real-world machine learning applications, it may be expensive or time consuming to obtain sufficient labeled data because it needs human efforts to categorize them. On the other hand, it is easier to collect a large amount of unlabeled data. Learning from a small number of labeled data may not generate good classifiers. Semi-supervised learning is one method to deal with the problem by utilizing the unlabeled data to help learn better classifiers. Although various semi-supervised methods have been studied by researchers, there are still some problems remaining unknown in semi-supervised learning. One of the problems is the performance of different Bayesian network classifiers in semi-supervised learning scenario. An extensive study is needed to get a general picture of the performance of these methods. A second problem is their application on cost-sensitive learning where misclassifying different classes are associated with unequal costs. For example, misclassifying a person with cancer as healthy could be more serious than misclassifying a healthy person as cancerous. Particularly, the classification performance may degrade when the datasets have skewed class distributions. The insufficiency of labeled data makes it more difficult to build a good Bayesian classifier to identify classes having different misclassification costs. Specific techniques are required to make optimal cost-sensitive classification decisions. In supervised learning, many techniques have been applied to solve the cost-sensitive problem. However, only a little work has been conducted in semi-supervised learning and many questions remain unclear in the research area. A third problem is to learn Bayesian network structures in semi-supervised learning scenario. Sufficient training data examples are needed to learn a good Bayesian network structure. It is a big challenge to learn good Bayesian networks when the labeled data is scarce. This thesis firstly systematically studies the performance of several commonly used semi-supervised learning methods, when different Bayesian network classifiers are used as the underlying classifiers. Several novel and interesting observations are found from the experiments and the performance analysis. Motivated by the observations, an instance selection method is thus presented to improve the classification accuracy when using Naive Bayes in self-training and co-training methods. The presented method can roughly prevent adding errors in the training data so as to preventing the degradation of accuracy. Then, three cost-sensitive semi-supervised learning methods are presented to improve the performance of Naive Bayes when the labeled data is scarce and different misclassification errors are associated with different costs. Misclassification costs are incorporated into the three methods to generate cost-sensitive classifiers. Experimental results show that, the new methods can obtain lower total misclassification costs than the corresponding opponents. Finally, a new method is designed to learn Bayesian network classifiers with more complex graph structures for classification in semi-supervised learning. The experiments show that, the proposed method achieves higher accuracy while obtaining better Bayesian network structures to represent the relationship among the variables.Item Support Ordered ResolutionSpencer, Bruce; Horton, J., D.In a binary tree representation of a binary resolution proof, rotating some tree edge reorders two adjacent resolution steps. When rotation is not permitted to disturb factoring, and thus does not change the size of the tree, it is invertible and defines an equivalence relation on proof trees. When one resolution step is performed later than another after every sequence of such rotations, we say that resolution supports the other. For a given ordering on atoms, or on atom occurrences, a support ordered proof orders its resolution steps so that the atoms are resolved consistently with the given order without violating the support relation between nodes. Any proof, including the smallest proof tree, can be converted to a support ordered proof by rotations. For a total order, the support ordered proof is unique, up to rotation equivalence. The support ordered proof is also a rank/activity proof where atom occurrences are ranked in the given order. Procedures intermediate between ordered resolution and support ordered resolution are considered. One of these, 1-weak support ordered resolution, allows a resolution on a non-maximal literal only if it is immediately followed by both a factoring and a resolution on some greater literal. In a constrained experiment where ordered resolution solves only six of 408 TPTP problems with difficultly between 0.11 and 0.56, 1-weak support ordered resolution solves 75.