1 / 1

Experiments were carried out on Prairiefire.unl

Empirical Comparison of Preprocessing and Lookahead Techniques for Binary Constraint Satisfaction Problems. Zheying Jane Yang & Berthe Y. Choueiry. Constraint Systems Laboratory • Department of Computer Science and Engineering, University of Nebraska-Lincoln • {zyang|choueiry}@cse.unl.edu.

bertha
Download Presentation

Experiments were carried out on Prairiefire.unl

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Empirical Comparison of Preprocessing and Lookahead Techniques for Binary Constraint Satisfaction Problems Zheying Jane Yang & Berthe Y. Choueiry Constraint Systems Laboratory • Department of Computer Science and Engineering, University of Nebraska-Lincoln • {zyang|choueiry}@cse.unl.edu • Constraint Satisfaction Problems (CSPs) are in general NP-complete. They are usually solved using backtrack (BT) search, which is an exponential-time procedure. Constraint Processing explores several approaches to improve the performance of BT search, such as: • Constraint propagation as a preprocessing stepto search filters the CSP and reduces its size without losing solutions. • Lookahead techniques intertwine constraint propagation with search. • Ordering heuristics during search (i.e., Dynamic Variable Ordering (DVO) to direct search as quickly as possible towards a solution. • Several propagation & lookahead techniques are known in the literature. • The goal of this research is to verify empirically the results claimed by the community and find the best combinations on randomly generated problems. • Our results disprove some widely believed but imprecise claims. We make explicit the rare conditions under which AC2001 [1] (a preprocessing technique) outperforms AC3 [2], and Maintaining Arc Consistency (MAC) [3] (a lookahead strategy) outperforms Forward-Checking (FC) [4]. We also show that the cost of Neighborhood Inverse Consistency (NIC) [5] (NIC) is prohibitive. AC3 AC2001 NIC  Search Preprocessing FC_DVO MAC_DVO Abstract Results:preprocessing Figure1 illustrates the cost the preprocessing techniques by showing the number of constraints checks for filtering the CSP. • Although not shown here, AC2001 is more costly that AC3 in terms of CPU time. • NIC is consistently more costly (in terms of constraint checks and CPU time) than any arc-consistency algorithm (i.e., both AC3 & AC2001). • For high values of constraint probabilities, the cost of NIC becomes prohibitive. • For low values of constraint probability (i.e., sparse CSPs), NIC is preferable to any arc-consistency algorithm (i.e., both AC3 & AC2001). Definitions • A CSP is defined as a set of variables and their domain and a set of constraint that restrict the acceptable combinations of values to variables. A solution is an assignment that satisfies all constraints. We focus on finite and binary CSPs. • Arc consistency: arc (Vi, Vj) is arc consistent if xDi yDj such that (x,y) is allowed by constraint Cij, and vice versa. • Constraint propagation:arc consistency is achieved by repeatedly deleting every value from each domain Di that fails this condition, until quiescence. • Neighborhood inverse consistency (NIC): Ensures the a given value for a variable appears in at least one solution of the sub-problem induced by the variable and its neighborhood. Search is required to achieve NIC. We test Forward Checking (FC) with DVO and Maintaining Arc Consistency (MAC) with DVO. • Forward Checking (FC): A partial lookahead strategy that ensures arc-consistency between a current variable and each of the future variables with which it shares a constraint. • Maintaining Arc Consistency (MAC): A full lookahead strategy that ensures arc-consistency among all future variables. Figure 2 illustrates the effectiveness of the preprocessing techniques tested by showing the CSP size after filtering. NIC reduces the search space significantly more than any arc-consistency algorithm (i.e., both AC3 & AC2001). Preprocessing We test 5 preprocessing (PPi) aglorithms: We test 2 lookahead strategies: Results: search • PP1 = AC3 • PP2 = AC2001 • PP3 = NIC_FC_DVO • PP4 = NIC_(MAC_DVO_AC3) • PP5 = NIC_(MAC_DVO_AC2001) • Forward checking • Maintaining Arc Consistency Figure 3 illustrates the cost of solving the CSP using the hybrid search strategies tested We order the variables dynamically according to the least domain heuristic. Search strategies We generate seven hybrid search strategies (Sj), combining the above 5 preprocessing algorithms, 2 lookahead strategies and a dynamic variable ordering heuristic (DVO): • S1 = PP1 + FC_DVO • S2 = PP1 + (MAC_DVO_AC3) • S3 = PP2 + FC • S4 = PP2 + (MAC_DVO_AC2001) • S5 = PP3 + FC_DVO • S6 = PP4 + (MAC_DVO_AC3) • S7 = PP5 + (MAC_DVO_AC2001) Table 1, Search strategic tested • FC-based hybrids outperform MAC-based hybrids, regardless of the techniques used in preprocessing. • MAC-based algorithms perform consistently poorly where it matters most, around the peak of the phase transition. Experiments We designed a generator for random binary CSPs that guarantees the existence of at least one solution. We generated over 3000 problem instances. All problems have 50 variables and a domain size of 10. We vary of constraint probability between 0.05 and 0.3. For each probability value, we vary constraint tightness between 0.05 and 0.95. We measured the number of constraint checks and the CPU time. We averaged the results over 30 instances per point. Lessons • Unless the constraint graph is very sparse and the constraints are very loose, we should use FC and avoid MAC. • When constraint checks is a cheap operation and the constraint graph is sparse, we should use NIC, otherwise, FC is sufficient. • The best overall combination is strategy S5: • For preprocessing: NIC combined with FC • For search: FC combined with DVO References [1] C. Bessière & J.-Ch. Régin, Refining the Basic Constraint Propagation Algorithm, pages 309-315, IJCAI 2001. [2] A.K. Mackworth and E.C. Freuder. The complexity of some polynomial network consistency algorithms for constraint satisfaction problems, pages 65-74, Artificial Intelligence, 1985. [3] D, Sabin and E.C. Freuder. Contradicting conventional wisdom in constraint satisfaction, pages 125-129, ECAI94. [4] R.M. Haralick and G.l. Elliott. Increasing tree search efficiency for constraint satisfaction problem, pages 268-277, Artificial Intelligence, 1980. [5] E.C. Freuder & C. Elfe, Neighborhood Inverse Consistency Preprocessing, pages 201-208, AAAI 1996. Acknowledgment Experiments were carried out on Prairiefire.unl.edu

More Related