4. Problemlösen und Suche 4.1. Intelligente Suche - PDF

Description
4. Problemlösen und 4.1. Vorlesung ssysteme Wintersemester 2004/2005 Prof. Dr. Bernhard Thalheim Systems Engineering Group Computer Science Institute Kiel University, Germany Vorlesungsprogramm ist das

Please download to get full document.

View again

of 44
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Information
Category:

Humor

Publish on:

Views: 30 | Pages: 44

Extension: PDF | Download: 0

Share
Transcript
4. Problemlösen und 4.1. Vorlesung ssysteme Wintersemester 2004/2005 Prof. Dr. Bernhard Thalheim Systems Engineering Group Computer Science Institute Kiel University, Germany Vorlesungsprogramm ist das grundlegende Problem IS.12 Search spaces find good algorithms based on appropriate data structures Reduction for higher feasibility Search problem in general towards a unified theory Techniques classical techniques of searching Optimization intelligent search is informed heuristic search Summarizing IS.12 Search Space To solve a problem, we must formalize them in a precise way. States - descriptions of the world Operators - actions that transform one state into another Given: initial state S and a set of goals G = {G} Goal: sequence of operator applications from S to G G Different conceptualizations of states, goals and operators: Example: Water jug problem given: 2 jugs (3 and 4 l) and a pump question: how do you get exactly two liters into the four liter jug? Step 1: choose a representation for the state space (x, y) with 0 x y, 0 y 3 Step 2: represent operators, that is, transition rules between spaces, 1. (x, y), x 4 (4, y) fill x 2. (x, y), y 3 (x, 3) fill y 3. (x, y), x 0 (0, y) dump x 4. (x, y), y 0 (x, 0) dump y 5. (x, y), x + y 4, y 0 (4, y (4 x)) pour from y to x until x is full 6. (x, y), x + y 3, x 0 (x (3 y), 3) pour from x to y until y is full 7. (x, y), x + y 4, y 0 (x + y, 0) pour all water from y to x... One solution fill y; pour from y to x; fill y; pour from y to x; dump x; pour all from y to x (0, 0) 2 (0, 3) 7 (3, 0), 2 (3, 3) 5 (4, 2) 3 (0, 2) 7 (2, 0) Solution through production system IS.12 (1) a set of rules, each with an applicability condition and a description of the action to be taken how to move from one state to another (2) database with appropriate information (3) control strategy decides which rules to use in a given situation it should cause motion and be systematic often by heuristic control strategies Rich: A heuristic is a technique that improves efficiency of a search process, possibly by sacrificing claims of completeness. Often solution by Horn formulas on a database consisting of facts (atomar formulas) Problem characteristics: (1) decomposable (2) solution steps can be ignored or undone (3) predictable universe (4) is a good solution immediately clear, or is it relative good one compared with other solutions? (5) is the information base internally consistent? (1) monotonic Types of production systems if a and b could be applied, applying a doesn t keep b from being applied IS.12 (2) partially commutative, Church-Rosser, or confluent if a; b; c transforms X to Y then b; a; c also transforms X to Y (if b; a; c is legal ) (3) commutative: monotonic and partially commutative monotonic: good for ignorable problems, e.g., Theorem proving non-monotonic, reversible: order doesn t matter, e.g., 8-puzzle non-monotonic, irreversible: e.g., chemical synthesis Problems can be represented by states and transitions that move between states. A solution to the problem is a transition sequence between initial and goal states. (0,0) Water jug problem (4,0) (0,3) (4,3) (0,0) (1,3) (4,3) (0,0) (3,0)...... Reduction of the search space for resolution strategies Deletion strategies (1) IS.12 eliminate clauses before they are ever used Pure literal elimination: remove any clause containing a pure literal (a literal that has no complementary) instance in the data base: { P Q R, P S, Q S, P, Q, R} S is a pure literal, for purposes of refutation S-clauses won t help Tautology elimination: eliminate clauses that contain complementary literals - tautologies {P (G(a)) P (G(a)), P (x) Q(y) R(z) Q(y)} Subsumption elimination: a clause α subsumes a clause β if there is a substitution σ such that β σ(α) {P (x) Q(y)} subsumes {P (A) Q(v) R(w)} for σ = {x/a, y/v} delete β (for refutation α is necessary to reduce) Reduction of the search space for resolution strategies Deletion strategies (2) IS.12 Unit resolution: at least one of the clauses being resolved at every step is a unit clause, i.e. one containing a single literal resolvent contains only as much disjuncts as length of the second - 1 Unit resolution is not refutation complete {p q, p q, p q, p q}! A Horn clause is a clause with at most one positive literal. There is a unit resolution of a set of Horn clauses if and only if that set is unsatisfiable. Linear Resolution: At least one of the clauses being resolved at every step is either in the initial database or is an ancestor of the other clause, p q, p = q, q, q = 0 Linear resolution is refutation complete. Input resolution: At least one of the clauses being resolved at every step is a member of the initial (i.e., input) database. Input refutation is complete for Horn clauses but incomplete in general. IS.12 Reduction of the search space for resolution strategies Orderung & reduction strategies (1) restrict derivations to a certain extent Background: α β,β δ α δ α (α 0) α (1 α) Ordering of clauses: complete extension that is also complete: linear resolution with ordered clauses Ordered resolution: Each clause is treated as a linearly ordered set. Resolution is permitted only on the first literal of each clause. Literals in the conclusive preserve parent clauses order - positive parent clauses followed by negative parent clauses. Refutation by ordered resolution is complete for Horn clauses but incomplete in general (just like unit resolution and input resolution). IS.12 Reduction of the search space for resolution strategies Orderung & reduction strategies(2) restrict derivations to a certain extent Directed resolution: Use of ordered resolution on a database of direction clauses - i.e., Horn clauses with any positive literal either at the beginning or the end of the clause. α 1... α n β α 1... α n β β α 1... α n β α 1... α n ( α 1... α n ) ( α 1... α n ) ( α 1... α n ) 1. M(x) P (x) M(x) P (x) Example: (forward) Same problem in backward resolution directed 2. M(a) M(a) 3. P (z) P (z) 4. P (a) P (a)(m(a), M(x) P (x)) P (x) M(x) P (x) M(x) 2. M(a) M(a) 3. P (z) P (z) 4. M(z) M(z)( P (z) M(z)) ( M(z), M(a)) IS.12 Reduction of the search space for resolution strategies Orderung & reduction strategies(3) restrict derivations to a certain extent Lock resolution: an arbitrary number is associated with each literal in a clause; any application of resolution must be upon literals of the least index in each clause; literals in resolvents inherit the number of parents; duplicate removal for higher numbers Example: {p q, p q, p q, q p} { 1 p 2 q, 2 p 1 q, 1 p 2 q, 1 q 2 p} 1. 1 p 2 p 2. 2p 1q 3. 1p 2 q 4. 1q 2 p 5. 2 q (but not allowed) now there are two cuts possible: 2.+. and 4.+. (and 2.+4.) 6. 2p p Lock resolution is refutation complete. Reduction of the search space for resolution strategies Semantical concepts (1) IS.12 idea: divide and conquer; divide the set of clauses into a positive and a negative part so that resolution is only allowed between opposite parts Semantical clauses: Instead of deriving sequentially formula after formula we can derive it in one step {p, q, p q α} = α Semantical clash + Ordered Resolution: Hyperresolution Reduction of the search space for resolution strategies Semantical concepts (2) IS.12 Set of support resolution: a subset M of a set N is called set of support for N if N \ M is satisfiable; at least one of the clauses being resolved at every step is selected from a set of support M {p, p q, p q r, r} first three are satisfiable M = { r} resolution: (3) (4) p q (5) added to M (2) (5) p (6) added to M (1) (6) 0 Set of support: refutation is complete. Often M is chosen to be the clauses derived from a negated goal (if the initial database is satisfiable). Can be seen as working backwards from the goal. Reduction of the search space for resolution strategies Connection methods (1) IS.12 Connection graph procedure: representing a set of clauses in a graph (nodes are clauses, edges are assigned if there is a unification - labels which are assigned: most general unifier) (1) delete all edges which resolvents gives a tautology (2) delete all nodes which haven t an outgoing edge (3) choose an edges and change the graph by replacing the two nodes by its resolvent and changing the other mgu in accordance Given {Q(b), Q(z) R(z), Q(a), P (x, f(x)), R(y) P (y, f(y))} Q(b) z/b Q(z) R(z) z/y R(y) P (y, f(y)) Q(z) R(z) z/a Q(a) reduction graph for the set: {Q(b), Q(z) R(z), Q(a), R(x)} reduction graph for the set: {R(b), R(x)} Q(z) R(z) has two edges to Q(a); one these could be deleted In some case a subgraph is derived together with the 0. The connection-graph method is refutation complete. Reduction of the search space for resolution strategies Connection methods (1) IS.12 Connection-matrix method: rewrite the set of clauses by a matrix L 1,1 L 1,2... L 1,m1 L 2,1 L 2,2... L 2,m2... L n,1 L n,2... L n,mn then construct a vertical path through the matrix if there are two opposite literals in the path then this path is complementary if all vertical pathes are complementary then the matrix is complementary α is unsatisfiable if M α is complementary extending for FOPL formulas path is potentially complementary if there is at least one pair of potentially complementary literals on it matrix is refutable if each path is potentially complementary and the mgu of at least one pair of potentially complementary literals on each path is compatible with the substitutions computed for other paths using Herbrand universes for the replacement of original rows by instances of them there is a variant for the proof of tautologies using a similar method for disjunctive normal forms The Search Problem Parameters of the search problem IS.12 (1) direction (2) topology (3) node representation (4) selecting rules (5) heuristic functions Direction: search forward from I to G or backward from G to I. The same rules are used in either direction, but the way we use them changes. Sometimes it is easier to search forward, and sometimes it is easier to search backward. (1) More start or goal states? We want to move towards the larger set. (2) Branching factor? We want to move in the direction with lower branching factor. (3) Need to interact with human and explain reasoning? One direction may be more natural than the other. Bi-directional search is also possible... The Search Problem Parameters of the search problem IS.12 Topology: graphs versus trees Representation of nodes: 1. How are the objects and facts represented? (ontology) 2. How is a complete state represented? 3. How are sequences of states represented? Frame problem When moving between states, what facts change, and what facts don t change? Selecting rules: 1. Efficient hashing (indexing) schemes 2. Matching with variables 3. Building too much into the rules, so matches are automatic 4. Resolving conflicts (e.g., choose most specific rule) Heuristic functions: Assigns to each state a measure of desirability Helps guide search, but not used in every search technique. / Graphs / graphs tree with nodes representing the objects which branches are interpreted as OR-choices IS.12 Example: traveling salesman Representation by a tree through full unfolding / graphs: for better representation of some problems a solution to an AND/OR graph is a subgraph all of whose leaf nodes are in the goal set can be used to represent grammars (CF-grammars are productions); the resulting parse tree is an AND/OR graph Problem: Lack of independence. In And/Or graphs, individual paths from node to node cannot be considered in isolation from other paths through other nodes (connected to the first by an AND arc). Longer paths may be better (In OR graphs this is never true.). Search techniques: Depth-first-search Dive into the search tree as far as you can, backing up only when you reach the end of the tree IS.12 see OR-tree (1) Form a one-element queue consisting of the root node (2) Until queue is empty or goal has been reached, determine if first element in queue is the goal If it is, stop. If not, remove it and add its children (if any) to the front of the queue (3) If goal is found, announce success; if not, failure. Depth-first search can be very inefficient, e.g., goal = second child of the root 0(b d )-time complexity and 0(d)-space complexity for search spaces represented by trees of depth d and breadth d; in the bidirectional variant still 0(b d/2 )-time Search techniques: Hill climbing using a heuristic function, make the next best local move OR-tree above (any function or an enriched distance function; below: arbitrary function) IS.12! heuristic is a rule of thumb or judgemental technique that leads to a solution some of the time but provides no guarantee of success; it may in fact end in failure; however play an important role in search strategies because of the exponential nature of most problems; they help to reduce the number of alternatives from an exponential number to a polynomial number and, thereby, obtain a solution in a tolerable amount of time; example: traveling salesman problem - take the nearest unvisited neighbor requires only 0(n 2 ); heuristic search functions - informed search Search techniques: Hill climbing (1) Form a one-element queue consisting of the root node IS.12 (2) Until queue is empty or goal has been reached, determine if first element in queue is the goal (a) If it is, stop. (b) If not, remove it then sort the first element s children, if any, by estimating remaining distance, and add them to the front of the queue. (3) If goal is found, announce success; if not, failure. Problems with hill climbing: foothills (local maxima or peaks) (the children have less promising goal distances than the parent node; no indication of goal direction) Plateaus(flat areas) (all neighboring nodes have the same value) Ridges (e.g. snow flake trees) (several adjoining nodes have higher values than surrounding nodes) Search techniques: Breadth-first search (1) Form a one-element queue consisting of the root node IS.12 (2) Until queue is empty or goal has been reached, determine if first element in queue is the goal (a) If it is, stop. (b) If not, remove it, and add its children to the back of the queue (3) If goal is found, announce success; if not, failure. Breadth-first search can be expensive (in time and space). 0(b d ) - time and space complexity for search spaces represented by trees of depth d and breadth d Search techniques: Beam search like breath-first search, but use a heuristic function. IS.12 Proceed level by level, but only expand best k nodes at each level, ignoring the rest. Search techniques: Best-first search Press forward from the best unexpanded node so far, regardless of where it is in the tree, IS.12 (1) Form a one-element queue consisting of the root node (2) Until queue is empty or goal has been reached, determine if first element in queue is the goal (a) If it is, stop. (b) If not, remove it and add its children (if any) to the queue. Sort the entire queue by estimation remaining distance. (3) If goal is found, announce success; if not, failure. Problem: heavily depends from the heuristic function Best-first searches will always find good paths to a goal, even when local anomalies are encountered. All that is required is that a good measure of goal distance be used. Some informal criteria for search techniques Depth-first: when blind alleys aren t too deep IS.12 Breath-first: when the branching factor isn t too big Hill-climbing: when a good heuristic function exists, and local choices lead to final goal Beam-search: when a good heuristic function exists, and goodlooking partial paths at each level lead to a goal Best-first: when a good heuristic function exists, and a good path may look bad at shallow levels. Some search methods give us an optimal path to a goal, others give us just some path. British-Museum procedure Exhaustive search IS.12 Optimal search techniques: Branch and bound Trick: paying attention to path length always expand the shortest path so far no heuristic is used, just the actual elapsed distance still optimal More interesting heuristic function: g - distance traveled so far h - estimated distance remaining new heuristic function f = g + h bad estimates of remaining distance can cause extra work overestimates can cause us to fail to find an optimal path but with an underestimated h, we are guaranteed to find an optimal path because, with underestimates, the actual distance from any unexpanded node to the goal will be more than the heuristic function guesses. The heuristic function must be developed from prior knowledge of the domain, not by searching to a solution. Functions used for the 8-puzzles: 1. number of tiles in wrong place 7 2. sum of the distances each tile is from its home position 12... IS.12 Optimal search techniques: Dynamic Programming In graph searches, there are sometimes redundant paths. Discarding redundant paths (dynamic programming) can improve efficiency... When looking for an optimal path from S to G, all paths from S to I, other than the minimum length path from S to I, can be ignored. Branch and bound + dynamic programming: h = 0 Expand the shortest path so far, but discard redundant paths that are too long. IS.12 Optimal search techniques: A algorithm branch and bound + estimate of remaining distance + dynamic programming (1) Form a queue of partial paths, initially consisting of the zero length, zero-step path from the root node to nowhere (2) Until the queue is empty or the goal has been reached, determine if the first path reaches the goal. (a) If so, stop. (b)if not, then: (i) remove the queue s first path (ii) Form new paths by extending it one step (iii) Add the new paths by extending to the queue (iv) Sort the queue using f = g + h with the least cost path in front (v) If 2 or more paths reach a common node, delete all those paths except for one that reaches the common node with minimum cost (3) If goal is found, success; if not, failure. Termination condition (to guarantee that the path it finds is optimal): Even after a goal is found (at cost c) expand unexplored paths until each of their g + h values is greater than or equal to c Optimal search techniques: A algorithm applications of the A algorithm IS.12 small search tree British museum algorithm bad paths turn distinctly bad quickly large search tree good lower-bound estimate of the remaining distance many paths converge on the same place branch and bound branch and bound with a guess dynamic programming A algorithm A algorithm Other applications: (A) robot path planning (collision-free path) 1. redescribe the problem in another simpler representation 2. build a fence for one point of the robot configuration-space transformation 3. if fences are not dense then there exists a path 4. point representation (full graph; (e, e ) if direct adjacency (visibility graph)) 5. guess function: direct distance to find point 6. A* (B) configuration space generalizations moving object many rotate several configuration space other move trajectory IS.12 Optimal search techniques: AO algorithm branch and bound + estimate of remaining distance + dynamic programming (1) Traverse the graph (from the initial node) following the best current path. (2) Pick one of the unexpanded nodes on that path and expand it. Add its successors to the graph and compute f for each of them (using only h, not g). (3) Change the expanded node s f value to reflect its successors. Propagate the change up the graph. Reconsider the current best path. Optimal search techniques: Trees and adversarial search IS.12 how to play board games (checkers & chess: one choice producing another tree of choices) but here now: interdigitated choices of two adversaries new issue: competition! min-max search with alpha-beta pruning reducing search by stopping work on guaranteed losers or progressive deepening or heuristic deepening nodes
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks