1 / 34

Paul CRISTEA “Politehnica” University of Bucharest

IWALT 2000 - International Workshop on Advanced Learning Technologies 4-6 December 2000, Palmerston North, New Zealand. Adaptation to Nonstationary Environments – Learning and Evolution. Paul CRISTEA “Politehnica” University of Bucharest

adonia
Download Presentation

Paul CRISTEA “Politehnica” University of Bucharest

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IWALT 2000- International Workshop on Advanced Learning Technologies4-6 December 2000, Palmerston North, New Zealand. Adaptation to Nonstationary Environments – Learning and Evolution Paul CRISTEA “Politehnica” University of Bucharest Spl. Independentei 313, 77206 Bucharest, Romania, Phone: +40 -1- 411 44 37, Fax: +40 -1- 410 44 14 e-mail: pcristea@dsp.pub.ro

  2. Adaptation to Nonstationary Environments – Learning and Evolution 1. INTRODUCTION 2. EVOLUTIONARY INTELLIGENT AGENT CONCEPT 3. EVOLUTIONARY INTELLIGENT AGENT MODEL 4. ONTOLOGY AND ARCHITECTURE 5. CONCLUSIONS Paper Outline

  3. INTRODUCTION Intelligent Agents • Capability to: • Learn, • Communicate, • Establish complex, yet flexible organizational structures Operate in dynamic and uncertain environments Robust and scalable software systems • Agent-based computation allows improved: • Modeling, • Design, • Implementation

  4. INTRODUCTION Evolutionary Systems • Capability to: • Evolve by changing the gene pool of a population from generation to generation by such processes as: • Mutation, Genetic Drift, Gene Flow, • Crossing-Over, • Selection. Adapting the behavior to the environment • Able to address real-world problems involving: • Chaos, • Randomness, • Complex nonlinear dynamics

  5. INTRODUCTION Evolutionary Systems The adaptive challenge is determined by the population, the environment and the interactions between and within them. Reductionist models stress either the role of the population or that of the environment, and usually take into account only the evolution through selection, while ignoring learning and competition. In such studies, simple reactive agents have been considered, with the behavior described by a sensorimotor map, i.e., a table of behavioral rules of the form: IF<environment feature Eiis sensed > THEN <do behavior Bj >. This approach has the advantage of keeping the model simple enough for directly deriving quantitative results about the efficiency of accomplishing the adaptive task at the level of the population, but can not be used to investigate the effects of the more complex cognitive capabilities of the agents.

  6. INTRODUCTION Evolutionary Computation Intelligent Agents Evolutionary Intelligent Agents • Bring together the two main forces of adaptation: • learning - occurring at the level of each agent and • at the time scale of agent life, • evolution - taking place at the level of the population and • unfolding at the time scale of successive generations.

  7. INTRODUCTION Evolutionary Intelligent Agents • EIAs are Intelligent Agents provided with a genotype that controls their • capability to carry out various tasks, i.e., their phenotype. • EIAs can adapt efficiently to their environment by using synergetically • both learning and evolution. • EIAs can address the problem of adaptation to nonstationary environments, i.e., to real-life complex and non-predictable environments • as the nowadays worldwide computer networks, or • the user friendly learning/teaching systems. • Current applications of the concept: • Multiresolutional Conceptual Learning [A. Meystel, 2000], • EIA based Information Retrival [F.B.Pereira and E. Costa, 1999, 2000], • EIA based Personalized Web Learning/Teaching [A. Cristea, T. Okamoto, P. Cristea, 2000 ], • Genetic Estimation of Competitive Agents Behavior [A.M.Florea, 2000], • Intelligent Signal and Image Processing [P.Cristea, 2000].

  8. E I A CONCEPT The behavior of an EIA is not a mere automatic response to stimuli from the environment, but is governed by its knowledge about the world. Modalities to represent and process information Cognitive Resources External World • Modalities to infer • new knowledge from: • existing knowledge, • interaction with the environment, • communication with other agents Sensory Representation of the World Environment Sensory Input Cognitive Representation of the World Other Agents Communication Modalities to behave Actions

  9. Agent j Agent k Cognitive Resources • Modalities to infer • new knowledge from: • • existing knowledge, • • interaction with the environment, • • communication with other agents Cognitive reality Cognitive reality “Linguistic” communication Modalities to behave Perception Perception Sensorialreality Sensorialreality “Telepathic” communication Modalities to represent and process information Actions Actions Sensory Input Sensory Input Environment External reality

  10. E I A CONCEPT Learning occurs mainly at the level of individuals that modify their current knowledge by using the outcome of their own experience. Learning can also have a cooperative dimension, the agents communicating through a certain language. The successful representation of the environment or the successful behavioral rules can thus be shared within the population. The decision to accept received knowledge remains with each individual; new knowledge is appropriated only if it fits the existing knowledge of that individual, or if the agent rates its own current knowledge as unsatisfactory (i.e. incomplete, uncertain or contradictory).

  11. E I A CONCEPT Evolution occurs at the scale of the population and involves genetic mechanisms that act over successive generations. Both reactive and cognitive features of the agents can be genetically controlled. An agent’s genotype is expressed in its phenotype -- the entirety of its capabilities. No interactions within the genome are considered; every gene encodes a unique feature in the phenotype. Some genes are of binary type, controlling the dichotomy existence - nonexistence of some capabilities. Other genes specify quantitatively the value of some parameters that determine the intensity of agent features. The reproduction is asexual, meaning that all agents have similar roles in reproduction. However, along with single-parentduplication, i.e., cloning, perturbed/enriched by low probability small random mutations, crossing-over -- a two-parent operator -- is also considered.

  12. E I A CONCEPT • The cognitive resources of an agent can be genetically transmitted, • i.e. inherited from its parent(s): • essential data, • basic rules, • mappings. • The cognitive resources are continuously evolving during the life • of the agent, both by accumulation of sensory input and by • learning/refining processes at various levels. • Baldwin effect • Some of these acquired cognitive resources can also be genetically • transmitted, under certain circumstances. • J. M. Baldwin, A new factor in evolution, American Naturalist, 30, 1896 ,441 – 451. • The sensorial and the cognitive maps of the parent(s) can be inherited • by the offspring.

  13. Better Fitness Trained Population (2) Initial Population (1) Learning Advance of a population in the feature space under the effect of learning.

  14. Better Fitness Next Genetic Step Initial Population + Offspring (2) Evolved Population (3) Initial Population (1) Random Reproduction Selection “of the fittest” Initialization Advance of a population in the feature space under the effect of evolution

  15. Learning Better Fitness Next Cycle Evolved Population (3) Trained Population (2) Initial Population (1) Evolution Advance of a population in the feature space under the combined effect oflearningandevolution.

  16. E I A MODEL A prototype of the EIA system has been implemented for study purposes, to experimentally investigate the EIA concept. The model is quite simple, but illustrates the basic features of an EIA system. According to the concept, the EIAs have not only a reactive behavior, but also cognitive features. A sensorimotor type of agents has been considered, evolving in a two-dimensional world and performing several simple tasks.

  17. E I A MODEL The system comprises one or more agent populations - teams. Agents from different teams interact only by acting in the same environment. Agents from a team may also interact directly, e.g., through message exchange, genetic interactions, etc. All the agents move synchronously and make at most one movement at each step. The world is a rectangular lattice with strong boundary conditions. Any location is considered adjacent to its eight surrounding neighboring locations. An agent may move into a neighboring location, if accessible. Walls and domain margins are permanently inaccessible locations. Two agents cannot be in the same location at the same time. If two agents attempt to occupy the same location, there is a collision and only one of the agents succeeds, according to the agents’ push strength. Some of the grid nodes contain a certain amount of resources - seeds. Each population has assigned some special locations on the grid - nests. The task of an agent is to pick up the resources and carry them to the nests. This specific task is a pre-programmed objective of the agent.

  18. E I A MODEL • An agent holds subjective, partial information about the environment, • at two levels of world representation: • sensorial level- depicted in a sensorial map constructed with • tactile and visual inputs, • cognitive level- in a cognitive map, based on the information in the sensorial map, • modified and enriched through • - some heuristic processing and with • - the information received by communicating • with other agents in the same team • The agent decides what actions to undertake based on the subjective • information in the cognitive map and on previous knowledge • expressed in behavior rules. • It sends the movement requests to the environment and updates its • knowledge base knowing the results of these requests.

  19. E I A MODEL • The fitness of an individual agent is quantified by its energy. • The agent starts with an initial energy. • There is an energy cost associated to each action and • an energy bonus at the completion of a task. • The existence, behavior and reproduction of an agent • depend on its energy: • IF< the energy falls below a threshold > • THEN < the agent can be destroyed > • IF< the energy rises above a threshold > • THEN < the agent can replicate and new agents are created >. • .

  20. ONTOLOGY AND ARCHITECTURE • An agent is described by: • State attributes -- can change at every step with the state of the agent • Position – the location of the agent in the grid that forms the world, • Orientation – one of the eight neighboring locations, • Load– the amount of resources carried by the agent. • Permanent attributes -- specified when the agent is created and changed • only by genetic operations • Actuator attributes -- determine directly the agent action results • Speed – number of movements an agent can make in a given time interval, Capacity – the maximum amount of resources an agent can acquire, • Push strength – determines theagent that wins in a collision • Sensor attributes -- determine the agent’s sensorial capabilities • Visual Range -- sets the depth of the visual field • Behavior attributes -- internal attributes of the agent, without direct influence on • the environment, not visible to the environment and other agents. • Memory Size -- limits the amount of information retained by an agent, • Weighting parameters -- for target selection from multiple potential targets

  21. ONTOLOGY AND ARCHITECTURE The visual field for Visual Range = 5 and for two different orientations of an agent

  22. ONTOLOGY AND ARCHITECTURE • The locations in the grid are of four different types: • Walls -not accessible to the agents, used to create a maze configuration in which the agents evolve and search their targets: resources and nests. The borders of the grid are also marked as walls. • Nests - where the agents of a team have to deliver resources. No agent picks up resources from a nest. • Spaces - contain a non-negative amount of non-renewable resources. An agent passing through a space location consumes the resources and increases its Carried Seeds value until the amount of resources in that location becomes zero or the agent reaches its Capacity. • Generators - model renewable resources. An agent entering a generator location consumes the available resources like in a space location, but after a certain delay the amount of resources in that location is incremented with a preset step, until a preset maximum resource amount is reached. If the delay is set to zero the resource amount is constant, i.e., non-exhaustible.

  23. ONTOLOGY AND ARCHITECTURE • The architecture of the EIA system comprises two components: • Server - managing the environment; • Client - that communicate over an IP network. • Several clients can connect simultaneously to the server, modeling several agent • populations acting together in the same environment. • The clients can be different applications running different agent control algorithms, • as long as they respect the communication protocol. • The server implements the world model. It manages the environment in which the • agents are acting and controls the state of the agents. • The information about the world stored by the server is objective, complete and up-to-date. • The agents send movement requests to the server, which: • analyses all the requests, • estimates the possible interactions between the agents, • determines the resulting configuration of the world. • The feedback from the server provides the agents with tactile (contact) item identification capabilities. • The server also establishes the visual information received by each agent in accordance to its sensorial attributes and dispatches that information to the corresponding agent.

  24. ONTOLOGY AND ARCHITECTURE Environment (Server machine) Agent population Agent population Agent population (Client machine) (Client machine) (Client machine) Agents Agents Agents Architecture of an EIA system

  25. ONTOLOGY AND ARCHITECTURE • A single client machine hosts an entire agent population (team), to facilitate the • implementation of population-level features such as establishing a certain level • of agent collaboration or implementing genetic interactions between the agents. • The agents remain quasi-autonomous, their actions being decided at the individual • agent level, not at the population level. • The client application comprises two modules: • one implementing the intelligent agents and • another implementing a population manager. • An agentdecides what movements to make based on • the information in its cognitive map, • the behavior rules. • The agent sends action movement requests to the environment and updates its • knowledge base knowing the results of these requests. • The communication between the agents in the same team takes place by exchanging • information at the level of the cognitive map.

  26. ONTOLOGY AND ARCHITECTURE • The population manager acts as a middle layer between • the agents in the population and the environment. • Computes the energy value for the agents in the population, rewarding or taxing them according to their actions. • Destroys the low energy agents and replicates the high energy ones. • Performs the evolutionary operations, implementing the genetic interaction and • applying mutations to individuals of the same population. Probability of agent destruction and replication.

  27. ONTOLOGY AND ARCHITECTURE The client process computes the energy according to the results received from the server. The energy parameters have the same value for all the agents in the team and are set by the user when initializing the client. The current energy of an agent E is a positive value. Each agent starts having an energy Einitial. The energy decreases with a fixed amount Estep for each step made by the agent. There is an additional energy cost for a lost conflict (collision). When the agent succeeds in delivering resources to a nest of the team, it receives a fixed amount Ebonus for each resource unit (seed ) it delivers.

  28. ONTOLOGY AND ARCHITECTURE If the energy falls below a threshold Ed, the agent may be destroyed with the probability: If the energy is higher than another threshold Er , the agent may replicate. After replication, a new child agent is created with the energy Einitial. The energy of the parent agent decreases with the same amount Einitial. The parent can replicate again as long as its energy remains above Er . The probability for replication has been chosen:

  29. ONTOLOGY AND ARCHITECTURE The genotype is encoded in a bit string. The genotype includes the permanent attributes specific to an agent population. During each simulation step, there is a low probability that a mutationoccurs to an agent chosen randomly in the population. A mutation flips randomly one of the bits of the encoded genotype. A crossover operation can occur between two agents from the same population, if they happen to be placed in adjacent locations. A double-point crossover operator over all the attributes encoded in the genotype is used. The probabilities for crossover and mutation are user modifiable parameters. When an agent replicates, it creates a new agent having a copy of its genotype, except for possible mutations. The agent knowledge may be genetically transmitted or not: the new agent can either start with blank maps or inherit the maps from its parent. The user selects the desired behavior for the whole population before the simulation begins. Genetic transmission of acquired features leads to Baldwin effect.

  30. Information Retrieval • IR basic stages: • Formulating queries; • Finding documents; • Determining relevance • Traditional IR systems: • Static and centralized collections of directly accessible documents; • Concerned only with Formulating queries & Determining relevance • Finding documents on the Web: • Millions of documents, distributed on many independent servers; • Dynamic nature of the environment, updating of information; • Structured as a graph where documents are connected by hyperlinks. • Altavista and Yahoo use indexing databases storing efficient representation of a large number of documents.

  31. Information Retrieval After Francisco Pereira and Ernesto Costa, 2000

  32. Information Retrieval After Francisco Pereira and Ernesto Costa, 2000

  33. Information Retrieval After Francisco Pereira and Ernesto Costa, 2000

  34. CONCLUSIONS • The paper presents preliminary results in investigating the concept • of Evolutionary Intelligent Agents (EIA). • This concept brings together features of Intelligent Agents and the • Evolutionary / Genetic Algorithms and Genetic Programing approaches. • There are already strong enough reasons to believe that this new idea • allows addressing highly complex real-life problems - ones involving • chaotic disturbances, randomness, and complex nonlinear dynamics, • that traditional algorithms have been unable to handle. • The EIAs have the potential to use the two main forces of adaptation: • learning and evolution. • There are already several successful applications of EIA to problems like: • Multiresolutional Conceptual Learning, • EIA based Web Information Retrieval, • EIA based Personalized Web English Language Teaching, • Intelligent Signal and Image Processing.

More Related