Vicki allan 2008
Download
1 / 58

Vicki Allan 2008 - PowerPoint PPT Presentation


  • 66 Views
  • Uploaded on

Vicki Allan 2008. Looking for students for two NSF funded grants. Funded Projects 2008-2011. CPATH – Computing Concepts Educational Curriculum Development Looking for help in the creation of a new introductory course – USU 1360 COAL – Coalition Formation Research in Multi-agent systems.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Vicki Allan 2008' - ciara-rojas


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Vicki allan 2008

Vicki Allan2008

Looking for students

for two NSF funded grants


Funded projects 2008 2011
Funded Projects 2008-2011

  • CPATH – Computing Concepts

    • Educational

    • Curriculum Development

    • Looking for help in the creation of a new introductory course – USU 1360

  • COAL – Coalition Formation

    • Research in Multi-agent systems


Cpath
CPATH

  • There is a need for more computer science graduates.

  • There is a lack of exposure to computer science.

  • Introductory classes are unattractive to many.

  • Women are not being attracted to computer science despite forces which should attract women – good pay, flexible hours, interesting problems.


Create a library of multi function interactive learning modules ilms
Create a library of multi-function Interactive Learning Modules (ILMs)

  • Showcase computational thinking

  • De-emphasize programming


The balls on the left are to be exchanged Modules (ILMs)

with the balls on the right by a sequence

of moves. Any ball can move into adjacent empty slot. Any ball can jump over a single neighbor to an empty slot.

Complexity

Algorithm design

Abstraction – general purpose rules


Need students
Need Students Modules (ILMs)

  • Good programmers to program interactives. Using Java or flash.

  • Ideas for how to revitalize undergraduate education

  • TA for next semester to help with USU 1360


COAL Modules (ILMs)

Second project involves multi-agent systems



Monetary auction
Monetary Auction Modules (ILMs)

  • Object for sale: a dollar bill

  • Rules

    • Highest bidder gets it

    • Highest bidder and the second highest bidder pay their bids

    • New bids must beat old bids by 5¢.

    • Bidding starts at 5¢.

    • What would your strategy be?


Give away
Give Away Modules (ILMs)

  • Bag of candy to give away

  • If everyone in the class says “share”, the candy is split equally.

  • If only one person says “I want it”, he/she gets the candy to himself.

  • If more than one person says “I want it”, I keep the candy.


The point
The point? Modules (ILMs)

  • You are competing against others who are as smart as you are.

  • If there is a “weakness” that someone can exploit to their benefit, someone will find it.

  • You don’t have a central planner who is making the decision.

  • Decisions happen in parallel.


Cooperation
Cooperation Modules (ILMs)

  • Hiring a new professor this year.

  • Committee of three people to make decision

  • Have narrowed it down to four.

  • Each person has a different ranking for the candidates.

  • How do we make a decision?


Binary Protocol Modules (ILMs)

One voter ranks c > d > b > a

One voter ranks a > c > d > b

One voter ranks b > a > c > d

winner (c, (winner (a, winner(b,d)))=a

winner (d, (winner (b, winner(c,a)))=d

winner (d, (winner (c, winner(a,b)))=c

winner (b, (winner (d, winner(c,a)))=b

surprisingly, order of pairing yields different winner!


If you only wanted to find the first place winner, could you count the number of times a person was ranked first?

  • a > b > c >d

  • a > b > c >d

  • a > b > c >d

  • a > b > c >d

  • b > c > d> a

  • b > c > d> a

  • b > c > d> a

    a=19, b=24, c=17, d=10

Just counting first ranks isn’t enough.


Borda protocol count the number of times a person was ranked first?

assigns an alternative |O| points for the highest preference, |O|-1 points for the second, and so on

  • The counts are summed across the voters and the alternative with the highest count becomes the social choice

15


reasonable??? count the number of times a person was ranked first?


Borda paradox
Borda Paradox count the number of times a person was ranked first?

  • a > b > c >d

  • b > c > d >a

  • c > d > a > b

  • a > b > c > d

  • b > c > d> a

  • c >d > a >b

  • a <b <c < d

    a=18, b=19, c=20, d=13

Is this a good way?

Clear loser


Borda paradox remove loser d winner changes
Borda Paradox – remove loser (d), winner changes count the number of times a person was ranked first?

  • a > b > c

  • b > c >a

  • c > a > b

  • a > b > c

  • b > c > a

  • c > a >b

  • a <b <c

    a=15,b=14, c=13

  • a > b > c >d

  • b > c > d >a

  • c > d > a > b

  • a > b > c > d

  • b > c > d> a

  • c >d > a >b

  • a <b <c < d

    a=18, b=19, c=20, d=13

When loser is removed, second worst becomes winner!


Conclusion
Conclusion count the number of times a person was ranked first?

  • Finding the correct mechanism is not easy


Who works together in agent coalition formation

Vicki Allan – Utah State University count the number of times a person was ranked first?

Kevin Westwood – Utah State University

Presented September 2007, Netherlands

(Work also presented in Hong Kong, Finland, Australia, California)

CIA 2007

Who Works Together in Agent Coalition Formation?


Overview
Overview count the number of times a person was ranked first?

  • Tasks: Various skills and numbers

  • Agents form coalitions

  • Agent types - Differing policies

  • How do policies interact?


Multi agent coalitions
Multi-Agent Coalitions count the number of times a person was ranked first?

  • “A coalition is a set of agents that work together to achieve a mutually beneficial goal” (Klusch and Shehory, 1996)

  • Reasons agent would join Coalition

    • Cannot complete task alone

    • Complete task more quickly


Skilled request for proposal srfp environment
Skilled Request For Proposal (SRFP) Environment count the number of times a person was ranked first?

Inspired by RFP (Kraus, Shehory, and Taase 2003)

  • Provide set of tasks T = {T1…Ti…Tn}

    • Divided into multiple subtasks

    • In our model, task requires skill/level

    • Has a payment value V(Ti)

  • Service Agents, A = {A1…Ak…Ap}

    • Associated cost fk of providing service

    • In the original model, ability do a task is

      determined probabilistically – no two agents alike.

    • In our model, skill/level

    • Higher skill is more flexible (can do any task with lower level skill)


Why this model
Why this model? count the number of times a person was ranked first?

  • Enough realism to be interesting

    • An agent with specific skills has realistic properties.

    • More skilled can work on more tasks, (more expensive) is also realistic

  • Not too much realism to harm analysis

    • Can’t work on several tasks at once

    • Can’t alter its cost


Auctioning protocol
Auctioning Protocol count the number of times a person was ranked first?

  • Variation of a reverse auction

    • One “buyer” lots of sellers

    • Agents compete for opportunity to perform services

    • Efficient way of matching goods to services

  • Central Manager (ease of programming)

    1) Randomly orders Agents

    2) Each agent gets a turn

    • Proposes or Accepts previous offer

      3) Coalitions are awarded task

  • Multiple Rounds {0,…,rz}


  • Agent costs by level
    Agent Costs by Level count the number of times a person was ranked first?

    General upwardtrend


    • Agent cost count the number of times a person was ranked first?

      • Base cost derived from skill and skill level

      • Agent costs deviate from base cost

    • Agent payment

      • cost + proportional portion of net gain

    Only Change in coalition


    How do I decide what to propose? count the number of times a person was ranked first?


    The setup
    The setup count the number of times a person was ranked first?

    • Tasks to choose from include skills needed and total pay

    • List of agents – (skill, cost)

    • Which task will you choose to do?


    Decisions
    Decisions count the number of times a person was ranked first?

    If I make an offer…

    • What task should I propose doing?

    • What other agents should I recruit?

      If others have made me an offer…

    • How do I decide whether to accept?


    Coalition calculation algorithms
    Coalition Calculation Algorithms count the number of times a person was ranked first?

    • Calculating all possible coalitions

      • Requires exponential time

      • Not feasible in most problems in which tasks/agents are entering/leaving the system

    • Divide into two steps

      1) Task Selection

      2) Other Agents Selected for Team

      • polynomial time algorithms


    Task selection 4 agent types
    Task Selection- 4 Agent Types count the number of times a person was ranked first?

    • Individual Profit – obvious, greedy approach

      Competitive: best for me

      Why not always be greedy?

      • Others may not accept – your membership is questioned

      • Individual profit may not be your goal

    • Global Profit

    • Best Fit

    • Co-opetitive


    Task selection 4 agent types1
    Task Selection- 4 Agent Types count the number of times a person was ranked first?

    • Individual Profit

    • Global Profit – somebody should do this task

      I’ll sacrifice

      Wouldn’t this always be a noble thing to do?

      • Task might be better done by others

      • I might be more profitable elsewhere

    • Best Fit – uses my skills wisely

    • Co-opetitive


    Task selection 4 agent types2
    Task Selection- 4 Agent Types count the number of times a person was ranked first?

    • Individual Profit

    • Global Profit

    • Best Fit – Cooperative: uses skills wisely

      Perhaps no one else can do it

      Maybe it shouldn’t be done

    • Co-opetitive


    4 th type co opetitive agent
    4 count the number of times a person was ranked first?th type: Co-opetitive Agent

    • Co-opetition

      • Phrase coined by business professors Brandenburger and Nalebuff (1996),to emphasize the need to consider both competitive and cooperative strategies.

    • Co-opetitive Task Selection

      • Select the best fit task if profit is within P% of the maximum profit available


    What about accepting offers
    What about accepting offers? count the number of times a person was ranked first?

    Melting – same deal gone later

    • Compare to what you could achieve with a proposal

    • Compare best proposal with best offer

    • Use utility based on agent type


    Some amount of compromise is necessary… count the number of times a person was ranked first?

    We term the fraction of the total possible you demand – the compromising ratio


    Resources shrink
    Resources Shrink count the number of times a person was ranked first?

    • Even in a task rich environment the number of tasks an agent has to choose from shrinks

      • Tasks get taken

    • Number of agents shrinks as others are assigned


    Task Rich: 2 tasks count the number of times a person was ranked first?

    for every agent

    My tasks parallel total tasks


    Scenario 1 bargain buy
    Scenario 1 – Bargain Buy count the number of times a person was ranked first?

    • Store “Bargain Buy” advertises a great price

    • 300 people show up

    • 5 in stock

    • Everyone sees the advertised price, but it just isn’t possible for all to achieve it


    Scenario 2 selecting a spouse
    Scenario 2 – selecting a spouse count the number of times a person was ranked first?

    • Bob knows all the characteristics of the perfect wife

    • Bob seeks out such a wife

    • Why would the perfect woman want Bob?


    Scenario 3 hiring a new phd
    Scenario 3 – hiring a new PhD count the number of times a person was ranked first?

    • Universities ranked 1,2,3

    • Students ranked a,b,c

      Dilemma for second tier university

    • offer to “a” student

    • likely rejected

    • delay for acceptance

    • “b” students are gone


    Affect of compromising ratio
    Affect of Compromising Ratio count the number of times a person was ranked first?

    • equal distribution of each agent type

    • Vary compromising ratio of only one type (local profit agent)

    • Shows profit ratio = profit achieved/ideal profit (given best possible task and partners)


    Note how profit is affect by load count the number of times a person was ranked first?

    Achieved/theoretical best


    Profit only of scheduled agents
    Profit only of scheduled agents count the number of times a person was ranked first?

    Only Local Profit agents

    change compromising ratio

    Yet others slightly increase too


    Note count the number of times a person was ranked first?

    • Demanding local profit agents reject the proposals of others.

    • They are blind about whether they belong in a coalition.

    • They are NOT blind to attributes of others.

    • Proposals are fairly good


    For every agent type, the most likely proposer count the number of times a person was ranked first?

    was a Local Profit agent.


    No reciprocity: Coopetitive eager to accept Local Profit proposals,

    but Local Profit agent doesn’t accept

    Coopetitive proposals especially well


    For every agent type, proposals,

    Best Fit is a strong acceptor.

    Perhaps because it isn’t accepted well as a proposer


    Load balance seems to affect roles proposals,

    Coopetitive agents function better as proposers to Local Profit agents in balanced or task rich environment.

    • When they have more choices, they tend to propose coalitions local profit agents like

    • More tasks give a Coopetitive agent a better sense of its own profit-potential

    Coopetitive Agents look

    at fit as long as it isn’t too bad

    compared to profit.


    Agent rich 3 agents task
    Agent rich: 3 agents/task proposals,

    Coopetitive accepts most proposals

    from agents like itself

    in agent rich environments


    • Do agents generally want to work with agents of the same type?

      • Would seem logical as agents of the same type value the same things – utility functions are similar.

      • Coopetitive and Best Fit agents’ proposal success is stable with increasing percentages of their own type and negatively correlated to increasing percentages of agents of other types.



    What happens as we change relative percents of each agent
    What happens as we change relative percents of each agent? type?

    • Interesting correlation with profit ratio.

    • Some agents do better and better as their dominance increases. Others do worse.


    Best fit does better and better as more dominant in set type?

    Best fit does better and better as more dominant in set

    shows relationship if all equal percent

    Local Profit

    does better when

    it isn’t dominant


    So who joins and who proposes
    So who joins and who proposes? type?

    • Agents with a wider range of acceptable coalitions make better joiners.

    • Fussier agents make better proposers.

    • However, the joiner/proposer roles are affected by the ratio of agents to work.


    Conclusions
    Conclusions type?

    • Some agent types are very good in selecting between many tasks, but not as impressive when there are only a few choices.

    • In any environment, choices diminish rapidly over time.

    • Agents naturally fall into role of proposer or joiner.


    Future work
    Future Work type?

    • Lots of experiments are possible

    • All agents are similar in what they value. What would happen if agents deliberately proposed bad coalitions?


    ad