1 / 23

Community of Practice

Community of Practice. Melanie Barwick, Ph.D., C.Psych. Health Systems Scientist SickKids, Toronto KTE CoP September 25 2008. Highlights from the Literature: Value. In the short-term , CoPs benefit the organization by: Facilitating the identification of individuals with specific expertise

vienna
Download Presentation

Community of Practice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Community of Practice Melanie Barwick, Ph.D., C.Psych. Health Systems Scientist SickKids, Toronto KTE CoP September 25 2008

  2. Highlights from the Literature: Value • In the short-term, CoPs benefit the organization by: • Facilitating the identification of individuals with specific expertise • Fostering knowledge sharing across organizational and geographic boundaries • Improves the rate if implementation/uptake of evidence based practices [1] • Improves the quality of research and practice • The long-term value to organizations include: • Leveraging strategic plans • Increased retention of talent • Increased capacity for knowledge development • Knowledge based partnerships [1] Barwick, Peters, Barwick, Boydell (unpublished). (February 26th 2008). Do ‘Communities of Practice’ Support Practice Change? Findings from a Pilot Study. 21st Annual Research Conference: A System of Care for Children’s Mental Health: Expanding the Research Base, Tampa, Florida.

  3. Highlights from the Literature: Value • In the short-term, CoPs benefit the individual: • Providing a safe environment for sharing problems • Reducing learning curves • Improving topical knowledge • Foster interaction between junior & senior practitioners • Improves the quality of research and practice • The long-term value to individuals include: • Providing a forum for expanding skills & expertise • Networking for staying up-to-date in the field • Enhanced professional reputation • Increase marketability and employability • Strengthens one’s professional identify

  4. Structural Elements • CoPs can be small, big, long-lived or short lived, colocated or distributed, homogeneous or heterogeneous, inside or across boundaries, spontaneous or intentional (purposeful), unrecognized or endorsed organizationally. • They all share: • Domain • Community • Shared practice

  5. Key Organizational Design Factors • Clarity of purpose and core membership • Healthy infrastructure • Leadership • Organizational culture • Information systems • Human resource management • Community-building process • Results measurement (metrics)

  6. The CoP Model • A distributed community of practice • Overcomes barriers of time, geography, affiliation, culture • Can be both virtual (web-based) and situated

  7. Readiness for CoP Is there top-level sponsorship? Is there an existing sense of community within the targeted CoP? Is there a sense of energy and passion around the community? Is there a recognized need that the community can meet, thus providing value to the members and their organizations? Is there a significant or critical issue facing the community that knowledge sharing can positively impact? (This implies that there is significant interest or urgency around the issue; these tend to focus on specific process topics.) Are there resources (i.e., money and people) to support the community?

  8. Roles, Responsibility, Supports Roles, responsibilities and supports also need to be articulated and put in place at the front end. A support team is needed to provide the operational infrastructure, procedural guidelines, technical support, user support, and community support for the community. The support team provides training, deployment, and startup functions, as well as process and infrastructure support for communities.

  9. 14-Step Model • Initial concept formation • Core planning meeting • Draft community charter • Establish community structure • Inventory of knowledge assets • Organize the content within the community • Identify and develop new content • Identify content editors • Train content editors • Manage the content • Facilitate interaction among members • Market the community • Keep content current and relevant • Determine the effectiveness of the community Defense Acquisition University (2005). Community of Practice Implementation Guide.

  10. Sample Charter Template[1] Community Name: (Identify the name of the Community, i.e., Cancer Prevention CoP.) Community Membership/Audience: (Identify the audiences / stakeholders that the community is targeting or is trying to attract, i.e., narrow down the list) Community Purpose/Intent: (Identify the purpose/intent of the community, i.e., the community is focused on documenting, sharing, and transferring best practices in cancer prevention.) Type of Community or Knowledge Area: (Identify the type of virtual space that best supports the community’s purpose) Community Objectives: (Identify the community objectives, i.e., the specific areas/issues that the community is interested in addressing.) Community Roles: (Identify by name the individuals who are filling roles.) Sponsor __________________________ Leader __________________________ Content Editor __________________________ Critical Business Issues: (Identify the critical business issues faced by the community.) Resources: (Identify the resources required to support the community, i.e., the organic resources that are available, the contractor support that is required, any performance engineered content that needs to be developed.) Measures of Success: (List measures of success as determined by the community during the Workshop.) [1] Adapted from DAU 2005.

  11. Activity Metrics Activity Metrics (Quantitative) • Website Page Views; • New Website Accounts; • New Topics; • New Knowledge Objects; • New Discussion Forums; • Member Logins; • Community Page Views; • Number of Times Knowledge Object is Viewed; • Most Viewed Knowledge Objects; • Membership growth trends; • Contribution growth trends; • How often users interact (face-to-face meetings, virtual discussions, etc.).

  12. Performance Metrics Performance Metrics (Qualitative) Performance metrics indicate the value of the web tool to community members: • Usability: • Unsolicited, through on-line CoP feedback tools • Testimonials and other user feedback (e.g., examples of specific mistakes or problems that were avoided or solved, time saved, etc.): • Unsolicited, through CoP feedback tools; • Solicited, through various mechanisms: • Emails targeted at specific communities of the workforce; • Conference surveys; • Phone calls; • In-person meetings; • Written forms; • Interviews; • Workshops; • Group meetings; • Focus groups of users (i.e., ask the users how the community has helped them). • Community of Practice Early Progress Checklist • Story Telling (e.g., anecdotes, insights, lessons learned, and actions).

  13. Community of Practice Early Progress Checklist • Does the community have a common purpose? Is the purpose compelling to leadership, prospective members, and their functional managers? • Is the common purpose aligned with sponsor and organizational strategies? • Is the right sponsorship in place, i.e., a respected leader who is willing to contribute to the community? • Does the Functional Sponsor(s) agree with the community’s scope, purpose, and membership? • Are Core Group Members and the Community Leader enthusiastic, content experts, and able to develop the community? • Do members’ Functional Managers agree that time away from the job is valuable? • Does the community have the right content experts to provide perspective and meaning to its membership? • Does the community have enough members to stay alive? • Are collaborative tools in place and easily accessible? Are members willing and able to use them? • Are needed resources available, e.g., meeting rooms, participation in conferences, travel dollars, conference fees, etc.?

  14. Metrics Performance Metrics Examples • Satisfaction of specific knowledge goals; • Reduction in hours needed to solve problems; • Reduction in planned or actual schedule hours; • Reduction in learning time; • Reduction in rework; • Improvement in speed of response; • Increase in innovative and breakthrough ideas; • Increase in reach to customer; • Reduction in cost to support collaborative workspaces; • Transfer of best practices (tacit knowledge) from one member to another; • Adoption of best practices or innovations that were “not invented here”; • Reduction in redundancy of effort among members; • Avoidance of costly mistakes; • Reduction of specific cost due to superior knowledge resources or shared knowledge; • Increase in the productivity of knowledge workers; • Improvement in the quality of decision making; • Increase in user satisfaction with the ability to access knowledge.

  15. Engagement Jakob Neilson, describes the ratio of on-line participation as a 90-9-1 rule: • 90% of users are lurkers (i.e., read or observe, but don't contribute). • 9% of users contribute from time to time, but other priorities dominate their time. • 1% of users participate a lot and account for most contributions: it can seem as if they don't have lives because they often post just minutes after whatever event they're commenting on occurs. • Wikipedia contribution and general Internet participation complies roughly with this rule. This breakdown does seem to be congruent with our observations regarding on-line communities. http://howardlenos.blogspot.com/2008/06/90-9-1-rule.html

  16. Engagement Re-label the participants: • Call the 1% "Knowledge Champions“ - people who excel at sharing knowledge. • Call the 9% "Knowledge Agents“ - people that readily connect people to information and are proactive in responding and interacting to knowledge flow. • The rest, the 90% we'll label as "Knowledge Users“ - valuable community participants that convert explicit information into solutions, products and value.

  17. Engagement – Dunbar’s Number Dunbar's number is the supposed cognitive limit to the number of individuals with whom any one person can maintain stable social relationships: the kind of relationships that go with knowing who each person is and how each person relates socially to every other person.[1] Proponents assert that group sizes larger than this generally require more restricted rules, laws, and enforced policies and regulations to maintain a stable cohesion. No precise value has been proposed for Dunbar's number, but a commonly cited approximate figure is 150. Dunbar's number was first proposed by British anthropologist Robin Dunbar, who theorized that "this limit is a direct function of relative neocortex size, and that this in turn limits group size ... the limit imposed by neocortical processing capacity is simply on the number of individuals with whom a stable inter-personal relationship can be maintained." On the periphery, the number also includes past colleagues such as high school friends with whom a person would want to reacquaint themselves if they met again.[2] 1 Gladwell, Malcolm (2000). The Tipping Point - How Little Things Make a Big Difference. 2 Grooming, Gossip, and the Evolution of Language (Paperback) by Robin Dunbar

  18. Engagement • 90-9-1 rule sure holds true across sites • Beyond Dunbar number, hard to maintain trust. • Have different strategies for different participation levels & group sizes http://www.socialtext.net/ocu2008/index.cgi?

  19. Engagement Stimulate community engagement by: • Mapping the social network to identify the Knowledge Champions and Knowledge Agents. • Optimize support and communications structures around the Knowledge Champions, they are the "collaboration core" of the community. • Empower the Knowledge Agents by making sure they are solidly connected into the community and have full visibility and convenient contribution mechanisms. • Finally, provide the Knowledge Users with very low-barrier interaction mechanisms that align with their working contexts.

  20. Metrics • Measuring for SuccessA successful community generally has two hallmarks: a high level of interaction between the participants, and a growing body of valuable content. That's a wonderful end-state, but how do we assess the current state of collaboration? Here are some criteria critical to success: • Discovery - How easy is it for others to see what your community is currently doing or intends to do? • Participation - How easy is it for others to contribute to the community? • Promotion - How do you help others connect with your community and stay informed? • Production - How valuable are the contributions of the community? http://howardlenos.blogspot.com/search/label/Metrics

  21. Engagement • Christopher Allen, The Numbers Behind Trust, Online Community Unconference, June 18 2008, Mountainview, California http://www.socialtext.net/ocu2008/index.cgi? • Key Takeaways: • 1) 90-9-1 rule sure holds true across sites; participation inequity.2) Beyond Dunbar number, hard to maintain trust.3) Have different strategies for different participation levels & group sizes • Jakob Neilson, in his article, "Participation Inequality, Encouraging More Users to Participate", describes the ratio of on-line participation as a 90-9-1 rule: • 90% of users are lurkers (i.e., read or observe, but don't contribute). • 9% of users contribute from time to time, but other priorities dominate their time. • 1% of users participate a lot and account for most contributions: it can seem as if they don't have lives because they often post just minutes after whatever event they're commenting on occurs. • He then goes on to describe how Wikipedia contribution and general Internet participation complies roughly with this rule. Although not mathematically conclusive, this breakdown does seem to be congruent with our observations regarding on-line communities. http://howardlenos.blogspot.com/2008/06/90-9-1-rule.html

  22. Engagement • Re-label the participants: • Call the 1% "Knowledge Champions", people who excel at sharing knowledge and evangelizing ideas and content. • Call the 9% "Knowledge Agents", people that readily connect people to information and are proactive in responding and interacting to knowledge flow. • The rest, the 90% we'll label as "Knowledge Users", valuable community participants that convert explicit information into solutions, products and value. • Stimulate community engagement by: • Map the social network to identify the Knowledge Champions and Knowledge Agents. • Optimize support and communications structures around the Knowledge Champions, they are the "collaboration core" of the community. • Empower the Knowledge Agents by making sure they are solidly connected into the community and have full visibility and convenient contribution mechanisms. • Finally, provide the Knowledge Users with very low-barrier interaction mechanisms that align with their working contexts.

  23. Metrics Measuring for SuccessA successful community generally has two hallmarks: a high level of interaction between the participants, and a growing body of valuable content. That's a wonderful end-state, but how do we assess the current state of collaboration? Here are some criteria critical to success: • Discovery - How easy is it for others to see what your community is currently doing or intends to do? • Participation - How easy is it for others to contribute to the community? • Promotion - How do you help others connect with your community and stay informed? • Production - How valuable are the contributions of the community? http://howardlenos.blogspot.com/search/label/Metrics

More Related