enterprise information architecture because users don t care about your org chart l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Enterprise Information Architecture Because Users Don’t Care About Your Org Chart PowerPoint Presentation
Download Presentation
Enterprise Information Architecture Because Users Don’t Care About Your Org Chart

Loading in 2 Seconds...

play fullscreen
1 / 239

Enterprise Information Architecture Because Users Don’t Care About Your Org Chart - PowerPoint PPT Presentation


  • 138 Views
  • Uploaded on

Enterprise Information Architecture Because Users Don’t Care About Your Org Chart. Fall 2007 Louis Rosenfeld www.louisrosenfeld.com. About Me. Independent IA consultant and blogger (www.louisrosenfeld.com) Founder, Rosenfeld Media, UX publishing house (www.rosenfeldmedia.com)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Enterprise Information Architecture Because Users Don’t Care About Your Org Chart' - avani


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
enterprise information architecture because users don t care about your org chart

Enterprise Information ArchitectureBecause Users Don’t Care About Your Org Chart

    • Fall 2007
  • Louis Rosenfeld
  • www.louisrosenfeld.com
about me
About Me
  • Independent IA consultant and blogger (www.louisrosenfeld.com)
  • Founder, Rosenfeld Media, UX publishing house (www.rosenfeldmedia.com)
  • Work primarily with Fortune 500s and other large enterprises
  • Co-author, Information Architecture for the World Wide Web (1998, 2002, 2006)
  • Founder and past director, the Information Architecture Institute (www.iainstitute.org) and User Experience Network (www.uxnet.org)
  • Background in librarianship/information science
seminar agenda
Seminar Agenda
  • Welcome/Introduction
  • Top-Down Navigation
  • Bottom-Up Navigation
  • Search
  • EIA and the Organization
    • Research methods
    • Governance and more
  • Discussion
introduction ia in one slide
Introduction:IA in one slide
  • Definition: the art and science of structuring, organizing and labeling information to help people find and manage information
    • Balances characteristics and needs of users, content and context
    • Top down (questions) & bottom up (answers)
introduction only one ia rule
Introduction:Only one IA rule
  • Pareto’s Principle (“the 80/20 rule”)
    • 20% of content satisfies 80% of users’ needs
    • 20% of possible IA options address 80% of content
    • 20% of IA options address 80% of users’ needs
  • IA’s goal: figure out which 20%
  • No other rules, just guidelines
what an enterprise is
What an Enterprise Is
  • Large, distributed, decentralized organization made up of multiple business units
  • Distributed
    • Functionally in many different “businesses” (e.g., HR vs. communications, or hardware vs. software)
    • Geographically
  • Decentralized
    • Large degree of authority and responsibility resides in hands of business units in practice (if not officially)
    • Business units often own significant infrastructure (technical, staff, expertise)
ia and eia the differences
IA and EIA: The differences
  • The “enterprise challenge”: providing centralized access to information in a large, decentralized, distributed environment
  • Information often organized by business function (e.g., “org chart”), not in ways users think
  • Not “textbook” IA; highly dependent on business context
the challenge of eia competing trends
The Challenge of EIA: Competing trends
  • Trend toward autonomy
    • Cheap, easy-to-use democratizing technology
    • Human tendency toward autonomy
  • Trend toward centralization
    • Users’ desire for single-point of access
    • Management’s desire to control costs and communications
  • These tend to cancel each other out, getting us nowhere
  • Result: content “silos” and user confusion
indicators of problematic eia intranet glitches
Indicators of Problematic EIA: Intranet glitches
  • “How come I didn’t know your department was developing a product similar to ours?”
  • “Why couldn’t we find any relevant case studies to show that important prospect?”
  • “Why do our sales and support staff keep giving our customers inconsistent information?”
indicators of problematic eia external facing site glitches
Indicators of Problematic EIA: External-facing site glitches
  • “Our customers think we’re still in the widget business; after all these M&As, why don’t they realize that we’ve diversified?”
  • “We have so many great products that go together; why don’t we cross-sell more?”
  • “Customers keep asking for product support through our sales channel; why don’t they use the site’s FAQs and tech support content?”
so how do we get there
So How Do We Get There?
  • Let it go
    • There is no single solution
    • Redemption lies within phased, modular, evolving approaches that respect 80/20 rule
  • Your friends
    • Straw men
    • Your colleagues and professional networks
  • This seminar provides straw men for
    • EIA design
    • EIA methods
    • EIA team design and governance
top down navigation roadmap
Top-Down Navigation Roadmap

Main page

Site hierarchy

Site map

Site index

Selective navigation

top down challenges
Top-Down Challenges
  • Top-down IA
    • Anticipates questions that users arrive with
    • Provides overview of content, entry points to major navigational approaches
  • Issues
    • What do we do about main pages?
    • Portals: the answer?
    • Other ways to navigate from the top down
    • The dangers of taxonomies
portal solutions why they fail 1 2
Portal Solutions:Why they fail 1/2
  • Organizational challenges
    • Fixation on cosmetic, political
    • Inability to enforce style guide changes, portal adoption
    • Lack of ownership of centralizing initiatives, or ownership in wrong hands (usually IT)
  • Information architecture challenges
    • Taxonomy design required for successful portal tool implementation
      • Always harder than people imagine
      • Taxonomies break down as they get closer to local content (domains become specialized)
portal solutions why they fail 2 2
Portal Solutions:Why they fail 2/2
  • Challenges for users
    • Portals are shallow (only one or two levels deep)
    • Poor interface design
    • Users don’t typically personalize
  • More in James Robertson’s “Taking a business-centric approach to portals” (http://www.steptwo.com.au/papers/kmc_businessportals/index.html)
top down navigation design approaches
Top-Down Navigation:Design approaches
  • Main pages
  • Supplementary navigation
    • Tables of contents
    • Site indices
    • Guide pages
  • Taxonomies for browsing
    • Varieties: product, business function, topical
    • Topic pages
top down navigation main pages
Top-Down Navigation:Main pages
  • Often 80% of discussion of EIA dedicated to main page
    • Important real estate
    • But there are other important areas
      • Navigational pages
      • Search interface
      • Search results
      • Page design (templates, contextual navigation)
  • Divert attention from main pages by creating alternatives, new real estate: supplementary navigation
top down navigation supplementary navigation
Top-Down Navigation:Supplementary navigation
  • Examples
    • Site maps/TOC
    • Site indices
  • Benefits:
    • Create new real estate
    • Can evolve and drive evolution from org-chart centered design to user-centered design
    • Relatively low cost to initially implement
  • Drawbacks:
    • Often unwieldy for largest enterprises (not at IBM, Microsoft, failure at Vanguard)
top down navigation site maps
Top-Down Navigation:Site maps
  • Condensed versions of site hierarchy
    • Hierarchical list of terms and links
    • Primarily used for site orientation
    • Indirectly cut across subsites by presenting multi-departmental content in one place
    • But still usually reflects org chart
  • Alternative plan
    • Use site map as test bed for migration to user-centric design
    • Apply card sorting exercises on second and third level nodes
    • Result may cut across organizational boundaries
site map state of nebraska
Site Map: State of Nebraska

Majority of links reflect org chart

site map state of kentucky
Site Map: State of Kentucky

Evolving toward more user-centered, topical approach

top down navigation site indices
Top-Down Navigation:Site indices
  • Flat (or nearly flat) alpha list of terms and links
  • Benefits
    • Support orientation and known-item searching
    • Alternative “flattened” view of content
    • Can unify content across subsites
  • Drawbacks
    • Require significant expertise, maintenance
    • May not be worth the effort if table of contents and search are already available
  • Specialized indices may be preferable (shorter, narrower domain, focused audience)
site index am society of indexers example
Site Index:Am. Society of Indexers example
  • Full site index
    • @1000 entries for smallish site
    • Too large to easily browse
    • Replace with search?
specialized site index cdc example
Specialized Site Index:CDC example
  • Not a full site index
  • Focuses on health topics
    • Narrow domain
    • Specialized terminology
    • Possibly still too large to browse
specialized site index peoplesoft example
Specialized Site Index: PeopleSoft example
  • Product focus
    • A large undertaking at PeopleSoft
    • High value to users
top down navigation guides
Top-Down Navigation:Guides
  • Single page containing selective set of important links embedded in narrative text
  • Address important, common user needs
    • Highlight content for a specific audience
    • Highlight content on a specific topic
    • Explain how to complete a process
      • Can work as FAQs (and FAQs can serve as interface to guides)
  • Benefits
    • Technically easy to create (single HTML page)
    • Cut across departmental subsites
    • Gap fillers; complement comprehensive methods of navigation and search
    • Can be timely (e.g., news-oriented guides, seasonal guides)
    • Minimize political headaches by creating new real estate
top down navigation topic pages
Top-Down Navigation:Topic Pages
  • “Selective taxonomy improvement”
    • Portions of a taxonomy that expand beyond navigational value
    • Help knit together enterprise content deeper down in taxonomy
  • New “real estate” can be used by
    • Individual business units (to reduce pressure on main page) or…
    • Cross-departmental initiatives
topic pages cdc example
Topic Pages:CDC example

Subtopics now comprise only a small portion of page

top down navigation taxonomies portals
Top-Down Navigation:Taxonomies & portals
  • Can a single taxonomy unify an enterprise site?
    • First: can one be built at all?
    • Software tools don’t solve problems (see metadata discussion)
  • Approaches
    • Multiple taxonomies that each cover a broad swath of enterprise content: audience, subject, task/process, etc.
    • “Two-step” approach:
      • Build shallow, broad taxonomy that will answer “where will I find the information I need?”
      • Rely on subsite taxonomies to answer “where in this area will I find the information I need?”
top down navigation impacts on the enterprise
Top-Down Navigation: Impacts on the enterprise
  • Potential of “small steps” around which to build more centralized enterprise efforts
    • Site map and site index creation and maintenance
    • Guide and topic page creation and maintenance
    • Large editorial role, minimal technical requirements for both
  • May be preferable to tackle more ambitious areas much later
    • Developing and maintaining top-level taxonomy
    • Connecting high-level and low-level taxonomies
top down navigation roadmap45
Top-Down Navigation Roadmap

Main page

Site hierarchy

Site map

Site index

Selective navigation

top down navigation takeaways
Top-Down Navigation Takeaways
  • Main pages and portals: Bypass for now, add guides over time
  • Site hierarchy/taxonomy: Start shallow, "simple" (e.g., products); add progressively harder taxonomies (work toward faceted approach)
  • Site map/ToC: Use as a staging ground for a more topical approach
  • Site index: Move from generalized to specialized around a single topic, or augment with frequent search queries/best bets work
  • Guides: Start with a handful, then expand and rotate based on seasonality or other criteria of relevance
bottom up navigation roadmap
Bottom-Up Navigation Roadmap

Content modeling

Metadata development

Metadata tagging

bottom up navigation the basics
Bottom-Up Navigation: The basics
  • Focuses on extracting answers from content
    • How do I find my way through this content?
    • Where can I go from here?
  • Goals
    • Answers “rise to the surface”
    • Leverage CMS for reuse and syndication of content across sites and platforms
    • Improve contextual navigation
    • Increase the effectiveness of search
content modeling the heart of bottom up navigation
Content Modeling:The heart of bottom-up navigation
  • Content models
    • Used to convey meaning within select, high-value content areas
    • Accommodate inter-connectedness
  • Same as data or object modeling? Absolutely not!
    • Many distinctions between data and semi-structured text
    • Text makes up majority of enterprise sites
content modeling the basics
Content Modeling:The basics
  • Based on patterns revealed during content inventory and analysis
  • What makes up a content model?
    • Content objects
    • Metadata (attributes and values)
    • Contextual links
  • Applies to multiple levels of granularity
    • Content objects
    • Individual documents
content modeling we re already doing it at page level
Content Modeling:We’re already doing it at page level

album page = title/artist/release + tracks + cover image

content modeling content analysis reveals patterns
Content Modeling:Content analysis reveals patterns

album pages

artist bios

artist descriptions

album reviews

content modeling answer some questions
Content Modeling:Answer some questions
  • What contextual navigation should exist between these content objects? (see Instone’s “Navigation Stress Test”--http://user-experience.org/uefiles/navstress/ )
  • Are there missing content objects?
  • Can we connect objects automatically?

album pages

artist bios

artist descriptions

album reviews

content modeling fleshing out the model
Content Modeling:Fleshing out the model

concert calendar

album pages

artist descriptions

TV listings

album reviews

discography

artist bios

content modeling problematic borders
Content Modeling:Problematic borders

concert calendar

album pages

artist descriptions

TV listings

album reviews

discography

artist bios

content modeling when to use
Content Modeling:When to use
  • Use only for high value content
  • High value content attributes based on users, content, context, including
    • High volume
    • Highly dynamic
    • Consistent structure
    • Available metadata
    • Available content management infrastructure
    • Willing content owners
  • Much content can and will remain outside formal content models
content modeling steps for developing a model
Content Modeling:Steps for developing a model
  • Determine key audiences (who’s using it?)
  • Perform content inventory and analysis (what do we have?)
  • Determine document and object types (what are the objects?)
  • Determine metadata classes (what are the objects about?)
  • Determine contextual linking rules (where do the objects lead us to next?)
content modeling content object types 1 2
Content Modeling:Content object types 1/2
  • List known object types
  • For each audience:
    • Are there types that don’t fit?
      • Examples: company executive bios, Q&A columns
      • Venue reviews may be part of a separate content model
content modeling content object types 2 2
Content Modeling:Content object types 2/2
  • For each audience (continued):
    • Gap analysis: are there types missing that users might expect?
      • Examples: Gig reviews, Buy the CD, Links to music in the same genre
    • Which types are most important to each audience?
      • Fans of the band: Interviews with the band members
      • Casual listener: Samples of the CD tracks
content modeling metadata 1 2
Content Modeling:Metadata 1/2
  • Determine which objects would benefit from metadata
  • Develop three types of metadata
    • Descriptive
    • Intrinsic
    • Administrative
content modeling metadata 2 2
Content Modeling:Metadata 2/2
  • Aim to balance utility and cost
    • Answer most important questions: who, what, where, why, when, how?
    • Cost-benefit analysis
    • Development and maintenance costs of controlled vocabularies/thesauri
    • Ability of in-house staff to apply properly
content modeling contextual linking rules
Content Modeling:Contextual linking rules
  • Are there specific objects for which these questions arise again and again?
    • Where would I go from here?
    • What would I want to do next?
    • How would I learn more?
  • You have a rule if
    • The questions apply consistently
    • The answers work consistently
    • Metadata can be leveraged to connect questions and answers
  • Unidirectional links or bidirectional?
content modeling impacts on the enterprise
Content Modeling: Impacts on the enterprise
  • Content models are a means for tying together content across business unit boundaries
  • Content modeling is modular; over time, content models can be connected across the enterprise
  • Major benefits to users who get beyond main page
  • Can help justify CMS investments
  • Not all content areas and owners are appropriate to work with
cms selection eia needs
CMS Selection:EIA needs
  • Support metadata management (Interwoven)
  • Support shared metadata workflow
    • Author creation/submission/tagging (distributed)
    • Editorial tagging (centralized)
    • Editorial review (centralized)
  • Ability to support contextual linking logic
metadata what is metadata
Metadata:What is metadata?
  • Data about data
  • Information which describes a document, a file or a CD
  • Common metadata
    • CD information: title, composer, artist, date
    • MS Word document properties: time last saved, company, author
metadata three types
Metadata:Three types
  • Intrinsic: metadata that an object holds about itself (e.g., file name or size)
  • Descriptive: metadata that describes the object (e.g., subject, title, or audience)
  • Administrative: metadata used to manage the object (e.g., time last saved, review date, owner)
metadata common sources
Metadata:Common sources
  • Vocabularies from other parts of your organization (e.g., research library)
  • Competitors
  • Commercial sources (see www.taxonomywarehouse.com)
  • Your site’s users
    • Search analytics
    • Folksonomies
    • User studies (e.g., free listing, card sorting)
metadata value for the enterprise 1 2
Metadata:Value for the Enterprise 1/2
  • Search: cluster or filter the search by metadata, like title or keyword
  • Browse: create topical indexes by aggregating pages with the same metadata
  • Personalization and customization: show content to an employee based on their role or position in the company, e.g. engineer or manager
metadata value for the enterprise 2 2
Metadata:Value for the Enterprise 2/2
  • Contextual linking: create relationships between individual or classes of content objects (e.g., cross-marketing on llbean.com)
  • The purpose is to connect
    • Content to content
    • Users to content
  • To provide value, metadata requires consistency (structural and semantic)
metadata scaling problems
Metadata: Scaling problems
  • Barriers to enterprise metadata development:
    • Volume of metadata vocabs./silos
    • Complexity of semantic relationships (beyond synonyms)
metadata structural consistency
Metadata: Structural consistency
  • Standard formats and approaches enable interoperability, which enables sharing of metadata.
  • Examples
    • RDF (Resource Description Format)
    • Topic Maps
    • Dublin Core
    • OAI (Open Archives Initiative)
  • Sources
    • Academia/scholarly publishing world
    • Little from data management world
metadata rdf resource description format
Metadata: RDF (Resource Description Format)
  • A syntax for expressing semantic relationships
  • Basic components
    • Resource
    • Property type

3. Value

4. Property

2

1

3

4

From Andy Powell: http://www.ukoln.ac.uk/metadata/presentations/ukolug98/paper/intro.html

metadata topic maps
Metadata: Topic Maps
  • Potential syntax for content modeling, semantic webs
  • Most simply, made up of topics (e.g., “Lucca”, “Italy”), occurrences (e.g., “map”, “book”),and associations (e.g., “…is in…”, “…written by…”)
  • Source: Tao of Topic Maps, Steve Pepper (http://www.ontopia.net/topicmaps/materials/tao.html)

associations

topics

occurrences

metadata the dublin core
Metadata: The Dublin Core
  • A schema for expressing semantic relationships
  • Can use HTML or RDF syntax
  • Useful tool (or model) for creating document surrogates (e.g., Best Bet records)
  • A standard, but not a religious one
    • Selecting fewer attributes may be a necessity in enterprise environment
    • Attribute review can be useful as an enterprise-wide exercise
metadata dublin core elements 1 2
Metadata: Dublin Core elements 1/2
  • Title: A name given to the resource
  • Creator: An entity primarily responsible for making the content of the resource
  • Subject: A topic of the content of the resource
  • Description: An account of the content of the resource
  • Publisher: An entity responsible for making the resource available
  • Contributor: An entity responsible for making contributions to the content of the resource
  • Date: A date of an event in the lifecycle of the resource
metadata dublin core elements 2 2
Metadata: Dublin Core elements 2/2
  • Type: The nature or genre of the content of the resource
  • Format: The physical or digital manifestation of the resource
  • Identifier: An unambiguous reference to the resource within a given context
  • Source: A Reference to a resource from which the present resource is derived
  • Language: A language of the intellectual content of the resource
  • Relation: A reference to a related resource
  • Coverage: The extent or scope of the content of the resource
  • Rights: Information about rights held in and over the resource
metadata dublin core in html
Metadata: Dublin Core in HTML
  • Dublin Core elements identified with “DC” prefix

From Andy Powell: http://www.ukoln.ac.uk/metadata/presentations/ukolug98/paper/intro.html

metadata dublin core and rdf
Metadata:Dublin Core and RDF
  • Syntax and schema combination is useful
  • But where are the metadata values?

From Andy Powell: http://www.ukoln.ac.uk/metadata/presentations/ukolug98/paper/intro.html

metadata oai and metadata harvesting
Metadata:OAI and metadata harvesting
  • OAI: Open Archives Initiative
    • Comes from academic publishing world
    • Provides means for central registration of “confederate repositories”
    • Repositories use Dublin Core; requests between service and data providers via HTTP; replies (results) encoded in XML
  • Metadata harvesting
    • Enables improved searching across compliant distributed repositories
    • Does not address semantic merging of metadata (i.e., vocabulary control)
metadata semantic consistency 1 2
Metadata:Semantic consistency 1/2
  • Provided through controlled vocabularies.
  • What is a controlled vocabulary?
    • A list of preferred and variant terms
    • A subset of natural language
  • Why control vocabulary?
    • Language is Ambiguous
    • Synonyms, homonyms, antonyms, contronyms, etc. (e.g., truck, lorry, semi, pickup, UTE)
metadata semantic consistency 2 2
Metadata:Semantic consistency 2/2
  • Control vocabulary…so your users don’t have to!
metadata semantic relationships
Metadata:Semantic relationships
  • Three types
    • Equivalence: Variant terms with same meaning (e.g., abbreviations and synonyms)
    • Hierarchical: Broader term, narrower term relationships
    • Associative: Related terms that are related to each other
metadata synonym rings
Metadata:Synonym rings
  • Used in many search engines to expand the number of results
  • Words that are similar to each other are linked together
  • Example for a multinational company
    • Annual leave (Australia), the holidays (US), public holidays (Australia, US), vacation (US), bank holidays (UK), holiday (Australia and UK), personal leave (all)
metadata authority files
Metadata:Authority files
  • Pick list of the authorized words to use in a field
  • Can have some equivalence relationships
  • Example using authors
    • Poe, Edgar Allan--USE FOR Poe, E.A.
    • Poe, E.A.--USE Poe, Edgar Allan
metadata classification schemes
Metadata: Classification schemes
  • Classification
    • Systematic arrangement of knowledge, usually hierarchical
    • Placement of objects into a scheme which makes sense to the user and relates them to other objects
  • Two types of classification schemes
    • Enumerative classification: hierarchical organization into which objects are placed
    • Faceted classification: organization by facets or attributes that describe the object
metadata enumerative classification
Metadata: Enumerative classification
  • Really good to classify small numbers of objects or objects that can live in only one place
  • Provides good browsing structure
  • Can be polyhierarchical, where objects live in many places
  • Best known: the taxonomy of life, Dewey Decimal Classification, Library of Congress Classification
  • Most familiar on the Web: Yahoo!, Open Directory
metadata faceted classification 1 2
Metadata: Faceted classification 1/2
  • Describes the object with numerous facets or attributes
  • Each facet could have a separate controlled vocabulary of its own
  • Can mix and match the facets to create a browsing structure
  • Easier to manage the controlled vocabularies
metadata faceted classification 2 2
Metadata: Faceted classification 2/2
  • Facets for a roast chicken recipe
    • Preparation: Roast / bake
    • Main ingredient: Chicken
    • Course: Main dish
  • Drawbacks of faceted classification
    • Too many facets attached to an object can make indexing hard to do
    • Browsing facets may not be as clear as browsing a hierarchy; many paths to the same object
metadata what is a thesaurus
Metadata:What is a thesaurus?

Traditional use

  • Dictionary of synonyms (Roget’s)
  • From one word to many words

Information retrieval context

  • A controlled vocabulary in which equivalence, hierarchical, and associative relationships are identified for purposes of improved retrieval
  • From many words to one word
enterprise metadata challenges
Enterprise Metadata:Challenges
  • Two barriers to enterprise metadata
  • Interoperability (structural)
  • Merging enables controlled vocabularies to work as a whole (semantic)
  • Interoperability must come before merging (merging requires knowledge of which vocabularies to merge)
  • Few standards in use
enterprise metadata structural approaches
Enterprise Metadata:Structural approaches
  • If directly marking up documents, this approach is probably impractical in the enterprise
  • Better uses:
    • Limited high value documents (e.g., content models)
    • Document surrogates (e.g., Best Bet records)
enterprise metadata merging vocabularies
Enterprise Metadata: Merging vocabularies
  • Extremely difficult, and currently rare
  • Mostly found in libraries, academia, scholarly publishing, and other resource-poor environments
  • Examples, hard to hardest
    • Cross-walking vocabularies
    • Switching vocabularies
    • Meta-thesaurus
    • Single thesaurus
merging vocabularies vocabulary cross walking
Merging Vocabularies:Vocabulary cross-walking
  • Map terms peer-to-peer between individual vocabularies
    • Primarily handles synonyms, not relationships
    • Can be handled manually or through automated means (pattern-matching)
  • Doesn’t scale well beyond two or three vocabularies
merging vocabularies switching vocabulary
Merging Vocabularies:Switching vocabulary
  • A single vocabulary that maps to existing vocabularies (primarily synonyms)
  • Similar to cross-walking, but better at handling translation when there are more than two or three vocabularies to connect
merging vocabularies meta thesaurus
Merging Vocabularies:Meta-thesaurus
  • A switching vocabulary which also includes thesaural relationships (essentially a thesaurus of thesauri)
  • Example: National Library of Medicine’s UMLS (Unified Medical Language System)
    • Merges over 100 vocabularies
    • Describes fairly homogeneous domain (medical literature) for fairly homogeneous audience (health science professionals)
merging vocabularies single unified thesaurus
Merging Vocabularies:Single unified thesaurus
  • Highly impractical in enterprise context
enterprise metadata impacts on the enterprise 1 2
Enterprise Metadata: Impacts on the enterprise 1/2
  • Requires coordinated strategy to ensure:
    • Structural interoperability from the start
    • Semantic mergability over time
    • Vocabulary control and maintenance through both manual and automated means
    • A workflow model and policies to support:
      • Decentralized tagging and vocabulary updating (through suggestions of new terms)
      • Centralized review and maintenance
enterprise metadata impacts on the enterprise 2 2
Enterprise Metadata: Impacts on the enterprise 2/2
  • “Serious metadata” is beyond the means of most enterprises
    • Encourage local (e.g., departmental) vocabulary development
    • Provides organizational learning and local benefit
    • Enterprise-wide, start with “easier” vocabularies; work your way to harder ones over time; suggested sequence:
      • Business functions
      • Products
      • Topics
bottom up navigation roadmap111
Bottom-Up Navigation Roadmap

Content modeling

Metadata development

Metadata tagging

bottom up navigation takeaways 1 3
Bottom-Up Navigation Takeaways 1/3

Content models

  • Use to support contextual navigation
  • Apply only to homogenous, high-value content
  • Won't transfer easily across silos and will require significant metadata development
bottom up navigation takeaways 2 3
Bottom-Up Navigation Takeaways 2/3

Metadata development

  • Distinguish attributes (and structural interoperability) from values (and semantic merging)
  • Costs and value both increase as these increase:
    • Complexity of relationships between terms (equivalence=>hierarchical=>associative)
    • Level of control (synonym rings=>authority files=>classification schemes=>thesauri)
  • Think small: facets instead of a single taxonomy
bottom up navigation takeaways 3 3
Bottom-Up Navigation Takeaways 3/3

Metadata tagging

  • Make choices based on actual needs (e.g., content models) rather than exhaustive indexing
  • Consider costs of application and upkeep
    • Need for professional expertise
    • Metadata is a moving target that matches other moving targets (users and content)
eia and search116
EIA and Search
  • Search systems are a natural enterprise IA tool
    • Automated
    • Crawls what you tell it to
    • Doesn’t care about politics
  • Problems with shrink-wrapped search tools
    • Default settings, IT ownership minimize customization to fit the enterprise’s needs
    • Results often not relevant, poorly presented
  • Customization is the answer
    • Within the realm of your team’s abilities
    • … and if IT will allow it!
enterprise search design potential improvements
Enterprise Search Design: Potential improvements

Our focus:

  • Clear interface
  • Enhanced queries
  • Improved results (relevance & presentation)

Basic search system components

enterprise search roadmap
Enterprise Search Roadmap

Search interface

Search queries

Search results

search interface design the box
Search Interface Design:The “Box”
  • The “Box” unifies IBM.com
  • Consistent:
    • Placement
    • Design
    • Labeling
    • Functionality
search interface design combine interfaces when possible
Search Interface Design: Combine interfaceswhen possible
  • Two boxes bad, one box good, usually…
  • Will users understand?
search interface design the role of advanced search 1 2
Search Interface Design: The role of “advanced search” 1/2

Not a likely starting point for users who are searching

  • Continued…
search interface design the role of advanced search 2 2
Search Interface Design: The role of “advanced search” 2/2
  • Suggestions
    • Use for specialized interfaces
    • Reposition as “Revise Search”
    • Don’t bother
search interface and queries functionality and visibility
Search Interface and Queries: Functionality and visibility
  • Hide functionality? Consider the “Google Effect,” human nature and the LCD
  • Don’t hide it?
    • Not if users expect it
      • Legacy experience (e.g., Lexis-Nexis users)
      • Specialization (e.g., patent searchers)
    • Not if content allows/requires it
      • Specialized content and applications (e.g., staff directory)
the query query language considerations
The Query: Query language considerations
  • Natural language
    • Usually don’t show up in search logs
    • Low priority, but nice to support
  • Operators (Booleans, proximity, wild cards)
    • Booleans: use default “AND” for multi-term queries
      • Less forgiving than treating as phrase, more selective than “OR”
      • Most retrieval algorithms will find results for just one term
      • Rely on other approaches (e.g., filtering, clustering, Best Bets) to reduce search results overload
    • Low priority: Proximity operators (e.g., “enterprise (W3) architecture”), wild cards (e.g., “wom*n”)
the query query building considerations
The Query: Query building considerations
  • Large potential benefits to improving “intelligence” behind search queries
    • Adding semantic richness to queries allows for stronger searches without “touching” content
    • Overrides “enterprise bias” embedded in content
    • A centralized (enterprise-wide) process
  • Query building approaches
    • Spell checking: can be automated
    • Stemming: can be automated
    • Concept searching: requires manual effort
    • Synonyms (via thesaurus): requires manual effort, but no need to be comprehensive
stemming ibm example
Stemming:IBM example

IBM uses Fast Search

enterprise search interface guidelines
Enterprise Search Interface:Guidelines
  • Hide functionality on initial enterprise-wide search
  • Cast the net widely: rely on query builders to generate larger, higher quality result sets
  • Use filtering/clustering to narrow
  • Use Best Bets to ensure strong initial results
individual search results goals
Individual Search Results: Goals
  • Enable users to quickly understand something about each document represented
  • That “something”: confirm that a known-item has been found, or distinguish from other results
  • Align to searching behaviors (determined through user testing, persona/scenario analysis, local site search analytics)
    • Known-item
    • Open-ended/exploratory
    • Comprehensive research
individual search results approaches
Individual Search Results: Approaches
  • Basic approaches
    • Document titling
    • Displaying appropriate elements for each result
  • These approaches have value in any context, but especially useful in enterprise setting
document titling daimlerchrysler example
Document Titling: DaimlerChrysler example
  • What do these document titles tell you?
  • And what do they tell you about DaimlerChrysler?
document titling ford example
Document Titling: Ford example
  • Descriptive document titles provide clear value
  • …but rely upon highly centralized authoring procedures and style guide
displaying appropriate elements 1 determine common elements
Displaying Appropriate Elements: 1) Determine common elements
  • Develop table of available elements (including metadata) for disparate documents and records
    • Comes after content inventory and analysis
  • Develop table of common elements
    • Collapse similar elements (e.g., creator derived from author, artist, source…)
    • Consider Dublin Core as model
    • Include bare minimum elements (e.g., title and description)
displaying appropriate elements 2 select appropriate elements
Displaying Appropriate Elements: 2) Select appropriate elements
  • Choose common elements which match most common searching behaviors
    • Known-item
    • Open-ended
    • Comprehensive research
    • Etc.
  • Considerations
    • Which components are decision or action based?
    • Which components are of informational value only?
  • Display these elements for each search result
individual search results columbia university example
Individual Search Results:Columbia University example
  • Long display for open-ended searchers…
  • …shorter display for known-item searchers
individual search results what happens next
Individual Search Results: What happens next?
  • Augment with “next step” actions per result
    • Open in separate window
    • Get more like this
    • Print
    • Save
    • Email
  • Determine next stepsthrough contextual inquiry
presenting search result groups ranked results
Presenting Search Result Groups: Ranked results
  • Difficulties with relevance ranking
    • Depends on consistent elements across documents
    • Term frequency-dependent approaches create an “apples and oranges effect” on ranking
    • Google effect: benefits of popularity make less sense in enterprise context than in open web
  • Consider alternatives
    • Clustering and filtering
    • Manually-derived results (aka “Best Bets”)
presenting search result groups clustering filtering
Presenting Search Result Groups: Clustering & filtering

“Our user studies show that all Category interfaces were more effective than List interfaces even when lists were augmented with category names for each result” —Dumais, Cutrell & Chen

  • list results
  • clustered results

Consider using clustered results rather than list results

presenting search result groups methods of clustering and filtering
Presenting Search Result Groups: Methods of clustering and filtering
  • Use existing metadata and other distinctions (easier)
    • Document type (via file format or CMS)
    • Source (author, publisher, and business unit)
    • Date (creation date? publication date? last update?)
    • Security setting (via login, cookies)
  • Use explicit metadata (harder)
    • Language
    • Product
    • Audience
    • Subject/topic
clustering by topic ll bean example
Clustering by Topic:LL Bean example

Category matches displayed rather than individual results

filtering by source bbc example
Filtering by Source:BBC example

Selecting a tab filters results

clustering by content type c net example
Clustering by Content Type: c|net example
  • Mention content modeling

Results clustered in multiple

content types

clustering by language example peoplesoft netherlands
Clustering by Language Example:PeopleSoft Netherlands

Result clusters for Dutch and English

the zipf curve consistent and telling
The Zipf Curve:Consistent and telling

Zipf distribution from Michigan State University search logs (derived from local site search analytics)

  • From http://netfact.com/rww/write/searcher/rww-searcher-msukeywords-searchdist-apr-jul2002.gif
best bets by popular demand
“Best Bets”: By popular demand
  • Recommended links
    • Ensure useful results for top X (50? 100?) most popular search queries
    • Useful resources for each popular query are manually determined (guided by documented logic)
    • Useful resources manually linked to popular queries; automatically displayed in result page
best bets example bbc
“Best Bets” Example: BBC
  • Logic for BBC Best Bets
    • Is query a country name? (yes)
    • Then do we have a country profile? (yes)
    • Then do we have a language service? (yes)
best bets in the enterprise context
“Best Bets”: In the enterprise context
  • Who does the work?
    • Difficult to “assign” queries to different business units (e.g., “computing” means different things to different business units)
    • Can serve as impetus for centralized effort
  • Operational requirements
    • Logic based on users’ needs (e.g., queries) and business rules
    • Policy that assigns responsibilities, negotiates conflicts (e.g., who owns “computing”)
  • Opportunity to align Best Bets to user-centric divisions (e.g., by audience: a “computing” best bet for researchers, another for IT staff)
enterprise search impacts on the enterprise
Enterprise Search:Impacts on the enterprise
  • Designs
    • Simple query builders (spell checker, stemming)
    • Search-enhancing thesaurus
  • Policies
    • Best Bets design and selection
    • Style guide (result titling, search interface implementation)
  • Staffing needs
    • Content inventory and analysis
    • Interface design
    • Work with IT on spidering, configuration issues
    • Ongoing local site search analytics
    • Editorial (e.g., Best Bets creation)
search tool selection eia needs 1 2
Search Tool Selection:EIA needs 1/2
  • To basic evaluation criteria (from SearchTools.com)…
    • Price
    • Platform
    • Capacity
    • Ease of installation
    • Maintenance
search tool selection eia needs 2 2
Search Tool Selection:EIA needs 2/2
  • …add:
    • Ability to crawl deep/invisible web
    • Ability to crawl multiple file formats
    • Ability to crawl secure content
    • API for customizing search results
    • Work with CMS
    • Duplicate result detection/removal
    • Ability to tweak algorithms for results retrieval and presentation
    • Federated search (merge results from multiple search engines/data sources)
enterprise search roadmap157
Enterprise Search Roadmap

Search interface

Search queries

Search results

enterprise search takeaways
Enterprise Search Takeaways
  • Search interface and queries
    • Consistent location and behavior
    • Keep as simple as possible
    • Use "refine search" interface instead of "advanced search"
    • Soup up users’ queries (e.g., spell checking)
  • Search results
    • Feature appropriate elements for individual results
    • Consider clustered results, especially if explicit, topical metadata are available
    • Best bets results for top X common queries
eia research methods learn about these three areas
EIA Research Methods:Learn about these three areas
  • Content, users and context drive:
    • IA research
    • IA design
    • IA staffing
    • IA education
    • …and everything else
eia research methods sampling challenges
EIA Research Methods: Sampling challenges
  • How do you achieve representative samples in the face of these difficulties?
    • Awareness: Who and what are out there?
    • Volume: How much is there? Can we cover it all?
    • Costs: Can we afford to investigate at this order of magnitude?
    • Politics: Who will work with us? And who will try to get in the way?
eia research methods reliance on alternative techniques
EIA Research Methods: Reliance on alternative techniques
  • Standard techniques may not work in enterprise settings
  • Alternatives often incorporate traditional methods and new technologies
    • Web-based surveys (e.g., SurveyMonkey)
    • Remote contextual inquiry and task analysis (via WebEx)
    • Web-based “card” sorting (e.g., EZsort)
    • Auto-categorization, auto-classification tools (e.g., Semio)
    • Log analysis tools (e.g., WebTrends)
eia research methods a closer look
EIA Research Methods: A closer look
  • Content-oriented methods
    • Content inventories
    • Content value tiers
  • Context-oriented methods
    • Sampling stakeholders
    • Departmental scorecard
  • User-oriented methods
    • 2-D scorecard
    • Automated metadata development
    • Freelisting
    • Local site search analytics
content inventory enterprise context
Content Inventory:Enterprise context
  • Issues
    • Even greater sampling challenges
    • Content research is even more critical: serves as a cross-departmental exercise
  • Approaches
    • Balancing breadth and depth
    • Talking to the right people
    • Value-driven
multidimensional inventory incomplete yet rich
EIA requires balanced, iterative sampling (where CMS implementation may require exhaustive inventory)

Balance scope (breadth) with granularity (depth)

Extend inventory to all discernible areas of content, functionality:

Portals and subsites

Application (including search systems)

Supplemental navigation (site maps, indices, guides)

Major taxonomies

Structured databases

Existing content models

Stakeholders

Multidimensional Inventory:Incomplete yet rich
content migration strategy value tier approach
Content Migration Strategy:Value Tier Approach
  • Determine value tiers of content quality that make sense given your users/content/context
    • Answer “what content is important to the enterprise?”
    • Help determine what to add, maintain, delete
  • How to do it?
    • Prioritize and weight quality criteria
    • Rate content areas
    • Cluster into tiers
    • Score content areas while performing content analysis
value tier approach potential quality criteria
Value Tier Approach:Potential quality criteria
  • Select appropriate criteria for your business context, users, and content
    • Authority
    • Strategic value
    • Currency
    • Usability
    • Popularity/usage
    • Feasibility (i.e., “enlightened” content owners)
    • Presence of quality existing metadata
assessing stakeholders what to learn from them
Assessing Stakeholders:What to learn from them
  • Strategic
    • Understanding of business mission and goals, and fit with larger enterprise mission and goals
      • Theory
      • Practice
    • Culture: tilt toward centralization or autonomy
    • Political entanglements
  • Practical
    • Staff: IT, IA, design, authoring, editorial, usability, other UX (user experience)
    • Resources: budget, content, captive audiences
    • Technologies: search, portal, CMS
stakeholder interviews triangulate your sample
Org chart: business unit representatives

Will provide strategic overview of content and whom it serves

May have some knowledge of content

More importantly, they know people who do in their units

Additionally, political value in talking with unit reps

Functional/audience-centered

Subject Matter Experts (SMEs): represent power users; valuable for pointing out content that addresses major information needs

Audience advocates (e.g., switchboard operators): can describe content with high volume usage

Stakeholder Interviews:Triangulate your sample
stakeholder interviews finding the low hanging fruit
Stakeholder Interviews:Finding the low-hanging fruit
  • Assessment should reveal degree of “enlightenment”
    • Early adopters
    • Successful track records visible within the enterprise
    • Understand/have experience with enterprise-wide initiatives
    • Willingness to benefit the enterprise as a whole
    • They just plain “get it”
  • You’ve got to play to win: lack of interest and availability mean loss of influence
stakeholder interviews indicators of enlightenment
Stakeholder Interviews:Indicators of enlightenment
  • Technology assessment: who has/uses the “classic 3”?
    • Portal
    • Search engine
    • CMS
  • Staff review: who has relevant skills/expertise on their staff?
  • IA review: what areas of enterprise site have strong architectures?
  • These areas may indicate redundant costs, targets for centralization
safe user sampling the 2d scorecard
“Safe” User Sampling:The 2D Scorecard
  • Combines alternative, apolitical methods for determining segments to sample, e.g.:
    • Role-based segmentation
    • Demographic segmentation
  • Distracts stakeholders from “org chart-itis,” to purify sampling
  • Enables evaluation methods (e.g., task analysis, card sorting)
the 2d scorecard role based segmentation
The 2D Scorecard: Role-based segmentation
  • Roles cut across political boundaries
    • Profile core enterprise-wide business functions
      • Why does the enterprise exist?
      • Examples: Sell products, B2B or B2C activities, manufacture products, inform opinion, etc.
    • Determine major “actors” in each process
the 2d scorecard demographic segmentation
The 2D Scorecard: Demographic segmentation
  • Standard, familiar measure; also cuts across political boundaries
    • Gender
    • Geography
    • Age
    • Income level
    • Education level
  • Your marketing department probably has this data already
the 2d scorecard incorporating contextual bias
The 2D Scorecard:Incorporating contextual bias
  • Role/demographic “scorecard” is pure
    • Serves as a structure that doesn’t have to change substantially
    • But how to incorporate stakeholder bias?
  • Stakeholder bias can be accommodated
    • Poll/interview stakeholders to determine how cell values should change
    • Axes and totals stay mostly the same
    • Distraction is our friend
maintaining a user pool build your own for fun and power
Maintaining a User Pool:Build your own for fun and power
  • Through automated surveys, lower level information architect built an enterprise-wide pool of 1,500 users
    • Prescreened by demographics and skills
    • Provided him with substantial leverage with others who wanted access to users
    • He just got there first and did the obvious
  • More information: http://louisrosenfeld.com/home/bloug_archive/000408.html
metadata development conventional techniques
Metadata Development: Conventional techniques
  • Techniques
    • Open card-sorting to gather terms
    • Closed card-sorting to validate terms
    • Can be difficult to carry out in enterprise environment (scope of vocabulary, subject sampling)
  • Modifications for enterprise setting
    • Use remote tools (e.g. IBM’s EZsort)
    • Apply in “stepped” mode: test subsections of taxonomy separately
    • Drawback: lack of physical cards may diminish value of data
metadata development classification scheme analysis
Metadata Development: Classification scheme analysis
  • Review existing schemes, looking for:
    • Duplication of domain
    • Overlapping domains
    • Consistency or lack thereof
  • Can some vocabularies be reused? Improved? Eliminated?
automated metadata development two classes of tools
Automated Metadata Development: Two classes of tools
  • Auto-categorization tools
    • Can leverage pattern-matching and cluster-analysis algorithms to automatically generate categories (e.g., Autonomy, Interwoven)
    • Can also use rules (i.e., concepts) to generate categories (e.g., Inktomi, Verity, Entrieva/Semio)
  • Auto-classification tools
    • Apply indexing to existing categories
    • Require controlled vocabularies (generally manually-created) to index content
automated metadata development pros and cons
Automated Metadata Development: Pros and cons
  • Benefits
    • Apolitical applications that disregard org chart
    • May be a necessary evil in a large enterprise environment
  • Drawbacks
    • Limited value in heterogeneous, multi-domain environment
    • Perform better with rich text, not so good with database records and other brief documents
automated metadata development semio example
Automated Metadata Development: Semio example
  • At best, an 80% solution; none truly “automated”
    • Significant manual proofing of the 80% of content indexed
    • Significant manual indexing of the 20% not indexed

“E-commerce”: A human would collapse many of these categories

finding metadata free listing
Finding Metadata:Free listing
  • Simple technique:
    • “List all of the terms you associate with ______”
    • Perform pair analysis (co-occurrence) on results
  • Benefits
    • Harvests terms associated with a concept or domain
    • Can be done in survey form with many subjects, multiple audiences
    • Supports card sorting
    • Less useful for structuring relationships between terms
    • Possible alternative to local site search analytics
local site search analytics what does this data tell us
Local Site Search Analytics:What does this data tell us?
  • Keywords: focis; 0; 11/26/01 12:57 PM; XXX.XXX.XXX.2
  • Keywords: focus; 167; 11/26/01 12:59 PM; XXX.XXX.XXX.2
  • Keywords: focus pricing; 12; 11/26/01 1:02 PM; XXX.XXX.XXX.2
  • Keywords: discounts for college students; 0; 11/26/01 3:35 PM; XXX.XXX.XXX.59
  • Keywords: student discounts; 3; 11/26/01 3:35 PM; XXX.XXX.XXX.59
  • Keywords: ford or mercury; 500; 11/26/01 3:35 PM; XXX.XXX.XXX.126
  • Keywords: (ford or mercury) and dealers; 73; 11/26/01 3:36 PM; XXX.XXX.XXX.126
  • Keywords: lorry; 0; 11/26/01 3:36 PM; XXX.XXX.XXX.36
  • Keywords: “safety ratings”; 3; 11/26/01 3:36 PM; XXX.XXX.XXX.55
  • Keywords: safety; 389; 11/26/01 3:36 PM; XXX.XXX.XXX.55
  • Keywords: seatbelts; 2; 11/26/01 3:37 PM; XXX.XXX.XXX.55
  • Keywords: seat belts; 33; 11/26/01 3:37 PM; XXX.XXX.XXX.55
local site search analytics instructions
Local Site Search Analytics: Instructions
  • Sort and count queries
  • Identify and group similar queries (e.g., “cell phones” and “mobile phones”)
  • Understand users’ query syntax (e.g., use of single or multiple terms, Boolean operators) and semantics (e.g., use of lay or professional terms)
  • Determine most common queries
    • Identify content gaps through 0 result queries
    • Build “Best Bets” for common queries
    • Map common queries to audiences through IP or login analysis
local site search analytics benefits for i nterface development
Local Site Search Analytics: Benefits for interface development
  • Identifies “dead end” points (e.g., 0 hits, 2000 hits) where assistance could be added (e.g., revise search, browsing alternative)
  • Syntax of queries informs selection of search features to expose (e.g., use of Boolean operators, fielded searching)

…OR…

local site search analytics benefits for metadata development
Local Site Search Analytics: Benefits for metadata development
  • Provides a source of terms for the creation of vocabularies
  • Provides a sense of how needs are expressed
    • Jargon (e.g., “lorry” vs. “truck”)
    • Syntax (e.g., Boolean, natural language, keyword)
  • Informs decisions on which vocabularies to develop/implement (e.g., thesaurus, spell-checker)
local site search analytics benefits for content analysis
Local Site Search Analytics: Benefits for content analysis
  • Identifies content that can’t be found
  • Identifies content gaps
  • Creation of “Best Bets” to address common queries
local site search analytics pros and cons
Local Site Search Analytics:Pros and cons
  • Benefits
    • Data is real, comprehensive, available (usually)
    • High volume
    • Can track sessions
    • Non-intrusive
  • Drawbacks
    • Lack of good commercial analysis tools
    • Lack of standards makes it difficult to merge multiple search logs (not to mention server logs)
    • More difficult to merge with other logs (e.g. server)
    • Doesn’t tell you why users did what they did
local site search analytics enterprise context
Local Site Search Analytics:Enterprise context
  • Makes case for EIA; usually demonstrates that users are requesting things that aren’t tied to departmental divisions (e.g., policies, products)
  • Informs “Best Bets”
  • Informs synonym creation
  • Limited value if not analyzing merged logs
eia research methods takeaways
EIA Research Methods Takeaways
  • Challenges
    • Many traditional methods can be adapted to the enterprise environment
    • But sampling, geography, volume and politics force a less scientific, more pragmatic approach
    • Also force greater reliance on automated tools
  • We need new methods
    • Focus on minimizing politics and geographic distribution
    • Most are untested
    • Information architects need to be willing to experiment, innovate, and live with mistakes
eia and the enterprise phased modular model
EIA and the Enterprise:Phased, modular model
  • Phasing is not just about roll-out and timing
  • Should be overarching philosophy for EIA initiatives
    • We can phase in whom we work with
    • We can phase in whom we hire to do EIA work
    • We can modularize what types of EIA we do
    • We can phase in what degree of centralization we can support
why a phased model because mandates don t work
Why a Phased Model?Because mandates don’t work
  • “Just do it!”…
    • …all (e.g., all subsites)
    • …now (e.g., in 3-6 months)
    • …with few resources and people (e.g., one sad webmaster)
    • …in a way that minimizes organizational learning (e.g., hire an outside consultant or agency)
  • Results of the mandated “solution”: completely cosmetic, top-down information architecture
the eia framework seven issues
The EIA FrameworkSeven issues
  • EIA governance: how the work and staff are structured
  • EIA services: how work gets done in an enterprise environment
  • EIA staffing: who handles strategic and tactical efforts
  • EIA funding model: how it gets paid for
  • EIA marketing and communications: how it gets adopted by the enterprise
  • EIA workflow: how it gets maintained
  • EIA design and timing: what gets created and when
the eia framework critical goals
The EIA FrameworkCritical goals

Re-balance the enterprise’s in-house IA expertise to support an appropriate degree of centralization

Enable slow, scaleable, sustainable growth of internal EIA expertise

Create ownership/maintenance mechanism for enterprise-wide aspects of IA (currently orphaned)

Ensure institutional knowledge is retained

eia governance questions
EIA Governance: Questions
  • What sort of individuals or group should be responsible for the EIA?
  • Where should they be located within the organization? How should they address strategic issues? Tactical issues?
  • Can they get their work done with carrots, sticks, or both as they try to work with somewhat autonomous business units?
eia governance a separate business unit 1 2
EIA Governance:A separate business unit 1/2
  • Logical outgrowth of
    • Web or portal team
    • Design or branding group
    • E-services, e-business or e-commerce unit
  • Goals
    • Ensure that IA is primary goal of the unit
    • Retain organizational learning
    • Avoid political baggage
    • Maintain independence
eia governance a separate business unit 2 2
EIA Governance:A separate business unit 2/2
  • Ambitious, fool-hardy, unrealistic? Necessary!
    • Models of successful new organizational efforts often start as separate entities
    • Alternatives (none especially attractive)
    • Be a part of IT or information services
    • Be a part of marketing and communications
    • Be a part of each business unit
eia governance balancing strategic and tactical
EIA Governance:Balancing strategic and tactical
  • Strategic: Model on Board of Directors
    • Represent key constituencies
    • Track record with successes, mistakes with organization’s prior centralization efforts
    • Mix of visionaries, people who understand money
  • Tactical: Start with staff who “do stuff”
    • Extend as necessary by outsourcing
    • Enables logical planning of hiring and use of consultants and contractors
eia governance board of directors 1 2
EIA Governance: Board of directors 1/2
  • Goals
    • Understand the strategic role of information architecture within the enterprise
    • Promote information architecture services as a permanent part of the enterprise’s infrastructure
    • Align the group and its services with those goals
    • Ensure the group’s financial and political viability
    • Help develop the group’s policies
    • Support the group’s management
  • Makeup
    • Draw first from effective leaders
    • Then from major units that would be strategic partners
eia governance board of directors 2 2
EIA Governance: Board of directors 2/2
  • Qualities
    • Experience and duration in the enterprise
    • Wide visibility and extensive network
    • Can draw on institutional memories and experiences
    • Track record of involvement with successful initiatives
    • Entrepreneurial (can read and write a business plan)
    • Experienced with centralization efforts
    • Does not shy away from political situations
    • Can “sell” a new concept and find internal funding
    • Is like the people you need to “sell” to
    • Has experience with consulting operations
    • Has experience negotiating with vendors
eia governance caterpillar s boards
EIA Governance:Caterpillar’s boards
  • Strategic board (quarterly; @10 members)
    • “Owners” of enterprise site
    • Decide on major policies
    • Settle conflicts
  • Stakeholder board (monthly; 15-20)
    • Ensure broad participation
    • Ensure two-way communication
    • Make recommendations re: policy to strategic board
  • User advocacy board (meets as needed; 5-10)
    • Represent major user groups
    • Maintain pool of sample users
eia services questions
EIA Services:Questions
  • What should a team responsible for EIA actually do?
  • How do their “services” fit with work that happens within business units? Or with outside contractors and consultants?
  • What kind of people should manage these efforts?
  • How do IA generalists and specialists fit together?
eia services modular service plan
EIA Services:Modular service plan
  • Avoid “monolithic” approach: “Hi, we’re the EIA team and we’re here to help… and we’re going to centralize all of your information…”
  • Break IA and CM into digestible, non-threatening tasks and sell those
    • Allows you to divide and conquer clients…
    • …and helps you understand IA challenges better (e.g., applying metadata in a centralized environment)
eia services potential service offerings 1 3
EIA Services:Potential service offerings 1/3
  • Client workflow-oriented (map to content publication process)
    • Content authoring and acquisition
    • Metadata development
    • Content titling
    • Content tagging
    • Content review (voice, accuracy, etc.)
    • Content formatting
    • Formatting review
    • Optimization for search engine optimization
    • Publication
eia services potential service offerings 2 3
EIA Services:Potential service offerings 2/3
  • User-oriented
    • Persona and scenario development
    • User testing and task analysis
    • Search and server log analysis
  • Content-oriented
    • Content inventory and analysis
    • Content evaluation and assessment
    • Content model design
    • Content development policy (creation, maintenance)
    • Content weeding, ROT removal, and archiving
    • Content management tool (acquisition, maintenance)
    • Metadata development
    • Metadata maintenance
    • Manual tagging
    • Automated categorization and classification
eia services potential service offerings 3 3
EIA Services:Potential service offerings 3/3
  • Context-oriented
    • Business metrics development and analysis
    • Internal marketing strategy and implementation
    • Stakeholder and decision-maker interviews
    • Business rules development (for best bets, content models, etc.)
  • Production/Maintenance
    • Template design and application
    • Training
    • Policy/procedure/standards development and acceptance
    • Publicity of new/changed content
    • Tool analysis/acquisition (CMS, search, portal)
    • Quality control and editing
    • Link checking
    • HTML validation
    • Liaison with visual design staff, IT staff, vendors
eia services basic premium levels
EIA Services:Basic & premium levels

Free services can lead to fee services

eia staffing questions
EIA Staffing:Questions
  • Who should be involved: in-house, consultant, contractor? What type of specialization should the staff have?
  • Should they be centralized or located within business units or both?
eia staffing tactical team basics 1 2
EIA Staffing: Tactical team basics 1/2
  • Goals
    • Delivers IA services to the enterprisein content, users, and context areas
    • Implements the strategic team’s policies
    • Works directly with clients to understand their needs and develop new services to meet those needs
eia staffing tactical team basics 2 2
EIA Staffing: Tactical team basics 2/2
  • Make-up driven by “market demand,” existing resources
  • “Vertical” IA generalists: split between EIA project enterprise business units
  • “Horizontal” IA specialists: “consultants” for both groups of generalists
    • Tools (e.g., search, portal, CMS)
    • Metrics
    • Evaluation
    • Metadata development
    • XML and other markup languages
eia staffing tactical team qualities
EIA Staffing: Tactical team qualities
  • Entrepreneurial mindset
  • Ability to consult (i.e., do work and justify IA and navigate difficult political environments)
  • Willingness to acknowledge ignorance and seek help
  • Ability to communicate with people from other fields
  • Sensitivity to users’ needs
  • …and know about IA and related fields
eia staffing tactical team backgrounds skills
EIA Staffing:Tactical teambackgrounds/skills
  • Human Computer interaction
  • Cognitive Psychology
  • Librarianship (reference)
  • Marketing
  • Branding
  • Merchandising
  • Librarianship (tech. services)
  • Information Science
  • Journalism
  • Technical Communication
  • Computer Science
  • Graphic design
  • Organizational Psychology
  • Business Management
  • Operations Engineering
  • Social Network Analysis
  • Ethnography
  • Economics
eia funding model questions
EIA Funding Model:Questions
  • How should this group be funded?
  • How should other expenses (e.g., software licenses) be covered? Charge-back fees for individual services? Flat “tax” paid by business units? Covered by general administration's tab? Some hybrid thereof?
  • Should certain services be performed gratis, while others require payment?
eia funding model looking for inspiration
EIA Funding Model:Looking for inspiration
  • Study the successes/failures of the enterprise’s other centrally funded services
  • Possible plan
    • Initially: “tax” on business units and/or “seed capital” from senior management
    • Ultimately: self-funding (models: IT, HR, special projects)
  • Key: funding should be from central group (e.g., senior management) or self-funded; else too much dependency on business units
eia funding model ensuring independence
EIA Funding Model:Ensuring independence
  • Potential models already in existence in the enterprise
    • Charge-back
    • Tax on business units
    • Money from general fund
    • Hybrids
  • Charge-back model is attractive
    • Increasing perceived value of IA by charging fees
    • Compares well with duplicated expenses incurred by business units
eia marketing communications questions
EIA Marketing & Communications:Questions
  • How to position this work and the group that supports it: IA? User Experience? Web Design? How do these terms affect the scope of the work/charter of the group?
  • How does a plan like this get “sold,” and to whom?
  • Whose support is needed, and what tactics are useful in convincing them to support EIA work?
  • How to prioritize which business units around the enterprise to work with?
eia marketing communications positioning the eia initiative
EIA Marketing & Communications: Positioning the EIA initiative
  • Approaching “clients”
    • No carrot or stick
    • Offer services and consulting that save money, reduce tedium
  • Branding: choose the term that is
    • Hottest
    • Has least baggage
    • Steps on fewest toes
eia marketing communications selling ia
EIA Marketing & Communications: Selling IA
  • Concrete
    • We can make work easier and save money for individual business units
    • We can improve the user experience and build brand loyalty among customers, organizational loyalty among employees
    • We can minimize the enterprise’s habit of purchasing redundant licenses and services
eia marketing communications one unit at a time
EIA Marketing & Communications:One unit at a time
  • Start with low-hanging fruit
    • Killer content
    • Plentiful or influential users
    • Strategic value (business context)
  • Determine current status of the “client”
    • What are they doing now?
    • What expertise is in-house?
    • What relevant tools do they own (extend licenses)?
    • Are they enlightened?
eia marketing communications illustrating the concept
EIA Marketing & Communications: Illustrating the concept
  • Select an initial model for centralized approach that’s familiar, accessible
  • Staff directory often the best
    • Serves all enterprise users
    • Useful, highly structured content which may have significant metadata, searching and browsing capabilities
    • Has high value in context of the enterprise’s daily operations
eia design timing questions
EIA Design/Timing: Questions
  • An EIA design is an overwhelmingly large undertaking; how might it be broken into more digestible pieces?
  • How should they be sequence: what makes sense to take on now, later, or perhaps not at all?
eia design timing 3 6 years not months
EIA Design/Timing:3-6 years, not months
  • Use early successes as models
  • Anticipate greater centralization among and within business units over time
  • Support different levels of centralization concurrently (Neanderthals coexist with Space Agers)
eia workflow questions
EIA Workflow:Questions
  • How does the content authoring and publishing process work now?
  • Who and how many are involved?
  • How can the group support that work, and determine the best mix of centralized and autonomous responsibilities within that workflow?
eia workflow supporting variation evolution
EIA Workflow:Supporting variation, evolution
  • Build around business units’ demand
  • Use as driver for CMS selection
eia framework takeaways
EIA Framework Takeaways
  • Be entrepreneurial
    • Market and sell services to internal clients
    • Become self-sustaining by diversifying revenue streams
  • Offer modular services
    • Specific services, not full package
    • Logical migration path accommodates all stages of evolution along centralization/autonomy axis for customers
  • Do what can be done in baby steps
    • Start with projects that are low hanging fruit
    • Selective roll-out
contact information
Contact Information
  • Louis Rosenfeld, LLC
  • 705 Carroll Street, #2L
  • Brooklyn, NY 11215 USA
  • lou@louisrosenfeld.com
  • www.louisrosenfeld.com
  • +1.718.306.9396 voice
  • +1.734.661.1655 fax