05-10-2012, 03:20 PM
Expert Finding Systems for Organizations: Problem and Domain Analysis and the DEMOIR Approach
Expert Finding Systems.pdf (Size: 128.25 KB / Downloads: 22)
Abstract
Computer systems that augment the process of finding the right expert for a given problem in
an organization or world-wide are becoming feasible more than ever before, thanks to the prevalence of
corporate Intranets and the Internet. This paper investigates such systems in two parts. We first explore
the expert finding problem in depth, review and analyze existing systems in this domain, and suggest a
domain model that can serve as a framework for design and development decisions. Based on our analyses
of the problem and solution spaces, we then bring to light the gaps that remain to be addressed.
Finally, we present our approach called DEMOIR, which is a modular architecture for expert finding
systems that is based on a centralized expertise modeling server while also incorporating decentralized
components for expertise information gathering and exploitation.
Introduction
Motivated by advances in information technology, organizations are giving more emphasis to the
capitalization of the increasing mass of knowledge they accumulate in the course of their business.
However, as noted by Stewart [1], the attempt to put all corporate knowledge on one huge
server in the style of the 18th-century French encyclopedists is doomed to fail. Stewart continues
to assert that the real value of information systems is rather in connecting people to people, so
they can share what expertise and knowledge they have at the moment, given that the cutting
edge is always changing. Studies into the information-seeking behavior of people working in
information-intensive fields also show that people searching for information commonly explore
personal communications prior to using formal sources [2][3].
Thus, if technology is to foster the effective utilization of the whole range of knowledge in organizations,
it has to be able to support not only access to explicitly documented knowledge but,
most importantly, tacit knowledge held by individuals. By enhancing the visibility and traceability
of such knowledge, technology can help catalyze collaboration and knowledge sharing
among its holders, both within and between organizations. Moreover, the ability to quickly find
information on expertise of people can play critical roles in fostering the formation and sustenance
of virtual organizations/enterprises, communities of practice, expertise networks and the
like.
What prompts expert seeking: information need and expertise need
Based on interviews we conducted with researchers at a major research institution as well as
extrapolation of observations in the relevant literature, we identified two main motives for
seeking an expert, namely as a source of information and as someone who can perform a given
organizational or social function. This categorization, though fuzzy and sometimes overlapping,
proves to be useful in analyzing the goals of automated expert finders as we will discuss later in
this paper.
People may seek an expert as a source of information to complement or replace other sources
such as documents and databases in various ways.
Internal versus external expert seeking
In organizations, expert seeking (and consequently, the benefits of finding them) can also be
viewed from an internal and an external point of view. There are a number of reasons for wanting
to know who knows what within an organization (as quickly as possible at that), including
knowledge sharing, team formation, project launching, etc. An organization also benefits if
external entities can easily discern the expertise of its staff, as this fosters collaboration, crossorganizational
networking, a better image, etc. For example, many organizations can deliver efficient
customer help services if the customers, or their contact points in the organization, can
easily trace and direct their queries to the appropriate expert. Likewise, academic and research
institutes want the industry, the public, and potential research sponsors and research collaborators
to know about and make use of their staffs’ expertise.
Automated Support: Traditional Approaches
Whatever their motives are, seekers of experts need a range of information regarding peoples’
expertise. They need to know whether a person who can answer their queries or meets their criteria
exists, how extensive his/her knowledge or experience is, whether there are other persons
who could serve the same purpose, how s/he compares with others in the field, how the person
can be accessed (contacted), etc. This, in turn, calls for a mechanism that gathers and makes
such information accessible. However, doing this manually is obviously a time-consuming and
laborious undertaking, making automated aids invaluable.
One way of providing automated assistance is the development of expert databases (aka “knowledge
directories”, "knowledge maps") through manual entry of expertise data. This is exactly
what many organizations commonly did in the past. Microsoft’s SPUD [16], Hewlett-Packard’s
CONNEX1 and the SAGE People Finder2 are examples of this approach. Similarly, manual data
entry is employed in skill inventory systems like Skillview3, which are common in the
knowledge and human resource management domain.
Automatic Expert Finders
The shortcomings of the above approaches coupled with the availability of large electronic
repositories of organizational and personal records have led to the suggestion of more helpful
systems known as expert finders or expert/expertise recommenders. These systems aim at mitigating
the above shortcomings by trying to automatically discover up-to-date expertise information
from implicit/secondary sources instead of relying on experts and/or other human sources
only, although experts and their proxies can still complement/refine the automatically generated
expertise information.
Attempts to develop systems that exploit implicit evidence of expertise to augment the process of
finding the right expert dates back at least to the visionary work of Maron and his colleagues
[17]. Their experimental system, called HelpNet, accepts requests for information and responds
with names of people ranked by the probability that the individual will provide a satisfactory
answer. This probability is computed using probabilistic models of information retrieval which
combine the estimation of people’s expertise in answering a question on a topic with the probability
that a given user would be satisfied with the response provided by the source. To do this,
the system constructs a profile by asking people to indicate their expertise by selecting from a list
of topics along with a probability estimate of their ability to provide a satisfactory answer to
questions on that topic. Maron and his colleagues envisioned such systems enabling the emergence
of “a large, active and fruitful future network of informationally rich people providing
help to one another”.
Positioning Automatic Expert Finders
Before going on to the domain analysis where we consider the functionality of expert finders, let
us first survey the potential application contexts for these systems, with particular reference to
other related organizational systems and services. This survey outlines the bounds of the expert
finders’ domain and crudely represents the ‘context analysis’ phase required prior to domain
analysis.
As part of organizational information systems, expert finders can either stand on their own or
form part of other broader organizational systems. We believe that their great potential is
unleashed only when used in integration with other organizational information systems, namely
knowledge management systems, recommender systems, CSCW systems, and electronic markets
for human expertise.
As mentioned above, expert finding capabilities form an important part of knowledge management
systems whose aim is provide access to knowledge in all forms, including knowledge held
by people. Davenport [37] called this the "hybrid approach to knowledge management". Kautz et
al [7] also discuss the importance of integrating both the “ask a program/document” and “ask a
person” paradigms into information seeking. These two approaches are mostly used in an interdependent
manner, i.e. one is used to find the other [2]. Organizational memory systems like
question answering and routing systems (e.g. Answer Garden[30]) can be cited as one realization
of this paradigm which necessarily calls for expert finding capabilities.