Networks: Basic Concepts. Centrality. Networks: Basic Concepts. In this discussion, we’ll outline some basic concepts of network analysis, focusing on centrality.
PowerPoint Slideshow about 'Networks: Basic Concepts' - Thomas
An Image/Link below is provided (as is) to download presentation
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
In this discussion, we’ll outline some basic concepts of network analysis, focusing on centrality.
We’ll also fold into this discussion an overview of UCINET. UCINET is a software program that is commonly used with network analysis. While it does not handle some of the more recent ways in which networks can be analyzed (such as longitudinal or cross-sectional ERGM methods), it is a very user friendly way to obtain network-related measures, and to create visual depictions of networks. UCINET offers a free 30 day trial.
Much of the discussion of UCINET was drawn from the tutorial created by Hanneman at Riverside.
One is in-degree centrality: An actor who receives many ties, they are characterized as prominent. The basic idea is that many actors seek to direct ties to them—and so this may be regarded as a measure of importance.
The other is out-degree centrality. Actors who have high out-degree centrality may be relatively able to exchange with others, or disperse information quickly to many others. (Recall the strength of weak ties argument.) So actors with high out-degree centrality are often characterized as influential.
So, node 7 has an in-degree centrality absolute value of 9 (there are 9 other nodes connected to node 7). The normalized value is 100 (all possible other nodes are connected to node 7). The out-degree centrality has an absolute value of 3 (node 7 is connected out to nodes 2, 4, and 5), and a normalized value of 33.33 (3 nodes is 33.33% of the possible 9 nodes to which node 7 could extend out.)
The average outdegree is 4.9 (which means that each node has, on average, connections out to 4.9 other nodes); the average indegree is also 4.9. Normalized, both measures are 54.44 (that is, 4.9 / 9).
One can also calculate network indegree and outdegree centralization. These network measures represent the degree of inequality or variance in our network as a percentage of that in a perfect “star network” – the most unequal type of network.
A depiction of a star network is on the next slide—note that only one node is connected to any of the others, and that node is connected to all of the others.
When calculating out the Bonacich Power measures, the “attenuation factor” represents the weight—an “attenuation factor” that is positive (between 0 and 1) means that one’s power is enhanced by being connected to well-connected neighbors.
Alternatively, one could argue that actors who are well-connected to individuals who are not well-connected themselves are powerful, because others are “dependent” on them. In this case, one would use an negative “attenuation factor” (between 0 and -1), to compute power accordingly.
Recall the graph presented above, in which actors #5 and #2 were the most central. Calculating out Bonacich measures suggests that actors #8 and #10 are also central—they don’t have many connections, but they have the “right” connections.
However, taking the second approach (using a negative attenuation factor) identifies actors 3, 7, and 9 as being strong – because they have weak neighbors (who are “dependent” on them).
As with all quantitative methods, it’s important to think about what you as a researcher are trying to measure before using the methods. In your particular context, are actors connected with other well-connected actors the most powerful? Or is it actors that are connected with those who are very dependent on them who are more powerful?
Closeness is a measure of the degree to which an individual is near all other individuals in a network. It is the inverse of the sum of the shortest distances between each node and every other node in the network.
Closeness is the reciprocal of farness.
Nearness can also be standardized by norming it against the minimum possible nearness for a graph of the same size and connection.
Closeness can also be calculated as a measure of inequality in the distribution of distances across the actors.
These measures rely on the sum of the geodesic distances from each actor to all the others. However, in complicated graphs, this can be misleading.
An actor can be very close to a relatively closed subset of a network—or moderately close to every actor in a large network—and receive the same closeness score. In reality, the two are very different.
Another way to think of closeness is to move away from thinking just about the geodesic or most efficient (shortest) path from one node to another—but to also think about all connections of ego (that is, the one node in question) to all the others.
There are several such measures: Hubbell, Katz, Taylor, Stephenson, and Zelen.
Hubbell and Katz methods count the total number of connections between actors (and do not distinguish between directed and non-directed data), but use an attenuation factor to discount longer paths. The two measures are very similar; the Katz measure uses an identity matrix (each node is connected to itself) while the Hubbell measure does not.
The Taylor measure also uses an attenuation factor, but is more useful for measuring the balance of in- versus out-ties in directed graphs. Positive values of closeness indicate relatively more out-ties than in-ties.
One can also identify levels of hierarchy. If one eliminates all the actors with no betweenness (that is, the “subordinates”), some of the remaining actors will then have 0 betweenness—they are at the second level of the hierarchy. We can continue to remove actors, and measure the # of levels of hierarchy exist in the network or system.
Note that the Knoke data presented above is not very hierarchical.
What if two actors want to have a relationship, but the path between them is blocked by a reluctant intermediary? Another pathway—even if it is longer—means another alternative / resource. The flow approach to centrality assumes that actors will use all the pathways that connect them. For each actor, the measure reflects the # of times the actor is in a flow (any flow) between all other pairs of actors (generally, as a ratio of the total flow betweenness that does not involve the actor).