3-Simpson’s Paradox

The previous post (2-Simpson’s Paradox), provided an explanation of how all departments of a university could provide preferential admissions to females, and yet males could achieve higher admission rates than females in the university as a whole. This post provides further alternative hidden causal structures for the same data set, which could radically alter the interpretations given earlier.

The central point we are trying to make here is that understanding data REQUIRES understanding deeper causal structures which generate the data, and these causal structures are not DIRECTLY observable. A large set of philosophers of science have committed Kant’s Blunder of forbidding us to investigate the unobservables. For example, Hume wanted to burn speculative writings. Wittgenstein dogmatized that “whereof one cannot speak, thereof one MUST be silent”. This prohibition has strongly inhibited exploration and discussion of causal structures in econometrics. Nonetheless, it is essential to understand causal chains, for correct data analysis. In this post, we will explore a large number of variant causal structures, which give similar observable outcomes but have radically different policy implications. In the previous post, we discussed the causal structure posited by Berk to explain the Simpson’s Paradox in Berkeley admissions. Better understanding is achieved by positing an even simpler causal structure which leads to similar paradoxical results:

Chain

The above causal diagram depicts the case where there is no discrimination by gender. Suppose that the Engineering Department Admits 60% of all applicants, regardless of whether they are male or female. Similarly, Humanities admits 30% of all applicants regardless of whether they are male of female. So the Admit Rate depends on the Department, but not on gender.  Now suppose that 90% of females apply to Humanities and 10% apply to Engineering. Then the overall admit ratio for females will be 10% x 60% + 90% x 30% = 33%. Suppose that 90% of the males apply to Engineering, and only 10% to Humanities. Then the overall admit ratio for males will be 90% x 60% + 10% x 30% = 57%. So it will appear that admissions are heavily biased in favor of males. 57% of the males who apply get in, while only 33% of the females get in. In the causal chain diagrammed above, Gender affects Department Choice, and Departments Chosen Affect Admission Rates. So, Gender will affect Admissions through the channel of Department Choice.  In this situation, we can learn the gender plays no direct role in admissions by conditioning on the Department. The link between Gender and Admissions is broken (or blocked) when we condition on Department. After conditioning on “department”, we will learn that Gender is independent of Admissions. Conditioning means doing the analysis separately for each department, holding department choice constant.

There are many other causal sequences which can create outcomes resembling the original ones, but with causes entirely different from those discussed until now. For example, suppose that the admissions process is completely mechanical, and depends only on the SAT Scores. Suppose that SAT scores vary from 20 to 80, and the admissions process is such that those with SAT scores x have an x% chance of admission. We consider an artificial example first, because it is very easy to understand. Suppose that females get scores of only 80 and 40, while males get scores of 60 and 20. Suppose that all females with score of 80 apply to Engineering and all females with scores of 40 apply to Humanities. Similarly suppose that all males with scores of 60 apply to Engineering and all males with scores of 20 apply to Humanities. If the proportions are set up to match the initial example – 200 females with SAT score 80 apply to Engineering, while 1800 males with SAT scores of 60 apply to Engineering. Similarly, 1800 females with SAT score 40 apply to Humanities and 200 males with SAT scores 20 apply to Humanities. This will create data identical to the first table. However, neither Gender nor Department has any effect on admissions, which is determined purely by the SAT Scores.

Here the Gender affects SAT scores; females get higher scores. Gender also affects choice of department. Females apply overwhelmingly to Humanities, which is more difficult to get into. However, Gender does not affect admission rates. Admission rates depend only on the SAT Scores – not on gender, nor on department .  The causal picture in this situation looks like this:

DAG4

 

Both Gender and SAT Scores affect choice of Department, since females with high scores opt for Engineering, while those with low scores opt for Humanities.l Admissions depend ONLY on the SAT Scores, and are blind to Gender, and not affected by Department. However, the data will be exactly the same as that analyzed in the original data table illustrating the Simpson’s Paradox. Gender and Department both appear to be strongly related to admit rates, with Engineering having easier admissions and Females being favored by both departments. All these misleading relationships disappear when we draw the right causal diagram, as above. We can discover the validity of this diagram by conditioning on the SAT Score. If this diagram is valid, then after conditioning on the SAT Score, Gender and Department will be independent of the admit ratios.  The observable structures DO NOT reveal the hidden underlying causal chains, and create a misleading picture. This is is why an empirical approach, which refuses to go beyond the observables, is bound to fail.

The point is that a superficial analysis, which only looks at the numbers, without attempting to assess the underlying causal structures, cannot lead to a satisfactory data analysis. David Freedman said that we must expend shoe leather in order to understand causality. That is, we must go out into the real world and look at the structural details of how events occur. To find out about whether or not discrimination occurs, we should examine how the admissions committee works – who is on the committee, what are their opinions regarding gender-blind admissions, and the procedures used to make admissions decisions. The idea that the numbers by themselves can provide us with causal information is false. It is also false that a meaningful analysis of data can be done without taking any stand on the real-world causal mechanism. Each of the diverse causal mechanisms discussed above has radically different implications regarding whether or not there is discrimination by gender, and how we should go about rectifying the problem, if there is a problem to rectify. These issues are of extreme important with reference to Big Data and Machine Learning. Machines cannot expend shoe leather, and enormous amounts of data cannot provide us knowledge of the causal mechanisms in a mechanical way. However, a small amount of knowledge of real-world structures used as causal input can lead to substantial payoffs in terms of meaningful data analysis. The problem with current econometric techniques is that they do not have any scope for input of causal information – the language of econometrics does not have the vocabulary required to talk about causal concepts.

In the remaining portion of this discussion, we will look at the same data set in some other real world contexts, where the same numbers lead to radically different conclusions. See (4-Simpson’s Paradox: Baseball Batting Averages, and 5-Simpson’s Paradox:Drugs and Recovery Rates) This goes against a standard assumption that statistical analysis can be done by looking at numbers, while the “real-world” context and interpretation can be left upto the field expert. We cannot separate the data from its real world meanings and context.

Postscript: For a summary of all five posts, with links to each one, see RWER Blog Post on Simpson’s Paradox.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: