Browse

Disciplinary Content

Posted on: Agile tester
Thursday, Oct 10, 2019
test article
There exist a number of frequenly mentioned regressional effects which conceptually are different but share much in common when seen purely statistically (see e.g. this paper "Equivalence of the Mediation, Confounding and Suppression Effect" by David MacKinnon et al., or Wikipedia articles):

Mediator: IV which conveys effect (totally of partly) of another IV to the DV.
Confounder: IV which constitutes or precludes, totally or partly, effect of another IV to the DV.
Moderator: IV which, varying, manages the strength of the effect of another IV on the DV. Statistically, it is known as interaction between the two IVs.
Suppressor: IV (a mediator or a moderator conceptually) which inclusion strengthens the effect of another IV on the DV.

I'm not going to discuss to what extent some or all of them are technically similar (for that, read the paper linked above). My aim is to try to show graphically what suppressor is. The above definition that "suppressor is a variable which inclusion strengthens the effect of another IV on the DV" seems to me potentially broad because it does not tell anything about mechanisms of such enhancement. Below I'm discussing one mechanism - the only one I consider to be suppression. If there are other mechanisms as well (as for right now, I haven't tried to meditate of any such other) then either the above "broad" definition should be considered imprecise or my definition of suppression should be considered too narrow.
Definition (in my understanding)
Suppressor is the independent variable which, when added to the model, raises observed R-square mostly due to its accounting for the residuals left by the model without it, and not due to its own association with the DV (which is comparatively weak). We know that the increase in R-square in response to adding a IV is the squared part correlation of that IV in that new model. This way, if the part correlation of the IV with the DV is greater (by absolute value) than the zero-order 𝑟r between them, that IV is a suppressor.
So, a suppressor mostly "suppresses" the error of the reduced model, being weak as a predictor itself. The error term is the complement to the prediction. The prediction is "projected on" or "shared between" the IVs (regression coefficients), and so is the error term ("complements" to the coefficients). The suppressor suppresses such error components unevenly: greater for some IVs, lesser for other IVs. For those IVs "whose" such components it suppresses greatly it lends considerable facilitating aid by actually raising their regression coefficients.
Not strong suppressing effects occurs often and wildly (an example on this site). Strong suppression is typically introduced consciously. A researcher seeks for a characteristic which must correlate with the DV as weak as possible and at the same time would correlate with something in the IV of interest which is considered irrelevant, prediction-void, in respect to the DV. He enters it to the model and gets considerable increase in that IV's predictive power. The suppressor's coefficient is typically not interpreted.
I could summarize my definition as follows [up on @Jake's answer and @gung's comments]:

Formal (statistical) definition: suppressor is IV with part correlation larger than zero-order correlation (with the dependent).
Conceptual (practical) definition: the above formal definition + the zero-order correlation is small, so that the suppressor is not a sound predictor itself.

"Suppessor" is a role of a IV in a specific model only, not the characteristic of the separate variable. When other IVs are added or removed, the suppressor can suddenly stop suppressing or resume suppressing or change the focus of its suppressing activity.
Posted by: Rohit Shinde
post image