**Course Signals’ Retention Claim**

Course Signals is an early-warning alert system for identifying at-risk students developed at Purdue University and made available commercially by Ellucian. It has been claimed that use of Course Signals “boosts graduation rate by 21 percent”. The claim is suspect and continues to be repeated without scrutiny. I wrote a simulation to test the claim.

My conclusion from looking at the simulation data: the direct causality attributed to Course Signals is erroneous. In fact, the causation is the reverse of what is claimed. Students who take Course Signals courses are *not more likely* to graduate than non-Course Signals students (at least not directly and at the rates suggested), rather students who graduate are more likely to take Course Signals courses. Recall Euthyphro’s dilemma.

This is a classic example of correlation being used to make claims about causality. X correlates with Y. Therefore, X causes Y. An obvious fallacy. In this case, X = students taking two or more Course Signals courses and Y = increased graduation rates.

What the simulation shows, first of all, is that if X indeed causes Y, then we can get the same retention effect by giving students chocolates. The chocolates given don’t even have to be very many or all that expensive. A few Hershey kisses will do. But, of course, we * can’t* improve retention rates by giving a couple of Hersheys kisses to students. What the simulation also shows is that if we were to give Hersheys kisses to students randomly, those who graduate are more likely to have more chocolates.

Note: I owe the insight to Mike Caulfield who pointed out anomalies in the data and re-framed, correctly I believe, how the Course Signals data should be viewed. The aim of the simulation is to provide some data to back up Caulfield’s insight.

### Simulation Design and Approach

The simulation is based on a model which tracks a cohort of students over four years. We begin with a sample cohort (e.g. 10,000) who enter the university as freshmen. A randomly chosen subset of the students (e.g. 20%) each year take courses where they are given chocolates. This is the analog of taking Course Signals courses. Each year a randomly chosen subset of students (e.g. 10%) drop out of the institution. The base parameters of the model (e.g. cohort size, percentage of chocolate dispensing courses, drop out rate) can be changed by the user.

The model tracks chocolates dispensed and compares retention rates between students who received no chocolates, students who received at least one chocolate, and students who received more than two chocolates. The simulation demonstrates that students who receive two or more chocolates consistently have significantly higher retention rates than students who received no chocolates. From the simulation data it would be erroneous, therefore, to conclude that we can significantly improve retention rates by merely giving chocolates to students. The simulation data illustrates that students who graduate are more likely to receive more chocolates, and not the reverse.

### Comment

The simulation, and the model upon which it is based, has not been verified and, therefore, the assumptions, the code, and the conclusions might be erroneous. The code is made available in my Github library in the form of an iPython notebook for review and criticism. It should also be noted the simulation is not intended as a general criticism of the Course Signals software, which was ground breaking in learning analytics. No doubt Course Signals software has many benefits, including improving course-level grades. The simulation is offered as a test of the claim that Course Signals *directly* leads to *significant* gains in retention.

### Sample Results from the Simulation

The following are some results from the simulation. The first row displays retention rates for students who received no chocolates. The second row displays retention rates for students who received at least one chocolate. The last row shows students who received two or more chocolates. Why track students who received two or more chocolates? Because the authors of the study claim that two is the “magic number” where significant retention gains kick in.

The simulation data shows us that the retention gain for students is not a real gain (i.e. causal) but an artifact of the simple fact that students who stay longer in college are more likely to to receive more chocolates. So, the answer to the question we started off with is “No.”. You can’t improve retention rates by giving students chocolates.

Pingback: Course Signals Effectiveness Data Appears to be Meaningless (and Why You Should Care) |e-Literate

Pingback: Purdue Course Signals Data Issue Explainer |e-Literate

Pingback: Looking harder at Course Signals | Doug Clow's Imaginatively-Titled Blog

Pingback: Jonathan D. Becker, J.D., Ph.D. – Learning, Analytics, Assessment, Big Data, etc.

Pingback: Response from Babson Survey author on differences with IPEDS | e-Literatee-Literate