The Wingspan

Centennial High School's Daily Online News Source

The Wingspan

The Wingspan

Why November? The Truth Behind College Board’s Decision

Why+November%3F+The+Truth+Behind+College+Boards+Decision

Words: Caleb McClatchey

Photo: Adithi Soogoor

For the world of secondary education, the College Board’s seemingly innocent February 6 press release contained a bombshell. In it, the educational non-profit announced it would be making two major changes to the Advanced Placement (AP) program for the 2019-20 school year. The first, giving teachers access to new online resources designed to help them better prepare their students for the AP exams, elicited largely positive responses from educators. The second, however, instantly sparked nationwide criticism.

This second change involved moving exam registration from the spring to the fall. Students must now sign up for AP exams by November 15, instead of March, to only pay the base exam fee of $94. Students who sign up for exams between November 16 and March 13 will have to pay a $40 late fee on top of the base exam fee. If a student changes their mind and decides to cancel an exam after November 15, they will still have to pay $40 of the $94 base exam fee.

According to the College Board, the decision to adopt fall registration was inspired by policies already in place in some AP schools. The College Board claims that some form of fall registration is already a “best practice” at over half of schools offering AP courses. However, it is unclear what specific policies the College Board considers fall registration or how strict these policies must be to be considered a “best practice.”

As the College Board explains, they learned that students in the schools which already offered fall registration were “more engaged and less likely to give up.” This increased commitment, the organization says, meant they were “more likely to earn a score that [would] translate to college credit.”

During the 2017-18 school year, the College Board conducted a pilot program to study the effects of fall registration and its alleged benefits. The organization implemented fall registration, among other changes, at 14 school districts across four different states. Combined, over 100 schools and 40,000 students participated in the pilot.

Although the College Board has provided minimal information on the nature of the pilot program or its results, it has relied heavily on anecdotes and highly limited data from the pilot program to support its claims. A video on the “2019-20 Changes to AP” page of the group’s website, for instance, shows teachers and students from pilot schools describing how they were initially skeptical of fall registration but came to realize that, as one teacher put it, “[It] really makes a difference.” Next to the video, the College Board explains that, “We’ve heard words like, ‘engaged,’ ‘confident’ and ‘less likely to give up’ when students register in the fall-and that commitment translates into more students taking the exam and earning college credit.”

Beyond anecdotal evidence, College Board boasts that, “Scores of 3+ increased across student groups” in their pilot program. A 3 is considered a passing score on the exams and is typically the minimum score required by colleges to earn credit. What College Board puts the greatest emphasis on, however, is the effect that fall registration had on groups it deems as traditionally underrepresented in the AP program. According to the College Board, underrepresented minorities (African Americans, Latinos, and Native Americans), low-income students, and female STEM students fall under this category. The College Board claims that while fall registration “made a difference across the board,” it “had the strongest effect” on these students. Accompanying this claim– often in graph form– is essentially the only data from the pilot program which the College Board has currently made readily available to the public.

Chart contributed by: the College Board

The data, which shows the percentage increases in scores of 3+ across different student groups, reveals that underrepresented groups saw significantly higher relative increases in passing scores than their counterparts. The total number of scores of 3 or higher increased by 12% for underrepresented minority students compared to 5% for White/Asian students. Likewise, passing scores increased by 20% for low-income students and only 4% for moderate/high-income students. The same trend occurred with female STEM students, who achieved a 14% increase compared to a 5% increase for their male counterparts.

These results indicate that fall registration will help these student groups, who have historically had lower participation and passing rates, move closer to equitable representation within the AP Program. In fact, the College Board touts that, during one year of fall registration, “schools sped up the work of AP Equity– the share of AP Exam registrations for students of color– by seven years.”

However, the minimal data which College Board is currently providing, and corresponding claims it makes, are meaningless when taken out of the context of the rest of the pilot program data. Earlier this year, the College Board itself released somewhat more detailed data from the pilot program on its website. Although the College Board has taken down that web page since then, screenshots exist and the graphs which the College Board used on the page are still hosted on its website. These graphs displayed the raw number of total exam takers, underrepresented minority exam takers, low-income exam takers, and passing scores by low-income students within the pilot districts for the 2016, 2017, and 2018 AP exams.

This data, which the College Board has taken down for unknown reasons, is essential for putting the minimal data which they are currently trumpeting into context. This deleted data shows that while the total number of low-income exam takers increased by 33.5% from 2016-17 to 2017-18, the total number of moderate/high-income exam takers only increased by 3.9%. Given this fact, the graph showing a 20% increase in passing scores for low-income students compared to a 4% increase for moderate/high-income students is somewhat misleading. The number of low-income students taking exams simply increased at a much higher rate than did the number of moderate/high-income students taking exams. As a result, the number of exams passed by low-income students increased at a much higher rate as well. Relative to the increases in exam takers, low-income students did not see nearly as significant of an increase in performance compared to moderate/high-income students as the 20%-4% comparison suggests at first glance. This pattern is repeated with underrepresented minorities and non-underrepresented minorities as well.

To gain a more complete understanding of the program’s results, The Wingspan tracked down five of the fourteen school districts who participated in the 2017-18 pilot. Through public information requests, The Wingspan obtained previously unpublished AP data from four of these districts: Klein ISD in Texas, San Antonio ISD in Texas, Amarillo ISD in Texas, and Jefferson County Public Schools in Kentucky. The Wingspan would like to note that, despite The Wingspan’s best efforts, the data obtained for San Antonio ISD is limited to 11th and 12th graders.

Overall, the total number of exams taken increased by 7.7% across these four districts in the pilot’s first year. Since the total number of passing scores increased by a nearly identical 8%, the overall passing rate only increased by a marginal 0.11%.

For the three districts which reported results by economic status, the number of exams taken by students considered economically disadvantaged/eligible for free or reduced lunches (Eco-Dis/FRL) increased by 12.2%. Meanwhile, the total number of passing scores for these students increased by 17%. This translates to a 0.8% increase in pass rate. In comparison, the total number of exams taken, total number of exams passed, and pass rate for students not economically disadvantaged/not eligible for free or reduced lunches (Non-Eco Dis/FRL) increased by 2.2%, 6.2%, and 2.0%, respectively.

Across all four districts, the total number of exams taken by African American and Latino students increased by 10.36%. At the same time, the total number of exams passed increased by 15.6% and the pass rate increased by 0.99%. For all other students, the number of exams taken, number of exams passed, and pass rate increased by only 5.8%, 5.9%, and 0.05%, respectively.

In the aggregate, the detailed data obtained by The Wingspan appears to tell the same story as the College Board’s deleted data. It seems that the changes implemented by the pilot did increase equity with regards to access. Underrepresented groups saw a much higher percentage increase in exams taken than their overrepresented counterparts. However, the pilot appears to have done little to close the performance gap between underrepresented and overrepresented groups. In the three districts which reported results by economic status, the passing rate for Non-Eco Dis/FRL students was 30.8% higher than the passing rate for Eco Dis/FRL students in the 2016-17 school year. In the pilot’s first year, this gap actually increased to 32.0%.

It is important to note that the results of the pilot program varied significantly between districts. How the pilot affected an individual district often differed from how the pilot affected the four districts as a whole. Although the number of exams taken by Eco Dis/FRL students increased by 12.2% overall, this number increased by a staggering 116% in Amarillo ISD and actually declined by 0.68% in Jefferson County. Furthermore, despite the gap in passing rates between African American/Latino students and other students decreasing by 0.94% overall, this gap increased in three of the four districts.

These differences in results shed light on a frustrating aspect of analyzing the pilot program data: there are simply so many variables at play. The previous AP registration policy, the cost paid for exams by low-income students, the quality of AP instruction, and any changes in enrollment all influenced how a district’s AP results changed during the pilot program. Since these factors significantly vary by state and school district throughout the country, one should not expect the universal adoption of fall registration to have a universal effect.

Further complicating a true evaluation of the results of the pilot program is the nature of the pilot program itself. As it turns out, instituting a fall registration deadline was just one of many changes implemented by the College Board as part of their 2017-18 AP Pilot. Most notably, all participating school districts received access to a new support system of online resources. According to the “AP Full Year Model Implementation Plan” attached to the pilot participation agreement between the College Board and Jefferson County Public Schools, these resources were meant to enable, “yearlong, college-level practice and instruction in AP classrooms.” Highlighting these resources was an AP Question Bank available for all AP courses. The pilot agreement describes this as a “comprehensive repository of AP released and practice exam questions indexed to unit content and skills, including reports highlighting student knowledge and skill achievements and gaps.” Teachers could use these questions to build custom quizzes for each unit, students could practice with them online or on paper, and administrators could access “year-round performance and usage data.” Furthermore, AP Calculus and World History teachers received access to additional resources including scoring training, unit quizzes, and student-directed practice.

Although the pilot schools and their teachers were free to use these resources as they wished, the College Board provided them with intent and wanted them to be utilized. In fact, the aforementioned implementation plan, written by the College Board, states that, “The College Board encourages District’s utilization of these resources.” The plan explains that this will, “enable the College Board to learn about usage patterns.”

While it is impossible to quantify the exact impact of these resources, it is highly likely that they increased student performance to some extent. Kevin Rasco, District Coordinator of Advanced Placement for San Antonio ISD, described these resources as being “very heavily used,” especially for Calculus and World History. In Amarillo ISD, Director of Counseling/College and Career Readiness Tracy Morman said the resources were utilized to varying extents by different teachers but on the whole were “very beneficial.” Both Morman and Rasco emphasized how the resources allowed teachers and students to track students’ progress throughout the year. This gave students added confidence and teachers the ability to assess the effectiveness of their instruction throughout the course.

Megan Shadid, an AP Economics and World History teacher from one of the pilot districts, echoed these sentiments in an interview with USA Today. “It’s been a game changer for me in terms of how I teach,” she explained.

If the College Board wanted to “further study the effects of moving exam registration to the fall,” as their website says, why introduce another variable into the study in the form of these highly beneficial online resources? Even ignoring all of the other factors influencing a district’s AP exam results, it is now impossible to say to what extent the results of the pilot are indicative of the effects of fall registration and to what extent they are indicative of the benefits of the online resources. Given that this uncertainty was created by the way the College Board designed the pilot, it is curious that they do not mention their inclusion of the online resources in practically any of the information they have released about the program.

Despite questions about College Board’s representation of the pilot program and its results, there seems to be plenty of support for the fall registration deadline from those involved with the pilot.

Regarding the decision to implement fall registration nationwide, Rasco stated, “I’m behind it. I believe in it… For one reason: you commit early to the full AP experience.”

Rasco believes that kids thrive in structure and high expectations. By forcing students to register early, teachers know they have a classroom full of kids committed to taking the exams. According to Rasco, this causes a “dramatic change in the way a teacher conducts their class.”

Like Rasco, Morman also thinks that the decision is a great move and called it  “a win-win for everybody.” She emphasized that the fall registration deadline and online resources were very beneficial for her district and believes that they are what’s best for students in general.

As long as there are skeptics of the College Board, there will always be controversy surrounding the decisions it makes. The move to a fall registration deadline for AP exams is no different. The inconvenient truth, for both the College Board and its critics, is that no clear narrative appeared to emerge from The Wingspan’s investigation of the 2017-18 AP Pilot Program. Although the changes implemented by the program seemed to increase equity with regards to access, their effect on equity with regards to performance seemed to be minimal. While some of those involved in the pilot like Rasco and Morman have expressed their support of fall registration, the College Board’s limited and somewhat misleading representations of the pilot program’s results and its nature raise questions. Unfortunately, there is and in all likelihood will be no final verdict, no definitive answers.The truth, much like the pilot program and the College Board itself, is complicated.

 

To listen to a behind the scenes interview with the author, Caleb McClatchey, click here!

To view the entire November issue, click here!

For more breaking news and photos, follow The Wingspan on Instagram and Twitter @CHSWingspan.

More to Discover