Evolution of after-school programming & next steps
It’s only been a matter of time – tons of money over the past 15 years has been dumped into after-school programming, with some important studies showing minimal results. That doesn’t mean, though, that after-school programs don’t work – it means that they’re only as good as they are good. After-school programs aren’t just good because they are – they’re good if they do the right things.
This article by Kathryn Baron (“As School Day Grows, Ties Deepen Between Schools, Providers”) on edweek.org highlights one such set of strategies that make some after-school programs better than others – collaboration with schools. Historically, this kind of idea has been paid great lip service by program providers for year, along with a number of other strategies. However, Baron highlights what looks to be actual, meaningful collaboration between after-school programs and providers.
First, a bit of background context: In short, after-school programs became really popular in the late 90s and early 2000s. Things like 21st Century Community Learning Centers (a huge federal grant program) fanned the flames, and before long we went from having a few YMCAs and Boys & Girls Clubs to having after-school programs in almost every school, church, and community program. The first few large-scale evaluations of 21st CCLC, though, didn’t look too good – in short, programs didn’t deliver, and in some cases made things worse.
Since, with initiatives such as after-school alliances, more research, school accountability, and general professionalization of the field, programs have started to improve. We’re starting to see more and more programs come along that are adding more to the mix than just sports and childcare.
So, what’s the take-home lesson for program providers with this article? The obvious, “teed up” answer is collaboration – the more youth workers talk to each other, the more those folks can coordinate and provide consistent support across each child’s environment. Everything from behavior to reading fluency benefits from this, if done right. More broadly, though, I’d argue that a bigger lesson from Baron’s article is meat, or substance. In other words, we need to audit each of our strategies – from overt, structured programs to latent, background processes – and ask ourselves, in a data-based method, if what we’re doing works. Collaboration, after all, may not really work in some environments – some schools and teachers aren’t able or willing to participate. So, program providers can’t rely on a pre-made template or checklist as to what to do – providers need to become critical thinkers and self-evaluators, identifying their own best practices based on their own data collected from their own strategies, based on their own research and interpretation of best practices in their local program context.