Employment Outcomes & Fulfilling Promises

Employment Outcomes & Fulfilling Promises
Photo courtesy of Unsplash

Bootcamp training programs like Turing are supposed to train you for a great job. Employment outcomes are the ultimate measuring stick to demonstrate the quality and effectiveness of a program.

This is the first of a three-part series:

Career Success

I decided to create Turing as a non-profit to ensure that we would always stay focused on our mission: "to unlock human potential by training a diverse, inclusive student body to succeed in high-fulfillment technical careers."

What is career success? It's going to vary person-to-person. Generally I consider an alum's career a "success" when:

  • They're employed in the field
  • They're using skills they learned at Turing, or skills they built on top of those they learned at Turing
  • They're able to progress into more senior positions
  • When wanted or needed, they're able to find a new employer
  • OR, when they do those things and, after some period of time, decide they want to do something completely different.

Career success really means economic empowerment – that there are good options open to you and you get to decide which to take.

Measuring Outcomes

All that is kind of difficult to define and measure. If you were a prospective student, you really want to know "is this going to work for me?" The real answer is unknowable, but we can start to look at some probabilities.

Over the years, I helped define the outcomes reporting standards for NESTA (New Economy Skills Training Association), then for CIRR (Council for Integrity in Results Reporting), and we've built our own outcomes reports. I believe I'm an expert in outcomes reporting in this industry, and yet...

When I've read a CIRR report or our own quarterly reports, you know what goes through my mind? "This is confusing as shit!" I know how all the measurements are done and why they're this way, but one piece doesn't exactly connect to another and, at the end of it, it's hard to make any meaningful conclusions. If all the data points were dreadful, you'd conclude that the program's students are not doing well. If all the data points are good, then you conclude that it's working for many people – but are those people you?

We get distracted by the granularity – the average salaries trending up and down, the time to hire fluctuations, and all that. You can get often get very different numbers by changing exactly which cohorts are included, certain demographics, locations, or backgrounds. It's been particularly difficult since the start of 2022 when any observer of the tech market would tell you that past employment results are not predictive of future possibilities.

Understanding Timelines

Even with an accelerated program like Turing, the time from when someone decides to attend to the point where they're job hunting is likely a year or more. And looking at data likely means considering students who graduated 6+ months ago. The time distance between their outcome and your hoped-for future is probably over 18 months; and the market has proven that it moves faster than that.

Outcomes data is a lot like economics – you can use it to explain what happened in the past and then can inform some guesses about the future. But it's far from a guarantee. I would argue that, especially in this market, the fine-grained details really don't matter. If someone got an awesome $100K salary 18 months before your job hunt, it doesn't mean you will. If someone struggled to find a role 18 months before you're actually looking, it doesn't mean you will.

Fulfilling Promises

And yet, we need to measure and reflect on these outcomes. Those students were made promises. Market swings or not, they were told they would learn, they would build skills, they would collaborate, and they would become job-ready. Given the right support and guidance, if they put in the work then they should find high-quality in-field employment. If that's not happening at a high rate, then some things need to change.

When you look at outcomes of a training program, don't try to extrapolate what it means for your possible future. Instead, ask "were the promises fulfilled?"

We've been digging into the data in new ways to try and help people answer these two simple questions:

  1. Were the promises to past students fulfilled?
  2. What does it mean for me as a prospective student?

Next week I'm going to begin releasing and explaining data I've been gathering on our alumni. Every data point is going to lead to more questions, so I welcome your thoughts and feedback along the way. In the end, I hope you can see that Turing makes big promises to it's students, then does our best to fulfill them.