For example, we used them for our first outcomes report and paid extra to have them "verify" our outcomes report, but they literally never opened the Google Drive file we sent them.
I think it was a great idea set up by well meaning people, but the self-governing aspect and comparisons created ended up in weird incentives that resulted in it falling apart.
The review sites are perhaps marginally better, but the positivity of reviews are almost 100% correlated with how hard schools work to farm for positive reviews, and their business model is selling leads to the schools, so the incentive isn't for objectivity there either.
Honestly the best way, though it requires more work, is to find a handful of recent grads on LinkedIn and ask them about their experience.
The stats on the CIRR site across schools did always seem a little... odd to me, with differences in outcomes too big to believe at times. Sounds like I would have found the same thing if I looked at any individual school over time, as the rules and practices changed.
... And suddenly it seems so obvious! Thanks, fixed!