Since I'm inside the university system, and we are assessed to death, I would say there is yes, there is empirical evidence, from everything involving faculty achievement and research ( which drives teaching), to student skill assessments (frequent, all too frequent), and student exit exam assessments. There are compiled by non-profit foundations research groups who spend weeks on every campus once every 5 years. It's boiled down into a data pack for each school that is about the size of a thick textbook.
That being said, my research for when my children are of age and they apply is took at the ratio of resources spent on administration versus teaching, adjunct faculty versus tenure-track faculty, flexibility of the curriculum. Those are meat-and-potatoes issues for every parent, but the data is only available from the DOE, unfortunately, so you have to dig.
I believe there is value to saying that the faculty actively and currently involved in knowledge production are also the best faculty to learn from. This is the community college versus university dilemma, because community colleges make it very hard for faculty to be involved in knowledge production. For most students, they might not see the connection, and most might be satisfied with the knowledge presented in community colleges, but if you're the type of student who sees themselves as a future knowledge producer (inside or outside academia), you're going to want to be around people who are doing it now, not those who may have been doing it when they were grad students.
This is exactly the type of information that, for instance, the Carnegie Foundation complies in order to assess every department at every university.