assessmentaccountability

New College Success Group in Arizona

The Arizona College Scholarship Foundation has merged with the Arizona College Access Network to form a new group that will serve as a "statewide voice for college access and success," according to a written statement. The group will add supports to an existing network of 200 college access programs across the state, which provide standards, training and tools aimed at helping more students graduate from college. The scholarship foundation has a particular focus on first-generation college students who are members of minority groups.

Study of Graduation Rates of Returning Students

A group of five higher education associations and other organizations are collaborating on a study of the retention and graduation rates of five million students who are not first-time college students. The American Council on Education, InsideTrack, NASPA -- Student Affairs Administrators in Higher Education, the University Professional and Continuing Education Association, and the National Student Clearinghouse last month announced the research project, which is slated to be wrapped up this fall.

The research is intended to give a broader, more accurate view of student completion trends than federal graduation rates, which track only first-time, full-time students. This approach fails to capture the large number of adult students who enroll in college multiple times. The Clearinghouse database, however, can provide information on almost all students. The new study will look at aggregate enrollment patterns, breaking out data by institution type, age of students, gender, geographic location, enrollment intensity and the type of degree pursued. 

4 states' project to share data sheds light on college outcomes

Smart Title: 

Western consortium says four states' sharing of data on college and workforce outcomes improved their understanding of how their citizens fared after high school.

ABA Approves Changes in Law School Accreditation

The American Bar Association's governing council has approved changes in the ABA rules for accrediting law schools, The National Law Journal reported. The changes will require that law school have students gain experience in clinics or other real world settings, and will shift an emphasis from the qualifications of entering students to measures of learning and placement rates. The ABA and law schools have been criticized for not doing enough in the past about law schools that enroll students who may have little chance at employment in jobs sufficient to repay their loans.

 

Legislative Panel Votes to Censure U. of Texas Regent

The Texas House Transparency Committee voted Monday to censure Wallace Hall, a member of the University of Texas System Board of Regents, for “misconduct, incompetency in the performance of official duties, or behavior unbefitting a nominee for and holder of a state office," The Texas Tribune reported. Hall has engaged in lengthy investigations, complete with mammoth document requests, of the University of Texas at Austin, and many have accused him of a witch hunt to try to force the ouster of President Bill Powers. Hall issued a statement saying that "the committee's findings are based on distortions, untruths, and intentional misrepresentations."

New workforce fund in Louisiana ties money to jobs and private donations

Smart Title: 

Louisiana's two-year colleges get the backing of business -- and more state funding -- thanks to workforce focus and program cuts.

Syracuse Tops Princeton Review’s ‘Party School’ List

Syracuse University is the nation’s top party school, according to the Princeton Review’s annual college rankings, which were released Monday.

The ranking dismayed Syracuse officials. “Syracuse University has a long-established reputation for academic excellence with programs that are recognized nationally and internationally as the best in their fields,” university officials said in a statement. “We do not aspire to be a party school.”

The Princeton Review surveyed 130,000 students across the country – an average of 343 students per campus – to develop its rankings. The “party school” rankings come from survey questions on alcohol and drug use, the number of hours students spend studying each day and the Greek system’s popularity.

Syracuse has fretted about college rankings in the past. Nancy Cantor, Syracuse’s former chancellor, disdained rankings. She quipped that the U.S. News & World Report rankings “may sell magazines,” but not much else. Syracuse slid down rankings lists under Cantor’s tenure as the university admitted more low-income and at-risk students.

Its current chancellor, Kent Syverud, who took office in January, pledged to pay more attention to rankings. The dubious honor of “top party school” is likely not what he had in mind.

“With new leadership, we are very focused on enhancing the student experience, both academically and socially,” Syracuse officials said in response to the party-school designation. “Students, parents, faculty and the full Syracuse University community should expect to see important and positive changes in the year ahead that will improve and enhance the student environment in every aspect.”

Officials said the rankings came from a “two-year-old survey of a very small portion of our student body” – a claim that misleads slightly.

The Princeton Review conducts formal surveys of colleges once every three years. But the company also offers an online survey, which students can complete any time.

“Surveys we receive from students outside of their schools’ normal survey cycles are always factored into the subsequent year’s ranking calculations, so our pool of student survey data is continuously refreshed,” Princeton Review editors wrote in their 2015 “Best 379 Colleges” guidebook.

Brigham Young University ranked number one among “Stone Cold Sober" universities – a title it has captured for 17 years in a row. To celebrate, the university posted an image on its Facebook page of what may be its preferred celebratory beverage: reduced fat chocolate milk.

White House Talks College Success With Education Leaders from 10 Cities

The White House summoned officials from higher education, K-12 and business in 10 cities to a meeting Thursday at the U.S. Department of Education. The group was brought together to discuss collaborative strategies on college completion, according to a brief written statement from the department. It was a follow-up to the college "summit" the White House held earlier this year. One area of focus was improving college preparedness and remedial success rates, sources said.

The represented cities and counties were Albany, New York; Baltimore County, Maryland; Camden, New Jersey; Denver, Colorado; Kansas City, Missouri; Minneapolis, Minnesota; Providence, Rhode Island; Rio Grande Valley and McAllen, Texas, Riverside County, California; and Spartanburg County, South Carolina.

Bar Exam Technology Disaster

New law graduates in many states experienced a technology snafu at the worst possible time Tuesday night: as they were attempting to upload bar examinations just before deadlines in their states. Many reported spending hours trying and failing to upload their answers. ExamSoft, a company that manages the bar test submission process in many states, acknowledged "slowness or difficulty" being experienced by many test-takers, and said that it was sorry for the difficulties many were having. The company, working with various state bar associations, announced 17 deadline extensions by states, so that people who couldn't submit their exams would not be penalized.

The legal blog Above the Law posted some of the emails and social media messages being posted by angry law graduates. the blog said that the situation "appears to be the biggest bar exam debacle in history."

Many bar exams continue today, so the frustrated test-takers who were up late, some fearing that they may have failed by not submitting their day's results, have another stressful day ahead of them, for many of them without as much sleep as they might have had otherwise. One comment on the ExamSoft page on Facebook said: "This is unbelievably disrespectful. I don't think you quite understand the pressure we are all under. We understand technical issues happen (although you are supposed to be a tech company), but your 'support staff' is a joke and you should at the VERY least had updates for each of the states BEFORE their respective deadlines. Now we are wondering, HOURS before a second day of grueling testing if any of it will matter. Please answer the states with past or remaining deadlines. Or get someone to answer the phone, chat or email--> have been trying all three methods for 4 hours. Thanks."

One law blogger, Josh Blackman, wondered what would happen if failure rates are higher this year. He explained: "And for crying out loud, this is serious business. Failing the bar in this economy is a 6-month sentence of unemployment. Somewhere, a plaintiff’s lawyer is putting together a class-action suit for those who used ExamSoft and failed."

 

Let's differentiate between 'competency' and 'mastery' in higher ed (essay)

"Competency-based” education appears to be this year’s answer to America’s higher education challenges, judging from this week's news in Washington. Unlike MOOCs (last year’s solution), there is, refreshingly, greater emphasis on the validation of learning. Yet, all may not be as represented.

On close examination, one might ask if competency-based education (or CBE) programs are really about “competency,” or are they concerned with something else? Perhaps what is being measured is more closely akin to subject matter “mastery.” The latter can be determined in a relatively straightforward manner, using various forms of examinations, projects and other forms of assessment.

However, an understanding of theories, concepts and terms tells us little about an individual’s ability to apply any of these in practice, let alone doing so with the skill and proficiency which would be associated with competence.

Deeming someone competent, in a professional sense, is a task that few competency-based education programs address. While doing an excellent job, in many instances, of determining mastery of a body of knowledge, most fall short in the assessment of true competence.

In the course of their own education, readers can undoubtedly recall the instructors who had complete command of their subjects, but who could not effectively present to their students. The mastery of content did not extend to their being competent as teachers. Other examples might include the much-in-demand marketing professors who did not know how, in practice, to sell their executive education programs. Just as leadership and management differ one from the other, so to do mastery and competence.

My institution has been involved in assessing both mastery and competence for several decades. Created by New York’s Board of Regents in the early 1970s, it is heir to the Regents’ century-old belief in the importance of measuring educational attainment (New York secondary students have been taking Regent’s Exams, as a requirement for high school graduation, since 1878).

Building on its legacy, the college now offers more than 60 subject matter exams. These have been developed with the help of nationally known subject matter experts and a staff of doctorally prepared psychometricians. New exams are field tested, nationally normed and reviewed for credit by the American Council on Education, which also reviews the assessments of ETS (DSST) and the College Board (CLEP). Such exams are routinely used for assessing subject matter mastery.

In the case of the institution’s competency-based associate degree in nursing, a comprehensive, hands-on assessment of clinical competence is required as a condition of graduation. This evaluation, created with the help of the W.K. Kellogg Foundation in 1975, takes place over three days in an actual hospital, with real patients, from across the life span -- pediatric to geriatric. Performance is closely monitored by multiple, carefully selected and trained nurse educators. Students must demonstrate skill and ability to a level of defined competence within three attempts or face dismissal or transfer from the program.

In developing a competency-based program as opposed to a mastery-based one, there are many challenges that must be addressed if the program is to have credibility. These include:

  • Who specifies the elements to be addressed in a competency determination? In the case of nursing, this is done by the profession. Other fields may not be so fortunate. For instance, who would determine the key areas of competency in the humanities or arts?
  • Who does the assessing, and what criteria must be met to be seen as a qualified assessor of someone’s competency?
  • How will competence be assessed, and is the process scalable? In the nursing example above, we have had to establish a national network of hospitals, as well as recruit, train and field a corps of graduate prepared nurse educators. At scale, this infrastructure is limited to approximately 2,000 competency assessments per year, which is far less than the number taking the College’s computer-based mastery examinations.
  • Who is to be served by the growing number of CBE programs? Are they returning adults who have been in the workplace long enough to acquire relevant skills and knowledge on the job, or is CBE thought to be relevant even for traditional-aged students?

(It is difficult to imagine many 22 year-olds as competent within a field or profession. Yet, there is little question that most could show some level of mastery of a body of knowledge for which prepared.)

  • Do prospective students want this type of learning/validation? Has there been market research that supports the belief that there is demand? We have offered two mastery-based bachelor’s degrees (each for less than $10,000) since 2011. Demand has been modest because of uncertainty about how a degree earned in such a manner might be viewed by employers and graduate schools (this despite the fact that British educators have offered such a model for centuries).
  • Will employers and graduate schools embrace those with credentials earned in a CBE program? Institutions that have varied from the norm (dropping the use of grades, assessing skills vs. time in class) have seen their graduates face admissions challenges when attempting to build on their undergraduate credentials by applying to graduate schools. As for employers, a backlash may be expected if academic institutions sell their graduates as “competent” and later performance makes clear that they are not.

The interest in CBE has, in large part, been driven by the fact that employers no longer see new college graduates as job-ready. In fact, a recent Lumina Foundation report found that only 11 percent of employers believe that recent graduates have the skills needed to succeed within their work forces. One CBE educator has noted, "We are stopping one step short of delivering qualified job applicants if we send them off having 'mastered' content, but not demonstrating competencies." 

Or, as another put it, somewhat more succinctly, "I don't give a damn what they KNOW.  I want to know what they can DO.”

The move away from basing academic credit on seat time is to be applauded. Determining levels of mastery through various forms of assessment -- exams, papers, projects, demonstrations, etc. – is certainly a valid way to measure outcomes. However, seat time has rarely been the sole basis for a grade or credit. The measurement tools listed here have been found in the classroom for decades, if not centuries.

Is this a case of old wine in new bottles? Perhaps not. What we now see are programs being approved for Title IV financial aid on the basis of validated learning, not for a specified number of instructional hours; whether the process results in a determination of competence or mastery is secondary, but not unimportant.

A focus on learning independent of time, while welcome, is not the only consideration here. We also need to be more precise in our terminology. The appropriateness of the word competency is questioned when there is no assessment of the use of the learning achieved through a CBE program. Western Governors University, Southern New Hampshire, and Excelsior offer programs that do assess true competency.

Unfortunately, the vast majority of the newly created CBE programs do not. This conflation of terms needs to be addressed if employers are to see value in what is being sold. A determination of “competency” that does not include an assessment of one’s ability to apply theories and concepts cannot be considered a “competency-based” program.

To continue to use “competency” when we mean “mastery” may seem like a small thing. Yet, if we of the academy cannot be more precise in our use of language, we stand to further the distrust which many already have of us. To say that we mean “A” when in fact we mean “B” is to call into question whether we actually know what we are doing.

John F. Ebersole is the president of Excelsior College, in Albany, N.Y.

Editorial Tags: 

Pages

Subscribe to RSS - assessmentaccountability
Back to Top