February 1, 2007
Assessment of student learning has been a recent focus of departmental activity, both because of its inherent value in improving our students' learning and because of its value in our accreditation process. Many individual departments have devised creative ways of assessing student learning in the context of their disciplines, and this is your opportunity to "look over their shoulders" and learn from their successes! Ginger Vaughan and Meredith Neuman from English, Deb Martin from Geography, Nina Kushner from History, and Deb Robertson from Biology will share highlights of their assessment plans and processes.
This workshop included Outcomes Assessments materials from the following
I. An Introduction
In my two years at Clark, I've been making the rounds of departments, evangelizing about assessment. There are lots of kinds and levels of assessment, for purposes of this discussion, talking here about assessment of academic programs—i.e. at the department level. I fundamentally and passionately believe that assessment is the right thing to do, because it gives us information that we can use to improve our students' learning. Yes, accreditors require assessment, but that's incidental to why we should do it—it just gives us a timeline.
As I've traveled around to departments, many of you have told me that it would be helpful to see examples of assessment work from other departments, hence this session. I've picked four departments that are approaching assessment in different ways and from different disciplinary contexts, and have invited representatives of those depts. to give you a snapshot of their work. My hope is that you will see some ways in which the assessment activities of these four departments, and of the others than come up in discussion can be adapted to the assessment work in your own department.
I want to start with an overview of program assessment. The point of assessment is to find out whether and how well, as a result of a Clark education, students are learning what we want them to learn. Once we know what students are and aren't learning, we have information that we can use to improve our educational program. It's important to note that assessment does not involve evaluation of any individual's performance, either faculty or student. It's about program improvement.
The assessment cycle handout summarizes the assessment cycle.
One turn of this cycle generally takes about two academic years. Doesn't need to be onerous—you can pick one or two outcomes at a time to work on—but it needs to be ongoing. We had a NEASC visit in fall 2005, at which point almost all departments had begun planning assessment activities. Although accreditation is not the most important reason to do this, in summer 2010 we owe NEASC a written report that shows that we have made progress in assessment of learning outcomes. So that means it's time to get started now, and I am ready, willing, and able to work with departments to get your assessment plans rolling. I think you will hear in this session that assessment is its own reward—that it can help you to achieve what you value.
II. Clark University Biology Department Learning Goals and Objectives
Goal 1: To provide students with a broad conceptual background in the biological sciences.
Objective 1.1: Students will demonstrate an understanding of
Mendelian and molecular genetics, cell structure, cell physiology, and molecular
processes of cells.
Objective 1.2: Students will demonstrate an understanding of organismal form, function and diversity.
Objective 1.3: Students will demonstrate an understanding of the principles of evolution and ecology.
Goal 2: To provide students with technical and analytical skills used in modern biological research.
Objective 2.1: Students will demonstrate proper and safe
laboratory practice, proper use of equipment, and the ability to use basic
techniques in several areas and advanced techniques in at least one area.
Objective 2.2: Students will demonstrate the ability to perform appropriate quantitative analysis of experimental data and draw valid conclusions from their analyses.
Objective 2.3: Students will demonstrate the ability to work effectively with computers to acquire and analyze experimental data.
Goal 3: To provide students with the ability to develop hypotheses and design approaches to evaluate them, as well as access and critically evaluate information in biology.
Objective 3.1: Students will demonstrate the ability to
develop testable hypotheses, design appropriate experiments, and present
reasoned analyses and interpretations of results.
Objective 3.2: Students will demonstrate the ability to effectively use electronic media to access biological information.
Objective 3.3: Students will demonstrate the ability to critically evaluate a journal article from the primary literature.
Goal 4: To provide students with the ability to effectively communicate the findings of biological research and incorporate these findings into the existing body of knowledge in biology.
Objective 4.1: Students will demonstrate the ability to
report the results of their experiments in a written paper that conforms to the
scientific conventions of that field and incorporate the findings into the
existing body of knowledge in biology.
Objective 4.2: Students will demonstrate the ability to orally communicate the findings of their experiments or the work of others.
Goal 5: To prepare students to undertake careers in the biological sciences.
Objective 5.1: Graduates will demonstrate the ability to use their degrees to undertake careers in biology or to gain admittance to graduate or professional school.
III. English Major Outcomes Assessment
Outcomes for English Courses (handout)
English Department Overview
The English Department began assessing whether or not senior English majors had actually attained the outcomes and objectives listed here four years ago. Each senior English major is required to take a capstone seminar in the fall of his or her senior year. The seminar culminates in a capstone project of the student's choosing. We ask students to turn in a hard copy of their project to the professor but at the same to submit it as an electronic attachment, with their name removed, to the English Department secretary. The attachments are offloaded onto a CD, and the following year, after the students have graduated, a member of the department, fondly dubbed the "Assessor Professor," picks 15 or so of the projects and evaluates them according to these goals and objectives. After reviewing the student projects, the Assessor Professor reports back to the department on the students' strengths and weaknesses, and we adjust our curriculum in ways we hope will improve the major. As a result of this circular process, we have beefed up aspects of our courses and articulated more clearly in our Majors Handbook just what our goals and expectations are.
Tales from an Assessor Professor
So as not to go too crazy, and in the interest of bringing a different kind of measurement to the process, I devised an informal way to rank writing skills. Below is my very non-objective "matrix" for assessing writing competence. (This is just one of many skills to consider, but it proves particularly knotty one to quantify.)
IV. Geography Major Outcomes Assessment
GES Learning Synopsis Chart
Poster Outcome Evaluation Form
GES Learning Synopsis Assessment Plan
The poster assessment was the first assessment tool we easily agreed upon as a department, because all of our majors must complete an original research project as part of the geography major, and present a poster of it at Clark or a regional/national geography conference. The assessment categories were developed by a committee, focusing on the elements that would apply across geography's diverse subfields. The assessment was field-tested in the first year, and the current version reflects changes to the assessment matrix. We are still working on gathering sufficient data to determine what this learning outcome assessment tells us about our program; we expect to spend the next three years collecting data on this measure before being able to fully evaluate its current success as a learning outcome.
The second assessment tool we have been working on is called the learning synopsis, which relies upon a "SWOT" (strengths, weaknesses, opportunities, threats) evaluation. We designed this tool as a modified portfolio, but with a focus on student summaries of their learning rather than intensive evaluation of materials produced in our courses. Part of the goal of the synopsis is to produce a document that helps students "market" their expertise, while also providing the department with critical feedback on the strengths and weaknesses of the program as measured against our stated goals. This tool will go into effect in spring of 2008, and will be required of all graduating majors. The students are asked to prepare the synopsis in consultation with their advisors. GES students are expected to have two GES faculty sign off on it since GES is a more disparate major than Geography.
V. History Major Outcome Assessment
Rubric for Capstone Paper
Rubric for Writing History Final Paper
We have identified two points in our program at which assessment is most practical and practicable. These include the final project for Writing History (History 120) and the Capstone Paper.
Points of Assessment: The history department requires majors to take a methods course called Writing History. We encourage our students to take this course in their sophomore year. The course's purpose is twofold: to get students thinking about the nature and purpose of history and to introduce them to the techniques of writing history. The course's main requirement is a research proposal on a topic of the student's choosing. Constructing this proposal requires a familiarity with the basic elements of historical scholarship, skills which will be further developed over the course of the major.
The Capstone Paper, a semester-long research project written in the senior year, strengthens and demonstrates students' command of these skills. The Capstone Paper is not generated from a designated course, although most of our majors write them as part of research seminars.
The Assessment Process: At the end of the academic year, a member of the department, dubbed "the assessor professor", will assess about two-thirds of the Writing History and Capstone Papers, selected at random. These papers are assessed using rubrics (see attached) which enumerate the various elements of historical scholarship. Given that students take Writing History in their sophomore year and the Capstone in their senior year, we plan to go through this process every other year, in an attempt to measure the same students.
Our expectation is that students will not have mastered all of the skills after Writing History, but will understand the types of skills valued in the discipline. In assessing their work we can 1) see where the most problematic points are and 2) develop a baseline for comparison in the second assessment.