Assessing General Education at Boston University’s College of General Studies

Print Friendly

 

CGS1                                                                   Our assessment project at Boston University’s College of General Studies uses ePortfolio to evaluate student learning in our general education program.

 

aaeebl-poster-session

John Regan presents at the AEEBL International Conference on Assessment   

Summary

At the College of General Studies (CGS), Boston University (BU), we have completed the second year of a two-year assessment project, funded by the Davis Educational Foundation. Our project has helped us to establish an on-going model for Outcomes Assessment for our college. The project involves a team of 11 faculty members assessing two years of general education work posted by 100+ students on their ePortfolios.  (Students post work from each of the general education courses they take in their two years in our college.) In assessing this work, faculty use a rubric informed by the AAC & U VALUE model to gauge students’ levels of competency in areas such as critical thinking and perspective taking, writing skills, and awareness of historical and rhetorical contexts.  We are pleased to have encouraging quantitative and qualitative data from our first two summer of assessing 106 ePortfolios, but we continue to face challenges in terms of faculty and student buy-in. Some students are not posting work from each of their classes, and some faculty are not encouraging them to do so.  We also face challenges in translating assessment data into curricular and pedagogical change. 

Authors:

John Regan, Senior Lecturer, Department of Rhetoric, CGS (Author, Lead Editor and Web Designer), Natalie McKnight, Dean of the College of General Studies (CGS); ; Amod Lele, Instructional Technologist, Boston University; Gillian Pierce, Assistant Professor, Dept. of Rhetoric, CGS.

Overview of ePortfolio-related Outcomes Assessment on Our Campus

Part I: Setting the Stage:  Outcomes Assessment on Our Campus

Outside of professional programs such as the Dental and Public Health schools, Boston University has not had widespread outcomes assessment programs, and we’ve had very little at all in terms of general education assessment. The Writing Program at the College of Arts and Sciences, which oversees required writing courses for most of the undergraduates at BU, has instituted a hard-copy portfolio-review assessment of its program, but that is still in its early stages. Recently BU has moved towards university-wide assessment led by the Associate Provost for Undergraduate Affairs.  NEASC (the New England Association of Schools and Colleges) is our accrediting agency, and when it is time for BU to be re-accredited, the Provost appoints someone to manage the process (last time it was the Dean of our college, who was working for the Provost at that point; the next time it will probably be the Associate Provost for Undergraduate Affairs, but the process involves input from Deans, Chairs, and other academic leaders as well). While some faculty still bristle at the idea of assessment, many are coming to realize that it’s a necessity, an imperative that is not going to go away.

Part II: Outcomes Assessment Developmental Story

Since the former Dean of CGS, Linda Wells, oversaw BU’s last reaccreditation, she was well-aware of our need to develop some system for assessing the impact of our program (lack of sufficient assessment processes was seen as a weakness in the last accreditation report).  She formed a committee to explore assessment options and to write a grant to support our assessment work. Through this inquiry process, the committee discovered sources indicating the usefulness of ePortfolios for assessment. At the same time, the Associate Provost for Undergraduate Education had begun an ePortfolio initiative across campus, for similar reasons. It was the integration of these two projects that led to our first successful grant application to the Davis Educational Foundation, which paid faculty stipends to attend ePortfolio workshops and training sessions.  (For more details, see our “Scaling Up” story and our “Professional Development” narrative under “Polished “Practices” in our BU ePortfolio). A faculty committee with representatives from each division in the college also worked on a rubric that would best reflect the goals of our program and could be used for programmatic assessment.

CGS Rubric

For a readable version of the rubric, click here.

 

By the end of 2010, all students at CGS had ePortolios in which they were archiving work from each of their courses at CGS. In 2011, we submitted another grant application, this time to pay faculty stipends to use the rubric to assess the work archived in 100+ student ePortfolios in the summer of 2012. (The grant also involved money for travel to conferences to present our findings.) The Assessment Committee in charge of this project, made up of 11 faculty members, met once a month for a year to assess student ePortfolios as a group in order to establish inter-rater reliability before conducting assessments independently over the summer. At the end of the summer of 2012, we collected assessment score sheets from the members of the Assessment Committee and inputted the data in a File Maker database in order to compute averages and standard deviations. We followed a similar method of dtaa collection in the summer of 2013. 

Part III: Conceptual Framework

Our work in Outcomes Assessment incorporates the principles of Inquiry, reflection, and Integration:

Inquiry:

Our use of ePortfolio for Outcomes Assessment has encouraged faculty to design more student-centered, inquiry-based assignments. Given how easily students can incorporate visual/audio/video components into their work on the Digication platform, and, more importantly, how the level of student engagement increases when assignments require these components, faculty are creating new assignments and tweaking existing ones to deepen student learning.

For example, in our Natural Science course, Professors Sally Sommer-Smith and Kari Lavalli ask students to visit the Museum of Fine Arts and describe a landscape painting from a scientific perspective. Below is an image from a response by Adri Alcivar (BU’13):

Adri5

Moreover, in some sections of Social Science, students have focused on the geographical spaces of Boston neighborhood, gathering images from specific Boston neighborhoods and then analyzing those images in the context of sociological theories of urbanization.

Faculty are encouraged to “spread the word” about their successful assignments in departmental meetings, college-wide faculty meetings, and in faculty development events sponsored by our Faculty Development, Student Research and Writing, and Grants and Assessment Committees,

Reflection:

As well as encourage faculty to use more reflection-based activities in their courses, we have instituted important reflection assignments at the end of both the freshman and sophomore years. At the end of the first year, students are asked to reflect upon their growth (or lack thereof) in our seven learning outcomes; click here to learn more about this activity.  We also added a reflection component to our existing sophomore-year Capstone project. This activity asks students to reflect on the relationship between their work in the Capstone group project and their two years of study in our general education program; click here to learn more about Capstone and its reflection component.

Integration:–   Given that we now have two years of data on student learning, we are beginning to work towards “closing the loop” and using our ePortfolio-based assessment as part of our approach to curricular and institutional improvement.  To cite one example, we have noticed a lack of student work in quantitative methods in their first year (Terms 1 and 2).  This leads directly to questions about our curriculum:  Is the student work with quantitative methods in their sophomore year science courses sufficient or should more work be done in the first year?  If more work in quantitative methods is needed, what faculty/courses should give it and to what extent? While the paucity of work in quantitative methods in Terms 1 and 2 may lead us to question whether quantitative methods is in fact a key learning outcome at our college, at this point we do believe that it is. We have begun to discuss the situation at the departmental level, and while by no means has a consensus emerged regarding a course of action, some first-year faculty do seem to recognize that students may benefit from more work in quantitative methods during their first two terms.   Indeed, there does seem to be a growing recognition nationally that students would benefit from more engagement with quantitative methods. As Princeton economist Alan Binder’ recently observed, “America is shamefully inadequate at teaching statistics. A student can travel from kindergarten to a Ph.D. without ever encountering the subject. Yet statistics are ubiquitous in life, and so should be statistical reasoning”  (New York Times Jan. 5, 2014  http://www.nytimes.com/2014/01/05/books/review/inside-the-list.html?_r=0).

Evidence

We assessed student performance in seven key learning outcomes over four terms in our program. The following data represents the assessment of over 100 student ePortfolios in 2012:

CGS Assessment 2012

CGS 2012 Assessment Data: Term 1 vs. Term 4   2012 data  We are now in the process of discussing how to interpret this data and translate it into improvements in our program. We are also considering our results in relation to other studies of student performance:

  • In Academically Adrift, by Richard Arum and Josipa Roksa report on data they collected from thousands of students in 24 4-year colleges who took the CLA (Collegiate Learning Assessment) test in their first term and at the end of their fourth term (the same points we are focusing on in our project).
  • In their study, in the important areas of written communication and critical thinking, students showed only a “seven percentile point gain, meaning that an average-scoring student in the fall of 2005 would score seven percentile points higher in the spring of 2007” (35). (Ours: 27%)
  • In Making Progress? What We Know about the  Achievement of Liberal Education Outcomes, Ashley Finley reveals that the Educational Testing Service Proficiency Profile and the Collegiate Assessment of Academic Proficiency (CAAP) also reveal similarly low rates of student progress.

Why are our assessment results so much better?

  • The College of General Studies is simply a phenomenally successful program?
  • Or, an ePortfolio assessment system provides a much more authentic, relevant, nuanced, and comprehensive picture of student progress.
  • Both?

We are also continuing to assess ePortfolios together in order to maintain inter-rater reliability in preparation for future assessment. Having completed our second round of assessment in the summer of 2013, we are now moving to conduct this assessment work during the academic year, and we may at that point make some changes in the rubric and in our mean of conducting the assessment, but those decisions will be made next year.

Our results from our second year of assessment are as follows:

Assessment Data: Summer 2013 (Second Year of Data) 

2013 data 2

In comparing the data from the 2012 and 2013 assessments, six of the seven rubric area are similar in their rate of increase from Term 1 to Term 4.  Regarding quantitative methods, the difference between the two assessments may be attributable to the limited amount of work in quantitative methods contained on student ePortfolios in Term 1 and Term 2. (This lack has led to a broader discussion at our college about the role of quantitative methods in our curriculum).        

Connections to Other Sectors of the Catalyst

Pedagogy:  The impact of ePortfolio is enhanced by integrative, reflective, and social pedagogies. Please see the above Conceptual Frameworks section for a more detailed discussion.

Scaling Up:  Click here to read our full Scaling Up story.

Professional Development: Click here to read about our professional development activities.

Technology:  Click here to read our full Technology story.     

Next Steps

Since we won’t always have grant money to fund assessment work in the summer, we need to make assessment part of the normal committee work of the academic year. With that in mind, we might simplify the process somewhat. We could, for instance, assess 100 portfolios once every two years. During the year we are not formally assessing, we can still be having monthly meetings to train any committee new-comers in the process and to continue establishing inter-rater reliability. Faculty assigned to this committee would have to have a two-year commitment, in that case. We might also move to a process of evaluating work only from the 1st and 4th terms, instead of from each term. That would reduce the labor involved in assessing while still, hopefully, capturing a sense of a student’s growth (or lack thereof). And we may decide that there are outcomes in our rubric that are not really reflective of our program, so we may need to tweak the rubric. Conversely, we may find that the rubric-based assessment highlights areas we need to improve in our program, and we would then need to find ways of making those improvements.