Assessment Work Group Meetings 2008-2009

The Assessment Work Group, (AWG) which includes assessment professionals from across campus and selected staff and faculty, focuses on improving communication, cooperation, and processes related to assessment at NC State University.

February 10, 2009

Welcome Teri Kaasa, Ph.D., Assistant Director of Assessment, DUAP.

Return on Investment Activity: Are there patterns of results across the university? Trey Standish presented data about identifying “at risk” students, based on data mining and statistical techniques.  Joni shared data on one cohort of data to show discrimination of those who graduated vs those who did not.  A lively discussion ensued about what were the next steps and if we should add more variables to the First Year Survey to improve on statistical models.  The group asked Trey to go back and add the current variables from First Year Survey to determine if any improved upon the model.  The group discussed what would be gained if more variables were identified.  Many concluded that there is much data already available and what should be done is follow-up on the data that already exists.

Next Meeting:  Discuss Success Stories, as this item was not discussed at the February Meeting.

December 18, 2008

Return on Investment Activity: Overview of the major activities of the ACCESS group related to assessment.  Are there patterns of results across the university? Carrie Zelna shared what ACCESS has been doing in the past.  Several issues related to assessment activities, including developing outcomes for first year students and the stop-out survey.  The ACCESS group has developed a set of outcomes that can be used as ways to inform student what is expected of them during the first year.  These may be piloted during 2009 and some may be assessed. Carrie has been leading the development and administration of a survey to go to students who do not return the following semester.  Two administrations have occurred.  The first administration was between fall 2007 and spring 2008.  Student who were in the census as of fall 2007 and not in the census as of spring 2008 were asked to complete a survey about why they did not return. The second administration was between spring 2008 and fall 2008. Carrie shared the results with the group.

The AWG was excited about the possibility of this type of data gathering.  Dianne said that she and Trey Standish (UPA) had developed a model to predict retention and have been finding interesting results.  Allen Dupont and Pam Stienke had also gathered some data about non-cognitive variables that Pam had begun to study.  Joni shared that she had developed a model for discrimination retention based on non-cognitive data from  engineering cohort of 2002.  Nancy suggested that some of these non-cognitive issues might be collected via Freshmen survey, as it was going online fall 2009.  The AWG decided to discuss these models and ideas for questions for freshmen survey.  Trey Standish and maybe an engineering faculty would be invited to share models and join the discussion.  Joni will set up this meeting.

Return on Investment Activity: Each person brings an example of what they think a “success story” might look like.  To be decided during meeting is what do we want?  Who is the audience?  What would the outline or criteria for each story entail? How convey/format – website; newsletter; written, verbal (podcast), multimedia?

Members shared their “success story” of their clients’ using assessment data for decision making.  We discussed that these are good stories and should be in the format of a story:  What  is the reason for the story (e.g. faculty dissatisfied with student learning), give an arch to the story (discovery, do something, find something), end story with conclusion that matches to reason for story (we can do this better). We discussed that having a picture of the person or chart of the process or some type of visual image would be important.  Ok to do this in mixed media, e.g. U tube clips, or written story. We discussed that we should have themes of the stories we wanted and then find those stories.  We could have them match the Assessment Principles and Best Practices.  We could list a series of themes, e.g. “don’t want to do assessment, but found it useful” “when dollars are scarce, how assessment data helped” “technology usage”, etc.  Our next task, then, is to decide on these themes.

October 16, 2008

The majority of the meeting was spent on deciding what AWG wanted to accomplish by summer of 2009, based on the discussions from the last two meetings.  Following are the ideas that AWG will focus on during 2008-2009 in relationship to the “Return on Investment @ Institutional Level And At Program/Unit Levels” activity area.

  • Identify “success stories”.  Success means that a unit/program can show evidence they used to make decision(s).  We discussed methods of presenting these stories, such as a website or using multimedia.  We discussed who should be the audience for this information.   Next meeting each person will bring an example to help with the discussion.  AWG needs to decide about audience, outline or criteria for each story.
  • Because of SACS, UPA will lead a “review of SACS criteria” during 2008-2009, including 3.3.1 (institutional effectiveness).  Karen Helm will develop a team, which would include some members from AWG plus faculty, staff and administrative representatives.  As a part of this review, AWG thought it would be important to not only evaluate the assessment process (assessment of assessment), but to also know who/how results are being used.  Through the process the review team could identify successful stories, successful strategies, help define efficiencies in processes.  The findings from the review of the SACS criteria could be used to increase awareness of assessment support, surveys, experts and expertise at NCSU.
  • Defining “university outcomes”  – both student learning outcomes and other expected outcomes.  We discussed that, at least, AWG could gather a list of various outcomes, see how they relate and see if there are any efficiencies that could be defined.  AWG could develop a concept map or umbrella of university outcomes that have been defined from bottom up, rather than from top down. More discussion is needed about AWG role on this topic.

September 11, 2008

TOPICS included further discussion on Return on Investment, Staff Well-Being Survey, CLA results and CHEA award process. Staff Well-Being Survey results are posted on web. Questions related to assessment: section D – question 3d, section E – questions 7 & 8, section G – question 16q. Allen Dupont briefly discussed process and results of CLA 2007-2008 process and results for NCSU.  UNC-GA not requiring for 2008-2009. Allen Dupont discussed process for applying for the CHEA award.  AWG decided we would be in better shape to respond to this next year, after reviewing assessment process for SACS during 2008-2009.

The majority of the meeting was focused on developing a better understanding of Return on Investment Activity area (see July 2008 notes). The group brainstormed about ideas, examples and definitions related to “Return on Investment” and ways AWG could implement these ideas.

Examples of Return on Investment included some of these ideas:

  • Examining assessment evidence and using results for decisions
    • Surveys (e.g. UPA conducted surveys); use of longitudinal data for decisions about policies, programming, etc
    • LITRE evidence
    • Closing the loop – many AWG members send out reports and evidence to many others, but get no response (negative example)
    • Closing the loop – faculty write reports and get no response from Deans or Provosts (negative example) or faculty write up reports and get response from Deans and Provost (e.g. graduate school external review).
    • Evidence that this is a culture at NCSU would include discussion at meetings around campus, use assessment evidence/data in compact planning at dept, college and institutional levels and evidence of changes in curriculum are based on assessment data.
  • Replicating studies
    • LITRE – LITRE projects are replicated in other courses/areas; adaptations are made based on assessment
    • Sharing results of assessment project – “market” that the results showed X
    • AWG members use examples for other’s work as success story. In general, success in using tools/approaches can spur others to adopt assessment approaches
  • Making the connections between assessment and teaching (learning and teaching)
    • LITRE – discussing connection between assessment and teaching
    • Faculty are finding that assessment helps them improve their teaching and student learning
    • Evidence that this is part of culture at NCSU – more discussion about teaching and learning
  • Ensuring process is useful and used
    • Graduate School External Review – is useful because it is used by administration and results and decisions are followed up every year. (e.g. action plans)
  • Identifying student needs and targeting resources
  • Valuing assessment
  • Identifying ways to break down institutional silos – foster collaboration
  • Defining cost of assessment data gathering vs benefit of results
  • Documenting the increase in those doing assessment over time – which shows return on investment of the effort of assessment across campus

What can AWG do to enable/influence Return on Investment? Some ideas:

  • Develop process for disseminating UPA administered surveys.
    • Nancy Whelchel works with AWG to help us understand the data collection process and how to interpret data.
    • AWG members go to their clients and help clients use the data appropriately.
    • AWG members give feedback to Nancy about what is useful and what is not; what are needs re survey items.
  • Similarly, lead/set up discussions of university available data points and performance measures (AWG gather information that would be of most use)
    • Include dept heads, student affairs, DUAP, assoc deans, etc. in both discussion and identifying what data they need for development of plans; what data they will share, etc.
    • Make data available in executive summary format
  • To improve culture/climate, discuss assessment evidence results, and how they were used.
    • Have faculty who have assessment successes in the classrooms share with other faculty
    • Reframe assessment successes into teaching practices.
    • Present LITRE results to Dept Heads meetings
    • Develop forum for programs to hear and discuss revisions to curriculum and link assessment results to curriculum changes
    • When discussing evidence, reflect source, methodology of evidence.
  • Develop a database of what are sources of data and what groups need to know about it; how data is being used, etc. Develop a “clearing house” of who is gathering what type of data and calendar.
  • Write news releases about how data was used, what learning, how used, etc.  Different focus for students, staff and faculty.
  • Develop better mechanisms to report “closing the loop” (e.g. which curriculum was changed due to assessment evidence, what was that evidence, did change “work”, etc)
  • Improve value of assessment, credibility of AWG members
  • For accountability – discuss and influence what measures should be used
  • Develop best practice models – Use a “model” of effective ROI, understand what were important elements that made it effective, apply it to other levels and units. (e.g. developing action plans after reports for Grad School Review has shown to be effective ROI)
  • During meetings, reinforce for needs driven/data driven decisions.  Asking for evidence.
  • Provide written criteria for reporting – help writing up results
  • Emphasis not just the assessment results but impact on learning and teaching
  • Develop ways to help others assess “cost” vs “benefit” of various assessment methods.

July 29, 2008

Using a series of exercises, Kevin Rice facilitated the group’s discussion to identify its mission, vision, and purpose for the next year. The first step was to brainstorm using the Fishbone Technique, then the group narrowed down the topics to the top three for this coming year.

Return On Investment (ROI) @ Institutional Level And At Program/Unit Levels

  • What is the evidence that assessment is making a difference?
  • Patterns of results across programs/units, colleges?
  • Coaching our clients on “how to use results”, how to interpret results

Develop Collaborative Projects Among AWG Members

  • Drive collaborative projects/topics as we see the need for them. For example: portfolios, student participation in activities
  • Look for “gaps” and “overlaps” of AWG members’ activities

Info/Advice about AWG member’s work/responsibilities at macro and micro levels

  • Macro level: for example: SACS, VSA, University Undergraduate Outcomes
  • Micro level: for example: specific projects, “how would I….?”