outcomes data

Outcomes Data


Context Statement

Assessment procedures for LAEP are defined by two predominant factors:

  1. As a nationally accredited professional program, and one based upon highly visual skills, the program is governed by reviews (both of the Program and of its students) that continually assess performance. Those reviews are established by the governing body, the Landscape Architecture Accrediting Body, and by studio-based pedagogical traditions.
  2. LAEP has seen unprecedented turnover in its sole administrative position, the Department Head, in recent years. From 1999 until 2008, LAEP had six changes in department leadership. Of the six individuals who filled the position during those years, only two remain on the department Faculty. The consistent use of varied assessment procedures, and national accreditation in particular, has meant that LAEP has continuously been examined for how it successfully it is achieving the goals it sets out to achieve. In contrast, one result of the upheaval in department leadership has been a highly inconsistent process of assessment. Data recording, follow-up, and enthusiasm for the benefits of assessment have been particularly harmed by leadership changes. It is within this context that the Outcomes Data below should be placed.

LAAB Professional Accreditation Standards

In 2005, LAEP was visited by an LAAB review team. The program’s full accreditation was renewed for the maximum period (6yrs). The program will be accessed for accreditation again in spring of 2011. The 2005 review team’s report cited the following assessments:


    Category
    Assessment Expectations  
      Program Mission and Objectives Met with Recommendations  
      Governance/Administration Met  
      Professional Curriculum Met  
      Faculty Met  
      Students Met  
      Alumni Met  
      Practitioners Met  
      Relation to the University and the Community Met  
      Facilities and Equipment Met  

 The team’s lone recommendation—for Program Mission and Objectives—included the following steps:

  • “Revisit the Program’s mission and its supporting goals and objectives to reflect current realities and future aspirations”
  • “Evaluate the curriculum and individual course content for their ability to implement the goals and objectives of the mission”

Advisory Board Review

As an independent body, the Advisory Board operates outside the oversight of the Program. As such, the Board has not historically kept a consistent record of findings and recommendations. Instead, the Board has reported at the annual meeting on their conclusions. The Program has been response to those recommendations in making appropriate changes. An important change in this process is reporting on Board recommendations. The Board’s turnover in members, coupled with the rapid change in LAEP’s administration, means that these processes will be evolving in 2009 as the Board next convenes. Doing so will improve the capture of outcomes data from the Advisory Board.

Public Juries

Public juries of student work provide an ongoing assessment point for LAEP. However, data from those juries is not systematically collected. Instead, guest critics share input to the instructor for the course, and to the student for their individual work. This project-by-project communication loop serves a distinct function, but does not tend to produce recorded outcomes data.

Portfolio Review

Since 2006, the average number of sophomores seeking to matriculate into the upper division of the BLA has been approximately 33. Of those applicants, each of whom submits a person portfolio, 27 on average have been accepted. In contrast, some 45, on average, begin the sophomore year in pursuit of the BLA. Consequently, the portfolio review remains a key assessment tool for annually examining the quality of work being produced by students in the lower division of the Program. Given the consistent nature of the portfolio requirements, each year’s students are responding to the same expectations, and are vying for the same number of slots against a similar number of classmates.

Internal Curriculum Review

In 2006, following an extensive self-assessment process, LAEP outlined a set of curriculum changes. Weaknesses in the curriculum identified in 2006 included:

  • LEED
  • Emphasis on presentation and presentation evaluation
  • Graphic Design (page/poster design)
  • BMP’s – bioengineering
  • Local History
  • Lighting

The review, prompted by the 2005 LAAB accreditation team’s recommendation (see above), was instrumental in identifying these six weaknesses. The subsequent course-by-course competency review resulted in the Learning Objectives matrix reported in this Assessment.

Updating Outcomes


Rubric

In 2009, LAEP developed a rubric to aid in assessing learning outcomes during the junior year of the BLA. The first data to be collected from the rubric will be available during the 2009/2010 academic year.

Exit Assessments

In 2009, LAEP conducted exit surveys and exit interviews for the first time. Despite steps to increase participation rates, a relatively low 30% (n=8) number of seniors took part, with some failing to submit the written survey. Data from those surveys will be compiled and analyzed during the 2009/2010 academic year.