Skip to main content
Apply Now

Impact Measures (CAEP Standard 4)

1. Impact on P-12 learning and development (Component 4.1)

The unit implemented a new component of its initial program Planning and Assessment Signature Assessment to address impact on P12 student learning and development. The Planning and Assessment Signature Assessment seeks to immerse candidates in activities most representative of the field.  Candidates develop assessments aligned with College and Career Ready Standards, and administer those assessments both prior to and after planned instruction. Data collected is used to analyze P12 student learning and instructional effectiveness.  Candidates receive feedback and support from instructors as they complete this Signature Assessment, not once, but twice prior to internship, which will culminate with the completion of EdTPA. The collection of P12 student data prior to and after instruction will help candidates make a case for their impact on P12 student learning and development, and thus allow the unit to report on that impact. An Excel workbook to collect and ease the analysis of student data was implemented in Spring 2018, and data will be reported for the 2019 CAEP Annual Report.

 2. Indicators of teaching effectiveness (Component 4.2)

Teaching effectiveness data was collected using the unit’s current Form F observation tool. This data was aggregated and disaggregated by program for the first time in response to CAEP Annual Reporting Measure #2 for the 2018 Annual Report. It is important to note that the Coordinator of Clinical Experiences retired at the close of the 2016-17 academic year, and thus the data was compiled by a newly appointed Coordinator of Clinical Experiences. As a result, there are some holes in the data, which are noted in the linked data table.  The data collection process has now been streamlined to avoid the loss of data in the future.

First, the unit has determined that the current observation tool does not provide particularly useful data for the purpose of program improvement. In order to better address CAEP component 4.2 the unit will replace the current observation tool (Form F) with the Danielson Observation Tool.  The Danielson Observation Tool will be implemented in Fall 2018.  

Acknowledging the limitations of the current data, some observations worth consideration are as follows:
•    There is some inconsistency in the number of observations completed by both unit supervisors and cooperating teachers across and within programs.  This inconsistency should be eliminated to improve the value of the data. This can easily be corrected through training and regular tracking of observation data.
•    There is consistency in candidate ratings of unit supervisors and cooperating teachers across all programs and academic terms.
•    On the other hand, cooperating teachers consistently rate candidates lower in all areas when compared to the ratings of unit supervisors. This pattern indicates that they unit likely needs to provide systematic and standardized training with respect to the use of the observation tool and unit expectations to ensure greater inter-rater reliability.
•    The data also indicates that Alternative A candidates are typically rated higher than candidates enrolled in Bachelor programs by both unit supervisors and cooperating teachers.  This pattern may simply be due to the fact that Alternative A program candidates are often older, with some work experience, that better equips them for immediate success in the field.

3.    Satisfaction of employers and employment milestones (Component 4.3 | A.4.1)

The unit, like all other EPPs in the state of Alabama, has a gap in the collection of employer satisfaction and employment milestone data. All EPPs in the state hold membership in the Alabama Association of Colleges for Teacher Education (ALACTE). Due to the absence of state reported information about employer satisfaction or new teacher effectiveness the membership, in collaboration with the Alabama State Department of Education (ALSDE), developed, validated, and piloted an employer satisfaction survey. The State Department, for their part, was to administer the survey annually, encourage employer survey response, and disaggregate the data by EPP. The ALSDE reported that the response rate for the pilot administration was too small to merit providing data to the organization or to individual EPPs.  Moving forward, the survey will be disseminated through the New Teacher Mentoring program beginning in Spring 2018. An update from the ALSDE was received in March with the following information: As of Monday, March 19th, 1,119 new teachers and 745 employers responded to their respective surveys.  You will receive raw data for the new teachers who completed one of your teaching field Class B or Alternative Class A programs. This data, not yet received, would be pertinent to the 2017-18 school year and will be included in the 2019 CAEP Annual Report.  In the meantime, the unit has obtained a copy of the ALACTE developed employer and alumni surveys and validation information. The unit is in the process of developing a tracking system for its graduates so that satisfaction data can be collected beyond the first year of teaching. There is also a need to develop a similar survey for advanced program graduates and employers, to track satisfaction data for those changing roles due to the completion of a credentialing program.

4.    Satisfaction of completers (Component 4.4 | A.4.2)

The unit had 65 initial program completers that were asked to respond to an exit survey to gather program satisfaction data. Of the 65 completers 31 (22 Alternative A & 9 Undergraduate) submitted the survey for a return rate of 47.7%.  Overall, initial program completers were satisfied with the preparation provided by their program. The data does show a dip in the satisfaction scores in two areas.  Those two areas are internship seminars and advising. Further investigation into completers’ thoughts about these two areas will be conducted in Spring 2018 through focus group interviews to determine what aspects of these two areas might be the source of the discontent so that steps can be taken to ensure candidate needs are being met.

Additional qualitative data was collected via an open-ended question at the close of the survey.  Most comments were positive and focused on the helpfulness of unit supervisors, the high quality of cooperating teacher, and praise for Dr. Melina Vaughn who assisted those participating in the EdTPA pilot program.  The only negative comments recorded related to the placement process itself.  Several completers logged complaints about the lack of communication from the Coordinator of Clinical Experiences. The Coordinator of Clinical Experiences retired at the close of the 2016-17 academic year. A newly appointed Coordinator of Clinical Experiences took over in Summer 2017 and has streamlined the placement process and is working to replace paper forms with digital forms to ensure more timely communication with candidates seeking field experience and internship placements.

Beginning in Spring 2018 a revised satisfaction survey will be utilized.  This survey was developed by an Alabama Association of Colleges for Teacher Education collaborative committee in response to CAEP standard 4. This survey has been validated and is also being utilized to collect data from new in-service teachers.

In order to increase the return rate with initial program completers the survey will be presented to candidates when they attend the third and final internship seminar near the close of the internship, and be encouraged to complete the survey before leaving the seminar.  Currently, satisfaction data is not being collected from advanced program completers, but beginning Fall 2018 the same satisfaction survey will be tied to the graduation application process for teacher education advanced program completers. Similar surveys to collect completer satisfaction data from advanced program completers in the areas or school counseling, library media, instructional leadership, and teacher leader need to be developed and validated.
Outcome Measures

5.    Graduation Rates (initial & advanced levels)


The graduation rates for both initial and advanced programs were calculated based on 2012 cohorts. For initial programs, any candidate accepted to the Teacher Education Preparation Program (TEP) was included. For advanced programs, any candidate accepted to and enrolled in an advanced program was included. For initial programs the unit had a total of 119 potential graduates.  Of those 22 never completed a program, and 11 more of those completed a program, but not a program supported by the unit, and thus they cannot be included in the unit’s graduation rate.  The unit’s calculated graduation rate for the 2012 initial programs’ cohort is 72.3%.  For advanced programs the unit had a total of 934 potential graduates.  Of those 375 never completed a program, and 3 more of those completed one program and began a second program that was not completed. The three candidates counted for the unit’s graduation rate once, and then against the graduation rate once. Therefore, the candidates were counted twice in the Accepted & Enrolled column, but only once in the Graduated column. The unit’s calculated graduation rate for the 2012 advanced programs’ cohort is 59.8%. The unit has addressed the retention rate through the of use of an “attention alert” system in recent years.  Any candidate appearing to struggle with attendance, timely work submission, or class participation was flagged and the appropriate personnel reached out to provide support in an effort to help the candidate maintain an active program status. With the addition of the Learning House partnership, additional retention watches and efforts have been implemented to support candidates.

6.    Ability of completers to meet licensing (certification) and any additional state requirements; Title II (initial & advanced levels)


In order for a candidate to complete any program supported by the unit they must meet all licensing requirements outlined by the Alabama State Department of Education. As a result, 100% of the unit’s completers have the ability to meet licensing requirements.  The following chart outlines the unit completer numbers for the 2016-17 academic year.

7.    Ability of completers to be hired in education positions for which they have prepared (initial & advanced levels)


The University’s Counseling and Career Services Department collected data from graduates using the National Association of Colleges and Employers’ Final Destination Survey.  Among other data, the survey collected employment information, which is reported above.  The data does not provide specific information related to employment field, but it does provide information about the employment status of some graduates.

The reported graduate numbers do not match the unit’s completer numbers, and there are gaps in reporting. Locating graduates and successfully collecting survey data after matriculation is challenging. Although a large number of graduates reported current employment, many graduates did not respond to the survey and the list of graduates contacted is incomplete.  It is also not possible to state whether unit completers are employed in their field.  This data further supports the need for the unit to develop a graduate tracking system so that employment data, in addition to satisfaction data, can be collected.

8.    Student loan default rates and other consumer information (initial & advanced levels)
The University’s Official Default Notification Letter was provided by the financial aid office.  The data is based on the cohort of loans that began repayment in 2014 and defaulted by September 30, 2016, but the data is not disaggregated by college or programs within a college.  The current UWA loan default rate is 8.5 %.