Teaching Resources

Teaching Philosophy

 

 

Teaching Portfolio Resources

A cohesive collection of documents that communicates your philosophy of teaching, summarizes your teaching activity, provides evidence of your effectiveness, highlights student mentorship, and demonstrates a commitment to professional development. (TLTC)

Teaching portfolios are now required for all faculty (TTK and PTK) wishing to be considered for promotion.

Signature Programs

Signature Programs

A signature program in UME is one that:

  • fits the mission of UME;
  • meets critical clientele needs; incorporates research, evaluation, and scholarship;
  • demonstrates the ability to be replicated;
  • incorporates a marketing and communication component;
  • establishes public value; and
  • demonstrates sustainability through ongoing UME funding.

Explicit details of these components can be found in the UME Program Assessment Tool (PAT).

Action teams, in conjunction with Program Leaders, determine when a program should be submitted for signature program status.

Programs submitted for signature program status will be reviewed by an ad hoc committee made up of the:

  • UME Associate Dean/Director,
  • UME Assistant Directors, and one
  • Representative of each professional association that is recognized by JCEP.

Committee members will serve for two years and be appointed by their respective associations. This committee will meet two times per year and coincide with regularly scheduled program leadership team meetings. The committee will meet each April and October on the second Tuesday. This committee meeting will be held during the program leadership meeting from 9:00 a.m. until 12:00 noon.

Committee members who must travel to this meeting will have all expenses covered by the University of Maryland Extension administration.

Assessment Procedures:

  1. Action team members or groups of faculty members should schedule a time to work together to review and conduct an assessment of the development level of the program. The PAT Worksheet should be used as a guide in the assessment process. Applicants are encouraged to work with their respective program Leaders during the review process to increase their understanding of their review criteria and to offer helpful suggestions to strengthen the application.

    It should be noted that not each item in the PAT and the PAT Worksheet within all of the categories must be present in the program. In other words, not all boxes have to be checked. However, the action team members or faculty members conducting the review must believe there is enough evidence present to warrant consideration for signature program status. This form with notes about the program is to be submitted as part of the application.
     
  2. The application packet will include a cover letter, PAT worksheet completed by application(s), the application narrative, and program documents. Applicant narrative will describe how the program meets the PAT criteria with evidence and will follow the PAT category headings (no more than 12 pages). Program documentation could include needs assessments, logic models, evaluation reports, brochures, web sites, web site analytics, IRB documents, marketing materials, journal articles, and other items that can be used by the committee to gain an understanding and appreciation of the program’s accomplishments.

Submittal Procedures:

  1. The Program Assessment Worksheet and program documentation (which constitutes the application) should be submitted to the respective program leader two months prior to the April and October regularly scheduled program leader meetings (second Tuesday of each month).
     

    Submission Calendar

    Second Tuesday of February or August: Application is given to respective program leader.
    One week after that date The Program leader makes a decision as to whether or not the application will go forward for committee review. The Program Leader should believe, after review, that the program meets expectations for signature status before sending it forward for committee review.
    Two weeks after that date The Program leader forwards the application to the review committee. The committee will review the application and be ready for discussion and decision at the next program leader meeting.
    Second Tuesday of March or
    October:

    The Committee meets to discuss application and makes decision. The committee can meet in person or via distance technology.

     

    The Program Leader submitting the application on behalf of the action team will take notes about both strengths and weaknesses according to the PAT criteria.

     

    At least two-thirds of the committee (66%) must accept the program as a UME signature program.

     

    If two-thirds of the committee is not in agreement about signature program status, the deficiencies and associated remedies are to be noted in writing by the Program Leader and returned to the Action Team with instructions to resubmit again after the items have been addressed.

     

    Action team members are notified of decision by the respective program leader.

Recognition Procedures:

  1. When the committee has made its decision, the members of the action team submitting the request will be notified by their respective program leader.
     
  2. A formal announcement will be sent from the Associate Dean/Director and respective program leader to the UME organization.
     
  3. A formal presentation to the program team members will be made at an appropriate AGNR and/or UME event, such as the AGNR Awards Luncheon, UME Conference, or UME Program Summit.
     
  4. The action/program team who developed the signature program will receive a program funding award for future work.* The funds will be transferred to the associated UME KFS account. Funds cannot be transferred to any account that exists outside of the UME financial system.

    *Amount is subject to change depending on funding availability. The award amount for 2022 will be $2,500. A maximum of two programs will be awarded each year.

Curriculum Development

Teaching Effectiveness

Teaching Effectiveness Overview
Length: 18:49  |  Date: January 24, 2023

 

Teaching Effectiveness Survey

Teaching Effectiveness results are available in real-time as clientele/peers/administrators complete the online survey. To view your results, go to ume.az1.qualtrics.com and log in using CAS (Directory ID and Password). When you log in to Qualtrics for the first time, you are automatically self-enrolled. If you do not see the Teaching Effectiveness Dashboard in your list of projects, please contact Dee Dee Allen dallen3@umd.edu for assistance. 

General Teaching Effectiveness

The General Teaching Effectiveness form is to be used when teaching a class to the general public, ages 14 and up. All efforts should be made to collect the data using the survey link. For those audiences who have no access to technology, paper documents can be used. The educator may then enter the information directly to Qualtrics via the appropriate survey link.

Teaching effectiveness ratings are a required element of Annual Faculty Reviews (AFR) as well as the promotion and tenure process.  UME Administration suggests targeting the collection of a minimum of 40 to 60 forms per year from 3 to 5 separate teaching events.  The results of the forms should be drawn on to improve your teaching approach in addition to meeting university-required collection processes.  

  Paper-based form Online (URL) Resources
4-H Teaching Effectiveness 4-H form (pdf) go.umd.edu/4HTEACH 4-H QR Code Image (png)
4-H Postcard w/QR code & instructions (docx)
AgFS Teaching Effectiveness AgFS form (pdf) go.umd.edu/AgFSTEACH AgFS QR Code Image (png)
AGFS Postcard w/QR code & instructions (docx)
ENR Teaching Effectiveness ENR form (pdf) go.umd.edu/ENRTEACH ENR QR Code Image (png)
ENR Postcard w/QR code & instructions (docx)
FCS Teaching Effectiveness FCS form (pdf) go.umd.edu/FCSTEACH FCS QR Code Image (png)
FCS Postcard w/QR code & instructions (docx)
SNAP-ED Teaching Effectiveness SNAP-ED form (pdf) go.umd.edu/SNAP-ED-TEACH SNAP-ED OR Code Image (png)
SNAP-ED Postcard w/QR code & instructions (docx)

Administrative/Peer

Three (3) administrative/peer evaluations are needed each year by three different reviewers for at least two (2) different classes.

The 2-sided form or online survey can be completed by administrators, colleagues, or equivalent peers in other agencies. Please be sure to print both sides when providing this form in hard copy to a reviewer. It is the responsibility of the faculty member to enter the data in Qualtrics.

HARD COPY
UME PEER Download fillable (pdf)
SNAP-ED PEER Download fillable (pdf)
ONLINE SURVEY
SNAP-ED PEER https://go.umd.edu/SNAP-ED-PEER
 4-H PEER https://go.umd.edu/4H-PEER
FCS PEER https://go.umd.edu/FCS-PEER
ENR PEER https://go.umd.edu/ENR-PEER
AgFS PEER https://go.umd.edu/AgFS-PEER

Do you need for your name to be listed in more than one discipline? For example, you may teach both agriculture and 4-H classes. Or, you may teach agriculture and environmental and natural resources classes. If so, email dallen3@umd.edu and indicate the program areas to be included.

University of Maryland Extension Program Assessment Tool (PAT)

Program Assessment Tool

Programs are the foundation of Extension’s educational strategies. Yet, despite the literature and all of the expertise that exists, Extension faculty and administrators often find it difficult to assess the development stage of the program. Extension organizations and educators often describe a program as “signature” without criteria of what that term means. It is also difficult to take an objective view of a program and decide if further resources are warranted that can move it from a good idea with limited applicability to a statewide effort that meets a critical public need or issue. A national environmental and literature scan of Extension resources did not produce a tool that established criteria to make informed program assessments. Because of these and several other reasons, the UME Program Assessment Tool (PAT) was developed.

The PAT is based on the two well-known and used educational tools: rubrics and logic models. Like a rubric, the PAT provides criteria that can be used to help make decisions or judgments about where a program stands in the development process. Like a logic model, the PAT can be read from left to right—starting with the emerging and developing stages on the left where Extension efforts are more focused on outputs, to the right where the focus in on signature and evidence-based programs and outcomes.

Impact Teams will use this tool to make decisions about which programs will be sent forward to be peer-reviewed for signature status, as well as to determine emerging and developing programs that will be priorities for further investments. For some programs, they will be critiqued for an evidence-based status.

Definitions of Terms

Many terms in this tool could be interpreted in multiple ways. For purposes of use of the PAT, we’ve provided a short list of terms and our definitions.

  • Curriculum - A specific learning program with targeted learners, goals and objectives, learning activities and materials.
  • Educational Intervention - The programming done by Extension salaried and volunteer faculty and staff.
  • Evaluation Methods - The evaluation strategies that will be used to determine program outcomes.
  • Evaluation Use - What type of data will be collected and how it will be used.
  • Needs Assessment - “A systematic way … for identifying education and training problems, needs, issues, and the like” (Caffarella, 2002, p. 123).
  • Programs
    • Informational - A UME-branded program that delivers research-based information.
    • Developing - A UME-branded program in early stages of demonstrating its public value.
    • Signature - A UME-branded, research-based program known for its demonstrated public value.
    • Evidence-Based - A UME branded program that can be replicated with similar outcomes based on scientific measure of effect and judged by external reviews to meet standardized assessments.
  • Program Scholarly Outputs - Products that document the educational intervention including theory, findings, and effectiveness measures. Refereed reviews are the gold standard of judgment of quality of educational interventions.
  • Research Base - The science of the curriculum content, delivery, and evaluation.

Instructions for Using PAT

A. Individuals and Impact or other teams should use the PAT under these conditions:

  1. When assessing a current program for the extent to which it meets the criteria and deciding
    1. what to do to strengthen the program to remain in that category, or
    2. if it’s time to end the program or hand-off to a non-Extension entity.
  2. When determining what would need to be done to advance the program into a next category.

B. In some cases, stakeholders and partners should be included in completing the assessment. In other cases, an external review may be helpful.

C. ALL boxes need to be checked for a program to meet the requirements of its category.

We recognize that programs are constantly evolving and go through cycles, perhaps moving move forward and backward in these four types of categories that we have established. Programs need to change as the needs of individuals and communities that we serve change. Program evaluations often bring forth evidence that program changes are needed. This understanding is best described in the Cornell Office for Research and Evaluation (CORE)’s The Guide to the Systems Evaluation Protocol (2012):

“Each iteration of a program is related to the program’s history but is also shaped by decisions based on new information about how and how well the program works, and about what is needed by the target audiences or community; and by purely external factors like funding availability. The process of evolution involves learning, changing, and ultimately strengthening the larger system as a program is run, evaluated and revised and re-run over time” (p. 18)
 

University of Maryland Extension Program Assessment Tool

Category Informational Developing Signature Evidence-Based
 

Needs Assessment:

Fit with UME Mission
(Program Design)
▢ Represents an emerging public issue or need that could be addressed by UME.

▢ Based on some evidence of the issue and/or need

▢ Included in at least one IEP.

▢ Not yet included in TEP.

▢ Minimal or no specific UME funding or other resources dedicated to addressing the emerging issue or need through a formal UME program.
▢ Represents a developing public issue or need that can be addressed by UME.

▢ Represents a developing public issue or need that can be addressed by UME.

▢ Based on substantive evidence of the public issue or need AND the capacity of UME to make an impact.

▢ Included in multiple IEPs.

▢ Included in at least one TEP for development.

▢ Start-up UME funding or other resources committed to addressing the issue or need through a formal program.
▢ Represents a priority of UME based on identified public issues and/or needs of the people of the state.

▢ Provides sufficient evidence of impact to justify commitment of resources to conduct program.

▢ Defines the distinctiveness ofUME from other organizations in addressing the public issue and/or particular need of the people of the state.

▢ Included in multiple IEPs across multiple disciplines.

▢ Identified as a signature program in at least one TEP.

▢ Adequate funding and other resources from UME and others to have an impact on the issue or need through a program that is known outside of UME among public decision-makers and the people of the state.
▢ Represents an ongoing priority(ies) of UME based on identified public issues and needs of the people of the state.

▢ Provides sufficient evidence to justify commitment of resources needed to substantially address the issue or need over time.

▢ Documents the distinctiveness of UME from other organizations to address the public issue and/or particular needs of the people of the state or beyond.

▢ Included in multiple IEPs across multiple disciplines.

▢ Included as a signature program in at least one TEP.

▢ Adequate and sustained funding and other resources from UME and others, including states that replicate the program, to address the national issue or need and provide scientifically rigorous evidence of impact.
 
Category Informational Developing Signature Evidence-Based
 

Educational Intervention:

Meets Critical Clientele Needs (Program Development) ▢ Exchange of information to answer questions and address concerns.

▢ Information is transferred to client for immediate use.

▢ Information is research-based.
▢ Exchange of information is for immediate use and could lead to change over time in an individual’s knowledge, attitude, skills, and aspirations (KASA).

▢ Information and methods of teaching/learning are research and theory-based.

▢ Contact time with client is of a short-to-medium duration and may be face-to-face and/or through different types of media.

▢ May involve key partners or stakeholders.
▢ Exchange of information leads to documented change in an individual’s knowledge, attitude, skills, and aspirations (KASA).

▢ Exchange of information isused to aid in the solution of a public issue or need of individuals, families, and communities.

▢ Information and methods of teaching/learning are research and theory-based.

▢ Contact time with client is of a medium-to-long duration and uses multiple methods of contact, including face-to-face and different types of media.

▢ Involves key partners and stakeholders.
▢ Exchange of information leads to scientifically-rigorous, documented change in an individual’s knowledge, attitude, skills, and aspirations (KASA) over time.

▢ Exchange of information is used to aid in the solution of a public issue or need of individuals, families, and communities.

▢ Information and methods of teaching/learning are research and theory-based.

▢ Contact time with client is of a medium-to-long duration and uses multiple methods of contact, including face-to-face and different types of media.

▢ Involves key partners and stakeholders.

▢ Uses program strategies that have been scientifically tested and proven successful for public issues and needs of people.
 

Curriculum:

  ▢ No curriculum. ▢ Program curriculum under development is tested based on the UME Extension Curriculum Assessment Tool(CAT) and, when appropriate, the Materials Assessment Tool (MAT).

▢ Program curriculum changes have been made based on the UME Extension CAT and, when appropriate, the MAT.

▢ Curriculum has been pilot-tested using appropriate testing methods.

▢ If curriculum is adapted from another source, is subjected to the CAT and, if appropriate, to MAT, and pilot tested for appropriateness in state and modified as needed.
▢ Program curriculum developed using the UME Curricula Assessment Tool(CAT) review guidelines.

▢ Program curriculum adapted from another state has been peer reviewed using the UME Extension CAT and, when appropriate, MAT, and modified to meet Maryland needs.

▢ Curriculum has been both internally and externally peer-reviewed.

▢ Curriculum has been published with a UME signature-program endorsement.

▢ Curriculum is available to other states to use and adapt.
▢ Program curriculum developed using the UME Curricula Assessment Tool(CAT) review guidelines.

▢ Program curriculum adapted from another state has been peer reviewed using UME CAT and, when appropriate, the MAT.

▢ Curriculum produces evidence-based results.
 
Category Informational Developing Signature Evidence-Based
 

Research Base:

Research & Scholarship
(Program Development & Delivery)
▢ Uses research-based information. ▢ Theory and research-based information is explicitly explained and incorporated into the development of program. ▢ Theory and research-based information are used to explain impact measures and outcomes.

▢ Provides information that can be used to build additional intervention strategies and research questions.
▢ Theory, research-based information, and empirical evidence are explicitly integrated in explanation of program intervention impacts on intended outcomes.

▢ Program research results provide evidence to build additional the oretical models.

▢ Program research results provide evidence that allows for further research study funds to be generated.
 

Program Scholarly Outputs:

  ▢ Program activities cited in CVs and annual faculty reports for merit review. ▢ Program activities cited in CVs and annual faculty reports for merit review.

▢ Conference and professional association posters.

▢ Conference and professional association workshops and presentations based on preliminary data.

▢ Contributions to eXtension Communities of Practice(COP).

▢ UME peer-reviewed Extension Briefs and/or Factsheets.
▢ Program scholarship findings cited in CV and annual faculty reports for merit reviews.

▢ Program scholarship findings used in promotion and tenure packages for decisions about Senior or Principal Agent advancement and for merit reviews.

▢ Program results presentations at professional association meetings, workshops, panels, and other types of delivery methods--both refereed and non-refereed.

▢ Invited presentations and articles about program results.

▢ Contributions to eXtension Communities of Practice(COP).

▢ Refereed articles in subject-based journals.

▢ UME peer-reviewed Extension Briefs, Factsheets, Bulletins, Manuals, and Curricula.
▢ Program scholarship findings cited in CV and annual faculty reports for merit reviews.

▢ Program scholarship findings used in promotion and tenure packages for decisions about Senior or Principal Agent advancement and for merit reviews.

▢ Invited presentations and articles about program results from other states, regions, and countries.
Evaluation results add to a national evidence-based database.

▢ Invited presentations and articles about program results are issued from other states, regions, and countries.

▢ Primary authorships in eXtension Communities of Practice (COP).

▢ Journal editorial board memberships.

▢ Refereed articles in highly-acclaimed journals.

▢ UME peer-reviewed Extension Briefs, Factsheets, Bulletins, Manuals, and Curricula.

▢ Books or book chapters.
 
Category Informational Developing Signature Evidence-Based
 

Evaluation Use:

Program Evaluation ▢ Data collected and evaluated to determine participant knowledge gain and satisfaction level with the interaction experience.

▢ Evaluation results are used to communicate reach of Educator’s work.
▢ Dated collected and evaluated to determine participants’ short-term KASA outcomes and clientele satisfaction level with the interaction experience.

▢ Evaluation results used to determine program effectiveness and to communicate effectiveness of Educator’s work to meet clientele needs.
▢ Data collected and evaluated to determine medium-term outcomes achieved that benefit clientele and/or the community.

▢ Evaluation results used to communicate UME’s value in addressing societal, economic, and environmental needs.

▢ Evaluation results used to communicate the effectiveness of Educator’s work to meet clientele needs in Maryland.
▢ Data collected and evaluated to determine long-term outcomes achieved that benefit clientele.

▢ Evaluation results used to communicate UME’s impacton compelling societal,economic, and environmental issues in Maryland.

▢ Evaluation results used to communicate state and national impacts on compelling societal, economic, and environmental issues.
 

Evaluation Methods:

  ▢ End-of-session instruments used to determine client satisfaction.

▢ No IRB approval required if client satisfaction will not be published.
▢ Basic logic model developed.

▢ End-of-session instruments used for program improvement.

▢ Paired or unmatched pretests and posttests assessments for KASA changes.

▢ Qualitative methods incorporated where appropriate (structured observations, interviews).

▢ IRB approved.
▢ Logic model is fully developed.

▢ End-of-session instruments used for program improvement.

▢ Paired or unmatched pretests and posttests for assessment of KASA changes.

▢ Qualitative methods incorporated where appropriate (structured observations, interviews).

▢ Follow-up survey research used to assess medium-term outcomes.

▢ Control and comparison groups used where appropriate.

▢ Findings are used to improve programs.

▢ Findings are peer reviewed and published when appropriate.

▢ IRB approved.
▢ Logic model is fully developed and tested for utility overtime.

▢ Results of evaluations have been subject to critical peer review.

▢ Empirical evidence exists about program effectiveness.

▢ Program results grounded in rigorous evaluations using experimental or quasi-experimental studies with randomized control groups.

▢ Program can be replicated by other states with confidence in program effectiveness.

▢ Findings are published in peer-reviewed journals and other publications.

▢ IRB approved.
 
Category Informational Developing Signature Evidence-Based
Adoption & Replication
(Program Dissemination)
▢ Potential for adoption and replication unknown. ▢ Has potential to become a program that can be replicated by Extension or others in state. ▢ Recognized by respected agencies and organizationsas an effective program.

▢ Adopted by other organizations or Extension services.
▢ Program is promoted and adopted nationally as an empirically-tested intervention with identified short-, medium-, and long-term outcomes.

▢ Program materials(curriculum, protocols, evaluation instruments) exist that make adoption and replication possible.
Marketing & Communication
(Program Dissemination)
▢ No formal marketing plan, but program is advertised at the local level though flyers, newspaper articles,newsletters, or word-of-mouth. ▢ No formal marketing plan, but advertising has extended beyond the local community. ▢ Formal marketing plan in place and evaluated for effectiveness. ▢ Effective components of a formal marketing plan are used.
Public Value
(Program Dissemination)
▢ Program value is evident to the individual participants using information. ▢ Program value is evident tothe individual participants using information and participating in the program. ▢ Program’s value is evident to individuals, families, and the community-at-large. ▢ Program’s value is evident to individuals, families, and the community-at-large.

▢ Program’s public value is determined by people or agencies outside of UME using this assessment tool or one used by an agency with a standardized tool and or a process for judging value.
Sustainability
(Organizational Commitment)
▢ Minimum resources are required to initiate elements of a program.

▢ Internal resources used to launch the program.
▢ Short-term resources committed from Impact Teams to assist program in developing into signature program.

▢ Short term external funding secured to assist in developing program.

▢ Potential partners identified.
▢ Medium-term resources committed to supporting the program from the UME budget pending evidence of potential for impact.

▢ External funders may be involved in on-going support of the program.

▢ Partners involved in program when appropriate.
▢ Long-term funding in UME budget due to evidence of impact.
External, long-term funding or partners secured to maintain programming.

▢ National partners involved in program when appropriate.

References:

  • Boyle, P. (1981). Planning better programs. New York: McGraw-Hill.
  • Caffarella, R. S. (2002). Planning programs for adult learners. San Francisco: Jossey Bass.
  • Cornell Office for Research on Evaluation. (2012). The guide to the systems evaluation protocol. Ithaca, NY: Cornell Digital Print Services. Available from https://core.human.cornell.edu/research/systems/protocol/index.cfm
     

Acknowledgements:

The UME Program Assessment Tool (PAT) was developed by Teresa McCoy, Assistant Director, Evaluation & Assessment, and Dr. Bonnie Braun, Professor and Extension Family Policy Specialist with assistance of Nicole Finkbeiner, M.S., Graduate Research Assistant. This tool is based, in part, on the the Curriculum Assessment Tool (CAT) and the Materials Assessment Tool (MAT) created by Bonnie Braun and Nicole Finkbeiner, November 2012. All three assessment tools are contained in the Extension Education Theoretical Framework Manual, to be published in 2013 by the University of Maryland Extension.

The PAT was reviewed as part of a formative evaluation by the following members of the UME Health Smart Team: Karen Aspinwall, Virginia Brown, Nancy Lewis and Elizabeth Maring.

The PAT was also reviewed by the UME program leadership team of Dr. Patsy Ezell, Assistant Director, Family & Consumer Sciences; Dr. Jeff Howard, Assistant Director, 4-H Youth Development; Dr. Andy Lazur, Assistant Director, Agriculture & Natural Resources; Dr. Doug Lipton, Director, Maryland Sea Grant Program; and Tom Miller, Assistant Director of Operations. The need for this tool was identified during the leadership of Dr. Nick Place, Associate Dean/Associate Director of UME, now the Dean and Director, University of Florida Institute for Food and Agricultural Sciences. .

Program Assessment Worksheet

Category Signature Comments
 

Needs Assessment:

Fit with UME Mission
(Program Design)
▢ Represents a priority of UME based on identified public issues and/or needs of the people of the state.

▢ Provides sufficient evidence of impact to justify commitment of resources to conduct program.

▢ Defines the distinctiveness of UME from other organizations in addressing the public issue and/or particular need of the people of the state.

▢ Included in multiple IEPs across multiple disciplines.

▢ Adequate funding and other resources from UME and others to have an impact on the issue or need through a program that is known outside of UME among public decision-makers and the people of the state.
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
 

Educational Intervention:

Meets Critical Clientele Needs (Program Development) ▢ Exchange of information leads to documented change in an individual’s knowledge, attitude, skills, and aspirations (KASA).

▢ Exchange of information is used to aid in the solution of a public issue or need of individuals, families, and communities.

▢ Information and methods of teaching/learning are research and theory-based.

▢ Contact time with client is of a medium-to-long duration and uses multiple methods of contact, including face-to-face and different types of media.

▢ Involves key partners and stakeholders.
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
 

Curriculum

  ▢ Program curriculum developed using the UME Curricula Assessment Tool (CAT) review guidelines.

▢ Program curriculum adapted from another state has been peer reviewed using the UME Extension CAT and, when appropriate, MAT, and modified to meet Maryland needs.

▢ Curriculum has been both internally and externally peer-reviewed.

▢ Curriculum has been published with a UME signature-program endorsement.

▢ Curriculum is available to other states to use and adapt.
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
Category Signature Comments
 

Research Base:

Research & Scholarship
(Program Development & Delivery)
▢ Theory and research-based information are used to explain impact measures and outcomes.

▢ Provides information that can be used to build additional intervention strategies and research questions.

 
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
 

Program Scholarly Outputs:

  ▢ Program scholarship findings cited in CV and annual faculty reports for merit reviews.

▢ Program scholarship findings used in promotion and tenure packages for decisions about Senior or Principal Agent advancement and for merit reviews.

▢ Program results presentations at professional association meetings, workshops, panels, and other types of delivery methods-- both refereed and non-refereed.

▢ Invited presentations and articles about program results.
Contributions to eXtension Communities of Practice (COP).

▢ Refereed articles in subject-based journals.

▢ UME peer-reviewed Extension Briefs, Factsheets, Bulletins, Manuals, and Curricula.
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
Category Signature Comments
 

Evaluation Use:

Program Evaluation ▢ Data collected and evaluated to determine medium-term outcomes achieved that benefit clientele and/or the community.

▢ Evaluation results used to communicate UME’s value in addressing societal, economic, and environmental needs.

▢ Evaluation results used to communicate the effectiveness of Educator’s work to meet clientele needs in Maryland.

 
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
 

Evaluation Methods:

  ▢ Logic model is fully developed.

▢ End-of-session instruments used for program improvement.
Paired or unmatched pretests and posttests for assessment of KASA changes.

▢ Qualitative methods incorporated where appropriate(structured observations, interviews).

▢ Follow-up survey research used to assess medium- term outcomes.

▢ Control and comparison groups used where appropriate.

▢ Findings are used to improve programs.

▢ Findings are peer reviewed and published when appropriate.

▢ IRB approved.
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
Adoption & Replication
(Program Dissemination)
▢ Recognized by respected agencies and organizations as an effective program.

▢ Adopted by other organizations or Extension services.
Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
Marketing & Communication
(Program Dissemination)
▢ Formal marketing plan in place and evaluated for effectiveness. Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
Public Value
(Program Dissemination)
▢ Program’s value is evident to individuals, families, and the community-at-large. Meets Criteria: 
▢ Yes 
▢ Marginal 
▢ No
Sustainability
(Organizational Commitment)
▢ Medium-term resources committed to supporting the program from the UME budget pending evidence of potential for impact.

▢ External funders may be involved in on-going support of the program.

▢ Partners involved in program when appropriate.
 

Definitions of Terms

Many terms in this tool could be interpreted in multiple ways. For purposes of use of the PAT, we’ve provided a short list of terms and our definitions.

  • Curriculum - A specific learning program with targeted learners, goals and objectives, learning activities and materials.
  • Educational Intervention - The programming done by Extension salaried and volunteer faculty and staff.
  • Evaluation Methods - The evaluation strategies that will be used to determine program outcomes.
  • Evaluation Use - What type of data will be collected and how it will be used.
  • Needs Assessment - “A systematic way … for identifying education and training problems, needs, issues, and the like” (Caffarella, 2002, p. 123).
  • Programs
    • Informational - A UME-branded program that delivers research-based information.
    • Developing - A UME-branded program in early stages of demonstrating its public value.
    • Signature - A UME-branded, research-based program known for its demonstrated public value.
    • Evidence-Based - A UME branded program that can be replicated with similar outcomes based on scientific measure of effect and judged by external reviews to meet standardized assessments.
  • Program Scholarly Outputs - Products that document the educational intervention including theory, findings, and effectiveness measures. Refereed reviews are the gold standard of judgment of quality of educational interventions.
  • Research Base - The science of the curriculum content, delivery, and evaluation.

References:

  • Boyle, P. (1981). Planning better programs. New York: McGraw-Hill.
  • Caffarella, R. S. (2002). Planning programs for adult learners. San Francisco: Jossey Bass.
  • Cornell Office for Research on Evaluation. (2012). The guide to the systems evaluation protocol. Ithaca, NY: Cornell Digital Print Services. Available from https://core.human.cornell.edu/research/systems/protocol/index.cfm

For further information, contact Dr. Debasmita Patra, Program Director, Evaluation and Assessment, University of Maryland Extension, 0322B Symons Hall, College Park, MD, 20742, 301-405-0929, dpatra@umd.edu.