The sites included in this document provide tools, aides, tip sheets and other resources to support with the planning and management, implementation and analysis of data and evaluations.
Bureau of Justice Assistance, Center for Program Evaluation and Performance Measurement. Useful Links.
This site contains links to other web-based resources arranged by category including, for example: assessment, measurement, and instruments; data and statistics, journals and on-line publications, on-line how-to documents, state agency contacts, and university-affiliated research/evaluation centers. http://www.ojp.usdoj.gov/BJA/evaluation/links/index.htm 
Fetterman, David (Ed). Collaborative, Participatory & Empowerment (CP&E) Evaluation.
This is the webpage of the CP&E Evaluation Topical Interest Group of the American Evaluation Association. It has links to a wide range of information and tools related to books and publications, guides, tech tools, and videos. http://www.davidfetterman.com/empowermentevaluation.htm 
The Harvard Family Research Project.
This website provides links to a wide range of resources to help users develop and evaluate strategies to promote well being of children, youth, families and their communities. HFRP produces a free quarterly periodical, The Evaluation Exchange, that contains lessons and emerging strategies for evaluating programs and policies focused on children, families, and communities. Articles are written by the most prominent evaluators and practitioners in the field and address current issues facing evaluators of all levels. It is written in a format that includes take-away ideas that are designed to help users in their current work. http://www.hfrp.org/ 
Innovation Network. Practical Tools for Planning, Evaluation, and Action.
Innovation Network is a nonprofit evaluation, research, and consulting firm. They provide a searchable database of resources for evaluation and capacity building that contains over 300 reports, articles, tools, and tip sheets. Included are resources related to evaluation planning, data collection, and analysis. They also provide access to a step-by-step Logic Model Builder for articulating and connecting goals, resources, activities, outputs, and outcomes. The Evaluation Plan Builder transfers key data from the Logic Model Builder and moves to identification of evaluation questions, indicators, and data collection strategies for evaluating implementation and outcomes. http://www.innonet.org/index.php?section_id=4&content_id=16 
Miller, Delbert and Salkind, Neil. 2002. Handbook of Research Design and Social Measurement(6th Ed.).Thousand Oaks, CA: Sage Publications, Inc.
This handbook provides procedures and guidance for three major types of research: basic, applied, and evaluation. It addresses topics such as research design, qualitative research, data collection, statistical analysis, and scales and indexes, and includes a guide to federal and private funding and to the publication of research reports. Extensive bibliographies accompany each major section of the handbook.
Penn State Cooperative Extension. Program Evaluation.
This site provides information on how to design and implement a program evaluation to improve a program, compare delivery methods, respond to stakeholders, advocate, or prepare for promotion. Included are links to almost 100 tip sheets on a variety of topics including question wording and ordering, types and sources of data and information, and program evaluation techniques. Links also are provided to a series of webinars/PowerPoint presentations on Evaluation for Statewide Programs. The topics include: what to evaluate, data collection methods, creating questions and items for measurement, paper surveys, and analysis and use of results. http://extension.psu.edu/evaluation 
SRI International. Online Evaluation Resource Library (OERL).
The OERL is supported by the Division of Research, Evaluation and Communication of the National Science Foundation and was developed for professionals seeking to design, conduct, document, or review project evaluations. The Library provides a large collection of plans, reports, and instruments from past and current evaluations that have proven to be sound and representative of current evaluation practices; guidelines for how to improve evaluation design and practice; and a discussion forum for stimulating ongoing dialogue in the evaluation community. Although the materials pertain primarily to NSF projects, it also is intended to be useful to evaluators outside of the NSF community. http://oerl.sri.com 
StatSoft, Inc. 2011. Electronic Statistics Textbook.Tulsa, OK: StatSoft.
This on-line text begins with an overview of the relevant elementary concepts and continues with a more in depth discussion of specific areas of statistics. Topics are organized by “modules” and accessible by buttons representing classes of analytic techniques. A glossary of statistical terms and a list of references for further study are included. http://www.statsoft.com/Textbook 
Trochim, Bill. Web Center for Social Research Methods.
This website is designed for people engaged in applied social research and evaluation. The Knowledge Base is an online hypertext textbook that covers the entire research process, including: formulating research questions, sampling, measurement, research design, data analysis, and reporting. Also included is a link to Selecting Statistics. This is an interactive on-line statistical advisor. The user answers a series of questions about characteristics of the data and the intent of the analysis and the appropriate statistical approach is suggested. http://www.socialresearchmethods.net/ 
University of Kansas. Community Tool Box.
The Tool Box is a service of the Work Group for Community Health and Development at the University of Kansas. It offers more than 7,000 pages of practical step-by-step skill building guidance for building healthy communities. The 46 chapters include links to nearly 300 sections, including basic methods and tools for effective program evaluation. Each section typically includes a description of the task, advantages of performing this task, step-by-step guidelines, examples, checklists of points to review, and possibly training materials and summary slides. http://ctb.ku.edu/en 
University of Wisconsin-Extension, Cooperative Extension. (2008.) Building Capacity in Evaluating Outcomes: A Teaching and Facilitating Resource for Community-Based Programs and Organizations.Madison, WI: UW-Extension, Program Development and Evaluation.
Building Capacity provides 93 activities and materials for practitioners working in and with community-based programs to use in building the capacity of individuals, groups, and organizations in evaluating outcomes. Included are eight units that cover core evaluation topics: getting ready; planning; engaging stakeholders; focusing the evaluation; collecting data; analyzing data; using data; and managing an evaluation. Each unit contains hands-on activities, handouts and a slide presentation. A Facilitator’s Guide also is provided. http://www.uwex.edu/ces/pdande/evaluation/bceo/pdf/bceoresource.pdf 
University of Wisconsin-Extension, Program Development and Evaluation (PD&E).
The evaluation page of the PD&E site includes many valuable resources. Two key resources are the Planning a Program Evaluation worksheet that can help evaluators identify stakeholders, type of evaluation needed, information needed, methodology, interpretation, and communication of evaluation results; and the Enhancing Program Performance with Logic Models on-line course. The course focuses onwhat a logic model is and how to use one for planning, implementation, evaluation or communicating a program. Building Capacity in Evaluating Outcomes (discussed further in a separate entry below) is an extensive document that covers the core topics of evaluation. Also provided are links to over 40 Quick Tips, one or two page documents that address topics pertaining to planning an evaluation, data collection, analyzing and interpreting information, communicating results, improving evaluation quality, and retrospective post-then-pre designs. http://www.uwex.edu/ces/pdande/evaluation/index.html 
Western Michigan University, The Evaluation Center.
The Evaluation Center was founded by Daniel Stufflebean at The Ohio State University in 1963 and moved to Western Michigan in 1973. The Center’s webpage contains resources intended to further its mission to advance the theory, practice, and utilization of evaluation. Included are links to publications, presentations, and video lectures/presentations on a wide range of evaluation topics. Of particular interest are the Evaluation Checklists for designing, budgeting, contracting, staffing, managing, and assessing evaluations of programs; collecting, analyzing, and reporting evaluation information; and determining merit, worth, and significance. Each checklist is a distillation of valuable lessons learned from practice. http://www.wmich.edu/evalctr/ 
Developed under the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for the Application of Prevention Technologies contract. Reference #HHSS277200800004C. For training and/or technical assistance purposes only.