Friday 2 January 2009

Evaluation

http://www.usnews.com/articles/business/best-careers/2008/12/11/11-best-kept-secret-careers.html
...

The 11 Best-Kept-Secret Careers are:


...
*
http://www.usnews.com/articles/business/best-careers/2008/12/11/best-careers-2009.html

Best Careers 2009

What's new in 2009 ... and some advice on picking a career

Posted December 11, 2008

...

So how did we select the Best Careers of 2009?
We scored hundreds of careers on five criteria:

  1. Job outlook, which took into consideration the above three factors
  2. Average job satisfaction
  3. Difficulty of the required training
  4. Prestige
  5. Pay

...
In the Ahead-of-the-Curve Careers section, we describe 13 careers with a bright future that are too narrow or yet too small to be considered a Best Career. Examples: data miner, wellness coach, and, new this year, solar energy system technician.
...
The Overrated Careers section profiles ... teacher, and chef. ... we added farmer because growing numbers of people have romanticized visions of what it's like to run a small organic farm. For each overrated career, we suggest an alternative. For example, instead of being an attorney, consider being a mediator.
...
A Bit of Career Advice
...
the Occupational Handbook consists of authoritative if dry few-page descriptions of 250-plus popular careers.
...

When you've narrowed down to one or two candidate careers, visit a few practitioners at their workplace. Check out the feel of the place. Could you see yourself happy there? Also, ask probing questions like: "Would you walk me through your career from the moment you chose it up to today? What's good and bad about the career that might not appear in print? In the end, what ends up being key to being good at this career? Why do people leave this career?" For a final check on your No 1 candidate career, volunteer to work alongside someone in this career for at least a week.

If you're still excited about that career, you've probably found one in which you'll be happy, successful, and make a contribution to society. Congratulations !

*

http://www.usnews.com/articles/business/best-careers/2008/12/11/best-kept-secret-career-program-evaluator.html

Best-Kept-Secret Career:

Program Evaluator

Posted December 11, 2008

...

Learn more: Basic Guide to Program Evaluation

*

http://www.managementhelp.org/evaluatn/fnl_eval.htm

...

Key Considerations:

Consider the following key questions when designing a program evaluation.
1. For what purposes is the evaluation being done, i.e., what do you want to be able to decide as a result of the evaluation ?


2. Who are the audiences for the information from the evaluation, e.g., bankers, funders, board, management, staff, customers, clients, etc.

3. What kinds of information are needed to make the decision you need to make and/ or enlighten your intended audiences, e.g., information to really understand the process of the product or program (its inputs, activities and outputs), the customers or clients who experience the product or program, strengths and weaknesses of the product or program, benefits to customers or clients (outcomes), how the product or program failed and why, etc.

4. From what sources should the information be collected, e.g., employees, customers, clients, groups of customers or clients and employees together, program documentation, etc.

5. How can that information be collected in a reasonable fashion, e.g., questionnaires, interviews, examining documentation, observing customers or employees, conducting focus groups among customers or employees, etc.

6. When is the information needed (so, by when must it be collected)?

7. What resources are available to collect the information?

...

Some Major Types of Program Evaluation

...

Goals-Based Evaluation

...

Process-Based Evaluations

Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does. These evaluations are useful if programs are long-standing and have changed over the years, employees or customers report a large number of complaints about the program, there appear to be large inefficiencies in delivering program services and they are also useful for accurately portraying to outside parties how a program truly operates (e.g., for replication elsewhere).

There are numerous questions that might be addressed in a process evaluation. These questions can be selected by carefully considering what is important to know about the program. Examples of questions to ask yourself when designing an evaluation to understand and/or closely examine the processes in your programs, are:
1. On what basis do employees and/or the customers decide that products or services are needed?
2. What is required of employees in order to deliver the product or services?
3. How are employees trained about how to deliver the product or services?
4. How do customers or clients come into the program?
5. What is required of customers or client?
6. How do employees select which products or services will be provided to the customer or client?
7. What is the general process that customers or clients go through with the product or program?
8. What do customers or clients consider to be strengths of the program?
9. What do staff consider to be strengths of the product or program?
10. What typical complaints are heard from employees and/or customers?
11. What do employees and/or customers recommend to improve the product or program?
12. On what basis do emplyees and/or the customer decide that the product or services are no longer needed?

Outcomes-Based Evaluation

...

Four Levels of Evaluation:

There are four levels of evaluation information that can be gathered from clients, including getting their:
1. reactions and feelings (feelings are often poor indicators that your service made lasting impact)
2. learning (enhanced attitudes, perceptions or knowledge)
3. changes in skills (applied the learning to enhance behaviors)
4. effectiveness (improved performance because of enhanced behaviors)

Usually, the farther your evaluation information gets down the list, the more useful is your evaluation. Unfortunately, it is quite difficult to reliably get information about effectiveness. Still, information about learning and skills is quite useful.

...

Basic analysis of "qualitative" information (respondents' verbal answers in interviews, focus groups, or written commentary on questionnaires):
1. Read through all the data.
2. Organize comments into similar categories, e.g., concerns, suggestions, strengths, weaknesses, similar experiences, program inputs, recommendations, outputs, outcome indicators, etc.
3. Label the categories or themes, e.g., concerns, suggestions, etc.
4. Attempt to identify patterns, or associations and causal relationships in the themes, e.g., all people who attended programs in the evening had similar concerns, most people came from the same geographic area, most people were in the same salary range, what processes or events respondents experience during the program, etc.
4. Keep all commentary for several years after completion in case needed for future reference.

...

Contents of an Evaluation Report -- Example

An example of evaluation report contents is included later on below in this document. Click Contents of an Evaluation Plan but, don't forget to look at the next section "Who Should Carry Out the Evaluation".

...

Contents of an Evaluation Plan

Develop an evaluation plan to ensure your program evaluations are carried out efficiently in the future. Note that bankers or funders may want or benefit from a copy of this plan.

Ensure your evaluation plan is documented so you can regularly and efficiently carry out your evaluation activities. Record enough information in the plan so that someone outside of the organization can understand what you're evaluating and how. Consider the following format for your report:
1. Title Page (name of the organization that is being, or has a product/service/program that is being, evaluated; date)
2. Table of Contents
3. Executive Summary (one-page, concise overview of findings and recommendations)
4. Purpose of the Report (what type of evaluation(s) was conducted, what decisions are being aided by the findings of the evaluation, who is making the decision, etc.)
5. Background About Organization and Product/Service/Program that is being evaluated
a) Organization Description/History
b) Product/Service/Program Description (that is being evaluated)
i) Problem Statement (in the case of nonprofits, description of the community need that is being met by the product/service/program)
ii) Overall Goal(s) of Product/Service/Program
iii) Outcomes (or client/customer impacts) and Performance Measures (that can be measured as indicators toward the outcomes)
iv) Activities/Technologies of the Product/Service/Program (general description of how the product/service/program is developed and delivered)
v) Staffing (description of the number of personnel and roles in the organization that are relevant to developing and delivering the product/service/program)
6) Overall Evaluation Goals (eg, what questions are being answered by the evaluation)
7) Methodology
a) Types of data/information that were collected
b) How data/information were collected (what instruments were used, etc.)
c) How data/information were analyzed
d) Limitations of the evaluation (eg, cautions about findings/conclusions and how to use the findings/conclusions, etc.)
8) Interpretations and Conclusions (from analysis of the data/information)
9) Recommendations (regarding the decisions that must be made about the product/service/program)
Appendices: content of the appendices depends on the goals of the evaluation report, eg.:
a) Instruments used to collect data/information
b) Data, eg, in tabular format, etc.
c) Testimonials, comments made by users of the product/service/program
d) Case studies of users of the product/service/program
e) Any related literature

...