Wednesday, November 11, 2009

Feedback on the survey

Of the six people that I contacted to provide feedback, four have responded - lucky me :). I will go through the feedback I received and comment on whether or not I changed the survey, how and why section by section.

Page 1 - demographics
Feedback indicated confusion because the groupings had overlapping numbers (age 16-18 or 18-25 for example). I thought I had watched for this but obviously not so - easy fix :)

I accepted the suggestion to change question 7 to, "What branch do you live in?" I had deliberate phrased it initially as "belong to" because I wanted the participants to identify the branch where they participate (and sometimes that isn't connected to where they live) but I then realized that the information I was looking for will be captured in another section where they indicated their level of activity in their "local branch". Changing this question clarifies it and will give richer information when considered with the level of activity questions.

While none of the test subjects :) commented, I am going to change question 5. the information I am trying to gather is regarding whether or not participants in the society tend to have memberships or not and why. Changing question 5 to, "Do you let your membership lapse?" will provide much more meaningful information, especially if they address it in the open ended question that immediately follows.

Page 2 - activities
I was expecting to have a few additional activies suggested. I was suprised there were no comments.

Page 3 - frequency of activity
One test subject suggested that using the term "weekly" in the first sentence may be confusing if the local branch doesn't have weekly activities and suggested merely dropping it. I have decided to leave it in because my understanding that every branch does have some kind of activity weekly (it isn't always the same activity though) and because I am trying capture whether or not the survey respondants are participating on a weekly basis rather than trying to determine if they are participating with their local group. What i will do to clarify is is change the answers to 'weekly, monthy, quarterly, and annually' then add a couple more questions asking about the frequency of participating in other groups. I think that is more likely to provide the information that I was hoping for - though it does make the survey a little longer, I don't think it will be significant.

There were no comments from my test subjects :), but I am going to add one open ended question here. These questions will capture some of the frequency and distances involved in participation, but will not provide any insight into why people participate the way they do. I will add the questions, "What would entice you to travel to participate in the activities of another branch?" and "What would entice you to travel outside your current comfort zone to attend an event?"

Page 4 - opinions
This section prompted the most feedback and I anticipated this. As I mentioned before, these are hot button items for many people and I deliberate presented them in a closed format to force participants to consider the basic level of answer, without conditions. Based on their feedback, I have decided to alter the wording of question 14 to "Avacal should become it's own kingdom." This removes the indication of immediacy which was, I think the basis of the need for conditions and theneed to comment on that particular question. I have also decided to add a box at the bottom of the page allowing participants to make any comments they wish - not connected to a particular question. Responses might be interesting and will only be made by those who feel the strongest. One of my commentors felt that allowing justifications might increase the response rate).

Page 5 - awards
Based on specific feedback, I will change the structure of the questions to, "Identify the level of recognitions that you have received in the ... Check all that apply. peerage, kingdom, principality, baronial, other, none yet" followed by, "If you wish, please describe the recoginition you received". The first question is clearer, I think. while the second may be a little more abiguous, it may provide richer information.

Based on conversations that ensued when reviewing these questions, I have decided that there are more questions to ask. This information will provide a factual snapshot of where people are at, but there is an underlying emotional issue that I would like to get at.  I will add an opinion statement to the previous page, "The award system in the SCA effectively recognizes individual efforts." I will also add the open ended question, "Please describe if and how recognition affects your enjoyment of participating in the SCA."

Page 6 - recruitment
There was no feedback specific to this section and I can see no reason to make changes :)


I will post the link to the survey when I am finished making all the changes so you can see the final product. I will also be sending the link and an invitation to fill out the survey to as many mailing lists within the principality that I can find. The reading material indicates that I should open the survey for three weeks with reminders sent out at one week intervals - I will do that and see how many responses I get. There are roughly 360 members but I have no idea how many are participating.

The pre-survey








Assignment 3 - practising survey creation

I think it would be helpful if I provided some background for the survey I'm about to share with you. I am involved in a historical re-creation group called the Society for Creative Anachronism. It is a group that studies and then attempts to re-create the middle ages (generously considering that to be 600ish -1600ish). I chose to use this venue to practise survey taking because it would allow me access to different people for testing the survey and an easy place to put the survey into practise (plus I'm genuinely interested in the results :). Just to clarify a couple of terms within the survey (that won't need to be clarified to the participants), like most international organizations we are organized as branches, my local branch is called a Barony in includes Saskatoon with surrounding area (most of the northern half of SK). The next level is called a Principality and encompasses SK, AB, plus the northern and western parts of BC. The Kingdom to wish we belong includes the rest of BC, Washington and Oregon with the northern tip of Idaho. If you are curious about the organization please feel free to ask me, or visit http://www.sca.org/ or to view my personal pictures of activities, visit http://picasaweb.google.com/raouldelaroche.

The purpose of the survey is mostly to gather information about the opinions of those who are participating in the activities of the society within the principality. There are several topics within this that could be considered for a program eveluation type of situation. As with most administrations of large groups of people, there have been many attempts (few of them organized programs) to do a multitude of things ranging from education of the membership, develop specific activities, recruit and retain members etc. As an involved member, I have been involved in many of these "programs" and am curious about their effectiveness - thus the connection to program evaluation :). These thoughts were the primary consideration of questions and topics for the survey. In addition, however, I also contacted a couple of other people in the leadership of the principality to see if there were questions and/or topics that they would like to see included in the survey.

Detailed purposes of each section:
PG1 - demographics & membership
This section is pretty self explanitory. It's purpose is to gather demographic information and determine if the participant is a member or not. There is interest in membership because in our society there are only a few activiieis that require membership to participate. There is a debate as to whether or not this should change. Many people say that 75% of our active participants do not have memberships. I do not believe that to be the case in our principality and am hoping to gather some information about both the numbers of members and the reasons why the participants have or do not have a membership. There have been some previous surveys done and it would be of interest to compare data and any activity (campaigning) done to convince people on either side of the debate.

PG2 - activities
There is an extremely wide range of activity in the SCA. I find it interesting to see what activities people are involved in and to see f there are any trends or themes that stand out.

PG3 - travel
In the last 5-10 years there seems to have been a change in the travel patterns of people in the principality. During many disucssions there has not been an explanation offered that seems reasonable to very many people. I would like to try and discover what some of the current patterns are and some insight into why.

PG4 - opinions
A few of these statements are 'hot button' items during discussion. There are some previous surveys asking similar questions and I would like to gauge changes in the general opinions of the participants. Questions regarding the transparency of decision making was the topic requested by someone sharing the leadership role at the principality level. Most of these ideas are quite complex and would be difficult to answer in such a straightforward way, but I choose to present them this way deliberately - almost like taking a temperature - I'm hoping to gain a general feel about the issue, not all of the context (that would be a different survey :).

PG5 - awards
I am mostly curious about this section cross references with other questions.

PG6 - recruitment and retention
This is a topic of conversation that has become more and more frequent of late. I am hoping that this survey will be provide information from wide section of participants and will provide some good insight that will be useful.

Once I designed the initial questions I sent them to six people for feedback and comments. I selected a mixture of geographic locations (Saskatoon, Calgary, Edmonton, & Medicine Hat), both genders, several age ranges. All of these people have had several years of experience in the society, would be interested in seeing the survey collect useful information (I think) and have at some point been involved in administsrative matters to varying degrees and in various ways. In hindsight, all of these people are professionals (one is a student) and probably should have attended to that. As well, I should have asked at least one person who is fairly new to the society to review the questions.

Thursday, October 15, 2009

Wednesday, October 14, 2009

Comments on Logic Model

I will keep trying (and also watching to see if anyone has advice for me) to upload the actual document/chart of my logic model or a readable image of it or something, but in the mean time, here are some comments and explanations.

This is a snapshot of the CLS summer school as I see it. I have been involved in the design and implementation since the beginning. The bulk of the logic model comes from my own perceptions of what we have been trying to accomplish, though I have also talked with two of the directors that have also been involved since the beginning (to varying degrees) to see if I am on the right path - more so for input into the evaluation plan that will be posted shortly (in text form!).

What is primarily missing from the chart is the inputs since that made the most sense to me (considering we are immersed in a facility wide strategic planning panic right now and thus - in my brain). So...to list inputs is easy: staff, money, venue for meetings/lectures, beamlines for practical sessions, social venues, time.

To be a little more specific, there is staff input regarding the logistics of event planning (room bookings, hospitality, registrations, budgeting etc); staff that develops the scientific program in collaboration with the SR community; and collaboration between aforementioned staff and user community for promotion. some of the class documents mention that the list of activities should be specific in listing who does each activity. I chose not to do that because most would be listed as, "a collaboration of administrative and scientific staff and user community". Most of the other inputs are pretty self explanatory.

So, in a nutshell, I am using this snapshot of the program to design an evaluation. There was no needs assessment at the beginning so I have only documents, notes, and correspondence to formulate a concept of what was initially intended (to compare to what we 'remember' we intended to do :) and there has been much formative evaluation in the form of feedback surveys and debriefing meetings so it seems time to do a formal, summative evaluation :), but that will be the next blog ...
I don't know if it actually helps but I've uploaded the word document here: http://www.lightsource.ca/files/details.php?id=2060 so you can at least look at it.

Logic Model

So yes, if you were paying attention :), I changed my mind. During the last class I said I was interested in evaluating the tours program. Well, I decided that would not be nearly as interesting or as useful as evaluating the facility's summer school (of which I am one of the Co-Organizers annually). It is our fifth school so I think it's time for an outcomes/outputs formative/summative evaluation. Are we doing what we set out to do (summative)? What are we missing that we need to attend to (formative)? And yes, I have the facility's support to implement this (or at least a version of this) even though I'm not registered for the class next term - only because I have all the required credits and I need to focus on the thesis :).

Now just to figure out how to post a document.

I can add a video or picture ...
A picture it is ... can you read it?

Saturday, September 19, 2009

Assignment two

Upon reading through the Student Services Program Description for Programming for Children with Severe Disabilities with an eye towards developing a plan for program evaluation, I found myself asking many questions. As with most programs there are so many opportunities for evaluation, several questions really should be asked before a model for evaluation is chosen. One of the primary questions being, “What do they want evaluated?” but others that immediately when through my head included: Are they summative or formative evaluation? Do they want to know if their funding was spent appropriately? Do they want to know if the children designated for this program were properly assessed? Are they wondering if the program was delivered according the criteria established? Are they wondering if the objectives set out by the Teacher were met for each child in the program? Interestingly, my own question based on this very procedural description of the program (number of visits, who visits, eligible criteria, etc) was whether or not the children in the program were having their needs met. I rapidly realized that the answer to this question would depend heavily on who was doing the asking since parents would be interested in different information than would the provincial funding department or, presumably, the Director of Special Education, for example. The purpose of the evaluation would also an important aspect to consider in choosing the model as some are better suited for make judgements and others for program improvement.

So, for the purposes of this assignment, I am going to make a few assumptions about the context within which this evaluation would take place. I am going to assume that this is a new program that Alberta Education is implementing and that they are at the planning stages and want to ensure that the program continually improves and evolves so that the children receive the best services possible so they are including program evaluation in the initial plan. Another assumption that I am making is that since evaluation is being included in the initial planning and throughout implementation, there is a modest budget included to allow for effective evaluation (though also noting that it is within a public education system so it likely wouldn’t be a robust budget).

Given this context, the model I would choose to use is Stufflebeam’s CIPP. The CIPP framework emphasises the collection of evaluative data the purpose of which is to help decision makers. It allows for formative evaluation to occur during implementation and for adjustments to be made accordingly so that processes are improved as well as summative evaluation to evaluate the product of the program. With information gathered in each of the four areas (context, input, process and product), this evaluation should be able to provide a broad based, comprehensive picture of the program as it is being implemented and, eventually of the outcomes (I assumed this was important since each program is individualized for a child’s specific needs). Such descriptive information would be important to the many stakeholders of a public education program and would also be useful for decision-makers. Understanding that catering formative evaluation strictly to decision makers’ questions and needs might garner criticism from other stakeholders in a public education program, I would develop a participative approach to the planning, including decision makers from various audiences (representatives from government funding, program developers, teachers and other education personnel delivering the program, and parents) in a focus group type of setting. This would have to be done very carefully to ensure that the group would not become so large that they would lose focus and to ensure that the plan for evaluation did not become too complex rendering the outcomes of the evaluation process useless.

I think that the cyclical nature of the, “process of delineating, obtaining and providing useful information to decision-makers, with the overall goal of programme or project improvement” (Robinson, 2002, p. 1) in periodic consultation with representatives of the major stakeholders, using the CIPP model to evaluate this programme would certainly provide for a stronger educational program for these children. Primarily because the program description does not outline to my satisfaction, what the purpose of the program is and this model begins with an evaluation of that context which would, in my opinion, be valuable since everything else stems from that.

Note that additional material, including the quote, comes from:
Robinson, Bernadette. The CIPP approach to evaluation. COLLIT project: A background note from Bernadette Robinson. 4 May 2002. Retrieved from the Commonwealth of Learning Discussion Area web site 2009-09-19. Http://hub.col.org/2002/collit/att-0073/01-The_CIPP_approach.doc

Stufflebeam, Daniel L. CIPP Evaluation Model Checklist: A tool for apply the Fifth Installment of the CIPP Model to assess long-term enterprises. June 2002. Retrieved from the Western Michigan University: The Evaluation Center web site 2009-09-19. http://www.wmich.edu/evalctr/checklists/cippchecklist.htm

Wednesday, September 16, 2009

Assignment 1

Wonderwise Women in Science Learning had developed activity kits designed for formal classroom use. The Nebraska State Museum and 4H wanted to adapt these kits for informal use, develop a few new ones, and disseminate all of these through 10 states. They very carefully made their plans for implementation, including this evaluation of the process. The purpose of the report is very clearly a summative evaluation of the effectiveness of the process of dissemination, not of the kits or program itself, but of the process that it took to spread the kits across the 10 states with high use by 4H clubs at the end of three years resulting in recommendations for other curriculum developers. The process was broken into five phases and with four specific goals (record of the process, documentation of different strategies between states, assess the effectiveness of each, make recommendations for others). The collaborative evaluators (the Nebraska State Museum and the Centre for Instructional Innovation) identified guiding questions and described the qualitative (telephone interviews) and quantitative (demographic surveys) methods of gathering information from various sources needed to make their evaluation of each phase carefully considering the scope and diversity of sites and situations involved in the program, comparing and assessing the effectiveness of different processes. This flexibility built into the process to allow differing states, people, and informal situations, to tailor the process to suit them turned out to be the most significant limiting factor in their evaluation but, ironically, was identified as a strength in the dissemination of the program – accentuating a couple of challenges in program evaluation (dealing with politics and balancing scientific soundness with practicality).

Like most things in the real world, this evaluation displays elements of more than one model of program evaluation. It is decidedly a formalized evaluation, purposefully prepared and carried out. The qualitative data gathered is not merely anecdotal, but descriptive. The first set of interviews was structured but the second was semi-structured and the timing was adjusted to suit the state and situation of dissemination in that location – this necessary adaptation of the evaluation process is descriptive data. All of this information was summarized for the purposes of assessing its effectiveness in the process, a judgement and producing a recommendation for adaptation for future dissemination processes. Although specific intentions were not stated in the report, descriptions of communication issues and differences in participant expectations indicate a congruence between what was intended and what was observed (both some failure in theory or planning as well as in consistent implementation of the plan created). Finally, the fact that differences in processes used from state to state were identified as successes and failures indicates a recognition that there is a connection between variables. All of these characteristics point towards the evaluators following a Stake-Countenance model of evaluation. The only exception is the expectation that the observations would be compared to a standard. I did not get the sense that the evaluators had a defined standard they were comparing their information to, merely that some processes worked better than others given the state and/or situation the program was in.

Although, I could easily make a case to say that these evaluators were following Provus-Discrepancy. The opening sentence of the Conclusions is, “It would be nice at this point to be able to offer a recipe for success from this project that could be applied to other similar dissemination projects.” It could be argued that the purpose of this evaluation to improve existing models of dissemination and establish better programs. The stated purpose within the report identifies several purposes and the audience the information is being gathered for: a record for the Principal Investigators of 4H; documentation of different strategies for 4H leaders; evaluative information for the funding sponsors; and recommendations for informal educators, all of which sounds much like accountability and efforts to help administrators make wise decisions. The report is a description, a summary of differences, and an assessment laid out step-by-step.
If I had to choose, I would say that Frerichs and Spiegel agree with Provus’ Discrepancy model.

Frerichs, S. W., Spiegel, A. N. Dissemination of the Wonderwise 4H Project: An Evaluation of the Processretrieved from: http://www.hfrp.org/out-of-school-time/ost-database-bibliography Thursday September 10th, 2009.

Tuesday, September 15, 2009

A starting point

For the purposes of this assignment and my own learning, I thought I would try to find what I thought might be an exemplary evaluation of a program so that I could see ‘how it could be done’. In my search I stumbled upon a resource that I thought I should share. The Harvard Graduate School of Education has a project called the Harvard Family Research Project (http://www.hfrp.org/). where they help, “stakeholders develop and evaluate strategies to promote the wellbeing of children, youth, families and their communities.” They have developed a searchable Out-of-School Time Program Research and Evaluation Database and Bibliography. As an out-of-school time educator and masters student studying the same, there is a wealth of information. It is from this database that I found an exemplary evaluation that I was interested in.