Expanding our Evaluation Practice: Community Development in 2017

KathyHi from Kathy Haynie, co-PI for CSONIC and Director of Haynie Research and Evaluation. I want to tell you about what we’ve been up to in the CSONIC community so you can understand who we are, and get excited about what we have to offer!

CSONIC: YEAR 1

This is a story about a networked improvement community (Bryk, Gomez, & Grunow, 2011) that began in spring 2016 when Tom McKlin (The Findings Group) had the idea of connecting Computer Science evaluators to build individual and collective evaluation capacity. Tiffany Berry (Claremont Graduate University) joined our team, and we were funded by NSF in September 2016.

The first and most critical pieces were to define our community, and find out more about existing evaluation capacity and needs. We started with a list of about 85 evaluators from CS10K projects and did a needs analysis study (n=50). Maybe you completed our survey?

For a long list of evaluation practices, we wanted to know if people typically did it, reasons for not doing it, and level of interest in having more resources/support for improving it. We queried about describing a program, measuring implementation, measuring student outcomes, and justifying conclusions/ensuring evaluation use.

The results were quite compelling, particularly when parsed by evaluator experience. Here’s the report if you are curious:

Needs Analysis

Here’s an example—what we found in the area of evaluating student skill development:

  • 54% of people said they collect information on this
    • 60% of those with 10+ years experience; 38% with < 10 years
  • 46% don’t, due to lack of time (50%), budget (50%), and confidence in measuring it (43%).
  • 74% are interested in additional resources/support to improve practice

We learned that evaluators with different levels of experience are interested in professional learning opportunities and resources, particularly for assessing student learning and classroom instructional strategies. We came out of this process with more developed ideas for facilitating collective learning: nurturing collaboration and resource sharing among the CSONIC community, curating a measurement repository, and continuing to hone CSONIC strategies through engaging evaluators in dialogue.

In March 2017, there was an opportunity to pull together an Empowerment Evaluation (Fetterman, Kaftarian & Wandersman, 2015) session at the CISE Broadening Participation and Education in Computing PI and Community Meeting. We were very lucky to have David Fetterman lead this meeting, and Jason Ravitz co-facilitate along with Tom McKlin and myself. Thirty-seven participants (mostly CS10K evaluators) joined us for a two-hour conversation brainstorming and defining our collective mission, current status, and goals.

dave

The major outcome of this session was five Core Strategies in support of the collective mission:

  1. Organize groups to lead knowledge-building (via workshops, webinars…)
  2. Establish and nurture relationships among evaluators, social scientists, and PIs
  3. List, curate, and encourage the use of measurement repositories
  4. Build interactive community site for hosting webinars, pointing to repositories, curating instruments and approaches, identifying CSONIC mission
  5. Engage leaders in the field (in evaluating implementation, in assessing content knowledge, institutional change, etc.) to build and deliver “instructables” through the interactive community site.

To read more about this meeting and process, check out our session report:

empreport

So far so good, but what next? The EWC had been going for over a year, and was already working with various organizations to create a repository of evaluation measures, and working carefully to build evaluator capacity through a community-based approach. In addition to CS10K evaluators, the EWC provided CSONIC with access to a network of evaluators from multiple initiatives. EWC members included Ann Martin and Kim Kelly from the AEA STEM TIG; Kelly and Ann had a commitment from Oak Ridge Associated Universities to do a build of a STEM measures repository. So, I started becoming more involved with the EWC to be a “voice” for the CSONIC community in planning for the repository.

Soon after the CSONIC Empowerment Evaluation session at CISE, David and Jason led a second EE session – this time with the EWC. The resulting goals and strategies were similar to those of the CSONIC session. A blended mission statement was vetted by the EWC that included both STEM and CS evaluation. Members of the EWC organized into subgroups to work in these areas: (1) designing the STEM repository, (2) collecting STEM and CS measures, (3) building a network of corporate stakeholders, and (4) educating the CS community about evaluation. Results of our process are listed below:

Our purpose is to improve STEM and CS education through evaluation. Our current activities include:

  • designing and building a repository of STEM and CS evaluation instruments, tools, guides, and related resources
  • creating a centralized hub to help STEM and CS evaluators
  • locating resources and engaging in dialogue about them
  • building a network of corporate stakeholders
  • teaching about effective measures
  • educating the CS community about evaluation

Video of Session:

 

We are partnering with many organizations and projects to work on these activities for the purpose of building evaluator capacity at all levels of practice and in the process improving the quality of STEM and CS education.

From about June through October, I facilitated biweekly meetings of the STEM repository group. We developed a functionality and specifications document that we discussed and honed for about two months. This included a diagram in RealTime board – a collaborative tool which people could synchronously edit during our team meetings.

The design called for particular input/search/output fields and supported the idea that the repository would have multiple “front ends” – landing pages (like this one!) for the various organizations within the community.

One challenge was the timing of the repository development. ORAU made it clear that the build would not begin until Summer 2018. In the meantime, we had a community poised for action! We decided to start collecting measures for the repository, even if the final “home” was not yet ready. In fact, some members of the EWC, including Ann Martin in connection with the STEM TIG, had already been collecting measures.

The STEM repository group, spearheaded by Ann Martin (and yours truly), developed an input form, which you can find by going here or checking out the Access Instruments section of this site.

Our group decided to include a variety of resources in the repository:

  • Reports or deliverables
  • Evaluation planning resources
  • Other relevant repositories or database
  • Intro resources to STEM or CS evaluation
  • Existing communities, listservs, professional groups
  • Evaluation tools (instruments, measures, scales, protocols)

We’ve also created an interim Awesome Tables interface (until the searchable database is built), which allows you to see what has been contributed so far.  If you’ve read this far – please contribute a quality STEM resource!

You will undoubtedly have questions about CSONIC and this expanded EWC evaluation community (or “coalition” as Jason calls it). I encourage you to jump on the Let’s Work Together section of this site to ask questions on our discussion board. Or you can always email me: kchaynie@stanfordalumni.org, and I will add you to the EWC group, if you are interested.

What is next? Well, refer to our mission statement and core strategies for an idea as to the our current directions. Or better yet, join us and help write the next chapters!