PLEASE NOTE DATE CHANGE: Save the Date – Hallie Preskill is coming to RTPE on June 14!!

RTPE is thrilled to announce that our June May workshop will focus on strategies that can be used to facilitate intentional group learning, and led by Hallie Preskill. In this hands-on session we will experience and learn how to use strategies such as data placemats, chalk talks, and gradients of agreement from FSG‘s new guide. In recent years, RTPE has focused on evaluation capacity building, stakeholder engagement, and data visualization. This session will be designed to build on these topics by providing hands-on training with activities you can use with your clients and colleagues.

As most of you probably know, Hallie is a former President of AEA. She has published multiple books on evaluation, and is one of the field’s leading contributors on evaluation capacity building and appreciative inquiry. Hallie is currently a managing director with FSG, where she oversees the firm’s Strategic Learning and Evaluation practice efforts. What some of you may not know is that Hallie moved to the RTP area within the last year, and she just joined RTPE! Whether you have the chance to meet her at the workshop in June May, or some other time soon, we hope you will help welcome Hallie to North Carolina!

This year’s session will be on June 14 May 24 at Duke’s Bullpen entrepreneurial space in Durham. The workshop will run from 9am-2pm (lunch will be provided). Seating is limited to 45 attendees. As always, members will have priority seating. Registration will open March 1, and so please mark your calendars and stay tuned!

Hope to see you all at this month’s meeting.

Data Visualization Meeting Recap

We had a great meeting last Friday.  As one of the presenters I want to thank everyone in the audience for your attention and great questions.

One thing that stood out for me in the Q&A portion was the focus on the ethical issues and practical complications associated with data visualization.   As evaluators I think we’re uniquely qualified to really delve into these issues.  Were you there, what takeways did you bring back?  Let us know in the comments.

Full Video

Guess what?  The whole presentation was captured on video and put up on YouTube.   You can thank Megan Williams for this.  The room wasn’t setup well enough (especially with the lighting) to get anything high quality but the sound is good.

I’ve provided all the slidedecks and resources below, so you can follow along.  I’ve also added the general starting points for each presentation.

Amy Germuth – Data Visualization Implications for Evaluation

Here is Amy’s presentation.  Amy kicks off the video embedded above, her presentation starts at about the 4 minute mark if you want to play along right here…

Chris Lysy – Recent Developments in Quantitative Data Visualization

My presentation kicks off about 22 minutes into the video above.  Here is the slidedeck if you want a closer look.

Felix Blumhardt – Data Visualization Tips

Felix starts at about the 42 minute mark.  No powerpoint, instead Felix provided a couple of handouts.  Here they are…

NetLEAD Infographic

Want to create one of your own in Power Point? Felix provided a handout with instructions which you can download here> Steps for Creating an Infographic in Power Point

She also provided a list of resources and tips.
Data Vis Resources

Q&A

The Question and Answer portion starts at about the 53 minute mark.  It can continue below in the comments section, so let us know what other questions you have or points that stuck out for you.

Additional Resources

Here is a post from Robert Kosara’s blog where you can find Jason Moore’s Hippocratic Oath.

Stephanie Evergreen’s book was mentioned a few times, here’s a trailer I worked with Stephanie to create.

I mentioned The Internet Archive’s Wayback MachineTableau, Windows Live Movie MakerNodeXLAnn K EmeryThumbsUpViz

Felix’s instructions are based off of this guide > The Marketer’s Simple Guide to Creating Infographics in PowerPoint [+Templates]

I’m missing things, what else would you like links to?  Let me know in the comments.

Data Visualization: New Directions, Implications, and Infographics

Mark your calendars!

The next meeting will be held on Friday, November 15, 2013 at the RTP Marriott  beginning at 11 am.

And at this meeting, there are not one, not two, but three presentations!  Each one addressing the very popular topic of data visualization and its implications for evaluation.

Meeting Agenda

  • 11:00am – 11:10am Announcements and Introduction
  • 11:10am – 12:15pm Data Visualization: New Directions, Implications, and Infographics
  • 12:15pm – 1:00pm Optional networking lunch at Newton’s Restaurant located in the hotel

Amy A. Germuth, Ph.D. on the Implications of Data Visualization History for Evaluation

Amy A. Germuth is the founder and president of EvalWorks, LLC which designs, manages, and conducts formative and summative evaluations PreK-16 education initiatives at the local, state, and national levels. Much of her work involves evaluating science, technology, engineering, and math (STEM) initiatives funded by the US Education Department and National Science Foundation. Other current and past clients include The Bill and Melinda Gates Foundation, Pew Charitable Trust, New York State Education Department, Chicago Public Schools, and multiple colleges, universities, and school systems.

As a committed member of the American Evaluation Association, Amy is the outgoing chair of the Data Visualization and Reporting TIG and former chair of the Independent Consulting TIG.

Amy will be presenting her New Directions for Evaluation chapter about the history of data visualization and its impact on and implications for evaluation, including what is data visualization and what should data visualizations do / can’t they do.

Chris Lysy on Interactivity and Approachable Data

Chris Lysy has 10 years of experience in research and evaluation working in for-profit and non-profit settings; he currently holds a position as an analyst for Westat. Chris is committed to helping evaluators effectively use the web.

His professional interests in social media and data visualization have lead to the creation of several blogs including freshspectrum.com, evalcentral.com, and rtpeval.org. In the upcoming year he plans to take his experience and apply it directly to practical challenges faced by evaluators.

Chris will be discussing his New Directions for Evaluation chapter: Developments in Quantitative Data Display and Their Implications for Evaluation. Specifically, he will focus on new opportunities provided by the web that allow evaluators to analyze and present large and sometimes complex datasets in an approachable manner.

Felix Blumhardt, Ph.D. on Creating a “Sustainable” Infographic

Felix Blumhardt it the Regional Manager of the Carolinas for The Evaluation Group, a small southeastern evaluation company with offices in Columbia, SC and Atlanta, GA. The Evaluation Group conducts utilization-focused evaluations that are participatory in nature for large and small education and human service grants.

Prior to working with The Evaluation Group, Felix conducted research and evaluation for The Center for Training, Evaluation, and Research of the Pacific Rim at the University of Hawaii – Manoa. When Felix is not sifting through data and engaging stakeholders, you may find her white-knuckling her car seat as her 15 year-old learns how to drive.

Felix will present on infographics. Infographics are a trendy option for reporting to stakeholders. They can be catchy and palatable. But is there a Paul Harvey? What is the rest of the story? Felix will demonstrate how to create an infographic that is “sustainable” and briefly discuss the pros and cons of its use.

To Attend

Please RSVP to Amy Germuth (RTPE President) at agermuth @ gmail . com

Notice: Our next meetings will be held February 21, 2014 and May 16, 2014.

Previewing Evaluation 2013

The American Evaluation Association’s Evaluation 2013 kicks off this week in Washington DC with professional development workshops starting today (October 14, 2013) and the main portion beginning on Wednesday afternoon (October 16, 2013).

Here are some of the RTP Evaluators you will find presenting.

  • Kristin Bradley-Bull
  • Tobi Lippin
  • Karen Peterman
  • Holli Gottschall Bayonas
  • Sally Bond
  • Amy Germuth
  • Joy Sotolongo
  • Myself (Chris Lysy)

For more on what they will be presenting, and a few cartoons, continue on.  Hope to see you in DC!

Kristin Bradley-Bull and Tobi Lippin

improvisation

Who Knows?  Engaging Laypeople in Meaningful, Manageable Data Analysis and Interpretation

Professional Development Workshop to be held in Columbia Section 8 on Wednesday, Oct 16,  8am-11am

How can evaluators simultaneously support high-quality data analysis and interpretation and meaningful participation of “laypeople” such as program participants and staff?

This workshop offers a practical look at some of the key strategies developed over a decade of facilitating these processes. Learn how to provide targeted, hands-on data analysis and interpretation training and support; develop accessible intermediate data reports; and carefully craft meeting agendas that succeed in evoking high-quality participation and analysis. This workshop will provide many take-home tools and give you a bit of hands-on experience.

Who will get the most out of your presentation?

Anyone who values – and maybe even uses — participatory approaches to evaluation (or anything data-related) but who hasn’t yet figured out how to apply these approaches specifically to the stage of analyzing and interpreting data. We are all about “how to” and will make sure people leave with concrete approaches they can apply right away.

Can you give me a little teaser?

The recipe might be: a cup each of facilitation, analysis, and training skills; a pint of trust in a group of committed people to move a piece of collective work forward in a meaningful way; and “improvisation/responsiveness” to taste.

What’s special about your presentation?

More than half the workshop will be spent with rolled-up sleeves working on exactly what we are talking about. (Did I mention we are very practical here?!)

Where would you like to refer people interested in your presentations?

We love to share what we’ve learned over, maybe, 15 years thus far of developing (and, sometimes, fumbling with) various approaches to stakeholder-engaged analysis and interpretation. Thank goodness our varied groups of stakeholders have been both flexible and willing to give a lot of feedback on their experiences! We have learned – and continue to learn – so much from them. Our website has various resources for people interested in this work: www.newperspectivesinc.org

Facilitation: An Essential Ingredient in Evaluation Practice

Think Tank Session 83 to be held in Columbia Section 11 on Wednesday, Oct 16, 6:10 PM to 6:55 PM

There are many intersections between evaluation and facilitation. In evaluation, facilitation can play a role in helping groups map a theory of change, in data collection through focus groups or other dialogues, in analysis by involving stakeholders in making meaning of the findings. While each of these steps is described in evaluation texts and the literature, less attention is given to describing facilitation approaches and techniques. Even less is written about evaluating facilitation practices, which are integral to organizational development and collaborative decision-making. Choices for facilitation methods to implement depend on the client, context, and priorities of the work, as well as the practitioner’s skill, confidence, and philosophy. This think tank brings together a group of evaluators and facilitators collaborating on a publication about these complementary practices. We hope to spark a deeper conversation and reflections among participants about the role of facilitation in evaluation and of evaluation in facilitation.

Who will get the most out of your presentation?

First, let us say that we were invited to join this think tank, so our perspectives may be somewhat different from the fabulous people who are convening all of us: Dawn Hanson Smart, Rita Fierro, and Alissa Schwartz. That said, this think tank will be particularly interesting to evaluators already intentionally using facilitation in their work.

Can you give me a little teaser?

How about a couple? We’re all thinking deeply about how facilitation is applied in an evaluation context and the implications. We are also thinking about how the field of evaluation has growth opportunities as viewed through a facilitation lens.

What’s special about your presentation?

We love AEA conferences for many reasons – among them the number of sessions and other spaces that promote dialogue among the many, many interesting and engaged people who attend. This think tank will be in World Café style. The session is taking place because all of us – facilitators and evaluators — under the able editorial leadership of Dawn, Rita, and Alissa, are working on developing an upcoming New Directions for Evaluation issue on the intersection(s) of evaluation and facilitation. We welcome AEA folk to come help shape this conversation.

Karen Peterman

lever

Getting Ahead of the Curve: Evaluation Methods that Anticipate the Next Generation Science Standards (NGSS)

Panel Session 63 to be held in Piscataway on Wednesday, Oct 16, 6:10 PM to 6:55 PM

The state of STEM evaluation practice in the early 21st Century is in transition as we await the final release of the Next Generation Science Standards (NGSS), which will guide research and evaluation in science education in the coming years. This panel features STEM evaluation methods that have been developed and field-tested in anticipation of the NGSS. The panelists are both evaluating supplemental education programs that promote instructional practices consistent with the NGSS in middle and high school classrooms, and have taken advantage of the opportunity to explore evaluation methods that will reveal teaching and learning consistent with NGSS concepts and practices. Each panelist will share specific methods and field test results, including performance-based assessments and a classroom observation protocol supported with virtual student artifacts generated as part of a technology-supported science curriculum. With feedback from the audience, the panelists will reflect on the merits and challenges of the work.

Who will get the most out of your presentation?

STEM evaluators who are interested in NGSS and methods that anticipate this policy shift

What’s special about your presentation?

This panel is sponsored by the new STEM TIG. This is our first AEA with STEM-sponsored sessions through the TIG, and so the panel is special for that reason. Beyond that, Kim and I are excited about sharing our work and then talking with others about whether this is the direction that STEM evaluators’ work should be heading or whether there are other compelling new directions for those who specialize in STEM evaluation.

Reframing: The Fifth Value of Evaluators’ Communities of Practice

Panel Session 20 to be held in Columbia Section 1 on Wednesday, Oct 16, 4:30 PM to 6:00 PM

If You Aren’t Part of the Solution… Reframing the Role of the Evaluator

Evaluators can play a key role in changing the complex adaptive systems in which our work is embedded, but taking this responsibility necessitates reframing ideas about what it means to be an “external evaluator,” as well as the scope of our work. The presenter, a seasoned STEM evaluator, will share the ways her perspectives on the evaluator’s role have evolved as part of the ECLIPS community of practice, as well as the impact that this shift has had on her evaluation practice. Examples include (a) using participatory evaluation methods, (b) consulting with clients to create a fuzzy logic model, (c) positioning conclusions and recommendations in relation to the complex system itself; and (d) challenging clients to think about system-level change and how their projects really can make a difference.  The presenter will also explore whether/how evaluators consider themselves to be active and contributing members of the complex systems they evaluate.

Who will get the most out of your presentation?

Anyone who is interested in systems theory and the impacts it can have on evaluation as a practice and/or evaluators as professionals

Can you give me a little teaser?

Evaluators should start envisioning ourselves as levers, the small changes that can result in large changes on a system. What the heck does that mean? Come to the session and find out!

What’s special about your presentation?

I think all of the panelists have studied p2i pretty extensively. We have worked hard to make our presentations beautiful and thought provoking.

Holli Gottschall Bayonas

other specify

Evaluating a Spanish Basic Language Program at a Mid-sized Southeastern University

Poster Presentation 101 to be held Wednesday, Oct 16, 7:00 PM to 8:30 PM

The author and Director of Language Instruction at a mid-sized public university are designing an evaluation of the Basic Language Program in Spanish. The Director has engaged in informal evaluation of the program since assuming the position in Fall 2009, but within the contextual factors of the university, such as adhering to the regional accreditation requirements, focusing on student learning outcomes, and the traditional program review process led by the university’s Institutional Research office, and office of the Chancellor. The presentation will document the steps involved in designing the evaluation, the design, and some preliminary results. Of particular interest will be the integration of the Student Learning Outcomes within the logic model.

Ignite Your Education: Evaluation of Teaching and Learning and Schools

Ignite Session 253 to be held in Lincoln West on Thursday, Oct 17, 1:00 PM to 2:30 PM

The Open-Response Rating Question: A Jerry Rigged Question-type in Qualtrics™ to get at What and How Much of School Initiatives

This ignite presentation will share how the author used a variation of the matrix table question-type in Qualtrics™ to ask survey respondents about other programs that may be affecting the climate in their schools. The presentation will include the context for the question, how to create the question on a survey, visuals of how the data appears on reports, and how the author has used the question type to show the plethora of education initiatives that exist in K-12 schools.

Sally Bond

late 20th century

Using Learning and Mentoring to Build Evaluation Capacity in 21st Century

Multipaper Session 622 to be held in Suite 2101 on Friday, Oct 18, 2:40 PM to 4:10 PM

Adult Learning Theory and Theories of Change in Evaluation Capacity Building Initiatives

Increasing demand for program accountability and results, combined with limited resources for external evaluators, puts pressure on internal program staff to assume more responsibility for program evaluation. Evaluation capacity building (ECB) initiatives target these program staff to increase their knowledge and skill at evaluating the implementation and outcomes of the programs they operate.  ECB initiatives often reference the use of adult learning *principles* in specific activities; however, the theories of change underlying these initiatives rarely make use of more complex adult learning *theories.*  ECB practitioners can benefit from a deeper understanding of adult learning in order to design interventions that promote deeper learning of evaluation principles and methods.  This paper examines (a) the extent to which adult learning theories currently imbue theories of change in ECB projects and (b) the ways in which selected adult learning theories might inform ECB practice.

The State of Evaluation Practice in the Early 21st Century: How Has the Theme of Evaluation 2013 Influenced Our Beliefs?

Plenary Session 994 to be held in International East on Saturday, Oct 19, 4:30 PM to 5:30 PM

Over the past few days we have been conversing with our colleagues on topics that speak to our expertise, our interests, and our curiosities. Throughout, one of the filters we have applied has been our conference theme, “The State of Evaluation Practice in the Early 21st Century”. How has this filter influenced what we believe about ourselves as evaluators, about our field, and about what we can do for the groups we serve? The closing plenary will challenge us to address this question by giving us answers from a disparate group of evaluators. The panel represents variety with respect to tenure in the field, domain expertise, employment sector, and personal background. As a spur to a collective discussion, panel member will spend a few minutes sharing their thoughts about the conference theme.

Amy Germuth

tufte

Data Visualization in the 21st Century

Multipaper Session 296 to be held in International Center on Thursday, Oct 17, 2:40 PM to 4:10 PM

The History and Future of Data Visualization and its Impact on and Implications for Evaluation

This paper sets the stage for the other papers by exploring the history of data visualization, including its roots in cartography, statistics, data, visual thinking, and technology, and its impact on social sciences and society. Next, attention is given to current trends in data visualization followed by predictions for its future, with a focus on implications for evaluators and evaluation. Discussion centers on how data visualization will result in a) greater expectations among the public for transparency and data-informed decision-making, b) greater involvement of stakeholders in data mining and analysis, c) greater needs for evaluators to create systems that incorporate measurement and real-time reporting to drive the data-informed culture, and d)  greater recognition by evaluators on the value of building the capacity of stakeholders to identify data needs, understand available data, and know their limitations in both analysis and interpretation, driving more serious thought regarding effective data visualization and reporting.

Independent Consultants at the Crossroad – What Independent Consultants Report as Trends and Challenges in Evaluation and Evaluation Consulting

Business Meeting Session 392 to be held in Columbia Section 7 on Thursday, Oct 17, 6:10 PM to 7:00 PM

This presentation will : (1) Provide an overview of independent consulting within the broader sphere of evaluators; (2)  Present the results of a web survey conducted in December 2012 to better understand what independent consultants as identified via membership to the AEA ICTIG. Responses from 140 members were qualitatively analyzed to identify what they perceive as future trends in evaluation and evaluation consulting over the next five years and the challenges they currently face; (3) Report on ways in which the ICTIG might move to better support independent evaluation consultants, including what training ICTIG members reported would be most useful; and (4) Conclude with a summary and discussion of the implications of the survey responses.

Joy Sotolongo

An Apple a Day Keeps the Evaluator Away? Engaging Health Care Providers in Evaluation of Community-Wide Teen Pregnancy Prevention Initiatives

Panel Session 423 to be held in Kalorama on Friday, Oct 18, 8:00 AM to 9:30 AM

A Gathering of Unusual Suspects: One County’s Evaluation of Increased Access to Teen Health Services

This presentation will describe contributions from an array of multi-disciplinary partners for an evaluation of the President’s Teen Pregnancy Prevention Initiative demonstration project in Gaston County, North Carolina. In addition to the usual suspects (health care staff), unusual suspects (advertising professionals, county planners, youth development programs) and teens themselves conduct evaluation activities. The presentation will describe the role of each contributor, the type of data they bring to the table, and how their multiple perspectives provide a more complete and interesting picture of the project’s experiences with increasing teen access to health services. Examples of data from the multi-disciplinary perspectives will include social marketing web analytics; detailed census tract maps; pre/post survey results; and depiction of teen experiences in their own voices.

Chris Lysy

Evaluation Blogging: Improve Your Practice, Share Your Expertise, and Strengthen Your Network

Think Tank Session 770 to be held in International Center on Saturday, Oct 19, 9:50 AM to 10:35 AM

Want to start blogging about evaluation, but not sure where to start? Started, but want to know what to expect (or what to do next, or how to keep it going)? Ready to take your independent consulting practice to the next level? Or just want to have fun with a new way of communicating with fellow evaluators? In this Think Tank session, you will hear from bloggers with varying degrees of blogging experience who blog through a variety of channels and formats — personal blogs, blogging on behalf of an employer, writing for AEA365, blogging through cartoons and videos, or blogging by guest-posting or co-authoring blog posts. Facilitators will share strategies for success and address potential concerns relevant to both novice and veteran bloggers in an interactive format with break-out groups and opportunities for participants to ask specific questions. We’ll end with a discussion of collaboration across the blogging community.

Who will get the most out of your presentation?

Evaluators who are blogging, or thinking about blogging.

What’s special about your presentation?

The range of experience.  Sheila Robinson (http://sheilabrobinson.com/) has been blogging for about a year.  Ann Emery (http://emeryevaluation.com) for a couple years.  Myself (freshspectrum.com and evalcentral.com) and Susan Kistler (http://aea365.org/) for more than a few years.  The session will be discussion focused and only loosely structured, so come with questions!

Communication in Evaluation: a Q&A with Kelci Price and August meeting preview

On Friday, August 16 we’ll be meeting at the usual spot (RTP Marriott) for our August meeting.  Book club will start at 10 AM, professional development session at 11 AM, and the networking lunch in Newton’s Restaurant at 12 PM.

Send an RSVP to Amy Germuth (agermuth @ gmail . com) if you plan to attend.

This month we’ll be doing something new, beaming in Kelci Price via Webinar to lead our professional development session.  I have a little pre-meeting Q&A down below, here’s the preview.

Communication in Evaluation: Professional Development Preview

Communication in evaluation involves evaluators translating complex data into understandable information for stakeholders. It sets the stage for the utilization of evaluation findings, so is a foundational skill for all evaluators. In recent years there has been a renewed focus on how to better communicate evaluation findings in ways that engage stakeholders, improve their understanding, and inspire them to take action.

The key elements of good communication include considerations of content structure and visual communication (e.g., data visualization). This presentation will demonstrate how these concepts can be addressed in both written deliverables and presentations. The goal is to provide both novice and veteran evaluators with innovative ideas and concrete strategies for creating reports and presentations that will communicate their messages, engage audiences, and inspire stakeholders to action. Real-life examples are included, as well as resources for evaluators to use in their own practice.

A little Q&A with Kelci Price

Kelci was kind enough to answer a few of my pre-meeting questions.

You will be our first ever Webinar. When the webinar starts what will the group see, your face, your slides, or both?

“I’ll admit that I’d just assumed I’d put my slides up first, and I don’t have a webcam on my computer. But now that you mention it, I should start with a picture of me!”

Now let’s go beyond preview, how about a little teaser? What is one of the strategies or ideas you plan to address?

“There are some really simple changes you can make to amp up the utility of a report. One that I love is using your headings differently. Instead of using a generic heading, have the heading reflect a take-home message (e.g., “Successful clinics organize their staff differently”). A simple but powerful way to draw in the audience!”

Instead of using a generic heading, have the heading reflect a take-home message

It seems like the TED conference and the loads of experts found on the web are having an impact on the way we approach presentations. It also seems to be ratcheting up audience expectations. Are the days numbered for the bullet point laden, clip art-filled, presentations of the past?

“I certainly hope so. It seems like we might be increasing our understanding that presentations should be informative and inspiring – even the ones about evaluation! The audience should expect more, but presenters should also demand more of themselves. We presenters need to give presentations the respect they deserve as vehicles of information and change. Hopefully we can do away with throwing together a few slides the night before with text that was copied directly from the report.”

In the past couple of years the American Evaluation Association has been making a big push to improve presentations with things like the potent presentations initiative. They even brought in a little fun with the Ignite presentations. You were at last year’s conference, did you notice any change in the quality of presentations?

“I am so excited by the attention that AEA has paid to this – I’ll admit that I jumped for joy when AEA announced the Data Visualization and Reporting TIG. They have really done an amazing job educating folks and providing resources. The changes I have seen people make so far have mostly focused on good slide design.

I admit that I’m concerned that we’re not paying enough attention to the structure of the story we tell – slide design and fancy infographics are one thing, but the narrative matters even more. I could stand in front of a black screen and hold you in rapt attention with no slides if I have a great narrative. We need to remember that knowing how to tell the story is key.”

We need to remember that knowing how to tell the story is key.

Personally my favorite presenter is Hans Rosling (He ended a presentation about data by swallowing a sword!!!), who is your favorite?

“Hans Rosling is definitely fabulous. I love Nancy Duarte for her storytelling prowess – her book Resonate was a turning point for me (she has a great TED talk). I love searching out new speakers with passion and a great story because it reminds me that you don’t have to be a professional speaker to give an incredible presentation! TED talks are my go-to place for this.”

Prior to the webinar, what else should we know about Kelci? You’re an Applied Social Psychologist, I had a few classes undergrad, they were always so much fun. You’re in Colorado are you an avid outdoors kind of person? Do you ski?

“I have a significant nerdy side and love to teach myself new things. My husband and I have been known to spend our Friday nights coming up with neat Excel formulas. I am also an avid wet felter – a little known art form that involves creating textiles from raw wool. On the weekend I can generally be found felting handbags and wall hangings. If this whole evaluation thing doesn’t pan out, I will probably get a farm and raise alpacas. And yes, I love to hike the mountains of Colorado!”

More about Kelci Price

Kelci Price has 10 years of experience conducting evaluations and supporting organizations in their use of evaluation in strategy and learning. She currently serves as the Director of Research and Evaluation for the Colorado Health Foundation, a grant-making organization with over $2.3 billion dollars in assets which seeks to make Colorado the healthiest state in the nation. Previously, Kelci served as an internal evaluator and Director of Data Program Management for the Chicago Public Schools, and as a senior evaluator with The Evaluation Center at the University of Colorado Denver. She holds a PhD in Applied Social Psychology.

Kelci has a passion for good communication in evaluation, having discovered early in her career that how the information is presented is a critical component of whether evaluation findings will be used. Her focus is on ways to communicate evaluation findings so that stakeholders engage with the evaluation, understand the findings, and are inspired to take action. She has found that that the key elements of good communication include considerations of content structure and visual communication.