Posts Tagged 'public programs'

So What? One Question Evaluations at the DMA

tour

Almost all museums have programs for the public, in one form or another. These programs may fall under the fields of education, visitor engagement or interpretation, and may take the forms of drop-in workshops, ticketed lectures, in-gallery interactives or scheduled tours. Even within a single museum, these programs are diverse in their scope, varied in the spaces they take place, and wide-ranging in their scale. Furthermore, they differ in their intended outcomes, but more on that later.

Importance has been placed on growing different types programs for the public, as these can garner more attention from the community, increase overall attendance, and develop audiences. Oftentimes, this growth is approached by expanding programmatic offerings: increasing lecture topics, presenting more classes and workshops, or providing additional interactives in galleries. In these instances, growth is measured by quantity: “Well yes we serve our community, look how many programs we offer!” But I question whether this type of growth is positive for museums. Is expansion without reflection a good idea? I am inevitably reminded of the adage, “Less is more,” and wonder how this concept can be manifested in museum education programs.

The process of reflection and evaluation is an often overlooked step in program creation. Activities are designed, implemented, and either repeated (perhaps next week, or next month) or they are archived (if they’re lucky) for posterity. But rarely does someone stop and ask, “So what?” This question, seemingly harsh and unforgiving to some, or unimportant to others, is an invaluable asset to evaluation in my mind. It is a question that harks back to my time as a graduate student at The University of Texas at Austin in the Art Education department. My thesis advisor and Assistant Chair to the department Dr. Paul E. Bolin situated this question as an integral part of all academic research. Students would approach him with research proposals and he would answer, “Well yes, that does sound like an interesting study of this-and-that at the so-and-so institution, but so what?” This simple question produced its fair share of frustration, but also some very fruitful discussion about the field of art education. It caused us to reflect on how our intended research would impact or advance the discipline. It wasn’t enough for the topic to simply garner interest, it had to have purpose and intention; it needed to be able to stand on its own and answer, “This is why I exist.” Now, I am not proposing that every public program initiated by a museum or art institution should be held to the bar of furthering the field of art education (this would be rather difficult), but I do feel that every program should be accountable in providing a significant answer to this critical question.

As a department, the DMA’s Education team has spent many hours creating a mission statement that encapsulates our departmental practice, inevitably answering the So What question for ourselves. And during this process we have hit upon some key concepts that fuel our programs, the chief one being engagement. As educators, we aim to broaden and deepen engagement, and recognize that the DMA can influence the depth of one’s engagement, not by pitching more programs at our community, but by facilitating meaningful experiences in our current educational endeavors. But how can we know that these experiences are meaningful, unless we ask?

Our Education team is undertaking the evaluation process by implementing a series of one question studies that aim to pinpoint specific queries we as educators have about our public programs. The restricted format of this evaluation exercise is key, because evaluation can be a daunting task if approached too broadly. The one question design ensures that we concentrate on a single point—one program or interactive, one outcome, one bit of information that is important for us to determine. Just as the questions vary, so too do the methods of collecting data from visitors, ranging from written surveys, a post-it note response wall, and even a voting system using colorful pony beads. Below are preliminary looks into two different one question case studies we’ve begun.

CASE STUDY #1: YOUNG LEARNERS GALLERY (contributed by Jessica Fuentes)

Young Learners Gallery

The Young Learners Gallery, within the Center for Creative Connections (C3), is a space designed for children ages 5-8 and their caregivers. Over the past three years while much of C3 has changed—with the introduction of new artworks, art-making materials and gallery interactives—the Young Learners Gallery has gone untouched, because making changes to that space requires a complete redesign. Before undertaking such a task, the C3 staff want to learn more about families’ anticipated and actual experiences at the Museum.

We started with the prompt, “I bring my child(ren) to the Dallas Museum of Art Prompt signbecause…” posted on a wall in the Young Learners Gallery near a small table equipped with post-it notes and pencils. Unlike a survey, this method allows for open-ended responses that can later be categorized and analyzed while retaining the individual visitor’s voice. This analog system has been brought into the 21st Century through the development of the Post-It Plus App. With this app, instead of sifting through responses and later transcribing them in digital form, we can simply photograph the post-it notes and organize the digitized notes on a virtual board. The board can then be exported in a variety of formats including PowerPoint, Excel, and PDF.

We posted our question for a month and received 107 responses. The responses ranged from children’s drawings to eloquent statements expressing a desire to expose children to a broader world view. Because the purpose of this question was to gauge the caregiver’s motivations, we set aside the 26 children’s responses and the 4 irrelevant responses; however, we plan to use future questions to gather children’s input as well. The top three categories for why caregivers bring their child(ren) to the DMA is to get inspiration or foster creativity, to provide exposure to different cultures or broaden their world view, and because it’s fun.

As we plan the new Young Learners Gallery, we are keeping these findings in mind. For example, we have decided to include works of art in the space which currently is an activity area. The works of art selected will have a strong emphasis on culture and creativity. We also plan to create hands-on activities that address the developmental milestones of children aged 5-8 and provide opportunities for children and their caregivers to play, draw, and talk together about the works of art in front of them.

Now that we understand why caregivers bring their children to the DMA, we are in the process of posing more questions to learn what families actually do in the Young Learners Gallery. Understanding both expectations and experiences will help us develop a space that will meet a wider spectrum of caregivers’ and children’s needs.

totes45CASE STUDY #2: ART TO GO FAMILY TOTE BAGS

The DMA first offered activity-filled tote bags to families around this time last year, premiering during our January 2013 Late Night event. Each tote bag contains a variety of activities that encourage families to write, talk, play, or make while exploring the galleries together, the idea being to have fun with the art as a family. Since their public introduction last year, our Art To Go Family tote bags have grown to include many different themes: Senses, to help explore art through the five senses; Color, to explore art while thinking about colors; Family Fun, with activities designed by a family who frequently visits the Museum; and Arturo’s Library totes, designed for children under five years of age, which focus on a single work of art with an accompanying book and hands-on activity. These tote bags are available for check-out at our Family Fun Cart, located at the main entrance to the Museum, and are free to use by families anytime the Museum is open.

Initially, the Family & Access programs staff sought feedback on tote bags through individual paper surveys, presented to families once they returned tote bags to the Family Fun Cart. We found that very few of these surveys were returned, or even taken in the first place. During busy times at the Museum, those who coordinated checking bags in and out to visitors rarely had time to focus on handing out this extra survey to families, who for their part, were usually rushing to leave the Museum, and therefore rarely could spend extra time answering a two-page questionnaire. In this case, the one question evaluation was ideal not only because of its simplicity and accessibility, but also because of its straightforwardness. Unlike the above-mentioned project with the Young Learners Gallery, for the tote bags we were not looking for open-ended answers, at least not yet. While it is absolutely valuable to know whether or not visitors feel that the tote bags encourage a playful attitude towards looking at art, or if they are able to increase visitor confidence in looking at art with children, this is information that is best obtained in a second stage of evaluation, when we look at the specific effects of each activity and deem whether things should be modified or not. At this still early stage in the life of the tote bags, our team is really interested in the simple question of whether families are indeed using the tote bags during their visit, and where in the Museum they are being used. (This is our So What question.)

We designed our one question evaluation as a multiple choice prompt, which was added to the tote bag check-out sheet. We asked, “On Which Floor Did You Use the Tote Bag?” and invited visitors to check the box next to each area in which they used the bags—Level 1, Level 2, Level 3, Level 4—or, if they didn’t in fact use the activities in the bag, to check Did Not Use. Our hope was that including our one question evaluation on the check-out sheet, something that families were already using and thus familiar with, would increase the amount of feedback we received.

The updated check-out sheet has been in rotation for two months, and we have collected data for November and December 2014. On average, 52% of people who checked-out tote bags during that time responded to our evaluation, and of that total only 6% did not use the tote bag activities at all. While in the Museum, the respondents said they preferred to use the tote bags most on Level 1 (28%) and least on Level 4 (16%). Now, is this choice based on physical access (Level 1 is the same level on which the tote bags are offered, while Level 4 is farthest away) or related to the works of art available on each level (Level 1 is Contemporary art, Level 4 is American)? Now that we are beginning to better understand how much the tote bags are being utilized by visitors, and where in the Museum galleries they are being taken, we can start to pose these types of ancillary questions, that tap into deeper inquiries about visitor engagement.

Tour

The insight provided by these two studies hopefully demonstrates the importance of evaluation to both growing and expanding program development. These are just preliminary looks into initial studies, and we hope to have more one question studies, as well as data, on the horizon. In order to increase the effectiveness of our programs and spaces, museum educators need the input of our audience to better understand their level and scope of engagement. Reflection and evaluation, in the style of these one question studies or other formats, can facilitate this exchange of ideas in a positive and productive manner, providing a strong foundation for educators to answer the So What question for themselves, their institution, as well as their community.

Danielle Schulz
Teaching Specialist


Archives

Flickr Photo Stream

Categories