Hadley, Bree Jamila, e Sandra Gattenhof. "Measurable Progress? Teaching Artsworkers to Assess and Articulate the Impact of Their Work". M/C Journal 14, n.º 6 (22 de novembro de 2011). http://dx.doi.org/10.5204/mcj.433.
Resumo:
The National Cultural Policy Discussion Paper—drafted to assist the Australian Government in developing the first national Cultural Policy since Creative Nation nearly two decades ago—envisages a future in which arts, cultural and creative activities directly support the development of an inclusive, innovative and productive Australia. "The policy," it says, "will be based on an understanding that a creative nation produces a more inclusive society and a more expressive and confident citizenry by encouraging our ability to express, describe and share our diverse experiences—with each other and with the world" (Australian Government 3). Even a cursory reading of this Discussion Paper makes it clear that the question of impact—in aesthetic, cultural and economic terms—is central to the Government's agenda in developing a new Cultural Policy. Hand-in-hand with the notion of impact comes the process of measurement of progress. The Discussion Paper notes that progress "must be measurable, and the Government will invest in ways to assess the impact that the National Cultural Policy has on society and the economy" (11). If progress must be measurable, this raises questions about what arts, cultural and creative workers do, whether it is worth it, and whether they could be doing it better. In effect, the Discussion Paper pushes artsworkers ever closer to a climate in which they have to be skilled not just at making work, but at making the impact of this work clear to stakeholders. The Government in its plans for Australia's cultural future, is clearly most supportive of artsworkers who can do this, and the scholars, educators and employers who can best train the artsworkers of the future to do this. Teaching Artsworkers to Measure the Impact of Their Work: The Challenges How do we train artsworkers to assess, measure and articulate the impact of what they do? How do we prepare them to be ready to work in a climate that will—as the National Cultural Policy Discussion Paper makes clear—emphasise measuring impact, communicating impact, and communicating impact across aesthetic, cultural and economic categories? As educators delivering training in this area, the Discussion Paper has made this already compelling question even more pressing as we work to develop the career-ready graduates the Government seeks. Our program, the Master of Creative Industries (Creative Production & Arts Management) offered in the Creative Industries Faculty at Queensland University of Technology in Brisbane, Australia, is, like most programs in arts and cultural management in the US, UK, Europe and Australia, offering a three-Semester postgraduate program that allows students to develop the career-ready skills required to work as managers of arts, cultural or creative organisations. That we need to train our graduates to work not just as producers of plays, paintings or recordings, but as entrepreneurial arts advocates who can measure and articulate the value of their programs to others, is not news (Hadley "Creating" 647-48; cf. Brkic; Ebewo and Sirayi; Beckerman; Sikes). Our program—which offers training in arts policy, management, marketing and budgeting followed by training in entrepreneurship and a practical project—is already structured around this necessity. The question of how to teach students this diverse skill set is, however, still a subject of debate; and the question of how to teach students to measure the impact of this work is even more difficult. There is, of course, a body of literature on the impact of arts, cultural and creative activities, value and evaluation that has been developed over the past decade, particularly through landmark reports like Matarasso's Use or Ornament? The Social Impact of Participation in the Arts (1997) and the RAND Corporation's Gifts of the Muse: Reframing the Debate about the Benefits of the Arts (2004). There are also emergent studies in an Australian context: Madden's "Cautionary Note" on using economic impact studies in the arts (2001); case studies on arts and wellbeing by consultancy firm Effective Change (2003); case studies by DCITA (2003); the Asia Pacific Journal of Arts and Cultural Management (2009) issue on "value"; and Australia Council publications on arts, culture and economy. As Richards has explained, "evaluation is basically a straightforward concept. E-value-ation = a process of enquiry that allows a judgment of amount, value or worth to be made" (99). What makes arts evaluation difficult is not the concept, but the measurement of intangible values—aesthetic quality, expression, engagement or experience. In the literature, discussion has been plagued by debate about what is measured, what method is used, and whether subjective values can in fact be measured. Commentators note that in current practice, questions of value are still deferred because they are too difficult to measure (Bilton and Leary 52), discussed only in terms of economic measures such as market share or satisfaction which are statistically quantifiable (Belfiore and Bennett "Rethinking" 137), or done through un-rigorous surveys that draw only ambiguous, subjective, or selective responses (Merli 110). According to Belfiore and Bennett, Public debate about the value of the arts thus comes to be dominated by what might best be termed the cult of the measurable; and, of course, it is those disciplines primarily concerned with measurement, namely, economics and statistics, which are looked upon to find the evidence that will finally prove why the arts are so important to individuals and societies. A corollary of this is that the humanities are of little use in this investigation. ("Rethinking" 137) Accordingly, Ragsdale states, Arts organizations [still] need to find a way to assess their progress in …making great art that matters to people—as evidenced, perhaps, by increased enthusiasm, frequency of attendance, the capacity and desire to talk or write about one's experience, or in some other way respond to the experience, the curiosity to learn about the art form and the ideas encountered, the depth of emotional response, the quality of the social connections made, and the expansion of one's aesthetics over time. Commentators are still looking for a balanced approach (cf. Geursen and Rentschler; Falk and Dierkling), which evaluates aesthetic practices, business practices, audience response, and results for all parties, in tandem. An approach which evaluates intrinsic impacts, instrumental impacts, and the way each enables the other, in tandem—with an emphasis not on the numbers but on whether we are getting better at what we are doing. And, of course, allows evaluators of arts, cultural and creative activities to use creative arts methods—sketches, stories, bodily movements and relationships and so forth—to provide data to inform the assessment, so they can draw not just on statistical research methods but on arts, culture and humanities research methods. Teaching Artsworkers to Measure the Impact of Their Work: Our Approach As a result of this contested terrain, our method for training artsworkers to measure the impact of their programs has emerged not just from these debates—which tend to conclude by declaring the needs for better methods without providing them—but from a research-teaching nexus in which our own trial-and-error work as consultants to arts, cultural and educational organisations looking to measure the impact of or improve their programs has taught us what is effective. Each of us has worked as managers of professional associations such as Drama Australia and Australasian Association for Theatre, Drama and Performance Studies (ADSA), members of boards or committees for arts organisations such as Youth Arts Queensland and Young People and the Arts Australia (YPAA), as well as consultants to major cultural organisations like the Queensland Performing Arts Centre and the Brisbane Festival. The methods for measuring impact we have developed via this work are based not just on surveys and statistics, but on our own practice as scholars and producers of culture—and are therefore based in arts, culture and humanities approaches. As scholars, we investigate the way marginalised groups tell stories—particularly groups marked by age, gender, race or ability, using community, contemporary and public space performance practices (cf. Hadley, "Bree"; Gattenhof). What we have learned by bringing this sort of scholarly analysis into dialogue with a more systematised approach to articulating impact to government, stakeholders and sponsors is that there is no one-size-fits-all approach. What is needed, instead, is a toolkit, which incorporates central principles and stages, together with qualitative, quantitative and performative tools to track aesthetics, accessibility, inclusivity, capacity-building, creativity etc., as appropriate on a case-by-case basis. Whatever the approach, it is critical that the data track the relationship between the experience the artists, audience or stakeholders anticipated the activity should have, the aspects of the activity that enabled that experience to emerge (or not), and the effect of that (or not) for the arts organisation, their artists, their partners, or their audiences. The combination of methods needs to be selected in consultation with the arts organisation, and the negotiations typically need to include detailed discussion of what should be evaluated (aesthetics, access, inclusivity, or capacity), when it should be evaluated (before, during or after), and how the results should be communicated (including the difference between evaluation for reporting purposes and evaluation for program improvement purposes, and the difference between evaluation and related processes like reflection, documentary-making, or market research). Translating what we have learned through our cultural research and consultancy into a study package for students relies on an understanding of what they want from their study. This, typically, is practical career-ready skills. Students want to produce their own arts, or produce other people's arts, and most have not imagined themselves participating in meta-level processes in which they argue the value of arts, cultural and creative activities (Hadley, "Creating" 652). Accordingly, most have not thought of themselves as researchers, using cultural research methods to create reports that inform how the Australian government values, supports, and services the arts. The first step in teaching students to operate effectively as evaluators of arts, cultural and creative activities is, then, to re-orient their expectations to include this in their understanding of what artsworkers do, what skills artsworkers need, and where they deploy these skills. Simply handing over our own methods, as "the" methods, would not enable graduates to work effectively in a climate were one size will not fit all, and methods for evaluating impact need to be negotiated again for each new context. 1. Understanding the Need for Evaluation: Cause and Effect The first step in encouraging students to become effective evaluators is asking them to map their sector, the major stakeholders, the agendas, alignments and misalignments in what the various players are trying to achieve, and the programs, projects and products through which the players are trying to achieve it. This starting point is drawn from Program Theory—which, as Joon-Yee Kwok argues in her evaluation of the SPARK National Mentoring Program for Young and Emerging Artists (2010) is useful in evaluating cultural activities. The Program Theory approach starts with a flow chart that represents relationships between activities in a program, allowing evaluators to unpack some of the assumptions the program's producers have about what activities have what sort of effect, then test whether they are in fact having that sort of effect (cf. Hall and Hall). It could, for example, start with a flow chart representing the relationship between a community arts policy, a community arts organisation, a community-devised show it is producing, and a blog it has created because it assumes it will allow the public to become more interested in the show the participants are creating, to unpack the assumptions about the sort of effect this is supposed to have, and test whether this is in fact having this sort of effect. Masterclasses, conversations and debate with peers and industry professionals about the agendas, activities and assumptions underpinning programs in their sector allows students to look for elements that may be critical in their programs' ability to achieve (or not) an anticipated impact. In effect to start asking about, "the way things are done now, […] what things are done well, and […] what could be done better" (Australian Government 12).2. Understanding the Nature of Evaluation: PurposeOnce students have been alerted to the need to look for cause-effect assumptions that can determine whether or not their program, project or product is effective, they are asked to consider what data they should be developing about this, why, and for whom. Are they evaluating a program to account to government, stakeholders and sponsors for the money they have spent? To improve the way it works? To use that information to develop innovative new programs in future? In other words, who is the audience? Being aware of the many possible purposes and audiences for evaluation information can allow students to be clear not just about what needs to be evaluated, but the nature of the evaluation they will do—a largely statistical report, versus a narrative summary of experiences, emotions and effects—which may differ depending on the audience.3. Making Decisions about What to Evaluate: Priorities When setting out to measure the impact of arts, cultural or creative activities, many people try to measure everything, measure for the purposes of reporting, improvement and development using the same methods, or gather a range of different sorts of data in the hope that something in it will answer questions about whether an activity is having the anticipated effect, and, if so, how. We ask students to be more selective, making strategic decisions about which anticipated effects of a program, project or product need to be evaluated, whether the evaluation is for reporting, improvement or innovation purposes, and what information stakeholders most require. In addition to the concept of collecting data about critical points where programs succeed or fail in achieving a desired effect, and different approaches for reporting, improvement or development, we ask students to think about the different categories of effect that may be more or less interesting to different stakeholders. This is not an exhaustive list, or a list of things every evaluation should measure. It is a tool to demonstrate to would-be evaluators points of focus that could be developed, depending on the stakeholders' priorities, the purpose of the evaluation, and the critical points at which desired effects need to occur to ensure success. Without such framing, evaluators are likely to end up with unusable data, which become a difficulty to deal with rather than a benefit for the artsworkers, arts organisations or stakeholders. 4. Methods for Evaluation: Process To be effective, methods for collecting data about how arts, cultural or creative activities have (or fail to have) anticipated impact need to include conventional survey, interview and focus group style tools, and creative or performative tools such as discussion, documentation or observation. We encourage students to use creative practice to draw out people's experience of arts events—for example, observation, documentation still images, video or audio documentation, or facilitated development of sketches, stories or scenes about an experience, can be used to register and record people's feelings. These sorts of methods can capture what Mihaly Csikszentmihalyi calls "flow" of experience (cf. Belfiore and Bennett, "Determinants" 232)—for example, photos of a festival space at hourly intervals or the colours a child uses to convey memory of a performance can capture to flow of movement, engagement, and experience for spectators more clearly than statistics. These, together with conventional surveys or interviews that comment on the feelings expressed, allow for a combination of quantitative, qualitative and performative data to demonstrate impact. The approach becomes arts- and humanities- based, using arts methods to encourage people to talk, write or otherwise respond to their experience in terms of emotion, connection, community, or expansion of aesthetics. The evaluator still needs to draw out the meaning of the responses through content, text or discourse analysis, and teaching students how to do a content analysis of quantitative, qualitative and performative data is critical at this stage. When teaching students how to evaluate their data, our method encourages students not just to focus on the experience, or the effect of the experience, but the relationship between the two—the things that act as "enablers" "determinants" (White and Hede; Belfiore and Bennett, "Determinants" passim) of effect. This approach allows the evaluator to use a combination of conventional and creative methods to describe not just what effect an activity had, but, more critically, what enabled it to have that effect, providing a firmer platform for discussing the impact, and how it could be replicated, developed or deepened next time, than a list of effects and numbers of people who felt those effects alone. 5. Communicating Results: Politics Often arts, cultural or creative organisations can be concerned about the image of their work an evaluation will create. The final step in our approach is to alert students to the professional, political and ethical implications of evaluation. Students learn to share their knowledge with organisations, encouraging them to see the value of reporting both correct and incorrect assumptions about the impact of their activities, as part of a continuous improvement process. Then we assist them in drawing the results of this sort of cultural research into planning, development and training documents which may assist the organisation in improving in the future. In effect, it is about encouraging organisations to take the Australian government at its word when, in the National Cultural Policy Discussion Paper, it says it that measuring impact is about measuring progress—what we do well, what we could do better, and how, not just success statistics about who is most successful—as it is this that will ultimately be most useful in creating an inclusive, innovative, productive Australia. Teaching Artsworkers to Measure the Impact of Their Work: The Impact of Our Approach What, then, is the impact of our training on graduates' ability to measure the impact of work? Have we made measurable progress in our efforts to teach artsworkers to assess and articulate the impact of their work? The MCI (CP&AM) has been offered for three years. Our approach is still emergent and experimental. We have, though, identified a number of impacts of our work. First, our students are less fearful of becoming involved in measuring the value or impact of arts, cultural and creative programs. This is evidenced by the number who chooses to do some sort of evaluation for their Major Project, a 15,000 word individual project or internship which concludes their degree. Of the 50 or so students who have reached the Major Project in three years—35 completed and 15 in planning for 2012—about a third have incorporated evaluation into their Major Project. This includes evaluation of sector, business or producing models (5), youth arts and youth arts mentorship programs (4), audience development programs (2), touring programs (4), and even other arts management training programs (1). Indeed, after internships in programming or producing roles, this work—aligned with the Government's interest in improving training of young artists, touring, audience development, and economic development—has become a most popular Major Project option. This has enabled students to work with a range of arts, cultural and creative organisations, share their training—their methods, their understanding of what their methods can measure, when, and how—with Industry. Second, this Industry-engaged training has helped graduates in securing employment. This is evidenced by the fact that graduates have gone on to be employed with organisations they have interned with as part of their Major Project, or other organisations, including some of Brisbane's biggest cultural organisations—local and state government departments, Queensland Performing Arts Centre, Brisbane Festival, Metro Arts, Backbone Youth Arts, and Youth Arts Queensland, amongst others. Thirdly, graduates' contribution to local organisations and industry has increased the profile of a relatively new program. This is evidenced by the fact that it enrols 40 to 50 new students a year across Graduate Certificate / MCI (CP&AM) programs, typically two thirds domestic students and one third international students from Canada, Germany, France, Denmark, Norway and, of course, China. Indeed, some students are now disseminating this work globally, undertaking their Major Project as an internship or industry project with an organisation overseas. In effect, our training's impact emerges not just from our research, or our training, but from the fact that our graduates disseminate our approach to a range of arts, cultural and creative organisations in a practical way. We have, as a result, expanded the audience for this approach, and the number of people and contexts via which it is being adapted and made useful. Whilst few of students come into our program with a desire to do this sort of work, or even a working knowledge of the policy that informs it, on completion many consider it a viable part of their practice and career pathway. When they realise what they can achieve, and what it can mean to the organisations they work with, they do incorporate research, research consultant and government roles as part of their career portfolio, and thus make a contribution to the strong cultural sector the Government envisages in the National Cultural Policy Discussion Paper. Our work as scholars, practitioners and educators has thus enabled us to take a long-term, processual and grassroots approach to reshaping agendas for approaches to this form of cultural research, as our practices are adopted and adapted by students and industry stakeholders. Given the challenges commentators have identified in creating and disseminating effective evaluation methods in arts over the past decade, this, for us—though by no means work that is complete—does count as measurable progress. References Beckerman, Gary. "Adventuring Arts Entrepreneurship Curricula in Higher Education: An Examination of Present Efforts, Obstacles, and Best pPractices." The Journal of Arts Management, Law, and Society 37.2 (2007): 87-112. Belfiore, Eleaonora, and Oliver Bennett. "Determinants of Impact: Towards a Better Understanding of Encounters with the Arts." Cultural Trends 16.3 (2007): 225-75. ———. "Rethinking the Social Impacts of the Arts." International Journal of Cultural Policy 13.2 (2007): 135-51. Bilton, Chris, and Ruth Leary. "What Can Managers Do for Creativity? Brokering Creativity in the Creative Industries." International Journal of Cultural Policy 8.1 (2002): 49-64. Brkic, Aleksandar. "Teaching Arts Management: Where Did We Lose the Core Ideas?" Journal of Arts Management, Law and Society 38.4 (2009): 270-80. Czikszentmihalyi, Mihaly. "A Systems Perspective on Creativity." Creative Management. Ed. Jane Henry. Sage: London, 2001. 11-26. Australian Government. "National Cultural Policy Discussion Paper." Department of Prime Minster and Cabinet – Office for the Arts 2011. 1 Oct. 2011 ‹http://culture.arts.gov.au/discussion-paper›. Ebewo, Patrick, and Mzo Sirayi. "The Concept of Arts/Cultural Management: A Critical Reflection." Journal of Arts Management, Law and Society 38.4 (2009): 281-95. Effective Change and VicHealth. Creative Connections: Promoting Mental Health and Wellbeing through Community Arts Participation 2003. 1 Oct. 2011 ‹http://www.vichealth.vic.gov.au/en/Publications/Social-connection/Creative-Connections.aspx›. Effective Change. Evaluating Community Arts and Community Well Being 2003. 1 Oct. 2011 ‹http://www.arts.vic.gov.au/Research_and_Resources/Resources/Evaluating_Community_Arts_and_Wellbeing›. Falk, John H., and Lynn. D Dierking. "Re-Envisioning Success in the Cultural Sector." Cultural Trends 17.4 (2008): 233-46. Gattenhof, Sandra. "Sandra Gattenhof." QUT ePrints Article Repository. Queensland University of Technology, 2011. 1 Oct. 2011 ‹http://eprints.qut.edu.au/view/person/Gattenhof,_Sandra.html›. Geursen, Gus and Ruth Rentschler. "Unravelling Cultural Value." The Journal of Arts Management, Law and Society 33.3 (2003): 196-210. Hall, Irene and David Hall. Evaluation and Social Research: Introducing Small Scale Practice. London: Palgrave McMillan, 2004. Hadley, Bree. "Bree Hadley." QUT ePrints Article Repository. Queensland University of Technology, 2011. 1 Oct. 2011 ‹http://eprints.qut.edu.au/view/person/Hadley,_Bree.html›. ———. "Creating Successful Cultural Brokers: The Pros and Cons of a Community of Practice Approach in Arts Management Education." Asia Pacific Journal of Arts and Cultural Management 8.1 (2011): 645-59. Kwok, Joon. When Sparks Fly: Developing Formal Mentoring Programs for the Career Development of Young and Emerging Artists. Masters Thesis. Brisbane: Queensland University of Technology, 2010. Madden, Christopher. "Using 'Economic' Impact Studies in Arts and Cultural Advocacy: A Cautionary Note." Media International Australia, Incorporating Culture & Policy 98 (2001): 161-78. Matarasso, Francis. Use or Ornament? The Social Impact of Participation in the Arts. Bournes Greens, Stroud: Comedia, 1997. McCarthy, Kevin. F., Elizabeth H. Ondaatje, Laura Zakaras, and Arthur Brooks. Gifts of the Muse: Reframing the Debate about the Benefits of the Arts. Santa Monica: RAND Corporation, 2004. Merli, Paola. "Evaluating the Social Impact of Participation in Arts Activities." International Journal of Cultural Policy 8.1 (2002): 107-18. Muir, Jan. The Regional Impact of Cultural Programs: Some Case Study Findings. Communications Research Unit - DCITA, 2003. Ragsdale, Diana. "Keynote - Surviving the Culture Change." Australia Council Arts Marketing Summit. Australia Council for the Arts: 2008. Richards, Alison. "Evaluation Approaches." Creative Collaboration: Artists and Communities. Melbourne: Victorian College of the Arts, University of Melbourne, 2006. Sikes, Michael. "Higher Education Training in Arts Administration: A Millennial and Metaphoric Reappraisal. Journal of Arts Management, Law and Society 30.2 (2000): 91-101.White, Tabitha, and Anne-Marie Hede. "Using Narrative Inquiry to Explore the Impact of Art on Individuals." Journal of Arts Management, Law, and Society 38.1 (2008): 19-35.