GPSa Knowledge Platform

Are we Really Learning? Making Grant-Making Practices more Conducive to Grantee Learning

GPSA Knowledge Platform forums Discussions with Experts Are we Really Learning? Making Grant-Making Practices more Conducive to Grantee Learning

This topic contains 53 replies, has 24 voices, and was last updated by Profile photo of Lucia Lucia 2 years, 3 months ago.

  • Author
    Posts
  • #4045
    Profile photo of Brendan
    Brendan
    @brendan-halloran
    United States

    Hi everyone and welcome to this E-forum entitled “Are we really learning? Making grant-making practices more conducive to grantee learning”. I’m Brendan Halloran, from the Transparency and Accountability Initiative, and I will be co-facilitating this forum along with Charlotte Ornemark from the GPSA.

    As the title makes clear, this forum will focus on how to strengthen organizational learning, particularly looking at the kind of funding systems and practices that can encourage and support learning. There is increasing consensus that the complexity and uncertainty of working for improved services, governance reform, social change, or equality and rights necessitates flexible and adaptive approaches that are based on continuous learning. Funders are increasingly asking grant recipients to prioritize and demonstrate learning. However, too often external funding can force organizations towards rigid and linear projects, with little scope for adaptation and little incentive (or support) to concentrate on learning. Both funders and practitioners need to get better at encouraging and implementing real organizational learning.

    In this forum we hope to explore organizational learning and how it can be best supported. We will draw on a recent study that delves into these issues and suggests important lessons. Hopefully many of you will have had the chance to review the document and/or join the recent webinar featuring Jenny Ross, the author of the report. If not, please take a look at these resources when you are able:

    Report on supporting learning: http://www.transparency-initiative.org/news/funding-learning-and-impact-how-do-grant-making-practices-help-and-hinder-real-grantee-learning.
    Webinar on supporting learning: http://gpsaknowledge.org/events/funding-learning-and-impact-how-do-grant-making-practices-help-and-hinder-real-grantee-learning/.

    To get the forum started, we would love to hear from you about how you have embedded learning in your work in new and innovative ways. Here are a couple of questions that we would like you to reflect upon and share some of your practices and experiences:
    • What are some examples of innovative learning practices you have been a part of in your organization?
    • Why (and how) did this learning get prioritized and implemented?
    • What kind of learning did it enable? (e.g. improved activities/outputs, revised theory of change, better understanding of changing context, an organizational culture that encourages learning).

    Please share your perspectives related to the above questions or any other thoughts you have about the importance of organizational learning and how learning can become part of organizational practice and culture.

  • #4050
    Profile photo of Scott
    Scott
    @scott
    United States

    Hi everyone–

    It’s a pleasure to participate in this highly germane conversation. As with many of you, I have experience on both sides of the coin – working for donors and for organizations implementing donor-funded projects. Currently one of the hats I wear is that of a GPSA capacity-building advisor to grantees in Europe and Central Asia.

    As we know, too often the incentives structuring the relationship between donors and grantees inadvertently undermine the prospect for adaptive learning. Donors are accountable to someone (their bosses, taxpayers, parliaments, etc.) for bringing about the changes promised in their grant agreements. Grantees are accountable to their donors for delivering what they promised in their proposals. Once grant agreements are signed the incentive for introducing changes, assuming new risks, and learning from failure is generally low.

    This shouldn’t be the case. Anyone who’s worked in the field for some time knows how nonlinear development work is. Externally, conditions are constantly in flux. Within projects, each activity generates learning that can be ingrained into subsequent activities, or can point to the need for substantive rethinking of project assumptions or goals.

    GPSA is working on ways to encourage adaptive learning. It has published a discussion note on the topic. It specifically prods prospective grantees to elaborate how its past learning has shaped the articulation of its current proposal. Additionally, its offers multiyear grants partially to ensure sufficient learning and adaptation time.

    GPSA has also been conveying to each of its grantees the importance it places on adaptive learning. Sometimes grantees find this hard to believe. They can be skeptical whether this is genuinely the case or some form of lip service. Often they are unaccustomed to a donor pushing for much more than pre-agreed outputs and outcomes. But emphasizing adaptive learning has recently led, for example, to a refining of methodological tools and approaches in Moldova and a reconsideration of certain project goals in Kyrgyzstan. Other projects, with support from GPSA, are in the process of analyzing their efforts to identify important learning and how that new knowledge can feed into their work going forward.

    There are various ways to foster increased adaptive learning. The people who work most intensely on a project are obviously those who know it best. But sometimes that closeness can also hinder their ability to assess developments from a healthy (and unemotional) distance. Projects that include regular (annual or biannual) reflection and learning meetings, ideally facilitated by a trusted external person, may be more likely to grasp the larger meaning of things and identify useful improvements or strategic shifts. This is something I encourage, for instance, with all the grantees I work with. Knowing that the donor is receptive to project changes, if based on clear learning and analysis, also creates an enabling environment for that learning to take place.

    One of GPSA’s roles could be to carefully assess how it encourages adaptive learning, how the grantees perceive this approach, and, over time, how it impacts the efficacy of its interventions.

    I look forward to a lot of learning from this exchange of ideas.

    Scott

  • #4051
    Profile photo of Janet Oropeza
    Janet Oropeza
    @janet-oropeza

    Hi! This is Janet from Fundar, Mexico. I would like to share our experience with learning. I would say that adopting a learning culture within a civil society organization, even if it is a consolidated one, is a long process that requires cultural changes. Due to some structural problems with funding, such as short-term grants (one or two years) and the need to report outcomes on a yearly basis, there is not much time left to learn or reflect. Most time is spent on implementing. However, within Fundar’s Knowledge Exchange Area, the team I work for, we have tried to learn in various ways. First, every semester or year we assess some of our projects. We do this not only for M&E purposes, but to extract lessons (what went well, what did not and why). Based on that, we decide what changes need to be implemented or what capacities we need to build within the team. We have identified that some of our donors have been very flexible when we have shared with them the need to adjust some activities or even objectives, as a result of our learning. Second, we have also started to include in our proposals an output related to learning. For example, in a recent knowledge exchange project we got funding for, we included as an output a report on the learning we collect as we implement the project. Third, our area works with Fundar’s researchers from all areas and helps them to systematize lessons learned from their projects. We have developed a methodology for making them reflect and identifying lessons in a facilitated session. Then, the learnings that came out of that session are put in paper and become a brief. Regarding the first two strategies we have implemented, we have seen openness from donors to adjust activities or outcomes as a result of our learning efforts. However, the need to report on a yearly basis and demonstrate outcomes and impacts periodically still constraints our efforts to allocate enough time to learn.

  • #4053
    Profile photo of Alan
    Alan
    @alanhudson
    USA - but UK really

    Great to see the issue of learning – adaptive learning – increasingly part of the governance agenda and, including through “Doing Development Differently” and related endeavours, the wider development agenda.

    Thanks to GPSA for hosting this discussions and to the Hewlett Foundation, T/AI and INTRAC for their contributions to what I think is an excellent report (although I thought it was a bit too much on how grantees can do a better job of letting funders know “what works”, rather than about how grantees can support learning that informs action that makes a more direct difference at country-level).

    I’m currently leading the process of reviewing and revising Global Integrity’s overall strategy. We’re heading in the direction of a focus on Adaptive Learning and Open Governance. Here’s a recent short summary – comments welcome!

    https://docs.google.com/document/d/1E5IPEBl1KYKHbIF6c7b6j7ZYTc9cm4WIGM_xqZRzv-0/edit#

    We’re very keen to be part of the conversation both about how organizations can support external adaptive learning, and how organizations can implement adaptive learning to inform their own behaviour.

    It would be great to share and document experience. This note from the GPSA is well worth a look:

    https://www.thegpsa.org/sa/Data/gpsa/files/field/documents/gpsa_note_5-adaptive_learning.pdf

    As part of our strategy process, I’ve just put together a first go at a table contrasting simple and adaptive learning. I’m trying to define an approach to learning that Global Integrity will take and apply through its various projects. Comments very welcome!

    https://docs.google.com/document/d/1wKWG2YSMBtD5ghnp85KB1tTxTTN9Z4gVq1Xcj-ala7o/edit

    Looking forward to learning about learning together!

    best wishes,
    Alan

  • #4054
    Profile photo of Brendan
    Brendan
    @brendan-halloran
    United States

    Thanks for the important contributions Scott, Janet and Alan! And for breaking the ice :)

    A couple of quick points raised by these posts:

    1. The importance of reflection, but also the need for time, space, and often a (trusted) critical friend to help provide new perspectives. Reflection can inform practices and strategies, and guide internal capacity building efforts, but documenting these reflections and lessons learned is critical.

    2. Not all learning is created equal. Implementing ‘simple’ learning practices that are weakly connected to informing practice and within an organizational culture that does not prioritize learning, will probably not improve organizational effectiveness. But implementing real learning (and shifting org. culture) is a long road; leadership is required as Alan is demonstrating!

    To add another resource to the conversation, check out this short video about social learning from learning gurus Bev and Etienne Wenger-Trayner: https://www.youtube.com/watch?v=qvighN3BDmI

    Great start and please join the conversation with more experiences about putting learning into practice!

  • #4055
    Profile photo of Saad
    Saad
    @saad
    Morocco - USA

    Hi everyone, I really appreciate this initiative and the level of discussions and exchange on this topic. I would like to see some grantees contribute with their own experiences on this.

    I could experience in my past experiences some of these aspects from both perspectives, as a recipient of funding or as a reviewer/contributor to select grantees and monitor their implementation of projects.

    I think it’s very important to recognize that this is relatively a new outspoken strategy, a shift in the “paradigm” of grant making where we discuss openly the learning component and put on the table “adaptive learning” as one of the central components and objectives of funding projects. In my prior experiences with a range of International NGOs and UN agencies, and also on the field in developing countries, the learning component is not developed and presented as a central component and this is something that has to become transverse at all levels and be very clear from beginning. This will definitely need a shift in how funding organizations and recipients of grants perceive each other, and we need serious amount of investment in trust, incentives and continuous dialogue between stakeholders to achieve this idea.
    The role of the “Critical friend” that Scott and Brendan refer to above, becomes central in the design of the project especially when differences of language, culture and context bring more complexity to the interaction between the funding organization and the recipient of the grant. It is key to make sure that there is a continuous exchange of information and possibilities of changing the course of the project when needed. The majority of existing frameworks based on Logical framework/matrix developed ex-ante of starting implementation of a project reduce considerably the possibilities of adaptation.

    On this, I think the the GPSA is very innovative in bringing Knowledge and learning as a central component of projects in addition to Monitoring and Evaluation and Capacity building. We have also to recognize that this is not achievable at short term by small organizations because it takes time and resources to redesign the grant-making architecture and internalize the component of learning at all levels. On the other hand, it takes time and energy to make grantees understand that there is space for experimentation and they should openly talk about what works and what doesn’t work. The culture of “Best practices” that has been cultivated for so many decades impacts this new orientation to encourage and ensure that we accept mistakes and we work collaboratively to learn from those experiences to advance the whole system.

    I look forward to continue learning from your experiences and I will invite the GPSA grantees I work with to bring their experiences and share their ideas here.

    Thanks a lot for this initiative.

  • #4056

    Hi every one, governance work is really a challenging one, and very different than any other programmatic approach. The power structure changes when elected bodies changes with new political set up. The government official also transfers from one place to other. When we do facilitation in accountability process of inter phase with these people to community leaders, change happens in service deliver and resource allocation to target citizen. In Bangladesh, national election and local government election occurs in between, that is 2.5 years, electoral power structure between national and local government changes. Though local government is politically neutral, but they have political back support. So if the political identity is aligned with national and local government, then service delivery and initiating accountability is easy. Since the public authorities and political leaders and local government counsels changes with short time, so the process of accountability that project brought is confronted with challenge to make it sustainable. So the impacts depends on how power structure changes.
    Usually, civil society organization adopts tested and proven strategies in the project design, but within the dynamic power structure can not make uniform changes. We capture learning when project activity or intervention ends, we do process reconstruction, what went well and what did not, and why. We also engage consultant, to evaluate from third party the model, how it impact into social changes how well it work in the change process,. We also do study on local context after some interval, how social changes has happened, so we have learning strategy with in projects, and also within organization. And adopt changes in our programs and projects. But the impact that we desire depends on how the context is conducive during project period in governance work.

  • #4065
    Profile photo of Brendan
    Brendan
    @brendan-halloran
    United States

    Thanks for the great contributions to this conversation thus far. We will soon turn to a new set of questions related to learning, but first I want to try to summarize some of the key points made thus far.

    Murad bin Aziz reminded us why learning is important to begin with: advancing responsive and accountable governance is inherently political, and politics is dynamic, shifting through formal and informal processes. If we aren’t learning about these contextual changes, as well as the effectiveness of our approaches, we will be implementing strategies that are not adapted to reality on the ground.

    But if learning is important, the questions remains: what does learning look like? what kinds of learning are most useful. Alan Hudson usefully contrasts ‘simple’, isolated learning practices that are weakly connected to informing strategy and practice, with a more embedded culture of learning that is tightly linked to how an organization does it’s work in a highly adaptive way. There is a danger that the new emphasis on learning will incentivize externally-funded organizations to start to implement lots of learning activities, but in ways that don’t really add up to meaningful changes in how the organization approaches its work.

    However, pursuing an organizational culture and orientation around learning is a long and challenging road. Janet Oropeza outlined some of the ways that FUNDAR is seeking to implement learning, highlighting both the need to carve out spaces for reflection and capture those insights through documentation. She mentioned that while donors have open to adaptation based on learning, the overall structure of funding is still based on short-term time frames for reporting and implementation, incentivizing organizations to focus on outputs rather than learning and adaptation.

    Thus, Scott and Saad emphasized the need for funders to support an enabling environment for learning. This recognizes that the current ways of doing business are deeply entrenched, and once money has been invested in an organization, the incentives – on both sides – to take learning seriously, and use it to adapt strategies and activities, are low. Funders cannot simply say ‘now add learning’, within the context of otherwise rigid funding, M&E, reporting and other requirements. What they may get is lots of ‘simple’ learning. Fortunately, donors are shifting some of their practices (as highlighted by the report by Jenny Ross) and investing in the spaces for dialogue, reflection, ‘critical friends’ and other ways of challenging the status quo and shifting towards organizational cultures and systems that encourage learning. This needs to be matched by longer term, flexible, core funding, if an enabling environment for learning organizations is to be strengthened. This is a long road, and grantee organizations are right to be wary of learning as the newest development ‘fad’. But if this is really the ‘new normal’, and there can be further discussion about what shifts are required by both funders and organizations to really prioritize learning (not just accumulating lots of ‘simple’ learning practices that don’t really change how an organizations works), then we are moving in a promising direction. I look forward to your further insights in this discussion.

  • #4090
    Profile photo of Abhijit
    Abhijit
    @adaschsj
    India

    Hello Everyone,
    This is an interesting question and from my perspective it needs to integrate three different domains or peculiarities. The first of course is related to the field of social accountability, the second to the issue of learning and then that of funders and their priorities. I have been a grounded practitioner for over 25 years and will be making my observations from that vantage point. I bring the perspective of a grantee exposed to the vagaries of shifting donor priorities and a southern practitioner balancing between global evidence reviews, meta-analysis and best practice and community reality and personal insight derived from practice.
    Social accountability is a field which is becoming popular. As it becomes popular the number of players involved in the implementation field is increasing. The ‘dejure’ implementor is often a large NGO ( sometimes an INGO bringing in international expertise) and even the Government ( at least in India), then there are one or two layers of subcontracted ‘defacto’ implementer organisations and finally the community leaders, the citizens who we expect are at the cutting edge of the negotiation process.
    If we now move to the field of learning, I am not exactly sure who is ‘we’ when we say “are we really learning?” If we look at the second proposition – making grant-making practices conducive to grantee learning – the focus moves to the grantee. The grantee again is a many layered institution with desk based managers, field supervisors and field facilitators who work with the community – who strictly speaking doesn’t form part of the ‘grantee’ organogram. In such situations most of the learning, or should I say capacity building moves from the top down. Formal knowledge production and dissemination mechanisms (formal being what is accepted as rigorous and ‘scientific’ by the grant-making community) are specialised functions requiring experts who can produce ‘review of literature’ and ‘summary of evidence’ which is accessible to those who are in certain places and with access to certain languages and technologies.
    However social accountability is a dynamic new field which entails understanding and negotiating power relationships at the community level. Many among the poor and marginalised intuitively navigate these complex fields as a matter of daily survival. Social accountability processes provide a somewhat formal framework for these negotiations in a structured manner. There are many innovations that take place at this cutting edge of practice, there can be many lessons to be learned, but the challenge is how do we do so? Many innovative community leaders and facilitators and excellent speakers but very reluctant to write down their experiences. Often they are the most conversant in a language which has little international resonance. Finally there is also the challenge of getting such experiences recognised as credible ‘knowledge’ in the absence of comparison groups and structured information gathering and analysis by ‘experts’.
    When we refer to the ‘grantee learning’ do we mean learning by the learning of the leadership of the grantee organisation who can synthesise the ‘state of the art’ knowledge for effective application downstream through elaborate logic models, or do refer to the ability of grantee organisations to analyse the results of their own experience into paper in peer reviewed journals. I wonder whether the creation of a lower level (closest to the community) learning loop is a high priority for many higher level (national or International NGOs) organisations. I am possibly sounding cynical but that is the result of years of closely observing donor – grantee – community relationship in which top down capacity building and extractive learning processes do not sufficiently enrich the bottom of the pyramid.
    COPASAH is a Community of Practitioners on Accountability and Social Action in Health(www.copasah.net) , and among its members are those who working closely with communities building capacity and voice to engage assertively with health systems. One of the issues which we have been trying to stimulate a process of learning and sharing that builds upon contextually relevant grounded experience. I will briefly summarise some of the steps that we have taken and some challenges that continue.
    Face to face meeting, talking and sharing are powerful ways of learning, especially for those who share a language and are not the most adept at manipulating the written word. However distances and even the diversity of languages within a country poses challenge. We started using face to face meeting along with internet based communication methods like listserv and web platforms for facilitated thematic discussions among practitioners, much like this platform itself. In India we have started a process of facilitated learning exchanges in which a group of practitioners visit each other’s work areas for a field visit based learning interaction. We have conducted eight such events in the last year and a half and have reached out to over a hundred and fifty practitioners who are now part of a group that communicates more actively with each other over the internet. We are encouraging practitioners to tell their stories through pictures and the current (10th edition) of the COPASAH newsletter (http://issuu.com/copasah6/docs/copasah_communiqu___10th_edition) is exclusively based on such photo-stories. Each of these processes requires patient persuasion but we feel that this encouragement is necessary to provide opportunity for grassroots based knowledge to enter some kind formal knowledge space. We are currently trying to develop a multi-lingual communication platform, which is not based on machine translations, and will allow people across different language competencies to communicate with each other in the South Asian region.
    As we are moving ahead with this process, because of our conviction, we are also facing some challenges, both anticipated and unanticipated. The first challenge is to stimulate what we are calling the ‘story telling’ process where grassroots practitioners are encouraged to put out their stories in written, oral or pictoral form. This requires much more persuasion than we imagined. The process of discussing the stories or case studies requires courage from the practitioners who open their field area to enquiry during the facilitated learning exchange (FLE) process. We have been humbled by the openness of practitioners to feedback during these FLE visits. The entire process requires much more facilitation and persistent encouragement than we had initially anticipated. Grassroots practitioner are far less open in their communication on ICT platforms than we anticipated considering the rapid penetration of internet and mobile telephony in a place like India. At another end we are struggling with the challenge of finding ways in which we can create some ‘credible knowledge’ from such community based practices. For example will photo-stories and case-studies in internet based newsletters considered valid evidence in a field which reifies the expert led randomised control trial?
    Here I would like to come to the field of grant-making and address what I think are some of its current anxieties. Globalisation of development aspirations through processes like the MDGs (and proposed SDGs’) have been an excellent means to try to reduce global disparities and aim at some common aspiration for peoples across different realities. However it has also brought with it the anxiety of scalability and sustainability which seem to be at the core of all grantmaking concerns. Everyone is keen to see demonstrated results within reasonably short timeframes of solutions which can be upscaled. This approach necessarily favours groups with the demonstrated capacity to conduct intervention research and publish them in journals. The emphasis is on finding that ‘intervention’ which works across different contexts and I fear different social accountability ‘tools’ are often viewed in that manner. However social accountability processes fundamentally call for shift in the locus of power from state ‘absolutism’ to shared control. It is fundamentally a political process which includes a complex set of social interactions taking place within a political context, some of which is more long standing but some which can change in the presence or absence of one powerful individual. While there are some predictabilities in the overall process, there are many uncertainities as well. My question for donors is whether they are willing to invest in a longer term learning cycle because there is no certainty that long standing power relations between providers and citizens can be substantively changed within a three year project cycle. And then what do we know about the continuity of this process? Often sustainability is assumed to be a steady state output/outcome of a development intervention, but my engagement with communities and groups for long years has taught me that the very essence of society is change. A successful intervention is one which introduces a social process which keeps adapting to change in a way in which the core values introduced by the intervention abide. This is neither easy nor impossible, but requires careful observation and probably continued tending to guard against dissipation. Are grant-makers willing to invest in these longer term maintenance and learning functions?
    One way that we as CHSJ have tried to continue this long term learning function has been through maintaining relationships and continuing collaborations with our partners. This is possible because we are a country based resource organisation with a pool of partners with whom we have multi-dimensional and long term relationships. It may have been difficult if we were a ‘prime’ in a large grant working many ‘subs’idiary implementers.
    I would like to close my submission with a few suggestions for grant-makers and the larger ‘grantees’ and that is to invest in developing a learning community.
    • Theory driven implementation models with ongoing review which includes implementors across the chain – grassroots implementor to project manager
    • Multiple language cross contextual communication platforms; using ICT platforms but be cognizant of existing digital divides
    • Capacity building for grassroots practitioners for developing and analysing their own stories/case-studies, the stories should also reflect upon the programme theory and changes/lessons drawn during implementation.
    • Praxis based learning opportunities for practitioners; opportunity for interaction between grassroots practitioners, project managers and academics in a spirit of mutual learning.
    • Building long term in country, regional and global relationships/alliances among implementers at different levels which link learning with advocacy since social accountability is fundamentally a form of advocacy action. Implementers who do not have a core interest in the advocacy action will not drive the process with ‘passion’ which is a hallmark of political change processes.
    Much of what I am suggesting will lead field building but may not be possible through individual situation specific grants. However since this is a platform looking for creative solutions I don’t think it is inappropriate. I look forward to your feedback,
    With best wishes,
    Abhijit

    • #4123
      Profile photo of Brendan
      Brendan
      @brendan-halloran
      United States

      In response to the earlier comment posted by Dr. Das:

      Very valid and insightful points. Just a few quick follow up thoughts, emphasizing some specific things you have proposed in this post.

      Thanks you for highlighting the need to be much more specific and inclusive in our discussion of learning. You raise key questions about learning for whom? Just upwards to donors? Just for large/formal/international NGOS? What about learning for grassroots organizations, frontline facilitators and citizen leaders engaged directly in social accountability processes?

      As you note, this suggests a related question about what kind of learning ‘counts’. Does learning just mean formal research or impact evaluations? These can be very useful tools to develop meaningful insights for both funders and practitioners. However, they often inaccessible (and unintelligible) to the vast majority of social accountability practitioners. Worse, they are often used selectively by decision makers to justify pre-existing or simplistic ideas about ‘what works’. Indeed, the ‘what works’ question is inherently problematic, as it too often suggests a ‘magic bullet’ mentality wedded to projects/interventions, rather than longer-term campaigns or other forms of engagement.

      Some may be familiar with Jonathan Fox’s reframing of the evidence of social accountability (available in several languages here: http://gpsaknowledge.org/knowledge-repository/social-accountability-what-does-the-evidence-really-say-2/#.VX_16_lVjKM). In this paper, Dr. Fox demonstrates the limitations of isolated social accountability ‘projects’, and proposes a set of guiding principles that can be applied in contextually-grounded ways by diverse organizations.

      Coming back to the question of learning for grassroots practitioners and citizens, you raise the question of how to facilitate their learning, given that much of the learning going on in the sector may not speak to their experiences or inform their daily engagement. Learning with and for frontline social accountability activists is challenging, and it is encouraging to hear of the innovative work that COPASAH is leading on in this area. I make some related points in discussing building political analysis for social accountability here: https://politicsgovernancedevelopment.wordpress.com/2015/06/09/political-analysis-for-citizen-led-accountability/

      Finally, and related, I appreciate your call to look beyond learning to broader thinking about social accountability practice. As noted above, you are right to question some of the underlying assumptions of the new enthusiasm for learning, which may seek answer the question: what (tool) will work for our next social accountability project? Citizen-led accountability needs to be reframed away from tools and methodologies (which can be useful, but not as a starting point) and grounded in a relational and political understanding of accountability dynamics. I suggest some of these points in a post here: https://politicsgovernancedevelopment.wordpress.com/2014/01/20/thinking-politically-about-social-accountability/.

      Many thanks again for sharing your insights!

      Regards,

      Brendan

      • #4143
        Profile photo of Charlotte
        Charlotte
        @charlotteornemark
        USA (Sweden)

        This discussion really resonates with the arguments put forth strongly at Sida’s Development Talks in Nov. 2012, where a main point was that “the power of monitoring needs to be given back to the development practitioners in the field.” (External evaluations will always need an arm’s length of distance for the sake of objectivity in order to challenge pre-existing mental models and to ensure accountability, but monitoring – in particular – needs to be learning-oriented and endogenously owned). See: http://www.sida.se/english/press/current-topics-archive/20121/results-discussion-at-a-high-level/

        As far as grant-makers go, Sida has been at the forefront of questioning and being self-reflective both in relation to the ‘results agenda’ and in how to best monitor and track support to complex social change. The findings from a 3-year initiative of supporting Sida-funded CSOs in Western Balkans (followed by 3 years of working in similar ways of long-term coaching of Sida-funded CSOs in Turkey) was combined with facilitating a learning process within Sida on what ‘results’ really could be expected when in the longer term social/democratic processes. This in turn led to a new way of framing both reporting and monitoring as a tool for learning. A process of facilitated joint learning between grantee and grant-maker was key to reach a common understanding, and developing a sense of being part of the same ‘system’ of results generation.

        Clearly we were operating in areas of complex and contested social change, often highly politicised, and not necessarily driven just by development “needs” in countries such as Serbia, Albania and later on Turkey. Long-term social change modelling as a basis for ToCs in a politicised climate, rather than vertically constructed LogFrames, were critical to create a mind-set for ‘investigative monitoring’. Investigative monitoring was actually a term we used (should be inherent, but just to make a point). Comparable to investigative journalism, but done more objectively and over a longer period of time againgst set objectives, backed by clear evidence…

        In this sense, Theories of Change (ToC:s) are powerful, because they show us clearly what is assumed. And what is assumed needs to be monitored in order to know whether change interventions work and drive processes forward in a given context…

        Some key lessons (or reflections) at the same Sida Dev Talks panel were made by the World Bank’s Michael Woolcock as he presented his approach to problem-driven iterative adaptation and change trajectories. In his scenario, the question of ‘who’ learns (how) is just as important as ‘what’ is being learned…. See presentations also at: http://www.sida.se/globalassets/global/contact-us/seminars_and_conferences/presentation-dev-talks-08nov12.pdf

        Charlotte

    • #4135
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Hi @adaschsj, thanks for your contribution – longer funding cycles are very important indeed. I would like to add that longer term processes are often disrupted by the need to “tender” (for the grant making function). This throws all good grant making experience and understanding of context, relationships etc out of the window. I can understand tendering for large infrastructure development programs (there are standards that everyone can adhere to), but it clashes with working in complex systems (where relationships are essential to be influential). What do others think about this? Would it be possible to avoid tendering enable more continuity (based on performance of course!)?

  • #4091
    Profile photo of Charlotte
    Charlotte
    @charlotteornemark
    USA (Sweden)

    Thanks to Abhijit Das for kicking off our second week of discussions where we will follow-up on some issues raised so far, digging deeper into what organizational processes and systems are really conducive for learning. Key questions are:

    • Are we using existing systems and processes in a more learning-oriented way? (e.g. through learning-oriented monitoring and evaluation)
    • Or does an increased emphasis on learning incentivize externally-funded organisations to start implementing knowledge and learning activities in parallel with and/or separately from other ongoing implementation processes?
    • How can grant-makers ensure that learning needs are endogenous to the organisations and change processes they support?

    In other words: Are we really learning?

    Abhijit has challenged us to think of who ‘we’ are, pointing out that ‘grantees’ often are multi-layered institutions (from global INGO contractors to sub-contracted units and grassroots leaders) . Formal knowledge production and dissemination and even capacity-building usually happens from the top down. Yet there is scope and need to learn from community practitioners and grassroots leaders –- those who are closest to the dynamic and political negotiations that the poorest “intuitively have to navigate” on a day-to-day basis “as a matter of survival”.

    This makes sense, but it is hard. He points to some interesting practices in this field but also to the challenges of getting grassroots story-telling recorded and fed into the broader knowledge streams.

    Abhijit and others on this Forum have pointed out that pressures to take working solutions to scale can ironically be a disincentive for funding longer-term learning cycles and/or processes of continuous learning. Incentives are for organisations to test models, peer review them, publish findings as policy recommendations, and engage in rolling-them out, rather than engaging in a continuous process of ‘messy’ learning that will require ongoing questioning of existing strategies and mental models.

    • Or does ‘scale’ in the context of social accountability mean scaling up ‘how’ we tap into more relevant evidence from the local contexts and transform it into streams of more accessible and actionable knowledge for a diverse set of stakeholders?

    • And if we need to learn ‘how’ to learn in addition to ‘what’ to learn, how does that affect our existing organizational and project management processes in place?

    We are looking forward to hearing from more participants on these and other issues you want to raise!

    Charlotte

  • #4092
    Profile photo of Elena
    Elena
    @bingvt
    Philippines

    I am relatively new to community development and social accountability, having first completed a career in corporate management, project management, information technology and training and development at a multinational corporation. I have personal experience with ISO certification and Total Quality Management (TQM).

    The objective of learning within a project is to improve project design, operations and/or people skills. In business, this would be called a continuous improvement process (CIP), a concept pioneered by W. Edwards Deming. The core of the effort is to gather feedback from the process and customer/beneficiary and evaluate these against organizational goals.

    Feedback may be triggered anytime a hindrance or difficulty is experienced during project implementation by anyone in the team. A Feedback Form may be devised containing 1> The difficulty or hindrance encountered, 2> What was done to correct the experience and 3> What should be done to prevent recurrence of the hindrance or difficulty.

    Integrating such a Feedback Form in the process of project implementation will not pose a huge burden on the people involved. The Forms are submitted at any time in the project life cycle, then analyzed and reviewed periodically by project management, adopting likely solutions and adjusting the project design or process accordingly. Periodic sharing and discussion of the Feedback Forms result in immediate learning within the team.

    At any time in the project life cycle and at the end of the project, a compendium of these Feedback Forms and the actions resulting from them will form the Lessons Learned.

    If I were a grant maker, I would require that the above continuous improvement process be part of project implementation. I don’t think the above would greatly add to the cost of the project and may even save costs over the term of the project.

    • #4134
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Thanks @bingvt for your contribution. In my experience it is important to involve stakeholders around a project in the learning as well. So not just the grantees, but also citizens, service providers, local government, development partners (donors), other NGOs. In the case of a business – it would mean not just getting feedback from the users of a product, but also from the shops that sell it, the policy makers that govern the product. I am wondering if there is more we can learn from the private sector in this regard?

      This “learning beyond the project” is very important because Social Accountability often requires deep change from all stakeholders. SA projects trigger such change – but this kind of change only materializes in society after many years (e.g. it took 15 years in Indonesia from piloting to full fledged government practice with participatory community development). Going to scale with SA needs to be prepared at the project stage, and might might require mass communication strategies (to “educate” the public and public service about SA). Again an area where we might learn from private sector.

  • #4099
    Profile photo of Varja
    Varja
    @vlipovsek
    Tanzania, East Africa, World

    I’ve been following this thread… and thinking for some days now how to engage. Twaweza, where I work, was one of the organizations included in the report that was presented, and many thoughtful things on organizational learning have been posted here.
    But I am turning something else around in my head… about taking ownership of our own learning.
    Something happened in the institutionalization of M&E (a move which I overall support) that took the fun away from it. To me, the reason to do M&E is really out of curiosity: it’s to ask all those questions about how, where, when, why, among whom something works… or doesn’t. Curiosity equals a lot of fun – because what drives you is not this-or-that requirement, but your own thirst to find something out. That’s the premise of learning.
    I struggle with this question a lot: how do we make questioning, inquiry an integral part of how my organization works, and how to make it fun in the process?
    One way we found that really makes a big difference is through hands-on participation (whether you call it immersion, or human-centered design, or something else). I can tell you that nothing beats learning like getting out there — to the capillaries of where your organization, your project, is supposed to reach. You aspire for rural women to use a mobile phone to report water pump problems? Then you better spend some time with rural women, and really listen to the honest critique you will get about your idea. You want the ministry of education to re-think its latest budget? Then you better hang out in the ministerial corridors of power (or as close as you can get – and don’t dismiss the lowly bureaucrat — they posses a wealth of information on how things get done). Don’t send a consultant, go yourself, and go often. Not because it’s gimmicky and not because you will get “hard data” out of it — but because it will challenge most of your own assumptions. So you will learn. Require this in your organization, of all your core staff, and do it routinely.
    We ourselves, at Twaweza, do some of this – but we could do better.
    And, we have also facilitated this kind of learning for some of our donors/grant-givers (those that were interested); the eye-opening experience for them was invaluable.
    This is not to say that the formal learning and reflection structures are not essential, that grant-giving parameters and practices can’t be improved, etc. But these structures and tools will be so much richer and meaningful if we engage in active, participatory learning.

  • #4102
    Profile photo of Florencia
    Florencia
    @florencia
    Argentina/Brazil

    Hi all, Like Varja I’ve been following the thread and thinking how to engage – a bit of a challenge as I am in the middle of a learning journey across Brazil. So, I’ll share some scattered thoughts to, hopefully, provide more food for thought for threads above:

    1. Thanks Scott, Alan, and Jenny for the shout out to the GPSA’s adaptive learning paper. Lily Tsai, Maria Poli and I worked hard and benefited from many colleagues’ insights to introduce adaptive learning, integrated capacity building and LME into the GPSA’s core grant-making documents.
    Still, there are some concrete points in the GPSA’s RF that I hope would be getting more notice/action. If you are curious search the document for key expressions such as “can grantees explain”, penalize, customize.
    http://issuu.com/thegpsa/docs/gpsa_results_framework/19?e=11259783/8037627

    We knew it wouldn’t be perfect or easy to implement in practice, but did we really have another choice to give grantees a chance given our best collective knowledge at the time? In turn, the shout out should go to the GPSA’s Steering Committee who encouraged us and, ultimately, took the risk.

    2. The Ford Foundation just launched its new grant-making approach. Check it out here.
    http://www.fordfoundation.org/equals-change/post/whats-next-for-the-ford-foundation
    There is a bold focus in the strategy to change grant-making practices in ways many in the field have been asking for (see Jenny’s report).I know first hand that it’s taken a lot of work from many colleagues at the Foundation to move in this direction. Even more, as Darren Walker says, it won’t be easy to turn Ford Forward’s blueprint to practice.
    In this context, those of us who believe in the relevance of learning for action, have a very concrete challenge: we’ll need to move from advocating for learning environments, to more pragmatic efforts to make the most of windows that are opening up to put ideas and ideals into action and results.
    How processes are structured, relationships built, etc, so that the will actually help us learn to get better at achieving our ambitious goals? Darren’s letter points that Ford is not alone in the effort. Copasah, as Abhijit mentions, is a reflection of a pionneer space opened and sustained by OSF’s AMHI and partners (now a broader effort is underway at OSF). I already mentioned the GPSA. All in all, perhaps, the field is moving quicker than many have envisioned, are we learning from these and other examples and ready to innovate & adapt towards concrete solutions to old and new problems?

    3. I agree with many of Abhijit’s suggestions to moving forward. U4 is supporting me and colleagues at UDESC in Brazil to think about learning in social accountability through a related but different approach to reflect with colleagues that have been opening local contracts for 5+ years, capture their under the radar learning about strategies and tactics over time, and contextualize insights. We are making the effort to conceptualize, translate and bridge: from Portuguese to English, but as importantly thinking about researching and communicating what is relevant to colleagues making macro and micro-level decisions, for colleagues approaching the issue globally or locally, strategically or tactically. It’s an experiment and a lot of work that we rarely compute in our LME plans and TORs. Stay tuned for a forthcoming paper with the results of our effort!

    Warm regards and looking forward to ongoing conversations here or in other places we may cross paths,
    Florencia
    @guerzovich

  • #4105
    Profile photo of Anowarul Haq
    Anowarul Haq
    @anowarulhaq

    Hi everyone, it is very interesting to follow the discussion threat. “Are we really learning?” – the questions is critical one. From my own experience, I would say that we often fail to learn what we should really learn. From a grantee’s persepective, we always develop our projects or initiatives following the logframe or result framework that the donors want us to follow. Through this, we start create “an artifical world” where thr project and its activities become more “real” than the “real world”. We often create systems and structures to capture changes around logframe/result framework indicators. The field staff become very busy with collecting information to satisfy the forward stakeholders. Most of the times, we are happy with the “intended outcomes” of a project, but the real learning often comes from “unintended outcomes” which happen because of our engagement with impact population, communities and various stakeholders. The conventional learning mechanism fails to capture these critical learning – what is the context, how our interventions intersect with the context and what changes (both positive and negative and both intended and unintended) do our interventions bring in the context.
    Now, if we want to engage in the “real world” for learning, we need to change our organizational culture, which is more hierarchic in nature, to a more empowering one. We need to empower and build capaties of the field staff, so that they can learn from the communities and stakeholders about changes. The change of mindset is key here from a mindset that we know everything (about the project and how changes should take place) to that people in the communities know the most about their context, livelihoods, institutions and changes that are happening in a context. Its not only about the field staff’s mindest change, but also about the change of mindset of managers to believe that field staff know better about the context than themselves who are sitting in the office. The managers are required to create an enabling environment for the field staff to learn from the “real world” and share both “intended and unintended outcomes”. What I am describing here is to create a reflective culture, which requires investments from the grantee. Now grantees are often constrainted by the short time bound nature of projects, but also by the disinterest of donors for investment in staff’s capacity building. So donors interest and willlingness to invest for learning from the “real world” is key here. On the other hand, grantees should also have organizational mandate to “really” learn and also change.
    Finally, we need to invest further on developing processes and tools to learn both “intended and unintended outcomes” – traditional baseline, M&E reporting using logframe/RF indicators are failing to generate the critical learning that provides what we should do differently if we do the same intervention again. Exploring life histories, change and trend analysis, participatory impact assessment – these are some examples of tools that can generate learning if we use strategically. The most important for a grantee is to structure what we really want to learn – are we clear about our impact population, theory of change, breakthroughs for change, critical pathways for breakthroughs considering the context or “real world” upon which we are operating? These should guide us to answer the question – are we really learning?

    Thanks, looking forward to hear others comments.

    • #4133
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Hello @anowarulhaq, good points about the real world – have a look at my reply to @arifhkhan. My argument is to take external stakeholders into grantee learning circles. In this way stakeholders can gradually start moving as one, complex web (we have started to call it the “SA movement”). Take them to the field, and take them into the grantee learning events. These are some of the best investments we have made in ESAP2.

  • #4106
    Profile photo of Arif H. Khan
    Arif H. Khan
    @arifhkhan

    Hi every one!
    “Are we really learning?” – is a very critical question indeed. For reflecting on this question, first we need to know what exactly referred by word ‘we’ as underscored by Abhijit. In our project we have two hats – grantee and grant maker. We are receiving fund from the GPSA and making grants to local CSOs, who are the real implementers of the project on the ground and facing the real implementation challenges. The learning of our CSO partners often demands making adjustments/modification in the project implantation strategy. Sometimes they want quick decisions to cope with the stipulated timeframe. But changing project strategy has many types of implications e.g. modification of result frame work, adjustment in the budget and so on, which we cannot do overnight and as frequently as demanded. Moreover, in order to adopt learning we need to convince multiple stakeholders i.e. our own organization management, the management of CSO partners, the GPSA team, the TTL of our project, the WB TTL of the government project we are working with, the country office and, in some cases the government counterpart as well. All of these stakeholders are not equally flexible and learning in the same way.

    Moreover, the capacity of CSO partners and our own team to capture ‘real’ learning and make them enough evidence based is also very critical.

    I, therefore, would like to underscore two things- firstly, investing more on capacity building of the implementers so that they can capture real learning underpinned by enough evidence. Secondly, developing a mechanism that enables us to mediate with different stakeholders to adopt learning that is coming from the field.

    • #4132
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Thanks @arifhkhan for your observations and questions. It took me back to the time when I joined ESAP2. There were some pretty negative attitudes towards the program from federal and regional government stakeholders. Two years later – they are all fervent supporters and promoters of SA. Our appoach to their “learning” was as follows:

      RESPECT that they are stakeholders, and have a role to play. Engage with them in this capacity – not just because you have to, but because they have a role to play. It is your job to keep everyone informed about the program and the work of grantees. Make a newsletter, find out how they would like to be kept informed, and follow-up.

      Give them the REAL experience of what grantees are doing and achieving, or try to take them close to it as possible. Invite them on field trips, welcome their critique (there is nothing for good conversations like spending hours in a car together on a rough road); let them take part in training and learning events together with grantees, and give them a clear role; show them videos with testimonies of citizens and service providers on what grantees and SA has been able to achieve (and what not).

      Get INVITED into their meetings and spaces. Show interest in their work and make connections and linkages.

      In sum – extend the learning to stakeholders around the grantees that can enable their projects. Government, other NGOs, development partners etc.

  • #4107
    Profile photo of Jenny
    Jenny
    @jennyross
    UK

    It has been really interesting reading the contributions as part of the research process for the TAI/Hewlett report I spoke to 23 grantees/organisations working in the transparency and accountability sector about their learning practice and what learning was for them. It was really interesting hearing about the progress that people were making and the challenges they faced.

    I would like to re-inforce the point that Varja has made about the importance of curiosity and a desire to understand better being at the heart of good learning. No one can force another to learn or to be interested in learning – there has to be an element of intrinsic motivation – supported by a culture which values questioning and a diversity of views. This is covered briefly in the research report.

    In addition, I agree that there are some tensions between m&e (where the incentives are to show that you have been successful) and learning agendas (which rely on openness about mistakes, more humility, reflection etc). Within organisations often these are seen as separate and there weren’t very good examples of how these have been integrated well within organisational systems and practice. The kinds of practices that Anowarul has outlined which bring these together (participatory impact assessment etc) are not widely used – sharing experiences of these practices could be really useful.

  • #4108
    Profile photo of Dante de los Angeles
    Dante de los Angeles
    @dantedelosangeles

    Hello everyone. This is Dante de los Angeles of Partnership for Transparency Fund based in the Philippines. We are a partner of Concerned Citizens of Abra for Good Government in the implementation of the GPSA-funded “Guarding the Integrity of Conditional Cash Transfer Program in the Philippines” which we call Project i-Pantawid.

    Are we really learning? Adhijit asks who is the “we” in the question. This is an important interjection as we now bring the Forum discussion to project level by drawingfrom the systems, processes and experiencesof our respective projects. I use this to go back to the basics of knowledge and learning and the role of M&E in the context of GPSAfunding social accountability projects wherethe empowerment of citizensis prime (citizen centric) to enable them to engage more constructively in development in general and with government in particular. In the end, all learnings that improve strategies and project effectiveness and program impacts redound to improved well-being of project beneficiaries. No doubt, the “we” in the quintessential “are we really learning” would include learnings of stakeholders beyond the grantees. More specifically, I suggest that in the context of social accountability projects, this“we” would include the beneficiaries, government partners (implementing national government agencies, assisting local governments, and independent supreme accountability institutions), the private sector and the media. And just so that we are rooted to the ground with specific project-level ideas, which seems to be the approach in this segment of the Forum, I would relate learning to (1) what is learning about (learning objective) and whom it is for (learning for whom), and (2) what is the face of learning (K&L products) and how it is produced. I draw both from actual experience in implementing the GPSA-funded Project i-Pantawid entitled Guarding the Integrity of Conditional Cash Transfer in the Philippines and how we have designed its implementation (not enough time has elapsed to rely completely on implementation experience!).

    In a project setting, in GPSA context, we are interested in knowledge and learning not so much for knowledge and learning sake (that is, for some abstract purpose) but for certain concrete objectives that would result in some expected utility or value not only to the project (Project i-Pantawid) but also to the program (Conditional Cash Transfer Program, locally known as Pantawid Pampamilya Program). In our case, this objective is related to improving strategy, practice (tools, processes, documentation) and effectiveness of project interventions and programimplementation. The higher objective is related to scaling – K&L products embodying improved systems and processes for safeguarding the integrity of CCT implementation developed and piloted by the project is scaled up and adapted by the government for nationwide CCT program implementation. The final objective of our K&L is to promote the transformational values that CCT has started to create in project beneficiaries. Indeed the greatest threat to CCT sustainability is adverse public opinion that CTT, as propounded by very vocal opponents of the program, as nothing more than a massive government dole-out. We then need continuous improvement in our processes and systems to help achieve these objectives. Thus, as planned, we have embedded into our implementation strategy continuous learning and knowledge enhancement through the project’s M&E system. This establishes the link between our K&L and our M&E system.

    To some M&E is too formalistic to be friendly to learning or to enable a learning environment conducive to establishing a learning culture. Not to us.To begin with, Project i-Pantawidfaces a “we” that encompass multi-layered, multi-stakeholdergroups with varying and opposing interests, power profiles and leadership structure. In this complex situation, the M&Eis important as anestablished formal process of evidence gathering and reflection that is commonly understood by these sectors. Not to forget, the importance of M&E process is also mirrored internally within the project organization. One is mistaken to assume that project partners with common interest towards effectiveness of project interventions and all professing to value learning, are uniformly agreed on the specifics of when and how to proceed with the business of learning. This process can be complicated, even divisive within the project implantation team when the rules are unclear, when no formal process is installed. The M&E as a formal process of data gathering and rigorous evaluation and reflection avoid this problem. Overall, the M&E’s regular quarterly, semi-annual and annual process of evidence gathering and evaluation serve as the venue for organizing critical reflection that enables adaptive learning on a continuous basis.

    How is K&L and M&E linked in Project i-Pantawid? The Implementation Progress Monitoring (IPM) portion of our M&E would undertakeperiodic tracking of status and results of K&L implementation. What is different is that in addition to doing quantitative reportbased on K&L indicators specified in the Results Framework, our M&E system requires narrative qualitative analysis report that “tells the story” behind progress or lack of progress vis-à-vis targets. We want to learn what factors are facilitating achievement of targets, and what factors are constraining the same.Based on an objective and thorough analysis of each factor, we then derive lessons for improving project strategy or providing specific, action-oriented recommendations to enhance the facilitating factors and remedy the constraining factors, with the end result of increasing project performance. On top of the quantitative and narrative status reports, our learning-oriented M&E system do subject major K&L products, such as the eFDS Training Modules and the M&E Manual itself, to rigorous annual review and assessments in search for continuous improvements and effectiveness for these K&L products. These activities would be reflected in the annual K&L Work Plan of Project i-Pantawid.

    Our M&E system goes beyond conventional M&E that captures only IPM. Our M&E system is comprehensive in that it integrates into it Third Party Monitoring (TPM) and Beneficiary Monitoring (BM). Whereas IPM is looking inward to track and assess performance vis-à-vis indicators contained in the Results Framework and update the baseline figures and targets in the RF, TPM/BM is looking outward to track, assess and help enhance the transparency, accountability, participatory-ness and effectiveness of implementation of the CCT program. What this integration does immediately is harmonize project-wide data gathering and evaluation activities. What it does more importantly is help synthesize M&E evaluation results and learnings from IPM and TPM/BM into a coherent whole that captures the connectedness between IPM and TPM/BM. This process will enable the scaling up of project systems and processes to program application nationwide. After all another major K&L Product of Project i-Pantawid is a Handbook on Social Accountability and CSO Participation in the Implementation of CCT program nationwide. This is included in Year 3 K&L Work Plan of the project.

    Finally, the TPM and BM portions of the M&E are also linked to another K&L product, namely the Training Modules for Enhanced Family Development Session (eFDS). In fact we have designed our TPM and BM to be fundamentally embedded into the eFDS training modules. Learning about PTM and BM as social accountability tools (Forms, Contents, Processes)are subject matters in eFDS and gathering of data is an eFDS activity. We view this as contributing to the empowerment of project beneficiaries. In the longer-term, TPM and BM will build the citizens’ self-confidence and capacity to take on not only action-oriented monitoring but more importantly, collective action to improve their well-being.

    The impetus for systems enhancement comes from many sources. Consider this very recent email exchange between Bing, Project i-Pantawid Training Officer, who initially handles all M&E forms and organizes all data and information generated from eFDS and Cesar, Project i-Pantawid M&E Adviser:

    Bing: “We gather so much information from the Parent Leaders, as part of their application of SAcc, that isn’t included in our RF nor in the implementation of the Pantawid Program. We are preparing to tabulate this PL SAcc, as we tabulate all forms, but the information we will get, and it is a very rich source of information, is neither … captured in IPM nor in TPM and BM.”

    Cesar: “Thanks for your email, which supports the envisioned process of M&E refinement. No doubt we are generating a treasure of info on SAcc. Project M&E is supposed to focus on RF indicators. But that doesn’t prevent us from generating other info as long as: (1) we fully meet RF requirements; and (2) we have the internal capability. I think such other info should feed into: (a) the “E” in M&E; and (b) K&L. We can, but it is not necessary to, revise the M&E objectives.”

    • #4131
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Hello @dantedelosangeles, thanks for being so detailed – it is very helpful. I would like to ask a bit more about documenting learning for the purpose of scaling. This helped you to develop a handbook. I would like to hear more about this. Handbooks tend to be prescritive while practice needs to be reflective and adaptive (working in diverse complex environments, etc). What is your experience in providing guidance beyond “prescription”? I would love to see you handbook – can you make it available to us?

      I also like your views about learning among the SA stakeholders, down to beneficiaries of service improvements. I want to share our intentions to develop a series of booklets for each type of SA stakeholder – citizens, service providers, local governments, elected councils etc… The booklets will describing “what SA can mean for you and how to get involved”. This is our “documentation for scaling” initiative. Your tips would be most welcome.

  • #4110
    Profile photo of Charlotte
    Charlotte
    @charlotteornemark
    USA (Sweden)

    Thanks to Elena, Varja, Florencia, Anowarul, Arif and Dante and Jenny for bringing your insights and rich experience from the Philippines, Tanzania, Bangladesh, and as practitioners and researchers…
    We have tried to dig deeper into the “Are we really learning?” question by looking at our systems and practical examples of how they are being used, or what is preventing us to use them for learning.

    The question of who “we” really are, and how inclusively we use this term has in itself been interesting to reflect further on.

    Drawing on her private sector experience in the past, Elena has pointed to the need for anyone involved in project implementation to – at any point – be able to systematically feed back information for corrective and adaptive course correction to the implementation team. This is nothing new, and is rather seen as an important point in the private sector where it is assumed that if the product is not continuously improving based on feedback, it is actually deteriorating (in line with Total Quality Management principles). This creates a culture where feedback on difficulties is encouraged rather than seen as counter-productive.

    Anowarul also stressed the need to learn from “the real world” – not “the intended world” which our logframes and results frameworks often reflect in line with donor requirements. But this means documenting both intended and unintended outcomes that may go beyond what we intended in the first place. Arif has pointed to some of the challenges related to this type of “real” learning from the field in terms of making it evidence-based so that it can be used to mediate different stakeholder interests. As a CSO grantee as well as a grant-maker to CSO partners, he has pointed to the sometimes conflicting need of sub-granted CSOs to change the results framework based on their own learning, and the need to get the full consortium and stakeholders to buy into such strategy changes what may have multiple implications.

    Dante de los Angeles shared very practical ways in which a GPSA-funded project designed to improve conditional cash transfers in the Philippines has sought to overcome such obstacles by integrating learning in the monitoring and evaluation system, combining reporting against the pre-determined results-framework with narrative qualitative analysis that ‘tells the story’ behind progress or lack of the same. They have also set up a system to integrate third party and beneficiary feedback into the continuous monitoring process so that it includes external views on a continuous basis. Using a formal system of data gathering and analysis can help, he argues, to overcome diverse views among implementing partners on what is important for learning and how learning also from unintended outcomes should be used and fed into both the M&E framework but also inform their knowledge products and training.

    Florencia has among other things pointed us to the Ford Foundation’s new grant-making approach and the very concrete challenge to move from advocating for learning environments to make the most of windows that are now opening up, arguing that “this field is moving quicker than many have envisioned.”

    Like Florencia, Varja and Jenny also reflected on the need to take ownership of our own learning, and the importance of curiosity and the desire to understand – supported by an organizational culture which values questioning and a diversity of views. However, there can also be tensions between M&E and learning, and Jenny points to a tendency among the 23 grantees/grant-making organisations that she interviewed in her research presented during the initial Webinar to see them as separate processes. Clearly this could lead to duplication and simply missed opportunities for learning from M&E. “Something happened in the institutionalization of M&E … that took the fun away from learning,” Varja points out, with a call to get that investigative sense back into the process.

    We look forward to further contributions to this rich debate, and thanks again for taking the time to contribute!
    Charlotte

    • #4130
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Hi @charlotteornemark, nice to see you again (this time virtually). Thanks for summarizing earlier contributions. This is helpful because I couldn’t find the time to participate so far…

      Your summary makes my think “who should learn?” It is important that “we” in the GPSA community learns, but programme/project level learning is also important, and in my view the most important learning that needs to happen is among the SA stakeholders. They need to start behaving differently, thinking differently, collaborating in new ways. Without local stakeholder learning, there can be no improvements in services for the poor. In ESAP2 I try to think about learning at several levels – my own (that’s why I have a blog), that of my colleagues (that’s why we have quarterly and especially bi-annual meetings in our team to distill patterns of change we observe in grantee projects), and that of the stakeholders in Ethiopia (that’s why we have multi stakeholder events every 6 months, where each stakeholder can learn what is relevant from their perspective and for their role in the SA process).

      Just thought I share my thinking with you. (I am happy to finally find some time to dig into this important area of practice! I am actually at a Communications Day for our grantees: sharing and learning about spreading the practice and results of SA in Ethiopia. It’s all in Amharic – so I can read and type away on the GPSA platform.)

  • #4122
    Profile photo of Gilbert
    Gilbert
    @gilbert
    Uganda

    Hi everyone, over the past few month Africa Freedom of Information Centre (AFIC) started a process of elaborating case studies about its work on citizens access to information in different spheres. This exercise has helped us draw a visible connection of how access to information connects to different sectors but also a link between issues at sub national, national, regional and global level yet, many times these different dimensions do not reinforce and compliment each other.

    Are we learning, what, from whom and how are all relevant questions. Project implementation interfaces with an ecosystem of stakeholders and environments each of which presents the need and opportunities for learning. Funders- there policies, approaches and attitudes. Government agencies- their intentions, interests and expectations. Civil society- fears, concerns and interests. Beneficiaries with their expectations, capacities and fears. How are all these affecting implementation, which of these are unique to a sector, context and which ones could be replicated? what factors determine success and why?

    AFIC is implementing a GPSA supported project, “enhancing value for money in public contracts and services in Uganda”. Our learning objective is to establish if and how disclosure and citizen participation increases value for money in public contracts and services. It will also seek to isolate contributing factors in either case.

    • #4129
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Hi @gilbert, I am interested in the case studies you are developing. What guidance do you use? Can you maybe share a ToR or case study format?

      In ESAP2 we are making two case studies on the experience of grantees with Gender Responsive Budgeting and Public Expenditure Tracking. These two tools have been newly introduced in Ethiopia, after having been adapted to the local context. We want to learn how the tool is working out (practice of grantees, and experience of citizens and providers during the process), and what the effects of the experience are in terms of behavior change and service improvement results. The cases can help us improve the SA tools and the way we teach these to other NGOs in the future.

      Thanks in advance for sharing a bit more detail about your approach to developing the cases.

      • #4243
        Profile photo of Marine
        Marine
        @marinestudent

        Dear members,

        We are testing this function.
        Thank you for your comprehension.

        GPSA KP Team

        • #4244
          Profile photo of Marine
          Marine
          @marinestudent

          Dear members,

          We are testing this function.
          Thank you for your comprehension.

          GPSA KP Team

    • #4172
      Profile photo of Judith
      Judith
      @judithnakamannya
      Uganda

      Hi Gilbert, World Vision Uganda is using a social accountability approach called Citizen Voice and Action. We are trying to incorporate PETs into the approach and i see the information you are sharing here can be of help to us. How can we link up and learn together?

      • #4201
        Profile photo of Gilbert
        Gilbert
        @gilbert
        Uganda

        Dear Lucia, thank you for the inquiry and glad that you are doing something on gender. We have an interest in this. Although our GPSA project was originally designed without a gender lens, we have fund that without gender reflection we will leave out key realities of social accountability. This, our stakeholder mapping, assessment of existing public sector citizen engagement mechanisms and designing of training is takes into account gender reflections.

        Regarding your question, our case studies were drafted without following any format. In fact, it has been adaptive. Originally we sought to capture a select of case where access to information has impacted transparency and/or accountability. This was mainly based on the work we have done in Uganda. A peer, outside our organisation noted that while we have done lots of work across Africa and in different sectors and levels, this was not reflected. We thus deleted some of the local cases and brought on board others from our regional work.

        One of our partners looked at the cases studies and said there was format that was being developed to document case studies. We are now adopting our case studies to the format and will be pleased to share updated version. In the meantime please see attached herewith referenced cases.
        Hope you will find them helpful, Gilbert.

        • #4202
          Profile photo of Gilbert
          Gilbert
          @gilbert
          Uganda

          Dear Judith,

          Thank you for the invitation and glad to hear about your interesting programme. Our project is part of the wider work being done under the framework of Uganda Contracts Monitoring Coalition which brings together 18 CSOs to monitor public contracts and services in health, education, agriculture, extractives, works, water and environment. We will be pleased to talk more about our respective work and explore collaboration. If you have some space next week we can link up.

      • #4203
        Profile photo of Gilbert
        Gilbert
        @gilbert
        Uganda

        Dear Lucia, thank you for the inquiry and glad that you are doing something on gender. We have an interest in this. Although our GPSA project was originally designed without a gender lens, we have fund that without gender reflection we will leave out key realities of social accountability. This, our stakeholder mapping, assessment of existing public sector citizen engagement mechanisms and designing of training is takes into account gender reflections.

        Regarding your question, our case studies were drafted without following any format. In fact, it has been adaptive. Originally we sought to capture a select of case where access to information has impacted transparency and/or accountability. This was mainly based on the work we have done in Uganda. A peer, outside our organisation noted that while we have done lots of work across Africa and in different sectors and levels, this was not reflected. We thus deleted some of the local cases and brought on board others from our regional work.

        One of our partners looked at the cases studies and said there was format that was being developed to document case studies. We are now adopting our case studies to the format and will be pleased to share updated version. In the meantime please see attached herewith referenced cases.
        Hope you will find them helpful, Gilbert.

      • #4239
        Profile photo of Gilbert
        Gilbert
        @gilbert
        Uganda

        @Lucia, thank you for the inquiry and glad that you are doing something on gender. We have an interest in this. Although our GPSA project was originally designed without a gender lens, we have fund that without gender reflection we will leave out key realities of social accountability. This, our stakeholder mapping, assessment of existing public sector citizen engagement mechanisms and designing of training is takes into account gender reflections.

        Regarding your question, our case studies were drafted without following any format. In fact, it has been adaptive. Originally we sought to capture a select of case where access to information has impacted transparency and/or accountability. This was mainly based on the work we have done in Uganda. A peer, outside our organisation noted that while we have done lots of work across Africa and in different sectors and levels, this was not reflected. We thus deleted some of the local cases and brought on board others from our regional work.

        One of our partners looked at the cases studies and said there was format that was being developed to document case studies. We are now adopting our case studies to the format and will be pleased to share updated version. In the meantime please see attached herewith referenced cases.
        Hope you will find them helpful, Gilbert.

  • #4126
    Profile photo of Ricardo
    Ricardo
    @ricardowilson-grau
    Brasil

    Colleagues,

    During the Webinar on 3 June on the topic of learning and grant-making, I was intrigued that so very few of the 23 grantees and grant-makers interviewed during the INTRAC research saw any connection between M&E and learning. Why?

    Based on my own experience evaluating a dozen programmes of international development funders (from Ford, IDRC and the Open Society in North America, to Novib and Hivos in Europe and the UN Trust Fund to End Violence Against Women, the World Bank Institute, Action Aid and CARE internationally), I suspect the reason may be that often social change faces so much complexity that conventional M&E is more or less irrelevant. When at the moment of planning you face considerable uncertainty and a dynamic if not volatile work environment, your multi-annual plans are quickly out-dated, and either overhauled to be short-term annual plans or shelved. Thus, tracking and assessing what you actually did and achieved against your original plan does not enable you to learn much more than that your long-term plan did not work, which you already know.

    One alternative to this customary formative and summative M&E is developmental evaluation, which is applied throughout the whole period when a project, programme or organisation is innovating. In the words of its creator, Michael Quinn Patton:

    “Developmental evaluation (DE) provides evaluative information and feedback to social innovators, and their funders and supporters, to inform adaptive development of change initiatives in complex dynamic environments. DE brings to innovation and adaptation the processes of asking evaluative questions, applying evaluation logic, and gathering and reporting evaluative data to inform and support the development of innovative projects, programs, initiatives, products, organisations, and/or systems change efforts with timely feedback.”

    That is, DE supports continual, real time learning.

    Here is a comparison between traditional and developmental evaluation: http://dmeforpeace.org/discuss/exploring-developmental-evaluation-unicefs-pbea-program

    • #4128
      Profile photo of Lucia
      Lucia
      @lucia-nass
      Ethiopia

      Thanks @ricardowilson-grau for making the link with M&E – which we also do in ESAP2. I like your observation about complex change – which is why we make use of large scale learning events that bring all stakeholders into the room. Your reference to Development Evaluation makes me think about the 7 questions we use for learning as the program evolves. I’ll read more about DE – there might be something there for us.

    • #4174
      Profile photo of Brendan
      Brendan
      @brendan-halloran
      United States

      Ricardo,

      The report we worked on about learning and funding practice definitely highlighted some potential tensions and inconsistencies in current M&E approaches. Sometimes they are indeed divorced from the kinds of learning that organizations need to inform their strategies and practices.

      Thanks for pointing us towards more real-time approaches to learning, which I think are promising tools for organizations seeking to be more intentional about gathering data about their work, reflecting, and adapting on a more continuous/iterative basis.

      We make some mention of this in our overview of the report and it’s broader implications here: http://www.transparency-initiative.org/news/funding-learning-and-impact-how-do-grant-making-practices-help-and-hinder-real-grantee-learning

      Great food for thought!

  • #4127
    Profile photo of Lucia
    Lucia
    @lucia-nass
    Ethiopia

    Are we really learning? That question triggers a follow-on question: how do we know that we are learning? For me it is all about our ability to observe change – what is shifting, why? where are real changes happening, why?

    In my experience with http://www.esap2,org.et we have learned a lot about how to learn together with over 100 NGOs and government officials at various levels. Over time we have developed 7 questions that we want to explore in terms of what shifts and changes, and why. The questions relate to: observed behavior changes among citizens and citizen groups / social Inclusion, and factors at play; Responsiveness of local government / service providers to the needs of citizens (in particular women, vulnerable people); Service Improvements and Sustainability – what is in place that may encourage the various stakeholders to sustain the SA processes?

    One of the difficulties we ran into is that our logframe isn’t formulated in terms of what we aim to shift or change – it rather counts (# of citizens and providers trained, # of interface meetings held, #of joint action plans developed…) I wrote a blog about this a couple of months ago: http://beads-passionforfacilitation.ning.com/profiles/blogs/m-e-and-learning-in-logframed-programs – In the blog I make 4 points about learning in logframmed programs:

    1 it doesn’t have to be in the logframe (no donor will argue when you produce results outside of the logframe)
    2 write, shoot and share it widely (about the importance of investing in helping NGOs to document using text, audio, video and pictures, and to share their work)
    3 bring multiple stakeholders together in large scale learning events (because learning and new capacity emerges in relationships)
    4 be very selective with tactical technical inputs (importance of the right advise in small quantities at the right time)

    The blog explains more – here I would like reflect on how we develop the learning agenda. We invest heavily in monitoring visits of our grantees. We prepare a schedule ahead of time so that grantees can prepare to showcase their work and share issues they face. We take a supportive supervision approach, and share relevant experience of other grantees. We document the work of grantees in our monitoring reports. We do this because the quality of reporting of our grantees is poor to very poor, and while some investments can be made to improve that capacity, the quality of grantee reports are never going to be good enough for distilling a deep learning agenda. So we engage directly, and jointly shape the learning agenda as the program evolves.

    Most NGOs practitioners in Ethiopia don’t learn from paper / they learn from sharing, discussing, visiting each other. They are bad at writing, but great at telling stories about their work, at making videos about their work (e.g. we have 125+ short video documents). We have accommodated such type of documentation, for instance by providing training in Participatory Video to interested grantees. We have designed 3 communications awards to encourage our grantees to document. Each award has a theme, so that we jointly develop a rich spectrum of experience from which learning can happen. This year the communication awards and themes are:

    – Participatory Video “Oscar” – theme is behaviour change, shown in a 10 minute PV, by comparing people who are involved in SA to those who are not.
    – SA hero – theme is community service, captured in a two page document and illustrated with 5 pictures, showing how one person has gone out of his/her way to achieve service improvement results for vulnerable groups in the community
    – Most Significant Change Story – theme is service improvements, documented with the MSC format and illustrated with 5 pictures.

    You may notice the overlap with the learning questions referred to above. By using the set of learning questions strategically throughout (e.g. for monitoring, documenting, communication strategy) grantees start to converge in their learning. Twice a year, we prepare large scale learning events. The agenda comes from the patterns we see emerging during our monitoring, in the videos, the questions asked on our Facebook page, etc. The learning events bring together over 500 people and use a “learning benchmark” methodology. I wrote a short blog about it here http://beads-passionforfacilitation.ning.com/profiles/blogs/learning-by-comparing. In this blog – learning by comparing – I describe how we use quarterly reports of grantees to make graphs that compare different projects and trigger sharing and discussion. We also have strategic inputs or “food for thought” with practical exercises. At the end of these events, participants go home with the 3 most important points they learned, and why this is important for them, and what they will do to put it into practice. In subsequent monitoring visits we have observed real change in practice after the learning events. And that’s what is it all about – real learning shifts practice!

    To give you an idea of the type of learning that happens at the bi-annual learning events, I have attached a sample report. (tried to upload more, but when I click “add another file” I am taken to the top of this page…)

    • #4177
      Profile photo of Brendan
      Brendan
      @brendan-halloran
      United States

      Lucia,

      Thanks for sharing your experiences in Ethiopia. It sounds like the experiences you have had trying to facilitate learning with local partners is similar to what Abhijit described with respect to the learning work of the COPSASAH community. There is probably lots you could learn from each other!

      I’m really glad to hear that your learning and reflection is oriented towards supporting local partners to improve their practices. This is different from either just learning and reflection, or learning that only goes ‘upward’ to grant making institution.

      I proposed some ideas about learning to practice in this blog here (https://politicsgovernancedevelopment.wordpress.com/2015/05/28/enhancing-accountability-through-open-government-learning-about-and-leveraging-ogp/). Would be great to get your thoughts. My point (echoing yours above) is that it can’t just end with learning, but really needs to go from learning to practice – and better if that also facilitates coordination and co-strategizing for collective change by multiple actors. How much do your learning efforts with local partners facilitate more collective action across organizations?

      Brendan

      • #4337
        Profile photo of Lucia
        Lucia
        @lucia-nass
        Ethiopia

        Hi @brendan-halloran, thanks for the link to the OGP blog. A fascinating initiative. We are not formally stimulating collective action across partners, because it seems early day for this. I also believe more in local initiative for collective action rather than external drive, so I will hesitate to push unless I hear things come up locally. So far there are two initiatives that we are supporting – one is a group of partners that came together to learn more from each other – we have suggested they might look into the issues their various projects are involved with, and see if this offers scope for regional dialogue with sector officials. Another is a group of 6 executive directors of partners – we facilitate their reflection about future involvement of CSOs in SA: what role, why pays etc…

  • #4147
    Profile photo of Riff
    Riff
    @rfullan
    Switzerland

    Dear all,
    I am Riff Fullan from Helvetas Swiss Intercooperation, an INGO working in around 30 countries in Africa, Asia, Latin America and Eastern Europe. I’m a bit late to this discussion, but I see that much of what I would have contributed has already been mentioned by various other participants.

    I would like to share my reactions to a couple of the threads that resonate with our way of approaching learning, the first being the connection with M&E, and the second being the role and importance of reflection to enable learning (which I will include in a separate post). On the positive side, here at Helvetas we believe that M&E and K&L are – or at least should be – intimately linked. One way we try to cultivate this linkage at the organisational level is to have the responsibility for supporting our overall approach to M&E within our K&L team. We want to promote the learning element of M&E as much as the accountability and steering elements.

    Obviously this is easier to talk about than to do, but we think we are making progress in a couple of ways. One is by trying to strengthen learning within what I would call the standard aspects of our M&E. For example logframes, which like them or hate them, are likely to be with us for some time to come. Some of the disadvantages have already been mentioned, in terms of their potential to absorb disproportionate energy of project staff, the orientation towards indicators that may have limited meanings, etc. For me, one of the biggest disadvantages, aside from the tendency to focus too much on outputs and not enough on outcomes and impact, is the equally strong tendency to drive us into linear ways of thinking.

    We are working to minimize these constraints by trying to ensure that creation of logframes is closely connected to the broader process of developing Theories of Change and related Impact Hypotheses. In a nutshell, the ToC should point directly to a credible IH for a project, from which the logframe should be directly developed. We believe this also makes it easier to create baselines that are meaningfully connected to our goals (especially related to outcomes and eventually impact), as well as being easier for ongoing monitoring to be linked to project IHs. At the end of the day, if our monitoring and evaluation is not helping us to reflect on our IH (to validate it or to suggest that it be changed), then we are not using M&E to help us learn more effectively.

    Another way we are trying to strengthen the learning element of our M&E is to think of multiple ways of doing it. As Lucia recently pointed out, it doesn’t have to [all] be in the logframe. I would add that it also doesn’t have to all be part of a single methodological approach to M&E. We can and should find other ways for our M&E to contribute to learning. For example, during particular moments of the project cycle. We have done a handful of participatory impact assessments (PIAs) over the past couple of years that we think provide valuable windows into community perspectives on the projects being assessed, often providing local partners (governments, NGOs, companies) with insights they do not get through their ongoing interactions with those same communities. These insights can be directly fed into the next phase of the project, or a followup project, depending on which stage the project is at.

    The added value of the reports coming out of PIAs to share the knowledge and learning that is revealed is, in my opinion, quite limited, but that is an observation that can probably be made about most of the reporting one does. However, for those who have been directly involved in the PIA, including the community members whose opinions are sought, it can be transformative, and can certainly contribute to a sense of shared ownership in a given project.

    This is already a bit long, but just to bring things back to the point of this forum, I think grant makers and grant recipients should continue to explore how M&E can be more strongly linked to learning, not by getting rid of the mechanical aspects of M&E, but by trying to streamline them so that core donor accountability requirements are met (i.e. accountability to donors as well accountability of donors to their respective constituencies) at the same time as space is created for more energy to be devoted to learning on the part of all project stakeholders.

  • #4152
    Profile photo of Madina
    Madina
    @madina-aliberdieva

    Greetings from TWISA project in Tajikistan!
    In our context organizational learning meant engaging in a social process where staff and partners directly engaged in the project, were self organized on common objective and sought to explore and refine key concepts on the project related issues. For example setting up a multi- sectoral experts working group on improving service performance indicators in drinking water and sanitation service was a social process where learning took place in a deductive manner. The working group sessions usually started with presentation of existing standards and mechanisms that defined service performance delivery indicators. Through WG discussion it became apparent that while these existing norms and standards look great on paper, the reality may not have always reflected similar state of affairs. WG members had to change the ‘roundtable- discussion’ approach to revision of indicators, to direct field participation and discussion with consumers and service providers. Such reality check contributed to revision of indicators with more realistic and grounded approach. It is also noteworthy that learning took place within the WG members, they learnt from each other. We concluded that learning is a people intensive process and it has less to do with information and communication technology as we initially predicted. The project team had to enable and facilitate the learning environment, be flexible and learn how to adapt to a group dynamics and be attuned to the context.

  • #4153
    Profile photo of Charlotte
    Charlotte
    @charlotteornemark
    USA (Sweden)

    Hi all,
    Thanks for all interesting contributions and thoughts, and for taking the time to take our collective understanding of this field further! For this week’s discussion, particular thanks to @madina-aliberdieva, @rfullan, @lucia-nass, @ricardowilson-grau, @gilbert, @dantedelosangeles, @jennyross, @arifhkhan, @anowarulhaq, @florencia, @vlipovsek, @bingvt, and @adaschsj for contributing.

    This week, there has been a rich discussion on how we use our management systems, or how we can change them, to make sure that we integrate learning into what we do. Why? Because we constantly have to challenge our hypotheses and pre-conceived mental images of what success might look like in the complex social change processes that reflect the way in which a sector or service is governed.

    This has led the debate to the use (and possibly under-use) of monitoring and evaluation — understood in its widest possible sense of ongoing interaction around actionable evidence — particularly in making sure that the power of learning is inclusive of those at the frontline and communities and citizens themselves. Central themes of contributions have been:

    • What and whose learning ‘counts’ in making sure our management systems are responsive and adaptive?

    • And what is our role as practitioners in bringing to scale the ways in which we tap into locally actionable evidence to align stakeholders’ incentives and shift the accountability paradigm closer to the ground?

    Gone are the days when experts could be parachuted into the field with technical solutions without taking the governance challenges of existing decision-making incentives into account and how they align (or not) with those of different groups of citizens.

    Knowledge cannot be divorced from contextually driven evidence if it is to lead to practical learning.

    Both research and practice point to the fact that iterative processes of learning, based on different types of monitoring and ‘sense making mechanisms’ combined with an investigative and open mind-set is needed.

    Many have shared examples of how learning-oriented M&E can be combined with more ‘formal’ project management systems through mixed methods and complementary ways of sharing information – using images, blogs, and other means of interpretation and visualization techniques. But technology cannot in itself drive learning, just like static knowledge products may not inspire ‘investigative monitoring’ if they are too prescriptive and do not allow for experimentation and internalization of new ideas.

    So under what conditions do a focus and investment in learning lead to better results? Are learning organisations and ‘learning systems’ (where both grantee and grant-maker prioritize learning) more effective than those for whom learning is not a priority? What difference does it make on the ground?

    These and other related issues related to linking learning to impact will be the focus during this last week of the e-forum debate. We will seek to wrap up contributions by the 25th of June. A synthesis will be provided and fed into a Dissemination Note for wide circulation.

    We look forward to your contributions and your continued engagement!
    Charlotte

  • #4165
    Profile photo of Riff
    Riff
    @rfullan
    Switzerland

    Dear all,

    Hopefully I will also be able to contribute to the upcoming week’s dialogue, but to follow up on what has already been discussed, another idea that has been raised a number of times in this forum is that of reflection. For Helvetas, this is at the core of our idea of how learning happens. We are putting the finishing touches on our latest Knowledge and Learning strategy and reflection comes up in a variety of ways, including:

    • Learning Expeditions. These are aimed at taking up topics of increasing importance to our work in the field, but may not otherwise get the levels of attention they need for us to understand them more deeply, particularly to understand how we can incorporate them into our programming. Current topics include migration, resilience, behaviour change (initially around hygiene and sanitation), and market systems development. We are at an early stage, but the idea is to combine our thematic experience with what are doing on the ground, and where feasible to partner with one or more academic institutions to strengthen the research angle.
    • Learning through Storytelling. Here we are focusing on encouraging more use of storytelling, particularly using video, but also photos, audio and face-to-face storytelling, to inject more energy into reflection, to allow for the complexity of most of the contexts in which we work to be retained in ways that can be readily shared. We hope this kind of thing will eventually be seen as a standard complement to the documents that we create to fulfil our ongoing reporting requirements.
    • Working out loud. This is a new direction for us, inspired by the work of John Stepper and others (see workingoutloud.com). Our take on it is if we can share more of what we are doing at early stages (the emphasis at the moment is more internal to the organisation than external, but both should be pursued), others will benefit from it in different ways. They not only will be better informed about what we are doing, but they will gain insight into how we are doing it, what our thought processes are, what kinds of methods, tools and approaches we use. Finally, this opens the door to greater networking as well as feedback from others that can improve our work before it solidifies too much. We can do this by using things like blogs to share early thinking, by sharing early drafts of documents we are working on, by engaging in discussion forums like this one, etc. Most of us are not used to working in this way, so it will be a challenge, but the potential to support greater learning is high.
    • Building reflection into PCM. This goes back to the link between M&E and learning, where we try to encourage project staff and partners to take opportunities during the project cycle that are already scheduled (e.g. mid-term reviews, end of phase evaluations, impact assessments) to really reflect on their experiences, to see how different stakeholders have perceived what has happened up to that point.

    To sum up, the idea is to pursue multiple avenues for enhancing learning in the organisation, as well as with partners, all of which are aimed at helping us to do more reflection, and especially to reflect in collaborative ways. Most of them require one thing that has also been mentioned already in this discussion: time. Creating space and time for reflection amongst all development stakeholders (and in various combinations) is at the same time one of the greatest challenges we face in promoting learning (which incorporates multiple perspectives) and one of the greatest ways in which we can do it.

    • #4178
      Profile photo of Brendan
      Brendan
      @brendan-halloran
      United States

      Thanks @rfullan for the great overview of the different ways in which Helvetas facilitates learning and reflection. You might be interested in this short video from some learning theorists/practitioners we have worked with about learning loops and understanding how learning turns into more realized value and impact: https://www.youtube.com/watch?v=qvighN3BDmI

      Regards,
      Brendan

      • #4240
        Profile photo of Riff
        Riff
        @rfullan
        Switzerland

        Thanks Brendan, I think the Wenger-Traynor’s recent work is quite interesting in terms of describing a coherent and iterative process. Although our approach is less explicitly following a trajectory, I would say what we are trying to strengthen is the learning loops they refer to. These are crucial to ensure that learning is less transitory and individually or small-team based.

  • #4182
    Profile photo of Brendan
    Brendan
    @brendan-halloran
    United States

    Dear all,

    As we go into the final week of this forum, I’m amazed at the deep and rich experiences and ideas that participants have shared with us. Thank you all for your contributions!

    In the final week of the forum, we would like to ask for final input on the critical question of “So what?”. In other words, how is learning being translated into improved practices and contributing to more meaningful impacts.

    The risk is, of course, that a new emphasis on learning could result in many isolated learning activities and practices, that don’t play a meaningful role in shaping organizational strategies and ways of working. For example, we can document learning and make time for reflection, but those are only meaningful if they help us improve the way we work. See a short reflection on learning to practice (in the context of the Open Government Partnership) here: https://politicsgovernancedevelopment.wordpress.com/2015/05/28/enhancing-accountability-through-open-government-learning-about-and-leveraging-ogp/

    A framework for thinking about how to how learning gets incorporated in organizational practice and creates value can be found here: https://www.youtube.com/watch?v=qvighN3BDmI

    Clearly, most of the contributions thus far in the forum have been grappling with this very central question of how to connect learning to improved practice and greater impact. We especially welcome contributions on the following questions:

    • How do organizations link their learning efforts to their activities and strategies, and how have these been improved by incorporating ongoing learning?

    • What examples of tangible improvements and impacts can participants share about translating learning into practice?

    • How do organizations understand and track the value and contribution of learning for their broader organizational success and impact?

    To inspire you, a poem on learning from a brilliant colleague:

    Can we learn how to learn?

    We gathered together for campaigns to change the power
    The poor and powerless must be at the top of this tower

    But when we started talking about just how to do it
    Damn, bother, alas we just could not get through it

    Its complex, its context, and its just too linear and technical
    Where’s the people and the solidarity in all this horizontal and vertical?

    Lets give up, its hard, its something you can’t learn
    Lets rather just keep fighting and this jabber adjourn

    But the test of learning is not is it true, is it accurate or rightly defined
    Its rather does it help me understand how the state is undermined

    So I say lets make some frameworks, wonky or not
    They’ll give us hooks to hang our stories on lest they be forgot

    I want to know how you did it in your situation
    Not as a recipe, but as fodder for my rumination

    I wont do the same , its not what’s required
    But knowing your story helps me think about how my government is retired

    So ask not is it definitional, multi level or vertical
    But rather is it useful, transformational, or somewhat medicinal

    Many thanks for helping us learn more about learning!
    Brendan

  • #4200
    Profile photo of Victoria Vasilescu
    Victoria Vasilescu
    @victoria-vasilescu

    Dear all,

    Thank you for such an inspiring exchange of thoughts and experiences on the discussed issue of learning and its impact.

    In our case, the answer to “Are we really learning?” is definitely ‘YES’. Being a national leading ‘Think-Tank’ with more than 10 years of experience and having a team of experts coming from different professional experiences yet with a common educational background and namely – economic studies in the country and abroad; learning and knowledge is at the core heart of our daily activity. It is in our organizational mission to learn, understand and analyze, then share what we have learned with the wide public (especially with the policy makers, Government representatives, other CSOs and international organizations) and make comments and recommendations on a wide range of policy issues in different aspects of economic development of our country in the attempt to influence the course of the policy agenda at the national level.

    The initiative that we have embarked with in the GPSA is also linked to what we have learned while making an analysis of the national education system’s performance and weakening results at that time. Our educational system is currently going through a deep structural reform and we believe that our involvement in the process via implementing the current initiative will have an important contribution to the sought national goals in the ‘Moldova Education 2020’ Strategy and in particular to the increase of transparency and citizens participation. Beginning with December 2013, our Think-Tank ‘Expert-Grup’ has started implementing the “Empowered Citizens Enhancing Accountability of the Education Reform and Quality of Education in Moldova” initiative. We aim to increase social accountability through the inclusion of citizens in monitoring the impact of reforms and budget allocations to education.

    The main challenge for our organization in implementing the current initiative was that we did not have much hands-on experience in the education sector of our country apart from our gained knowledge via our analytical efforts also the concept of SA was rather new to our country. Nonetheless being aware of this constraint from the start we made important efforts to minimize this gap. Apart from creating a tight collaboration and partnership with our Ministry of Education since the beginning of the project and establishing partnerships with regional CSOs with significant experience in community mobilization and collaboration with local governance authorities, we have established a Consultation Board with representative members from the Education Committee of our Parliament, the Children Rights Ombudsman, the Ministry of Education and certain national media representatives. Moreover, we established in the programming phase for this particular initiative to be evaluated both internally and externally. If on the one hand the internal evaluation is based on outcome and output indicators for each component in the project, an independent external evaluation we believe necessary in order to identify elements regarding the efficacy and impact on beneficiary groups. Thus, the evaluation exercise proposes to determine the way in which the project contributes to the implementation of reforms in the secondary education system in the Republic of Moldova.

    The usefulness of the external evaluation consists in the fact the information, which is gathered during the evaluation process, serves as guidance for us in the next stages of the initiative and gives us the opportunity to know whether we need to make any corrective measures to our originally planned actions. We use the same tactic as regards to the SAcc tools that we have developed and apply within the beneficiary communities of our country. The results of social accountability tools applied will serve to map out the situation of individual schools and will feed into the reform promoted by the Ministry of Education. After having developed the named tools and namely public hearings, community cards, independent budget analyses in the educational sector and having consulted them with the key stakeholders in our educational system, we have organized TOT trainings for our regional CSO partners and in concert with our five regional CSO partners we have applied the developed tools during the first year of the current initiative and evaluated and disseminated the results. We have envisaged the need to develop guides and ‘how to notes’ for our regional partners and any other interested party and made them available on-line on the recently launched current initiative’s web-site: http://www.scoalamea.md. Though considering the importance of evaluating the impact of our actions and applied tools we have foreseen the need to update the tools each year and make eventual improvements based on the learned results of bi-annual assessments that we conduct.

    Taking this opportunity we are particularly grateful to Mr. Scott Abrams and Ms. Maria Poli for their important contribution to our effort of updating the applied SA tools.

    Coming from the project management perspective of our organization’s initiatives we follow the common stages of project implementation: development, implementation, M&E at each stage of implementation, where K&L is the core heart of all stages. By M&E we mean also the evaluation of the impact of our actions on our target groups and beneficiary communities.

  • #4205

    My dear friends,

    It is a pleasure for me to participate in his forum. I am also pretty saddened by the fact I only do it now. Brendan has come up with questions which are not easy to answer. That is, indeed, the big challenge of the moment. By the moment one realizes that he has stopped learning, he also realizes that learning is key, learning is pillar and learning needs to be addressed.

    How are we are embedding learning? At Concern Universal we have the practice of using Outcome Journals. Outcome Journals can be physically any means used to track down and log your everyday activities related to your project. Meetings, training sessions, phone calls, pretty much anything you find important to mention. Each member of the team should have one Outcome Journal and send that information to the team’s researcher on a monthly basis. The researcher, on his/her turn, puts that information up together and structures it on a specific manner. The information produced is shared back to the team but is also used to track activities and developments against the baseline. Elements of the collection structure include:
    (i) Project activities and stakeholder responses (demand-supply side sensitive);
    (ii) Description of changes in stakeholders;
    (iii) Description of changes in any of your social accountability processes (e.g. Is there evidence that planning is becoming more effective, more responsive, more socially accountable, is engagement increasing?
    (iv) Other contributing factors (where we describe the role played by other factors (e.g. changes in government, political changes, economic changes) or other interventions;
    (vi) Sources of Evidence;
    (vii) Planned activities that did not happen and why;
    (viii) Expected changes that did not happen and why; and
    (ix) Lessons and proposed amendments to the Theory of Change.

    Another not-less-important role for the Outcome journal is to enable the organization to test whether the assumption made under the Theory of Change really wins. This works better when implementation is gradual and team members have the opportunity to look into what they have done and see if it really is conducive to achieve expected changes and if route amendments are required (including the redefinition of the ToC).

    The answer to these dimensions are merely subjective but requires supportive evidence. This is ideal. Does not really mean we have been able to do as planned. The truth is, in our everyday working lives we have to balance between Learning and results / activities / getting things done and “check” / reporting and so on (…) making learning less of a priority because time is a rare resource these days. It is hard to learn my friends, even when often a Lessons Document is standing right before your eyes. You will always read it later. Is no priority.

    Organizations and projects need to have organizational learning strategies and commitments. Why not? And leadership and management people have the utmost responsibility to care for learning. Organizations should probably appoint a specific person amongst staff or outsource someone to act as some sort of “Learning Officer”, a learning “fosterer” that can ensure that there are processes established within the organizational structure and operations (one) and that these processes are functional and functioning (two).

    Technology can also play a role. Think of using a platform like google.doc or a “Box” to share with team members a learning form (e.g. outcome journal) so that everyone can log their activities and thoughts? This is also why a platform like the K&L is key. Important also to mention that creating a learning system and a learning organization requires some creativity. There is a lot of potential in using ITs (digital storytelling is another example).

    And how can grant-makers help in there? The people with the cash are King because cash is King. It is amazing how people listen to donors. Donors have, thus, the responsibility to tease and foster learning environments. Grant-makers are not giving away cheap money. They also have the responsibility to ensure that money is strategically used, to produce results, to be accountable on it, and to make a change in the lives of the less advantaged people. So grant-makers can and should ensure that proposals include some sort of team-members who are supposed to hold a function only (or mostly) dedicated dedicated to learning. These people are also responsible to either together with some comms person or individually care for sharing that collective learning.

    Victoria Vasilescu’s is right: YES, we are learning. The question ahead is: Are we structuring the lessons we learn everyday? and .. what do we do with it? It is just a thought.

  • #4241
    Profile photo of Riff
    Riff
    @rfullan
    Switzerland

    Dear all,

    I am also sorry to not be able to be more consistently present in this discussion, but I would like to at least partially respond to this week’s topic. Sorry, this is a long….

    From Brendan’s message:

    • How do organizations link their learning efforts to their activities and strategies, and how have these been improved by incorporating ongoing learning?

    I would again mention our efforts to ensure there is an explicit (or at least readily recognizable) Theory of Change and Impact Hypothesis in strategies and project documents, which do not come out of nowhere, but are built on past experience. They should also help to keep us focused on what we and our partners are trying to achieve at different monitoring and evaluation points.

    Another mechanism we use that is more directly aimed at incorporating learning is to require a Management Response to the various recommendations that come up in different analyses (evaluations, impact assessments, etc.). This adds a bit of a burden on the management side, but it at least forces a dialogue at that level (actually different levels ,because it could be in a given country or for the organisation as a whole). The management response is also required to be action oriented, so saying ‘this is a good idea’ is not enough. There has to be a commitment or other pointer to how the recommendation will be taken up (or not).

    As I said, there is an overhead to this, but it does direct needed energy to making the link between what comes out from various reflection exercises, and what goes into subsequent phases or strategies or whatever.

    • What examples of tangible improvements and impacts can participants share about translating learning into practice?

    This is a difficult question for me to answer, but I might share one example of something I think has worked well in terms of learning across country programs. There is a methodology and related tool called Water Use Master Plan, which was developed by our Nepal program, based on long experience in the water sector, and incorporating a multistakeholder approach to water governance. Not only experience, but a lot of investment to develop, refine and PROMOTE the tool, which led to it being taken up and adapted to local contexts in our Pakistan and Ethiopia programs, as well as in a larger regional program in South Asia.

    It’s difficult to say exactly how all of this happened, but it was really the recognition that a powerful participatory approach was developed in a specific context, investment was made to make the approach easier to use through the creation of a 17-step manual, and various informal contacts among staff working in different countries within the organisation. Another element is luck (or maybe it would sound better to say serendipity ;-)

    With the right circumstances and persistence, such things can and do happen.

    • How do organizations understand and track the value and contribution of learning for their broader organizational success and impact?

    I really cannot adequately answer this question, but I believe in storytelling is a way to partially do this. I can think of a story that was shared (not by video, but verbally in this case), about a rainwater harvesting (RWH) technology again in our Nepal program (the biggest and one of the most long-established ones – over 50 years).

    In this case, a Helvetas Ethiopia project leader was at a workshop where a Nepali colleague was also participating. They hadn’t known each other before, but they got talking together and the Ethiopian person was talking about the difficulties they were having in their project to come up with an affordable and practical solution for rural water supply. They got talking about the RWH technology used in Nepal, and he was very excited to hear about something that had not been tried in Ethiopia, but that also sounded quite feasible.

    When they tried it in Ethiopia, there were some initial difficulties with availability and affordability of materials, but they ended up adapting the technology to those conditions and found a viable solution. Again, there is an element of chance to this sharing of knowledge and learning, but without well-developed competence and networking, it would never have happened.

    I suppose part of what I am trying to say is those of us having some formal responsibility for strengthening knowledge sharing and learning tend to think of enabling processes that are well-defined and follow a kind of cumulative path, but we should also try to create conditions where such things are driven by the people who have and need the knowledge (a large part of which involves networking, in my understanding). And of course to encourage and support them to share their stories about such experiences :-)

  • #4242
    Profile photo of Florencia
    Florencia
    @florencia
    Argentina/Brazil

    Today’s Craig Valter’s Devex article seems relevant to the conversation here (https://www.devex.com/news/3-big-problems-with-how-we-think-about-results-and-development-86419).
    It adds another player/approach to the picture of a funding ecosystem in which there are many windows to move towards grant-making practices that are more conducive to grantee learning: DFID’s Smart Rules. https://www.gov.uk/government/publications/dfid-smart-rules-better-programme-delivery (also see this ICAI evaluation http://icai.independent.gov.uk/reports/dfids-approach-to-delivering-impact/)

    It provoked me to challenge some of the points made above regarding the different, at times unintegrated functions of monitoring, evaluation and learning. Perhaps, the challenge/solution is to tackle out disjointed approach to tell ourselves different stories for course-correction, external dissemination, and compliance. Maria and I found this a paradoxical, problematic feature of GPSA applications https://www.thegpsa.org/sa/Data/gpsa/files/field/documents/gpsa_note_5-adaptive_learning.pdf. So special alert for people working on GPSA round 3 applications -;)

    If so, is it a necessary condition, however, that as Craig argues for UK Aid, we come to terms with the notion that we need to take the risk of justifying budgets in more realistic ways? “The public don’t want patronizing accounts of aid, but an honest appraisal of what’s being achieved and for what purpose. If the demand for results is genuine, we need more genuine attempts to answer it”. I think this want applies to many other funders / patrons in the field.

    Scott’s post reminds me that for GPSA grantee’s and applicants when we talk about learning we are also referring to what the GPSA calls capacity building. Again, check out Craig’s paragraph on supporting the reformers, not a silo but an integral part of innovative reform.

    Best
    Florencia

  • #4245
    Profile photo of Florencia
    Florencia
    @florencia
    Argentina/Brazil

    The World Bank’s Independent Evaluation Group just published an evaluation of learning and results in the WB’s operations – another resource to learn to learn and, probably, to create greater space for learning that improves our own practices http://ieg.worldbank.org/evaluations/learning-results-wb-operations2

    IEG’s Director General published a blog post with some of the key takeaways. A sneak peak:

    “The evaluation came up with some important insights: the Bank is a great producer of knowledge, but less so a consumer – staff don’t always draw on research or knowledge from outside the Bank… To meet the needs of its clients the Bank needs to be adaptive, but it is not always as flexible as it needs to be. For example, we identified resistance to early restructuring of poorly performing projects, and we note that piloting problem-driven solutions during the implementation phase can help to fully fit the project to the local context.
    In discussing the Bank’s focus on results, the report provides insights into what works, including the benefits of a close relationship with the client, Bank staff experience, and the relationship between tacit knowledge and project design quality. However, it finds the evidence the Bank uses to evaluate projects at completion is often insufficient to demonstrate that the results observed are attributable to the project. It notes that the Bank is beginning to address several shortcomings by taking steps, for example, to make mandatory reference to lessons learned in decision meetings, to collect baseline evidence early, and to rationalize sector indicators.
    The report’s key recommendation is for the Bank to develop an updated strategy for learning and knowledge. And, in response to the key findings, the Bank should make optimal use of informal learning and tacit knowledge; adjust institutional incentives to promote learning and development outcomes; balance the focus on global and local knowledge; and, promote adaptiveness.”

    You can check out the blog post here http://ieg.worldbank.org/blog/learning-to-learn
    Best,

The topic ‘Are we Really Learning? Making Grant-Making Practices more Conducive to Grantee Learning’ is closed to new replies.

0 Responses on Are we Really Learning? Making Grant-Making Practices more Conducive to Grantee Learning"

How Can I Contribute to the Knowledge Platform

You may contribute to the Knowledge Platform in many different ways: you can send and disseminate your social accountability materials (toolkits, reports, videos, etc.) in the knowledge repository; you can contact, interact and collaborate with other peers and join a global community of social accountability practitioners; you can participate in the different learning and knowledge exchange activities of the GPSA KP such as online courses, thematic forums, webinars and blogs; and you can develop a partnership with the GPSA KP to implement collaborative knowledge activities.

2016 GPSA Knowledge Portal. All rights reserved | Terms and Conditions
To participate in all the learning and sharing activities, you need to be registered Click here to create your account