Inside the TA Room: The Human Side of Implementation Research


Over the past year, teams in the uBoraBora portfolio have launched a range of studies to uncover the ‘why’ and ‘how’ behind their foundational learning programs. Their findings have sparked conversations internally and have been shared externally at forums like the Comparative and International Education Society (CIES) conference, influencing how practitioners think about evidence.

But in implementation research, the process is just as important as the outcome. In this blog, we will take you behind the scenes to share the secret ingredients that contributed to making uBoraBora.

Today’s focus is : our approach to Technical Assistance.

Technical Assistance (TA) is a hands-on partnership built on trust. It goes beyond reviewing surveys from a distance or simply analysing data to fostering a relational approach that covers all the key factors within the team. This is because every organization has unique interests and experiences. 

Our support is tailored to offer help when and where it’s needed most. The role played when offering TA is not to take over the research but to be a thought partner and a helping hand, building capacity so that teams can confidently lead the work themselves in present and future research endeavors.

So, what does this look like in practice? Let’s step inside the TA room.

Starting with the 'Why': From Ideas to Action

Our support often starts at the ideation phase, which is usually the most challenging part.

Teams need to pinpoint the exact problem they want to solve and the best solution to test, all while balancing logistical, budgetary, and time constraints.

Our first step is to help teams systematically organize their existing data and assumptions. The questions we seek to explore here are: what do we already know, and what’s missing?

How this looks like: 

  • Sometimes, this looks like a collaborative session filling a Miro board with everything that isn’t working in a program. Together, we investigate and validate these challenges to ensure the final research questions are targeted and grounded in reality. 

  • For other teams, we’ve conducted a Design Sprint, rapidly testing potential solutions in short, iterative cycles. This helps teams test their underlying assumptions and choose a promising intervention to explore at scale.

Assumption building

FHi’s miro board: https://miro.com/app/board/uXjVK4-wgNo=/

Design sprint

Session 1: Research Design Sprint

Designing the 'How': Tailored Research for Real-World Problems

Once a team narrows down the problem and a potential solution, it's time to design the study. Implementation research doesn’t have a one-size-fits-all blueprint. It draws on a diverse range of methods—from surveys and focus groups to cost-effectiveness analysis—to tackle complex problems.

Here, our TA focuses on helping teams choose the right research design and tools to collect the data they need, building on the gaps identified during the ideation phase.

Making Sense of the Mess: Turning Data into Insights

When it comes to analysis, implementation research often involves messy data. We know that when organizations collect routine monitoring data, the primary goal is to improve service delivery, not necessarily to conduct research studies. Limited resources, shifts in program delivery, and varied staff capacity can all affect data quality. The result is often implementation data stored in different formats, incomplete indicators, and stacks of interview transcripts.

Our role is to help teams dig into this messy data and make sense of it. We work together to build an analysis plan step-by-step, ensuring the evidence that emerges is relevant, informative, and useful for implementers.

Beyond the Report: Making Insights Useful

Implementation research holds powerful answers for improving education, but too often, those answers are buried in dense documents, inaccessible to the implementers, funders, and school leaders who need them most. This creates a gap between evidence and practice.

In collaboration with the Brink team, we offer the Evidence Studio to change this. We synthesize and package research in ways that are practical and engaging. This includes (but is not limited to):

  • Short video stories that highlight a grantee's journey and outcomes.

  • "How-To" Guides that break down the steps for applying evidence in practice.

  • A podcast series featuring conversations about how evidence informs action.

As our grantees finalize their studies, we look forward to sharing these products widely.

More Than Just Methods: The Power of Partnership

Beyond the technical work, our most important role is to be a continuous thought partner. Research is hard, and roadblocks are inevitable. We’ve set up different ways to help teams get through challenges and celebrate the small wins along the way.

  • We hold regular Milestone Reflections to talk through bottlenecks in the research process, offering a gentle nudge and tangible action points to keep the work moving forward.

  • With Lightning Talks, we connect grantees with experts in the field or with each other to exchange ideas. This cross-pollination is especially useful for brainstorming creative solutions to identified problems.

  • Sometimes, challenges are internal. A research team and an implementation team might have conflicting priorities. We can act as a neutral facilitator, ensuring all perspectives are heard and reflected in the research design.


Meet the team behind Technical Assistance(TA) at uBoraBora

Now that you know the ins and outs of the TA room, we want to introduce the people enabling this work. The uBoraBora technical team from Laterite, a research firm rooted in Africa, includes three core members who work hand-in-hand with the BRINK team to ensure our grantees get the right support at the right time.

  • Oliver Budd, MSc, is our project lead based in Kenya. He has led a portfolio of impact evaluations and quantitative education research projects, working with organizations like Rising Academies, CcHub, and the Mastercard Foundation. For uBoraBora, Oliver led a landscape analysis of implementation research and foundational learning, conducting over 30 key informant interviews. His technical understanding of full-cycle research projects has been crucial for allocating time and resources effectively. Oliver supports Meerkat Learning and Justice Rising.

  • Ilse Peeters Salazar, MSc, is our research associate based in the Netherlands. An economist by training, she has a special interest in quantitative research and education. She led research design and analysis in a range of education research in East Africa. For uBoraBora, her analytical expertise has played a key role in supporting grantees from the design and pre-analysis plan all the way to delivering and presenting their findings. She also co-led the project's landscape analysis. Ilse supports Building Tomorrow, Impact Network, and Rising Academies.

  • Ayumi Uchiyama, MSPH, is our analyst based in Sierra Leone. Trained in public health, where implementation science has deep roots, she brings experience conducting implementation research in LMICs and analyzing large-scale survey data. For uBoraBora, she hosted an under-the-hood workshop on research ethics and developed an internal guideline that helped grantees navigate ethics questions, from informed consent to institutional review board (IRB) applications. Ayumi supports VVOB and FHi360.

Each grantee is typically supported by one member of the Laterite team and a member of the uBoraBora team, who manages the project and ensures everything stays on track.


Ultimately, our approach is built on a simple belief:

The most effective implementation research happens when it is led by those closest to the work. By acting as thought partners, facilitators, and supportive critics, our goal is to empower organisations not just to find answers, but to build a lasting capacity for learning and evidence-based adaptation. The insights belong to the teams and we’re just there to help uncover them.

Ayumi Uchiyama

Ayumi is our analyst based in Sierra Leone. Trained in public health, where implementation science has deep roots, she brings experience conducting implementation research in LMICs and analyzing large-scale survey data. For uBoraBora, she hosted an under-the-hood workshop on research ethics and developed an internal guideline that helped grantees navigate ethics questions, from informed consent to institutional review board (IRB) applications. Ayumi supports VVOB and FHi360.

https://www.linkedin.com/in/ayumi-uchi/
Next
Next

Going Beyond "What Works" in Implementation Research: Understanding the "How" & "Why