Senior Analytics Engineer
Nala
👋 About Us
NALA is building Payments for the Next Billion. Faster, smarter, and fairer transfers for everyone. Since 2022, we've grown our business 120x, grown the team from 9 to 150+, raised $50M+ from top-tier investors, and were named to the Forbes Fintech 50 in 2025.
We operate two core products:
- NALA, our consumer app makes cross-border payments cheaper, faster and more reliable for the global diaspora. Allowing users to send money from the UK, US and EU to Africa and Asia.
- Rafiki, our B2B payments infrastructure, is powering global payments.
Our team includes alumni from Wise, Stripe, Monzo, Revolut, and CashApp — operators who’ve scaled world-class products. We act with urgency, think deeply, and put our customers first always.
At NALA, this isn’t just a job. It’s ownership, impact, and the chance to change global payments forever.
Join us in building Payments for the Next Billion
🙌 Your Mission
As Senior Analytics Engineer, you'll own, uplift and maintain NALA's data transformation layer — the foundation that all reporting, governed metrics, self-serve analytics and AI-powered capabilities depend on. Your work will add semantic richness, structure and governance to NALA's data, ensuring every model is documented, tested and described in a way that both humans and AI agents can interpret and trust. Agentic analytics is arriving fast, and this role exists to help ensure NALA's data foundation is ready for it.
🎯 Your Responsibilities in this Role
- Own the transformation layer (dbt + Snowflake) — refactoring, enforcing best practices, and evolving our data stack to a best-in-class standard
- Take ownership of streaming data pipelines alongside batch transformation — ensuring real-time and near-real-time data flows are reliable, cost-efficient and well-integrated into the broader data architecture
- Establish and enforce coding & agentic coding standards, systematic testing and documentation as CI-enforced defaults across all data models
- Optimise warehouse performance and cost efficiency, identifying and resolving the query patterns and materialisation choices driving unnecessary spend
- Build the foundation for AI-powered self-serve by ensuring models carry the semantic richness and documentation that agents need to return reliable answers
- Scope and resolve orchestration decisions (dbt Cloud vs Dagster) and own the infrastructure roadmap for the transformation layer
- Support and mentor analysts on analytics engineering best practices, raising the engineering standard across the team
🔥 Must-have requirements
- 4+ years hands-on experience with dbt (ideally fusion) — building, refactoring and maintaining production-grade transformation layers
- Strong SQL, Python and data modelling skills with a clear understanding of warehousing and modern data architecture
- Snowflake or Databricks experience including query performance tuning and cost optimisation
- Deep proficiency with AI-assisted development workflows (Cursor, Windsurf, Claude Code) to force-multiply engineering output and accelerate delivery
- Track record of implementing testing, CI/CD, documentation standards and PR review workflows in dbt projects
- Comfortable owning an infrastructure roadmap — can assess the current state, propose a plan and execute without being directed step-by-step
💪 Nice to have requirements
- Semantic layer experience (Cube, dbt Semantic Layer) and understanding of how governed metric definitions sit on top of a transformation layer
- Familiarity with orchestration tools (Dagster, Airflow, dbt Cloud etc)
- Experience with Hex or similar modern BI/notebook platforms
- Experience in fintech, payments or regulated environments where data accuracy and governance carry real business consequences
- Familiarity with experimentation frameworks and product analytics
✅ Success in the role looks like
3-Month Metrics
- Full ownership of the transformation layer and warehouse, with a clear understanding of the current architecture, cost drivers and priorities
6-Month Metrics
- Full ownership of warehouse coding standards, data architecture and infrastructure
- Measurable improvement in the cost and efficiency of supplying data to the business
- A clear data infrastructure roadmap delivered for the following 6 months
➡️ Interview Process
You will need to first submit your application through our ATS Workable. There is no need to submit a Cover Letter.
If successful you will be selected for our interview process which has 4 stages:
- [30mins] Interview with the Talent Team
- We want to understand your experience and motivations.
- [1hr] Live Assessment with the Data Team
- A practical exercise based on a realistic task, completed and discussed with a member of the data team.
- [1hr] Interview with the Hiring Manager
- A deeper dive into your experience, technical depth and approach to the role.
- [45mins] Senior Leadership Interview
- A final conversation with a member of our senior leadership team to discuss motivations and ask your own questions.
⭐️ Benefits
- 27 Days Off Plus UK Bank Holidays: Take the time to decompress. Working at a startup is hard!
- Birthday Leave: Celebrate your special day with a bonus day off to take off in that month.
- Enhanced Parental Leave: We offer 16 weeks of full pay for the primary caregiver and 4 weeks of full pay for the secondary caregiver (After a 6-month probationary period)
- Enhanced Pension: Salary sacrifice pension scheme via Penfold giving you flexibility and control on how you save for your future!
- Global Workspace: Get access to WeWork locations worldwide.
- Learning Budget: Fuel your growth with $1000 annually for learning and development.
- Sarabi: Themed snacks and Friday lunch focused on building great working relationships with the team.
- Monthly Socials: Join fun social events every month for great times.
- Free Coffee: Enjoy barista-style coffee at your fingertips.