About Inbank
Hello from Inbank! If you’ve ever bought something in three instalments or financed your gadget purchase online or in a shop, there’s a good chance you’ve used something we built.
We’re not here to be another bank. At Inbank, we’re a modular finance platform that helps businesses grow and customers pay with ease. We believe banking should feel invisible, an effortless layer that powers every great digital experience. Today, our financing rails are woven into the flow of 6,000+ leading retailers, giving around half a million customers a one-tap way to pay while turbo-charging merchants’ growth.
There are already 440+ of us across Estonia, Latvia, Lithuania, Poland, and Czechia, and we’re continuing to grow as we help thousands of retailers expand their business and reach millions of shoppers.
Due to our continued growth, we are looking for an experienced Senior Data Platform Engineer to join our Data Warehouse team.
As a Senior Data Platform Engineer, you will be a key individual contributor responsible for designing, building, and operating Inbank’s core data warehouse platform. You will work hands-on with our modern cloud data stack, contribute to architectural decisions, and ensure the platform remains scalable, reliable, and cost-efficient. You will collaborate closely with other engineers, analytics stakeholders, and platform teams, operating with a high level of ownership, technical maturity, and a strong sense of accountability for the solutions you deliver.
What will you be doing?
- Own the end-to-end technical delivery of data pipelines and transformations within the data warehouse domain, building and operating a cloud-native analytics platform on AWS, with Snowflake as the core data warehouse.
- Contribute hands-on across the stack, including:
- Data ingestion from operational systems into Snowflake
- ELT patterns using dbt for transformations, testing, and documentation
- Airflow for orchestration, dependency management, and error handling
- Python for ingestion logic, transformations, and operational tooling
- Exploring and applying GenAI capabilities (e.g. LLM-powered transformations, data quality automation, metadata enrichment, or analytics enablement) where they meaningfully improve product development and data workflows
- Implement and follow architecture standards and best practices for performance, reliability, and cost-efficient Snowflake usage.
- Work closely with the Tech Lead and team to:
- Review and improve data models and pipelines
- Raise the bar for data quality, testing, and maintainability
- Address technical debt and support platform evolution
- Partner with business and analytics stakeholders to translate requirements into well-modelled, tested, and governed datasets, balancing speed, cost, and accuracy.
- Champion best practices in data modelling, pipeline reliability, testing, and governance, ensuring the data warehouse delivers trusted, audit-ready data.
What success looks like in this role:
Within your first 6–12 months, success in this role means that:
- The data pipelines and models you own are stable, reliable, and well-documented.
- Key datasets are accurate, well-tested, and trusted by stakeholders.
- Data quality issues are identified early and resolved systematically, rather than through manual workarounds.
- Operational load is kept low and predictable, with pipelines and models designed to be stable, observable, and easy to support.
- The team views you as a senior, dependable engineer who takes ownership and helps raise the overall quality of the data platform.