dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate, we make it possible for our merchants to make inroads into the world’s fastest-growing, emerging markets.
By joining us you will be a part of an amazing global team that makes it all happen, in a flexible, remote-first dynamic culture with travel, health and learning benefits, among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders, we never run from a challenge, we are customer-centric, and if this sounds like you, we know you will thrive in our team.
As a Data Engineer, Technical Referent, you'll be a strategic professional shaping the foundation of our data platform. You’ll design and evolve scalable infrastructure, enable data governance at scale, and ensure our data assets are clean, reliable, and accessible. You will be a go-to expert, mentoring other engineers and influencing architectural decisions across the company.
Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders, we never run from a challenge, we are customer-centric, and if this sounds like you, we know you will thrive in our team.
flexible, remote-first dynamic culture with travel, health and learning benefits, among others.
What will I be doing?: * Architect and evolve scalable infrastructure to ingest, process, and serve large volumes of data efficiently * Lead improvements to existing frameworks and pipelines to ensure performance, reliability, and cost-efficiency * Establish and maintain robust data governance practices that empower cross-functional teams to access and trust data * Transform raw datasets into clean, usable formats for analytics, modeling, and reporting * Investigate and resolve complex data issues, ensuring data accuracy and system resilience * Maintain high standards for code quality, testing, and documentation, with a strong focus on reproducibility and observability * Stay current with industry trends and emerging technologies to continuously raise the bar on our engineering practices
What skills do I need?: * Bachelor’s degree in Computer Engineering, Data Engineering, or related technical field * Proven experience in data engineering or backend software development, ideally in cloud-native environments * Deep expertise in Python and SQL * Strong experience with distributed data processing frameworks such as Apache Spark * Solid understanding of** cloud platforms (AWS and GCP)** * Strong analytical thinking and problem-solving skills * Able to work autonomously and collaboratively, balancing hands-on work with technical leadership
Nice to have * Experience designing and maintaining DAGs with Apache Airflow or similar orchestration tools * Familiarity with modern data formats and table formats (e.g., Parquet, Delta Lake, Iceberg) * Master’s degree in a relevant field * Prior experience mentoring engineers and influencing architectural decisions at scale