We are establishing brand new Data Analytics team for global range customer from retail area.
In essence, client is leveraging the Azure PaaS platform with all kind of (business) data to build an advanced analytics platform aiming at delivering better insights and applications to the business.
The platforms are continuously being enhanced to support (additional) CI/CD and validated learning environment for science, machine learning and AI capabilities for all areas customer-facing like digital omni-channel interaction and commerce, commerce relevance, personalisation, loyalty and marketing and non-customer-facing like assortment optimization, supply chain optimization, external parties and IoT.
We will be working on end to end functionality including architecture, data preparation, processing and consumption by systems.
As BI/ETL Developer you'll be working with alongside data architects to take data throughout its lifecycle - acquisition, exploration, data cleaning, integration, analysis, interpretation and visualization. You will be creating the pipeline for data processing, data visualization, and analytics products, including automated services, and APIs.
You seek to grow your expertise in the different infrastructures, tools and applications, and stages of advanced analytic workflows. You are inventive and passionate about streamlining and automating data acquisition and possess a highly structured approach to problem solving.
You will be the go-to person for end-to-end data handling, management and analytics processes.
- Connecting to Oracle/other sources through an ETL tool like Qlik replicate
- Create and schedule data flows in tools like Qlik replicate
- Create reusable code for similar objects data migration through tools like Qlik replicate
- Structure data into a scalable and easily understood architecture
- Work in a multi-disciplined team where you'll turn data discoveries and ideas into models and insights. You'll find how to leverage the data and the models to create and improve products for our customers, in lean development cycles.
- Be able to implement/build methodologies as well as (understand how to) scale them together with the businesses;
- Maintain a good, current and demonstrable knowledge of adjacent application and market developments both for inspiration and for benchmarking the concepts.
- MSc in a computational field or another relevant area
- 10+ years industrial experience in the domain of large-scale data management, visualization and analytics
- Hands-on experience on Qlik replicate (Attunity)
- Expertise in advanced data modelling
- Experience with Microsoft data management tools and the Azure platform environment
- Curious, proactive, fast learner able to quickly picking-up new areas
- Experience with agile methodologies
- Perfect communication skills
- Can Do approach!
Who we're looking for?
- Knowledge of ETL/ELT with Data Migration resource.
- Knowledge of connecting to Oracle/other sources through an ETL tool like Qlik replicate (Attunity)
- Know to find the maximum throughput for a given source to target.
- Knows to create and schedule data flows in tools like Qlik replicate.
- Know to create reusable code for similar objects data migration through tools like Qlik replicate.
- Know the rest ratability option of ETL tools.
- Knows simple to medium SQL queries.
- Knows to create reusable code using Python for Migration.
- Worked on real scenarios of data migration across multiple sources and multiple targets
- Proven hands on business intelligence development or data engineering experience;
- Extensive ETL experience;
- Solid experience with data modelling, data warehouse design and data lake concepts and practices;
- Exposure working in a Microsoft Azure Data Platform environme
- Working on cloud-based big data solutions using Hadoop/Spark;
- SSAS cube development;
- Enterprise BI reporting - Power BI;
- Azure DevOps - CI/CD.