As a Data DevOps Engineer at Accenture you will have the opportunity to work in international projects focused on the growth and development of a leading European asset management company. 

You will join the Data Platform team responsible for designing, implementing and running cloud-native data solutions. You will integrate the core team building the data platform, maintain and enhance the data science studio and/or integrate a Product squad such as ESG, Data Referentials, Trading, Equities, Front Office or Operation squads to be part of Data Transformation. 

Apply for this position if you value diversity, want to influence the success of ambitious projects and work in an international environment of highly qualified specialists. 

What can we offer you: 

  • Flexible work conditions (B2B or employment contract)  
  • Broad range of available trainings covering both soft and technical skills, sponsored access to e-learning platforms and certification paths
  • Access to expert experience from around the globe managed through advanced knowledge management tools
  • Individually assigned mentor and a defined yet flexible career path
  • Opportunity to develop professionally in an international business and technical environment
  • Private medical care, life insurance and employee share purchase program
  • Access to Multibenefit platform (a selection of benefits that include products, services, Multisport card and others)
  • Paid employee referral program
  • Hybrid work with option to work from both a modern, centrally located office and remotely
  • Being a part of a socially responsible company engaged in ESG transformations 



Very well
AzurePowerShell/Shell/Bash

Administrate Data pipelines i.e.: 

  • Administrate Databricks cluster (including sizing, policies, security) 
  • Understand our Data pipeline based on Data Factory or Databricks workflow 
  • Enhance our monitoring capabilities 
  • Understand how changes impact our FinOps costs 
  • Build data distribution capabilities (including APIs, Delta Sharing) 
  • Monitor our API performance relying on a traditional SQL Database backend  
     

Operate a Data Science Environment i.e.: 

  • Manage and maintain Kubernetes cluster (AKS): bicep, ARM template, Helm charts 
  • Maintain Jupyter stack: Jupyter Hub, Jupyter Lab 
  • Update docker images for data scientist including Python Libs 
     

Conduct more traditional DevOps activities i.e.: 

  • Enhance existing and mature CI/CD pipelines for each application and infra component (including Infra, Databricks, AKS, Docker, Data Factory) as code 
  • Automate release pipelines (release notes, auto deploy, simplified approvals, post checks) and improve our release cadence 
  • Building self-service tools to focus on business value tasks 
  • Automate our incident management procedures and problem management practices 
  • Setup telemetry to enhance coding standards following Software factory recommendations 
  • Build resilience of our infrastructure  
  • Operate orchestration through BMC Control-M 
  • Provide data DevOps support to squads building our Data pipelines 
  • Track and follow up incident resolution 


And: 

  • Share your technical solutions within internal developers’ community to increase the company level of maturity on new technologies 
  • Promote software quality and deploy adequate methodology 
  • Participate to Technology meetups and Hackathon events 

  • 2+ years of expertise in DevOps and Software Engineering
  • Deep understanding of Azure infrastructure components
  • Experience with infrastructure as code
  • Scripting - PowerShell, Shell/bash
  • Experience in operating on CI/CD pipelines to ensure regular deployments in different environments
  • Good knowledge of monitoring tools (eg. Azure Monitor, Azure Log Analytics (Kusto), Grafana, ELK)
  • Basic development skills in Python/SQL 

Nice to have: 

  • Release note building from Azure Dev Ops or Jira/confluence
  • Release communication and approval: Microsoft Teams API
  • Azure/bicep knowledge
  • Azure DevOps
  • Spark and Databricks understanding
  • Scheduling management - BMC Control M API  
  • InfoSec concepts: Docker scanning, securing python libraries 
  • Data Modelling, Data exchanges, Data Ops concepts 
  • Change management - Service Now APIs
  • Knowledge of the financial industry, in particular knowledge of the Asset Management industry will be a plus


Packages and extras

  • Leisure package for families
  • Trainings
  • Books
  • Financial bonus
  • Healthcare package
  • Leisure package
  • Healthcare package for families
  • Conferences
  • Equity
  • Language courses

Amenities

  • Cold beverages
  • Hot beverages
  • Integration events
  • Fruits
  • Chill room
  • Lunches
  • Car parking
  • Shower

Accenture Polska

Warszawa
710 000

Accenture jest globalną firmą, świadczącą profesjonalne usługi w zakresie technologii cyfrowych, chmury obliczeniowej i bezpieczeństwa. Dzięki szerokiemu doświadczeniu i specjalistycznej wiedzy naszych ekspertów z ponad 40 branż oferujemy usługi w obszarach: Strategy & Consulting, Song, Technology i Operations przy wykorzystaniu największej na świecie sieci centrów zaawansowanych technologii i inteligentnych operacji. Zatrudniamy 710 000 pracowników, którzy w codziennej pracy wykorzystują potencjał nowych technologii i ludzkiej kreatywności, świadcząc usługi dla Klientów w ponad 120 krajach. Accenture wykorzystuje innowacje do tworzenia wartości i wspólnego sukcesu dla klientów, partnerów i społeczności. W Polsce biura Accenture mieszczą się w Warszawie, Krakowie, Łodzi, Wrocławiu oraz Katowicach. Pracuje w nich ponad 8 600 pracowników. Odwiedź naszą stronę i dowiedz się więcej: accenture.com/pl-pl