Data Developer (Snowflake), Stock Exchange

DataArt
Senior
Up to 19 000
PLN
gross / month (Employment contract)
Up to 20 000
PLN
net / month (B2B)

Online interview
B2B Employment contract
Wrocław Lublin Krakow Remote
Remote possible
100%
Paid vacation
26

We've checked DataArt and we like what we saw. Learn more

Project description

This Europe-based client is one of the world's most established stock exchanges. This is a new project that just started last month and will last several years. We have several teams with about 100 people total operating from the EU with no overtime work expected. The work is split up into small parts based on priority; after a part is released, the team moves on to the next one.

Our scope of engagement includes a modern approach to SOA, including AWS and Microservices stack, data warehousing, modern APIs, frontend development and other areas.

We are using Scrum and scaled Agile framework for the project. Program management, Solution Architects, and Business Analysts are involved on the DataArt side.

DataArt’s specialists focus on data streaming and sourcing solutions to ensure data processing necessary for trading operations.


DataArt is also engaged in transforming business processes and migrating from on-premise hosting to Cloud by re-platforming the entire system and using Cloud native technologies. Part of the work includes the integration of systems and data related to a new asset class into the customer's existing IT landscape.


Technology stack & infrastructure: basics: Java 1.7-1.8, Maven, Spring Boot/Security/Data/etc Git (BitBucket); database part: Snowflake, AWS, Matillion, Python; UI: Angular 2+, JSP; AWS: Anthos/Kubernetes, Snowflake, Matillion, S3, Lambda.

Your tasks

  • Build and maintain various ETL processes for the corporate DWH
  • Improvement and performance tuning of the existing ETL process
  • Help with the deployment and support of newly introduced features
  • Work with the QA team, bug fixing
  • Task estimation

Who we're looking for?

Must have

  • 3+ years of experience with DWH
  • Strong knowledge of SQL and PL/SQL
  • SQL Queries optimization experience
  • Working with dataflows with heterogeneous data sources.
  • Solid experience with ETL/ELT tools or custom pipelines.
  • Data warehouse + data modelling principles knowledge is a must.
  • Good spoken English

Would be a plus

  • Knowledge of AWS
  • Experience with Snowflake, or willingness to learn and work with it
  • Knowledge of Bash scripts
  • Experience with Matillion, or willingness to learn and work with
  • Ability to reverse engineer complex systems
  • Ability to work with documentation
  • Ability to work with versioning tools (Git)
Skills
MS SQL
ETL
How we manage our projects?
Methodology
Agile
Who makes architectural decisions?
Team
Who makes technology stack decisions?
Architect
Project management software
JIRA
Opportunity to change between projects
How we code?
Git
Version control
Style guide
Code review
Pair programming
Static code analysis
TDD
BDD
Code metrics
Knowledge database
How we test?
Unit tests
Integration tests
System tests
Pentests
Performance tests
Manual testing
Test automation
CI
Toolset
Laptop
Additional monitor
Headphones
Freedom to pick your tools
Operating system
Work environment
Open space
Flexible working hours
0 - 24
Office hours
Healthcare
  • Healthcare package
  • Healthcare package for families
Leisure package
  • Leisure package
Kitchen
  • Cold beverages
  • Hot beverages
  • Fruits
  • Snacks
Traning
  • Conferences
  • Trainings
  • Books
Parking
  • Car parking
  • Bicycle parking
Relocation package
  • Language courses
  • Temporary housing
  • Help finding an apartment
  • Visa Services
  • Adaptation tips
Other
  • Shower
  • Chill room
  • Integration events

Our company

DataArt

Wrocław, Lublin, Krakow 5000+
Tech skills
  • Android
  • iOS
  • Cloud
  • Java
  • AWS
  • Azure
  • .NET
  • DevOps
  • JavaScript
  • Python

Check out similar job offers