We are GBM Surveillance IT. We work within investment banking IT organization and with other global businesses to protect our stakeholders from illegal or prohibited activity across channels, including but not limited to trading activity, audio communications and electronic communications.
We work in small teams, bringing development, testing, business analysis, QA and system reliability together. We have started on our journey towards working in a truly agile, no-silos environment. Our teams write their code, write automated test harnesses for their code, use automated mechanisms to deploy their code, and are accountable for their code in Production, including monitoring to ensure they work uninterrupted while delivering new code. All the while they also work closely with business analysts and business stakeholders to deliver value in a steady, incremental stream.
For us, DevOps is not just about technology automation (automated build, test and release). It is about a cultural focus on working cross-functionally and delivering in small increments. We do need people who are 100% focused on automated delivery and are capable of working in dynamic, cross-functional teams to make this happen!
As we're at the start of our Agile/DevOps journey, joining us will enable you to make a lasting impression in a key regulatory/compliance function in one of the world's largest banks.
The purpose of this role will be to work in a global agile development team helping deliver the first step towards Surveillance’s technology target state. The successful candidate will help develop and manage data processing components and storage technologies to support and improve Surveillance’s way of working in order to achieve strategic objectives.
- Work with the Product Owner to understand stakeholder requirements
- Responsible for building and managing data processing components and storage technologies
- Responsible for onboarding new data sources and constructing data processing pipelines to present required views of data for application developers
- Work with Data Scientists to productionise proactively developed analytics
- Ensure that development is undertaken in line with continuous integration and test driven development philosophies and that code quality standards and processes are adhered to.
- Work with Architects to both validate designs and ensure adherence to the agreed approach
- Carry out unit testing to ensure the quality of delivered components
Who we're looking for?
We are looking for a Data Engineer with experience in Big Data processing technologies and techniques who is comfortable working alongside a strong, international team of engineers, to build applications for a key initiative.
Whilst core skills are listed below, we are mainly looking for passionate people who are looking to continually improve and challenge themselves to work in a highly disciplined, verifiable manner.
Core Skills / Characteristics
- Broad understanding of Big Data technologies.
- Experience with Hortonworks HDP/HDF
- Experience with ELK stack
- Experience with programming languages (Java, Scala)
- Experience with messaging technologies (Kafka)
- Experience with data processing frameworks (Apache Spark)
- Experience of deploying services on Google Cloud Platform
- Experience with programming laguages (Phyton)
- Knowledge of DataFlow, BigTable, BigQuery, PubSub, Kubernetes, Airflow
- Knowledge of machine learning libraries such as TensorFlow