Tech Jobs for Talents without Borders
English-1st. Relocation-friendly. Curated daily by Imagine.
4,685 Jobs at 189 Companies

DATA/ML ENGINEER WITH PYTHON

Deutsche Bank

Deutsche Bank

Software Engineering, Data Science
Bucharest, Romania
Posted on Thursday, May 23, 2024

Job Description:

About Regulation, Compliance & Anti-Financial Crime Tech (RCA)

Our RCA team is responsible for protecting the bank from financial and reputational losses incurred by financial crimes by assessing, controlling, and mitigating risks. Risk types related to Anti-Financial Crime are consolidated in a comprehensive and effective risk management framework that covers Anti-Money-Laundering, Sanctions & Embargoes, Anti-Bribery & Corruption as well as Anti-Fraud & Investigations.

Your profile

The position demands technical maturity in terms of design and implementation decisions, with overall vigilance of quality. You must have proven experience developing with the tools stack described below in a collaborative scrum of scrums environment. The key result area of this position is team’s ability to deliver high quality code which is testable, maintainable and meets all the business requirements.

Experience with Hadoop ecosystem, performance optimization, development, deployment and maintaining code in production is required, with a preferable previous exposure to financial services / banking and anti-financial crime concepts. This job is well-suited to people who have problem-solving skills, work well as part of a team, and can manage their time effectively to meet deadlines and keep to projected schedules.

We`ll trust you to:

  • As part of a development team, collaborate with other team members to understand requirements, analyze and refine stories, design solutions, implement them, test them and support them in production
  • Develop and deploy, within a team, big data anti-fraud detection solutions and models using Python and PySpark; ability/willingness to learn how to improve them with machine learning algorithms and models
  • Write code and write it well. Use test driven development, write clean code and refactor constantly
  • Design and develop excellent and understandable server-side code
  • Develop or being willing to learn big data machine learning solutions
  • Collaborate closely with product owners, analysts, developers and testers. Make sure we are building the right thing
  • Define and evolve the architecture of the components you are working on
  • Ensure that the software you build is reliable and easy to maintain in production
  • Help your team build, test and release software with short lead times and a minimum of waste.
  • Work to develop and maintain a highly automated Continuous Delivery pipeline
  • Help create a culture of learning and continuous improvement within your team and beyond
  • Actively support the business strategy, plan and value, contributing to the achievement of a high-performance culture
  • Take ownership for own career management, seeking opportunities for continuous development of personal capability and improved performance contribution.

We`d love you to bring:

  • Experience with Big Data and Hadoop Ecosystem: Spark (Spark SQL, Dataframes, PySpark), HDFS, Hive, YARN, Kerberos
  • Good developer skills utilizing Python and PySpark for processing large data volumes
  • Familiarity with Pandas, Numpy, Scikit-learn and other relevant ML frameworks
  • Good knowledge of working with Databases (e.g. SQL, Oracle, Hive, Impala) and experience in implementing relevant optimization techniques
  • Experience using version control systems such as Git
  • Proficient understanding of distributed computing principles
  • Good knowledge of Unix/Linux environments
  • Awareness/information about DevOps practices like CI/CD
  • Hands on knowledge of Shell Scripting
  • Bachelor’s degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification)

Nice to haves:

  • Experience using Control-M Workload Automation tool
  • Experience working in an Agile setup, practicing Scrum or Kanban
  • Familiarity with Atlassian stack: Jira, Confluence, Bitbucket
  • Have an aptitude for data mining, analytical in the approach to tasks and have a business focus
  • Understand statistical modelling and can apply modelling techniques to analyze data
  • Experience with Google Cloud Platform
  • Versed in working with different data file types in Spark

Our values define the working environment we strive to create – diverse, supportive and welcoming of different views. We embrace a culture reflecting a variety of perspectives, insights and backgrounds to drive innovation. We build talented and diverse teams to drive business results and encourage our people to develop to their full potential. Talk to us about flexible work arrangements and other initiatives we offer.


We promote good working relationships and encourage high standards of conduct and work performance. We welcome applications from talented people from all cultures, countries, races, genders, sexual orientations, disabilities, beliefs and generations and are committed to providing a working environment free from harassment, discrimination and retaliation.

Visit Inside Deutsche Bank to discover more about the culture of Deutsche Bank including Diversity, Equity & Inclusion, Leadership, Learning, Future of Work and more besides.