Deutsche Bank is hiring for the position of Associate – Data Engineer in Pune, India. Candidates with Bachelor’s/ Master’s degree are eligible to apply for this position. The complete information, eligibility criteria, and requirements are provided below.
Job Description:
Company Name | Deutsche Bank |
Position | Associate – Data Engineer |
Qualifications | Bachelor’s/ Master’s degree |
Batch | 2020/ 2021/ 2022/ 2023/ 2024/ 2025 |
Experience | Freshers/ Experienced |
Location | Pune, India |
Key Responsibilities:
- Work within software development applications as a Data Engineer to deliver fast and reliable data solutions for data warehousing, reporting, customer intelligence, and business intelligence.
- Collaborate with service and backend engineers to integrate data from legacy IT systems into designed databases and make it accessible to consuming services.
- Design and set up databases, data models, ETL processes, and data transformations, with a focus on critical online banking business processes, customer intelligence, financial reporting, and performance controlling.
- Contribute to data harmonization and data cleansing initiatives.
- Continuously learn and apply new technologies and programming languages in a rapidly evolving environment.
- Build highly scalable solutions that perform reliably under high-load scenarios.
- Develop and manage applications independently in collaboration with your team.
- Work closely with product owners and team members to design and implement data analytics solutions, providing support during the conception phase of products and solutions.
- Identify processes requiring high manual effort and automate them to optimize operations and free up time for further development.
Eligibility Criteria:
- Strong hands-on experience in developing scalable data engineering pipelines and performing data engineering/modeling using Java and Python.
- Excellent knowledge of SQL and NoSQL databases.
- Proven experience working in fast-paced and Agile environments.
- Working knowledge of public cloud environments.
Preferred Skills:
- Experience with Dataflow (Apache Beam), Cloud Functions, and Cloud Run.
- Proficiency in workflow management tools such as Apache Airflow and Composer.
- Ability to write clear, well-documented code and maintain it using version control systems (e.g., GitHub).
- Knowledge of GCS Buckets, Google Pub/Sub, and BigQuery.
- Strong understanding of ETL processes in Data Warehouse/Data Lake environments, including automation.
Nice to Have:
- Knowledge of provisioning cloud resources using Terraform.
- Proficiency in Shell Scripting.
- Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
- Familiarity with Google Cloud Monitoring & Alerting.
- Knowledge of Cloud Run, Dataform, and Cloud Spanner.
- Understanding of Data Warehouse solutions such as Data Vault 2.0.
- Familiarity with New Relic monitoring tools.
- Strong analytical and conceptual thinking skills.
- Excellent communication, independence, and initiative, with the ability to work effectively in Agile delivery teams.
- Experience collaborating with distributed teams, particularly across Germany and India.
About Company:
At Deutsche Bank, we give original thinkers the space and support they need to shine. Merging local knowledge with global vision, in-depth insight with industry-leading digital expertise, if you’re an innovator by nature, we can help you to unleash your potential. We see things differently at Deutsche Bank – and we’re proud of our fresh perspective. Today, we’re driving growth through our strong client franchise, investing heavily in digital technologies, prioritising long-term success over short term gains, and serving society with ambition and integrity.
How To Apply?
- First, read through all of the job details on this page.
- Scroll down and press the Click Here button.
- To be redirected to the official website, click on the apply link.
- Fill the details with the information provided.
- Before submitting the application, cross-check the information you’ve provided.