IBM is hiring for the position of Data Engineer in Bangalore; Hyderabad; Pune; Mumbai, India. Candidates with Bachelor’s/ Master’s Degree are eligible to apply for this position. The complete information, eligibility criteria, and requirements are provided below.
Job Description:
Company Name | IBM |
Position | Data Engineer |
Qualification | Bachelor’s/ Master’s Degree |
Batch | Recent Batches |
Experience | Freshers/ Experienced |
Location | Bangalore; Hyderabad; Pune; Mumbai, India |
Key Responsibilities:
- Implement and validate predictive models, creating and maintaining statistical models focused on big data, utilizing a range of statistical and machine learning techniques.
- Design and deploy enterprise search applications, such as Elasticsearch and Splunk, based on client requirements.
- Collaborate in an Agile environment with scientists, engineers, consultants, and database administrators, applying analytical rigor and statistical methods to address predictive challenges.
- Build teams or develop programs to cleanse and integrate data efficiently and reusably, develop predictive or prescriptive models, and evaluate model performance.
- Develop and maintain data pipelines for batch and stream processing using Informatica PowerCenter or cloud-based ETL/ELT tools.
- Collaborate with business teams and technical leads to gather requirements, identify data sources and quality issues, design target data structures, develop pipelines, create data processing routines, perform unit testing, and support user acceptance testing (UAT).
- Work with the data science and business analytics teams to assist in data ingestion and resolve data-related technical issues.
Eligibility Criteria:
- Design, develop, and maintain Ab Initio graphs for ETL processes, extracting, transforming, and loading data from various sources to target systems.
- Implement data quality and validation processes within Ab Initio to ensure accuracy and consistency.
- Collaborate with data architects and business analysts to understand data requirements and translate them into efficient ETL processes through data modeling and analysis.
- Analyze and model data to optimize ETL design and performance.
- Utilize Ab Initio components such as Transform Functions, Rollup, Join, and Normalize to create scalable and efficient data integration solutions.
- Apply best practices for creating reusable Ab Initio components.
Preferred Skills:
- Preferred Technical and Professional Expertise:
- Knowledge of MS Azure Cloud.
- Experience with Informatica PowerCenter.
- Proficiency in Unix shell scripting and Python.
About Company:
At IBM, we do more than work. We create. We create as technologists, developers, and engineers. We create with our partners. We create with our competitors. If you’re searching for ways to make the world work better through technology and infrastructure, software and consulting, then we want to work with you. We’re here to help every creator turn their “what if” into what is. Let’s create something that will change everything.
How To Apply?
- First, read through all of the job details on this page.
- Scroll down and press the Click Here button.
- To be redirected to the official website, click on the apply link.
- Fill the details with the information provided.
- Before submitting the application, cross-check the information you’ve provided.