Deloitte is hiring for Data Engineer | B.E/B.Tech/M.Tech/MS/MBA |  Apply Now!

Join Telegram Channel!

Deloitte is looking for Data Engineer in Hyderabad/ Mumbai/ Pune/ Chennai/ Gurugram/ Kolkata, India. Candidates with a B.E/B.Tech/M.Tech/MS/MBA are eligible to apply for this position. The complete information, eligibility criteria, and requirements are provided below.

Job Description:

Company NameDeloitte
PositionData Engineer
QualificationsB.E/B.Tech/M.Tech/MS/MBA
BatchAny Batch
Experience0 – 4 (Years)
LocationHyderabad/ Mumbai/ Pune/ Chennai/ Gurugram/ KolkataIndia

Key Responsibilities:

  • Develop solutions with an Agile Development team.
  • Define, produce, test, review, and debug solutions.
  • Create component-based features and micro-frontends.
  • Database development with Postgres.
  • Create comprehensive unit test coverage in all layers.
  • Deploy solutions to Docker containers and Kubernetes.
  • Help build a team culture of autonomy and ownership.
  • Work with a Product Owner to refine stories into functional use cases and identify the work effort as tasks.
  • Participate in test case creation responsibilities and peer reviews prior to coding.
  • Review implementation plans of peers prior to their coding.
  • Demonstrate feature work at the end of each iteration.
  • Work from home when desired with infrequent visits to the office and limited travel for planning sessions.
  • Develop our ETL process to be a robust automated production quality solution and lead the implementation and delivery.
  • Peer with the application engineering team to ensure our data model fits the need of the solution while promoting best practices in its design from both a maintenance and performance perspective.
  • Peer with the data science team in understanding their needs for preparing large datasets for machine learning.
  • Assist the team with understanding the execution plan of poorly written queries. Help remediate performance problems by assisting the performance tuning of queries and/or refining the data model to meet the needs of the business.
  • Build data systems and pipelines.
  • Evaluate business needs and objectives.
  • Explore ways to enhance Product / pipeline.
  • Collaborate with Team.
  • Showcase the Skills / Innovative ideas to Team biweekly.

Eligibility Criteria:

  • Strong knowledge of Python & SQL.
  • Hands-on experience with SQL database design.
  • Hands-on experience or Knowledge about Airflow.
  • Knowledge of Docker and Kubernetes.
  • Experience with running containerized microservices.
  • Experience with Apache Spark or AWS EMR.
  • Experience with cloud platforms (AWS, Azure) with strong preference towards AWS.
  • Experience in Database design practices.
  • Technical expertise with Data warehouse or Data Lake.
  • Expertise in configuring and maintaining PostgreSQL.
  • Experience performance tuning queries and data models to produce the best execution plan.
  • Strong experience building data pipelines & ETL.
  • Experience working on an Agile Development team and delivering features incrementally.
  • Experience with Git repositories.
  • Working knowledge of setting up builds and deployments.
  • Experience with both Windows and Linux.
  • Experience demonstrating work to peers and stakeholders for acceptance.
  • Ability to multi-task, be adaptable, and nimble within a team environment.
  • Strong communication, interpersonal, analytical and problem-solving skills.
  • Ability to communicate effectively with nontechnical stakeholders to define requirements.
  • Ability to quickly understand new client data environments and document the business logic that composes them.
  • Ability to integrate oneself into geographically dispersed teams and clients.
  • A passion for high quality software. Previous experience as a data engineer or in a similar role.
  • Eagerness to learn and seek new frameworks, technologies, and languages.
  • Commitment to working with others and sharing knowledge on a regular basis.

Preferred skills:

  • Experience working with Azure DevOps, JIRA or similar project tracking software.
  • Experience working in a startup environment.
  • Experience with data streaming such as Apache Kafka, AWS kinesis, Spark Streaming, or similar tools.
  • Experience with many other big data technologies at scale. Experience with BigQuery or similar (Redshift, Snowflake, other MPP databases).
  • Knowledge of when to use NOSQL versus traditional RDBMS.
  • Experience with RDS in AWS a big plus.
  • Kafka, RabbitMQ or similar queueing technologies a plus.
  • Experience with BI tools such as Tableau and Jaspersoft.

About Company:

Deloitte drives progress. Our firms around the world help clients become leaders wherever they choose to compete. Deloitte invests in outstanding people of diverse talents and backgrounds and empowers them to achieve more than they could elsewhere. Our work combines advice with action and integrity. We believe that when our clients and society are stronger, so are we. Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited (“DTTL”), its global network of member firms, and their related entities. DTTL (also referred to as “Deloitte Global”) and each of its member firms are legally separate and independent entities. DTTL does not provide services to clients. Please see www.deloitte.com/about to learn more.

How To Apply?

  • First, read through all of the job details on this page.
  • Scroll down and press the Click Here button.
  • To be redirected to the official website, click on the apply link.
  • Fill the details with the information provided.
  • Before submitting the application, cross-check the information you’ve provided.

To Apply: Click Here

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!