Deloitte is hiring for Data Engineer | B.Tech/ M.Tech/ MS/ MBA | Apply Now!

Join Telegram Channel!

Deloitte is looking for Data Engineer in across India. Candidates with a B.Tech/ M.Tech/ MS/ MBA Degree are eligible to apply for this position. The complete information, eligibility criteria, and requirements are provided below.

Job Description:

Company NameDeloitte
PositionData Engineer
QualificationsB.Tech/ M.Tech/ MS/ MBA
Batch2018/ 2019/ 2020/ 2021/ 20222023
Experience0 – 4 (Years)
LocationMultiple location

Key Responsibilities:

  • Create solutions in collaboration with an Agile Development team.
  • Create, test, review, and debug solutions.
  • Create micro-frontends and component-based features.
  • Create thorough unit test coverage across all layers.
  • Deploy Docker containers and Kubernetes solutions.
  • Assist in developing a team culture of autonomy and ownership.
  • Work with a Product Owner to refine stories into functional use cases and assign work as tasks.
  • Prior to coding, participate in test case creation and peer reviews.
  • Examine peers’ implementation plans before coding.
  • At the end of each iteration, demonstrate feature work.
  • Work from home when possible, with only infrequent visits to the office and only short trips to the office for planning sessions.
  • Develop our ETL process to be a strong automated production quality solution, and then lead the implementation and delivery.
  • Collaborate with the application engineering team to ensure that our data model meets the needs of the solution while promoting best practises in its design from the standpoints of maintenance and performance.
  • Collaborate with the data science team to better understand their requirements for preparing large datasets for machine learning.
  • Assist the team in comprehending the execution strategy of poorly written queries.
  • Aid in the resolution of performance issues by assisting with query performance tuning and/or refining the data model to meet the needs of the business.
  • Create data pipelines and systems.
  • Examine your company’s needs and goals.
  • Investigate ways to improve the product or pipeline.
  • Biweekly, present skills and innovative ideas to the team.

Eligibility Criteria:

  • Strong understanding of Python and SQL.
  • Hands-on SQL database design experience
  • Hands-on experience or knowledge of airflow is required.
  • Docker and Kubernetes knowledge
  • Working knowledge of containerized microservices.
  • Knowledge of Apache Spark or AWS EMR.
  • Extensive experience with cloud platforms (AWS, Azure), with a strong preference for AWS.
  • Knowledge of database design practises
  • Technical knowledge of a data warehouse or a data lake
  • PostgreSQL configuration and maintenance expertise.
  • Experience tuning queries and data models for performance to produce the best execution plan.
  • Strong knowledge of data pipelines and ETL.
  • Working on an Agile Development team and delivering features incrementally is a plus.
  • Knowledge of Git repositories
  • Working knowledge of building and deploying software
  • Working knowledge of both Windows and Linux.
  • Extensive experience presenting work to peers and stakeholders for approval.
  • Ability to multitask, adapt, and be nimble in a team environment.
  • Strong communication, interpersonal, analytical, and problem-solving abilities are required.
  • Ability to effectively communicate with nontechnical stakeholders to define requirements.
  • Quickly grasp new client data environments and document the business logic that makes them up.
  • Ability to fit into geographically dispersed teams and clients.
  • A strong desire for high-quality software. Previous experience as a data engineer or in a related role is required.
  • A desire to learn and explore new frameworks, technologies, and languages.
  • Regular commitment to working with others and sharing knowledge.

Preferred skills:

  • Working knowledge of Azure DevOps, JIRA, or similar project management software.
  • Working experience in a startup environment.
  • Knowledge of data streaming tools such as Apache Kafka, AWS Kinesis, Spark Streaming, or similar.
  • Extensive experience with a variety of other big data technologies at scale. BigQuery or similar experience (Redshift, Snowflake, other MPP databases).
  • Understanding when to use NOSQL versus traditional RDBMS
  • AWS RDS experience is a big plus.
  • A knowledge of Kafka, RabbitMQ, or similar queueing technologies is advantageous.
  • Knowledge of BI tools such as Tableau and Jaspersoft.

About Company:

Deloitte is the brand under which tens of thousands of dedicated professionals from independent firms around the world work together to provide audit and assurance, consulting, risk and financial advisory, risk management, tax, and related services to select clients. Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), comprises these firms. Each DTTL member firm offers specific geographic services and is subject to the laws and professional regulations of the country or countries in which it operates.

Each DTTL member firm is structured in accordance with national laws, regulations, customary practise, and other factors, and may secure professional services provision in its territory through subsidiaries, affiliates, and other related entities. Not every DTTL member firm offers all services, and certain services may be unavailable to attest clients under public accounting rules and regulations.

How To Apply?

  • First, read through all of the job details on this page.
  • Scroll down and press the Click Here text.
  • To be redirected to the official website, click on the apply link.
  • Fill the details with the information provided.
  • Before submitting the application, cross-check the information you’ve provided.

To Apply: Click Here

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!