The Senior Data Engineer should have experience in end-to-end implementation of data-warehousing projects, he will manage, utilize, move, and transform data from our source system and applications data to the cloud to create reports for senior management and internal users. working both independently on assigned projects, and collaboratively with other team members. Build various ETL pipelines among the various tools in play to surface data for consumption by our Reporting tool. Prioritize competing requests from internal and external stakeholders in addition to keeping the reporting infrastructure on par with new product functionality and release cycles. Become a Subject Matter Expert in data classification within the platform and utilize your expertise to identify the most efficient path to deliver data from Source to target as needed.

Core Responsibilities:

  • Design and write excellent, fully tested code to ETL/ELT data pipelines and stream on a cloud platform.
  • Good communication skills, as well as the ability to work effectively across internal and external organizations and virtual teams.
  • Implement product features and refine specifications with our product manager and product owners.
  • Stay familiar with industry changes, especially in the areas of cloud data and analytics technologies.
  • Ability to work on multiple areas like Data pipeline ETL, Data modelling design, writing complex SQL queries etc. and a good understanding of BI/DWH principles
  • Capable of planning and executing both short-term and long-term goals individually and with the team.
  • Understanding of the SDLC (Software Development life cycle) and Knowledge of Scrum, Agile
  • On-call rotation and Production Jobs monitoring.

Required skills and qualifications:

  • 10+ years of development experience building data pipelines.
  • Bachelor’s Degree or equivalent experience is required. Preferred in Computer Science or related degree
  • Minimum of 5 years of experience in architecture of modern data Warehousing platforms using technologies such as Big Data and Cloud, Kafka experience.
  • Cloud experience – any cloud, preferably Bigquery, data flow, pub-sub, data fusion
  • Migration experience, Utilizing GCP to move data from on-prem servers to the cloud
  • Good Python development for data transfers and extractions (ELT or ETL)
  • Experience developing and deploying ETL solutions like Informatica or similar tools
  • Experience working within an agile development process (Scrum, Kanban, etc)
  • Familiarity with CI/CD concepts
  • Demonstrated proficiency in creating technical documentation
  • Modern concepts (how new-gen DB is implemented – like how BQ/Redshift works?).
  • Airflow, Dag development experience
  • Informatica or any ETL tool previous experience
  • Ability and experience in BI and Data Analysis, end-to-end development in data platform environments.
  • Modern concepts (how new-gen DB is implemented – like how BQ/Redshift works?).
  • Fix things before they break
  • write excellent, fully tested code to build ETL /ELT data pipelines on Cloud.
  • Candidate must have prior experience coordinating offshore teams and working in the onsite-offshore model.
  • Good to have – Health care domain experience
Job Overview
Job alerts

Subscribe to our weekly job alerts below and never miss the latest jobs

Sign in

Sign Up

Forgotten Password

Job Quick Search

Cart

Cart

Share