Lead Data Engineer
Job Description
If you are passionate about building scalable data platforms, expert in modern data engineering technologies, and enjoy solving complex data challenges, we want to hear from you! We are looking for a Senior Data Engineer with 5–8 years of experience to design, develop, and optimize our data ecosystem across cloud and on‑prem environments. Experience in the healthcare domain will be a strong advantage.
Responsibilities
- Design, develop, and maintain end‑to‑end data pipelines and ETL/ELT workflows using Python, SQL, Azure Data Factory, Snowflake, and related services.
- Build and optimize high‑performance, scalable, and secure data architectures supporting both analytical and operational workloads.
- Work with cross‑functional teams—Data Architects, Application Engineers, Product Owners, and QA teams—to translate business requirements into technical data solutions.
- Ensure data quality and reliability by implementing strong data validation, monitoring, and automated testing frameworks.
- Develop and maintain data models, data marts, and analytical layers for reporting, self‑service analytics, and downstream applications.
- Implement robust CI/CD pipelines, automation strategies, and best practices for version control and deployment.
- Drive improvements in data architecture, performance tuning, optimization, and reduce technical debt across data systems.
- Monitor, troubleshoot, and enhance existing data pipelines and systems; proactively resolve data issues and production defects.
- Ensure data solutions meet standards of security, compliance, scalability, and reliability, especially when handling sensitive or healthcare‑related data (HIPAA familiarity is a plus).
- Mentor junior team members, conduct code reviews, and contribute to building a culture of engineering excellence.
- Maintain strong documentation of data processes, systems, and architecture.
- Stay current with emerging data technologies, tools, and industry best practices, and advocate improvements within the team.
Required Skills
- B.Tech/BE in Computer Science
- 5–8 years of professional experience in Data Engineering.
- Strong programming skills in Python for ETL, automation, and data transformation.
- Hands‑on expertise in Azure Cloud (ADF, Azure Databricks, Azure Storage, Azure SQL, Synapse, etc.).
- Deep knowledge of SQL including complex queries, performance tuning, stored procedures.
- Experience with Snowflake—warehouse design, SnowSQL, Snowpipe, performance optimization.
- Solid understanding of data modeling (Star/Snowflake schemas), data warehousing concepts, workflows, and metadata management.
- Experience working with structured, semi‑structured (JSON, Parquet), and unstructured data.
- Strong knowledge of SDLC, Agile methodologies, version control (Git), and CI/CD practices.
- Excellent analytical and problem‑solving skills, with the ability to work in highly collaborative and fast‑paced environments.