Job Title: Data Engineering Lead
Location: Holmdel, NJ (Twice a month – Onsite)
FTE Role
Job details
Description:
As the Data Engineering Lead, you will play a critical role
in leading and contributing to our data engineering competency to ensure the
highest quality, integrity, efficiency, and scalability of our data services
and infrastructure. This position requires a hands-on leader with a passion for
data excellence, a deep technical understanding, and the ability to guide and
mentor a team to provide data as a platform service to the broader organization
and to customers.
Responsibilities:
- Data
Architecture: Lead the design, implementation and operation of robust
data architecture, ETL/ELT pipelines, and data warehousing strategies
using Airflow and AWS cloud services.
- Team
Management and Leadership: Manage a small team of Data and
Business Intelligence engineers, fostering a collaborative and
results-driven environment. Provide mentorship and encourage growth.
Manage data expenses. Develop and report KPIs.
- Technical
Expertise and Contribution: Be an individual contributor for our
data projects. Provide hands-on guidance and expertise to the team in
Python, SQL, and AWS cloud services, ensuring high-quality code and
software engineering best practices. Help manage and pay off tech debt.
- Data
and BI Services and Integration: Collaborate with product
engineers and internal data stakeholders to make application data
available for product features and work closely with business analysts to
orchestrate Business Intelligence reporting using Looker.
- Quality
Standards: Uphold and refine data quality standards, engineering
best practices, and KPIs while optimizing data pipelines for performance,
scalability, and data quality.
- Innovation: Challenge
existing structures and thought processes, staying updated with the latest
developments in technology and recommending innovative solutions.
- Communication: Clearly
and concisely communicate ideas to both technical and non-technical
stakeholders, sharing knowledge and expertise both upstream and
downstream.
- Cloud
Infrastructure Optimization: Work closely with DevOps to optimize
cloud infrastructure performance and scalability and help to save costs.
- Code
Reviews: Participate in code reviews to ensure code quality and
knowledge sharing among team members.
- Documentation: Maintain
comprehensive documentation for data processes and systems. Establish and
enforce documentation best practices.
- Testing: Implement
robust testing frameworks for data pipelines and quality assurance.
Increase test coverage across all data engineering projects. Develop
automated testing scripts for data integrity and performance.
- Monitoring: Design
and implement logging and monitoring systems for data workflows. Set up
real-time alerting mechanisms for data pipeline issues.
- Data
Governance, Compliance, and Security: Manage and maintain
internal and external security best practices. Ensure compliance with Data
Usage-license agreements.
Qualifications:
- Experience: 6+
years of experience as a Data Engineer with increasing responsibility and
a strong track record of technical leadership.
- Technical
Skills: Advanced skills in Python 3 (5+ years recent), SQL (5+
years recent), and data cloud services (AWS preferred, 3+ years).
Extensive experience with data pipeline development (batch processing),
management, data warehousing (Redshift, Postgres), and orchestration
(Airflow) is a must. Experience working heavily with analytics and BI
reports.
- Leadership: 2+
years of experience in a team leadership role
- Passion: Passionate
about the latest developments in technology, with a commitment to data
quality and high personal code/development standards.
- Team
Player: A friendly, collaborative, and passionate team player who
can adapt to a fast-paced environment and is experienced with Scrum/Agile
methodologies. Can both give and receive respectful constructive feedback.
Committed to prioritizing team success over personal gain.
- Leveraging
Partnerships: Experienced in evaluating and executing strategic
outsourcing opportunities as a complement to in-house capabilities,
ensuring optimal resource utilization without compromising team integrity
or project quality.
- Continuous
Learning: A commitment to staying updated with industry trends
and emerging technologies.
- Communication: Excellent
communication skills, with the ability to articulate ideas and solutions
effectively.
Nice to Haves:
- Working
knowledge of Kubernetes and Docker
- Experience
with AWS Athena & AWS GLUE
- Experience
with AWS RDS Aurora
- Experience
with Snowflake or GCP BigQuery