Responsibilities:
- Collaborate with stakeholders to define and understand data needs.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer engagement and experience, operational efficiency, and other key business performance metrics.
- Develop new, or build against existing APIs, for data access or landing data as output for further downstream consumption in the appropriate target data store.
- Design and develop efficient data architectures that can support large-scale data processing and storage requirements.
- Develop and maintain data pipelines, data models, and ETL processes that align with business requirements, data quality standards, and industry best practices.
- Work closely with other data engineering teams to build and maintain reusable data pipelines and tools, enabling faster time-to-market for data-driven solutions.
- Monitor, troubleshoot, and optimize data pipelines and processes for performance, reliability, and scalability.
- Ensure the quality and integrity of various datasets across different platforms and data sources.
- Continuously evaluate and recommend emerging technologies and methodologies to improve data engineering processes, workflows, and performance.
- Mentor and guide junior data engineers on technical best practices, code reviews, and design patterns to ensure high-quality, scalable, and maintainable data engineering solutions.
- Perform special technology-related projects, as assigned.
Qualifications for position:
- 12+ years of experience in data engineering with a focus on designing and building scalable data platforms using cloud technologies.
- Experience with data warehousing (Snowflake, Redshift etc.)
- Expert proficiency in SQL and one or more programming languages such as Python, Java, or Scala.
- Knowledge of cloud-based data platforms such as AWS, Azure, or GCP, including experience with cloud-based storage, compute, and data processing services.
- Experience designing and implementing ETL and data integration pipelines, and familiarity with data modeling concepts and database design principles.
- Experience developing production-grade, large-scale data solutions using cloud technologies.
- Familiarity with version control systems (e.g., Git) and CI/CD principles.
- Experience developing dashboards and reports in applications such as Oracle Analytics Server (OAS), Microsoft Power BI, and Google Looker.
- Thorough experience with data integration tools such as Informatica Intelligent Cloud Services and MuleSoft.
- Excellent analytical and problem-solving skills, with the ability to understand complex business processes and translate them into technical solutions.
- Exceptional communication and presentation skills, with the ability to effectively interact with clients at all levels of the organization.
- Ability to work under tight deadlines across multiple time zones and proactively manage deliverables with minimal supervision.
- High level collaboration with demonstrated ability working on multiple teams in large or complex environments such as multi time zones and geographies on deadline-driven schedules.