Job description: This role is to maintain and support the operations of data pipelines built for SPE TV Central Research and Analytics team. It involves business requirements gathering, analyzing requirements, and designing technical solutions. Job includes doing impact analysis of any change being requested by the business to the existing data pipeline.Responsibilities include:• Partnering with Business Users and other IT teams to design and develop Data Pipelines.• Support the production pipelines – operations and enhancements to existing pipelines.• Doing impact analysis if any changes being requested upstream to the existing pipeline and data model.• Ability to interpret the Business Requirements and convert them to technical requirements.• Provide design considerations for scalability and reliability of data streams being ingested.• Implementing ETL pipelines within and outside of a data warehouse using Python and Snowflake Snow SQL.• Keeping the SOPs updated all the time for any changes. • Data profiling of various data sources to understand the relationships across them.• Develop optimized pipeline design to achieve acceptable performance.• Recommend Data Quality Management methodologies for various projects.• Assist in UAT phases of any new projects.• Follow agile development framework to work any new projects.• Follow Change Control management for any code deployment.Knowledge of:• 3 to 4 years of extensive hand-on experience in building data pipelines using Alteryx and Snowflake SQL procedural language.• 2 to 3 years of experience in Airflow/MWAA service for data pipeline orchestration and schedule management.• Experience in Python – knowledge of various libraries to manage data extractions and transformations• Deep knowledge and understanding of Snowflake Database Platform including features like external tables, clustering, various data ingestion techniques, Warehouse management etc.• Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting.• Knowledge of Tableau is a plus.• Exceptional knowledge in writing complex SQLs and Procedure Language.• Amazon Web Services like AWS Glue, Lambda, Secret Manager, S3, VPC, Private Link, Step Functions.• Experience with Infrastructure as Code (IaC) tool like Terraform (preferred), AWS CloudFormation.• Should have used GitHub repository for code management.• Experience with any deployment tools like Jenkins.• Knowledge of Data Modelling techniques.• JIRA for project execution.Skill In:• Strong interpersonal skills are required. Ability to work effectively with team members, clients, and other areas of the IT environment.• Strong written and oral communications.• Excellent problem-solving abilities, strong decision-making skills and make sound judgment decisions.Ability To:• Work effectively with team members, clients, and other area
Competencies: Digital : Business Intelligence and Analytics Tools, Digital : NoSQL Document DB
Branch | City | Location: TCS – WOODLAND HILLS, CA
Mohd Adil | Talent Acquisition
2838 E. LONG LAKE ROAD SUITE 210, TROY, MI 48085
M: 469-552-7783 | Mail: [email protected]
Connect me on – Linkedin