Urgent Hiring for the position || Lead Snowflake Data Engineer || Boca Raton, FL (Initial remote ok)

Hello,


My name is Mansi Sen, and I work as a Technical Recruiter for K-Tek Resourcing.
 
We are searching for Professionals below business requirements for one of our clients. Please read through the requirements and connect with us in case it suits your profile.


Please see the Job Description and if you feel Interested then send me your updated resume at 
[email protected]  or give me a call at  832.660.0736. 

Job Title:  Lead Snowflake Data Engineer
Location:  Remote
Duration: Long Term


Job Description:
  • Total of 10+ years in data engineering role with 4+ years of recent experience with  Snowflake.
  • Extensive experience in design, development and support of complex ETL solutions
  • Ability to design and implement highly performant data ingestion pipelines from multiple sources using DataStage, SnowPipe, SNOWSQL.
  • In-depth knowledge of SnowPipe, SNOWSQL, stored procedures
  • Good knowledge of Agile processes and able to work with Scrum teams.
  • Experience in DataStage and Snowflake performance optimization
  • Hands-on development experience with Snowflake data platform features including Snowpipes, SnowSQL,tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and respective use cases.
  • Advanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices.
  • Ready to cover on-call support on rotation basis
  • Proficiency in Snowflake Cloud Data Platform and familiarity with AWS /Azure cloud platform
  • Strong leadership quality and able to coordinate with team at offshore
  • Ability to provide technical guidance to data engineering team for data pipeline design and enhancements
  • Strong experience in ETL with Data migration, data consolidation
  • Hands-on experience in ETL data loading around event and messaging patterns, streaming data, Kafka , API
  • Understanding of fundamentals of DevOps CI/CD, Git and Git workflows and SAAS-based Git tools like GitHub, GitLab, Bitbucket
  • Experience of working in agile application development environment
  • Ability to proactively prioritize tasks in consultation with business stakeholders, Product Owners, Product Managers
  • Design, Build, Deploy and Support DataStage ETL jobs to extract data from disparate source systems, transform, and load data into EDW for data mart consumption, self-service analytics, and data visualization tools.
  • Ensure data quality, efficient processing, and timely delivery of accurate and trusted data.
  • The ability to design, implement and optimize large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is essential.
  • Establish ongoing end-to-end monitoring for the data pipelines.
  • Strong understanding of full CI/CD lifecycle.
  • Convert business requirements to technical solution
  • Ensure adherence to architectural guidelines, strategic business needs
  • Technical feasibility analysis, recommendations and effort estimation
  • Provide operational instructions for dev, QA, and production code deployments while adhering to internal Change Management processes.
  • Performance optimization
  • QA support
  • Automation

Good to Have:

  • Valid professional certification
  • Experience in Python and BigData Cloud platform
  • Expertise in Unix & Shell Scripting

USA | Canada| India

Thanks & Regards,

Mansi Sen

+1 832-660-0736

[email protected]


0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments