Sr. Snowflake Developer // Houston, TX // Onsite // C2C

Hi ram,

 

Role: Sr. Snowflake Developer

Location: Houston, TX / Onsite

Duration: 12+ months

Contract: C2C

Required exp: 10+ years.

 

Job Description:

Requisite Skills: Snowflake, SQL scripting & modelling, Azure Data Factory (ADF), understanding of Azure resources in general, Python, basic JavaScript, DevOps and Git, Enterprise Data Warehousing (conceptual

understanding).

Primary functions: Database development and design, data integration and cloud support, data

warehousing.

Primary Deliverables: SQL based reports, data integration and pipelines using ETL services,

SQL/JavaScript Stored Procedures and Functions, support of the cloud resources related to Snowflake

(ADF, Storage Accounts, Azure Functions, etc.), scripts and Azure Functions built in Python, support

documentation.

Activities:

·        End-to-end creation of reports in Snowflake, including the data ingestion, database and cloud

·        objects design and creation, testing (for quality and performance) and deployment

·        Support of existing models in Snowflake

·        Design and support of data pipelines using Azure Data Factory (ADF), but other ETL tools can

·        also be used such as Qlik, Fivetran, NiFi, BODS, SLT and etc.

·        Creation of scripts – mainly in Python – to manipulate and transform data when required by the

·        data ingestion pipelines

·        Creation of CI/CD pipelines for driving the Change Management process of the Snowflake

·        objects and their related data ingestion requirements

·        Provide support to the business team in data testing and validation activities

Technical Competencies:

·        Advanced knowledge in SQL including performance tuning.

·        Solid knowledge of Cloud environments, especially Azure.

·        Skills to build robust and complex data pipelines using Azure Data Factory (ADF)

·        Knowledge of Python and Javascript programming languages and ability to write and support

·        code using them

·        Knowledge of ETL tools such as Qlik, Fivetran, NiFi, BODS, SLT and etc.

·        Ability to build and run CI/CD pipelines for different resources in our technology stack

·        Basic data security and access management

Preferred candidate should:

·        Have Bachelor’s Degree in Engineering Technology, Computer Science, or a related field with

·        equivalent experience

·        Demonstrate critical thinking, analytical skills, and employ judgment to offer thoughtful, concise

·        input toward resolutions of problems

·        Have strong communication and interpersonal skills with strong English proficiency

·        Be able to translate data requirements into business processes and reverse engineer business

·        processes into data requirements

·        Show leadership skills needed to successfully promote ideas, coordinate work activities, and

·        plan deliverables within a project team

·        Have comprehension of DevOps and Agile development and application to data centric

·        architecture and solutions

·        Experience working in Agile environment and familiar with Scrum team process and ceremonies.

·        Have hands-on experience with Snowflake, Azure and ETL tools

·        Understand programming logic and architecture and have the ability to write code in Python

·        and/or Javascript

·        Know other DBMSs like SAP HANA, SQL Server and etc.

Nice to have: ETL Tools (Qlik, Fivetran, etc.), SAP HANA and other DBs, Clean Code principles, basic

Security and Control knowledge.

 

 

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments