Remote (Rate -$48/hr) || Data Engineer with snowflake & Azure -## No = GC/USC/OPT/CPT

Hi

I am Dheeraj from Quantum World Technologies Inc, let me know if you have any Interest to join, please have a look at the job details and do let me know if you are good to take this or know any friend or colleague might need for same, and please reply me back with a current copy of your resume

 

Job title :- Data Engineer

Location:- Remote

Duration:- Long Term contract

 

JOB Description:

Primary Responsibilities:
• Create & maintain data pipelines using Azure & Snowflake as primary tools
• Create SQL Stored procs and Functions to perform complex transformations
• Understand data requirements and design optimal pipelines to fulfil the use-cases
• Creating logical & physical data models to ensure data integrity is maintained
• Code management, CI/CD pipeline creation & automation using GitHub & GIT Actions
• Tuning and optimizing data processes
• Design and build best in class processes to clean and standardize data
• Code Deployments to production environment, troubleshoot production data issues
• Modelling of big volume datasets to maximize performance for our Business Intelligence & Data Science Team

Qualifications
Required Qualifications:

• Computer Science bachelor's degree or similar
• Min 1-4 years of industry experience as a Hands-on Data engineer
• Excellent communication skills – Verbal and Written
• Excellent knowledge of SQL
• Excellent knowledge of Azure Services such as – Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc.
• Excellent knowledge of Snowflake – Architecture, Features, Best practices
• Excellent knowledge of Data warehousing & BI Solutions
• Excellent Knowledge of change data capture (CDC), ETL, ELT, SCD etc.
• Hands on experience on the following technologies:
o Developing data pipelines in Azure & Snowflake
o Writing complex SQL queries
o Building ETL/ELT/data pipelines using SCD logic
o Query analysis and optimization
• Analytical and problem-solving experience applied to a Big Data datasets
• Data warehousing principles, architecture and its implementation in large environments
• Experience working in projects with agile/scrum methodologies and high performing team(s)
• Knowledge of different data modelling techniques such as Star Schema, Dimensional models, Data vault is an Advantage
• Experience in code lifecycle management and repositories such as GIT & GitHub
• Exposure to DevOps methodology
• Good understanding of Access control and Data masking

Preferred Qualifications:
• Knowledge and experience on Terraform, CI CD Pipelines and automation is an advantage
• Automation and orchestration using ADF
• Create real-time analytics pipelines using Snowpipe Streaming
• Experience developing optimized data models for Viz tool E.g. Tableau, PowerBI is an Advantage
• Exposure and experience in other programming language such as Python, Spark etc. is an advantage
• Hands-on experience of CI CD Pipelines using GIT & GIT Actions
• Understanding of United States Healthcare data and applicable regulations

 

 

Thanks & Regards

Dheeraj Singh

Sr. Recruiter

Quantum World Technologies Inc.

4281 Katella Ave, Suite # 102, Los Alamitos, CA, 90720

www.quantumworldit.com

E: [email protected]

 

 
 
 

To unsubscribe from future emails or to update your email preferences click here

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments