:Data Engineer::Seattle, WA—hybrid::H4Ead and Citizen Only

Hi,

 

I am Rakesh from Mighty Warner. I am trying to reach you for Job opportunity for Data Engineer at Seattle, WA—hybrid.

 

Please share below details asap for quick submission:

 

Work Authorization:

IF GC (Year of DoB):

Current Location:

Local DL Location:

LinkedIn URL:

 

H4Ead and Citizen Only

 

Data Engineer

 

Location: Seattle, WA—hybrid
**MUST HAVE INDUSTRY EXPERIENCE in ECOMMERCE/RETAIL**

**Need Snowflake—to migrate data into snowflake
*experience with massive amounts of data – this company has over 13 million users**
*GCP, Google Analytics, Azure, Pipelines, Delta*

 

Job description:
As an Engineer II, you will bring a high level of technical knowledge, but also an ability to spread knowledge to your co-workers. You will help form the core of our engineering practice at the company by contributing to all areas of development and operations (pre-production to production). You will be an example of what good engineering looks like and help others around you refine their skills. You will be part of a day-to-day production release team and may perform on-call support functions as needed. Having a DevOps mindset is the key to success in this role, as Engineers are commonly part of full DevOps teams that “own” all parts of software development, release pipelines, production monitoring, security and support.
Data Engineering Projects Data pipeline creation and maintenance. Stack: Google Cloud Platform (GCP), Azure Cloud, Azure Databricks, Snowflake. Includes engineering documentation, knowledge transfer to other engineers, future enhancements and maintenance. Create secure data views and publish them to the Enterprise Data Exchange via Snowflake for other teams to consume
Data pipeline modernization and migration via Databricks Delta Live Tables (DLT) and Unity Catalog. Leverage existing CICD process for pipeline deployment. Adhere to PII encryption and masking standards
Data Engineering Tools/Techniques. Orchestration tools- ADF, AirFlow, FiveTran. Languages- SQL, Python. Data Modeling- Star and Snowflake Schema. Streaming- Kafka, EventHub, Spark, Snowflake Streaming DevOps
Support improvements to current CICD process. Production monitoring and failure support Provide an escalation point and participate in on-call support rotations
Conduct research to aid in product troubleshooting and optimization efforts
Participate in and contribute to our Engineering Community of Practice
Qualifications:
Bachelor’s degree or diploma (or equivalent experience) in Computer Science, Software Engineering or Software Architecture preferred; candidates with substantial and relevant industry experience are also eligible
10+ years of relevant engineering experience
Google Professional Data Engineer Certification is preferred
Experience in Big Table, ClickStream data migration, Semi-Structured and Un-Structured data management
Experience with Google GCP and BigQuery
Experience with developing complex SQL queries
Experience with CI/CD principles and best practices
Experience with Azure Data Factory, Azure Data Bricks, Snowflake, and Storage Accounts

 

 

 

 

Rakesh Jaiswal

Senior Technical Recruiter

Email: rakesh@mightywarnerconsulting.com

Add: 30 N Gould St Ste R Sheridan, WY 82801

Websitewww.mightywarners.com

 

 

 

 
 
 

To unsubscribe from future emails or to update your email preferences click here

0 0 votes
Article Rating
Subscribe
Notify of
guest


0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments