CST Based Candidates Only// Urgently Looking For Sr. Data Engineer (Azure)// Des Moines, IA (Hybrid)// QMC// USC, GC Only

CST Based Candidates Only// Urgently Looking For Sr. Data Engineer (Azure)// Des Moines, IA (Hybrid)// QMC// USC, GC Only

 

Role: Sr. Data Engineer (Azure)

Location: Des Moines, IA (Hybrid) // CST Based Candidates Only

Duration: 6-12+ months

Client: QMC

Visa: USC, GC Only

Process: Phone/ Skype

 

Key Responsibilities

  • Lead the design, development, and optimization of data pipelines and architectures utilizing Azure technologies (Synapse, Azure Data Factory, Delta Lake, Databricks) or equivalent platforms such as AWS, Snowflake, or GCP.
  • Define and implement comprehensive data architecture strategies, focusing on Data Quality, Metadata Management, and Master Data Management (MDM) to support enterprise-wide analytics and decision-making.
  • Develop and oversee advanced dimensional data models (e.g., Star Schema) and implement efficient data population techniques for scalable and performant analytics systems.
  • Write and review complex PySpark and SQL code, ensuring best practices and scalability for data transformation and processing pipelines.
  • Drive the integration of hybrid data environments, seamlessly combining on-premises and cloud technologies to support diverse business needs.
  • Design and enforce CI/CD pipelines to enhance automation, efficiency, and reliability of data deployments.
  • Act as a technical leader in Product Scrum Agile teams, mentoring team members, ensuring alignment with agile principles, and delivering high-impact solutions.
  • Monitor, troubleshoot, and continuously improve data systems, ensuring optimal performance and resilience.
  • Provide thought leadership on emerging trends in data engineering, recommending tools and technologies to enhance the organization’s data capabilities.

 

Must-Have Skills

  • Extensive experience with Azure Cloud Technologies (Synapse, Azure Data Factory, Delta Lake, Databricks) or similar platforms in AWS, Snowflake, or GCP.
  • Deep expertise in data architecture principles, including Data Quality, Metadata Management, and MDM.
  • Proven track record of designing and optimizing dimensional data models (e.g., Star Schema) for high-performance analytics.
  • Advanced programming and optimization skills in PySpark and SQL, capable of handling complex data transformations.
  • Comprehensive knowledge of hybrid data ecosystems, combining on-premises and cloud infrastructures.
  • Strong experience with CI/CD pipelines and their application in data engineering environments.
  • Demonstrated leadership in Agile/Scrum teams, including mentoring junior engineers and driving best practices.
  • Strong analytical, problem-solving, and decision-making skills, with a focus on delivering scalable and reliable solutions.

 

Preferred Skills:

  • Expertise in data challenges and opportunities within the manufacturing industry.
  • Experience with real-time data processing and streaming data solutions.
  • Proficiency in implementing data governance frameworks to ensure compliance and security.
  • Familiarity with advanced analytics tools, such as machine learning platforms or AI integration for predictive insights.

 

Regards:

 

Anand

Headwit Global Inc.

Phone # +1 (512) 866-4578

[email protected]

5900 Belcones drive

Suit #100, Austin, TX , 78731

To unsubscribe from future emails or to update your email preferences click here

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments