Lead Data Engineer

Role : Lead Data Engineer

Location : Scottsdale AZ (100% onsite)
Hire type : Contract.

  

 Exp : 10 +

Must have skill set:  Spark, S3, Glue,  AWS Redshift , python and stream set exp

6-8 years of IT experience focusing on enterprise data architecture and management.

  • Experience in Conceptual/Logical/Physical Data Modelling & expertise in Relational and Dimensional Data Modelling
  • Experience with Databricks & on Prem , Structured Streaming, Delta Lake concepts, and Delta Live Tables required
  • Experience with Spark scala
  • Data Lake concepts such as time travel and schema evolution and optimization
  • Structured Streaming and Delta Live Tables with Databricks a bonus
  • Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation / support
  • Advanced level understanding of streaming data pipelines and how they differ from batch systems
  • Formalize concepts of how to handle late data, defining windows, and data freshness
  • Advanced understanding of ETL and ELT and ETL/ELT tools such as Data Migration Service etc
  • Understanding of concepts and implementation strategies for different incremental data loads such as tumbling window, sliding window, high watermark, etc.
  • Familiarity and/or expertise with Great Expectations or other data quality/data validation frameworks a bonus
  • Familiarity with concepts such as late data, defining windows, and how window definitions impact data freshness
  • Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design performance optimization)
  • Indexing and partitioning strategy experience
  • Debug, troubleshoot, design and implement solutions to complex technical issues
  • Experience with large-scale, high-performance enterprise big data application deployment and solution
  • Architecture experience in AWS environment a bonus
  • Familiarity working with Lambda specifically with how to push and pull data, how to use AWS tools to view data for processing massive data at scale a bonus
  • Experience with Gitlabs and CloudWatch and ability to write and maintain gitlabs for supporting CI/CD pipelines
  • Experience working with AWS Lambdas for configuration and optimization and experience with S3
  • Familiarity with Schema Registry, message formats such as Avro, ORC, etc.
  • Ability to thrive in a team-based environment
  • Experience briefing the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior level of management

Thanks & Regards

DhanasekaranE-IT Professionals Corp.

Email |[email protected]

Email

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments