Role : Sr Data Engineer
Location : Scottsdale, AZ -Onsite
Duration: Long Term
Note: must have 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
Job Description:
• A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
• Monitors the Data Lake constantly and ensures that the appropriate support teams are engaged at the right times.
• Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.
• Determine best way to extract application telemetry data, structure it, send to proper tool for reporting (Kafka, Splunk).
• Create reports to monitor usage data for billing and SLA tracking.
• Work with business and cross-functional teams to gather and document requirements to meet business needs.
• Provide support as required to ensure the availability and performance of ETL/ELT jobs.
• Provide technical assistance and cross training to business and internal team members.
• Collaborate with business partners for continuous improvement opportunities.
Requirements
JOB SPECIFICATIONS:
Education: Bachelor’s Degree in Computer Science, Information Technology, Engineering, or related field
Experience, Skills & Qualifications:
• 3+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
• 3+ years of experience with one of the leading public clouds.
• 2+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
• Mandatory Experience 3+ years of experience with Python with working knowledge on Notebooks.
• Mandatory – 3+ years working on a cloud data projects
• Nice to Have – Scala experience.
• Must have for Onshore candidate- 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
• At least 2 years of experience in Data governance and Metadata Management.
• Ability to work independently, solve problems, update the stake holders.
• Analyze, design, develop and deploy solutions as per business requirements.
• Strong understanding of relational and dimensional data modeling.
• Experience in DevOps and CI/CD related technologies.
• Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.
Thanks & Regards
Ganesh Pasham
Technical Recruiter
E-mail: [email protected]