Remote Senior GCP Data Engineer

Title:                   Senior GCP Data Engineer

Duration:         6+ Month months

Interview:        Phone and Video

Visa:                     USC, OPT/EAD, H4/EAD,

City:                     Remote

 

No GC and GC-EAD

POSITION SUMMARY:

The GCP Data Engineer is responsible for Analysis, Design, Development, Testing and Deployment support for new framework, enhance existing framework that enables the data platform data pipelines. This position is required to perform independently in a highly dynamic and fast paced environment. The person will work alongside Architects, engineers, analysts and PMs to deliver scalable robust innovative technical solutions. This position plays a key role in building real-time and batch data ingestion, egress frameworks, streaming analytics framework and support AI platform. The person must have similar experience in a prior job.

RESPONSIBILITIES

  • Build frameworks for large-scale data processing evaluating appropriate emerging technologies, and approaches that will power data-driven capabilities across the Enterprise.
  • Develops data solutions on Google Cloud Platform leveraging Google Data Flow, Data Proc, Composer, Pub/Sub, BigQuery, GCS, Cloud function, define workflows, scheduled and event driven workloads to ingest data from internal/external partners, data distribution channels, etc.
  • Build features using Python in Spark, leveraging GCP’s Spark Engine (Dataproc), SQL/GSQL on Google BQ.
  • Build and maintain scalable data pipelines to handle high-volume data (e.g., 150 million rows).
  • Collaborate with cross-functional technologists across the organization to gather requirements, solve new problems and deliver quality results.
  • Coordinate with offshore engineers to get projects/tasks completed.
  • Develops and executes test plans to validate the implementation and performance of frameworks and recommend performance improvements.
  • Supports the operations of the deployed solutions, investigates complex issues and assists with the resolution and implementation of preventive measures.

REQUIREMENTS FOR CONSIDERATION:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 8+ years of experience in data engineering, with a focus on GCP technologies.
  • Proficiency in Google Cloud Platform leveraging Google Data Flow, Data Proc, Composer, Pub/Sub, BigQuery, GCS, Cloud function, define workflows, scheduled and event driven workloads processing.
  • Hands-on experience and solid knowledge in building and maintaining end-to-end data pipelines using Python and Spark (GCP Dataproc) or any other GCP services
  • Strong experience with BigQuery including SQL and Stored Procedures.
  • Experience with high-volume data processing (GB-level).
  • Previous experience working in a similar role with GCP Services is a must.
  • Experience in data quality, continuous integrations build and deployment processes using GitHub, Jenkins and Unix/Linux shell scripts.
  • Proactive and able to catch issues before failures.
  • Possess a strong work ethic; takes pride in producing a quality product and a strong team player
  • Work with production support and project consultants in an onshore / offshore model
  • Support off-hours platform issues and code deployments as needed

 

 

 

Thank & Regards

Shard Phutela | Senior Techincal Recruiter

D: 267-665-2313,

First Ring Solutions LLC | Philadelphia, PA 19102

Note: Due to high volume of calls, I may miss your call, email is the better way to reach me.

 

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments