Title : GCP Developer ,Dallas , TX, onsite

Hi,

 

We have an urgent requirement for  GCP Developer with our client at Dallas , TX

 

Drop your resumes to [email protected]

 

Note: If you are not looking for a new opportunity, please refer someone like, will appreciate.

 

Title : GCP Developer

Location : Dallas , TX

Duration: long term Contract

 

 

 

Job description :

 

Design, develop, and maintain scalable data pipelines and data processing systems on the Google Cloud Platform (GCP).

2. Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and implement solutions accordingly.

3. Develop and optimize ETL processes to ensure efficient data ingestion, transformation, and loading.

4. Implement data governance and security measures to ensure data quality, integrity, and privacy.

5. Monitor and troubleshoot data pipelines to identify and resolve issues in a timely manner.

6. Work with cross-functional teams to integrate data from various sources and systems.

7. Conduct performance tuning and optimization of data processing jobs.

8. Stay updated with the latest trends and technologies in the field of data engineering and GCP services.

9. Focus Areas:

• Building scalable and efficient data pipelines on the GCP.

• Data governance and security.

• Integration of data from multiple sources and systems.

• Performance tuning and optimization.

• Staying updated with emerging technologies and best practices in data engineering.

 

Key Skill Sets:

1.  Data Engineering Work: The candidate should have experience in building pipelines using Python/Pyspark on GCP cloud.

2.  Dataproc Knowledge: The candidate should have working knowledge of serverless Dataproc and Ephemeral Dataproc.

3. Airflow Expertise: Proficiency in Airflow is required.

4. BigQuery: The candidate must be very strong in writing SQL.

5. ML Experience: Experience in machine learning will be an added advantage for retaining the position.

6. Vertex AI: Knowledge and working experience in model building using Vertex AI etc will be added advantage

 

Qualifications we seek in you!

Minimum Qualifications:

1. Bachelor's degree in computer science, information systems, or a related field.

2. Experience in data engineering or a similar role.

3. Demonstrated experience in designing and implementing data pipelines using GCP services.

4. Proficiency in Python, SQL, and data manipulation techniques.

5. Strong understanding of cloud computing concepts and distributed systems.

Preferred Qualifications/skills:

7. Master's degree in computer science, information systems, or a related field.

8. Experience with other cloud platforms such as AWS or Azure.

9. Certification in GCP data engineering or related field.

10. Familiarity with machine learning concepts and frameworks.

11. Experience with real-time data processing and streaming technologies.

 

 

 

Regards,

Nagendar Goud Mula

Sr. US IT Recruiter

E: [email protected]

linkedin.com/in/nagendar-goud-38680ba6

 

 

 

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments