Hi, Bench sales Recruiter.
This is Sai Ganesh Technical Recruiter at Hectadata, LLC ‘DBA’ Vilwaa We have an Opening for a GCP Engineer
If you are interested, kindly review the job description below and let me know if you are available. Call me as soon as possible.
Job Role: GCP Data Engineer
Location: Remote
Key Responsibilities:
- Design, develop, and maintain robust and scalable data pipelines on Google Cloud Platform using services like BigQuery, Dataproc, Dataflow, and other relevant GCP tools.
- Implement ETL/ELT processes to extract, transform, and load data from various data sources into cloud-based storage and databases.
- Utilize Airflow (or a similar orchestration tool) to schedule, monitor, and manage workflow tasks and ensure seamless data processing.
- Collaborate with data scientists, analysts, and stakeholders to understand data requirements and ensure efficient data flow for analytics and machine learning models.
- Perform data cleansing, data wrangling, and ensure data integrity and data quality throughout the lifecycle.
- Optimize queries and jobs for performance and cost-efficiency in BigQuery and other GCP services.
- Develop and maintain documentation for data pipelines, processes, and architectures.
- Stay up-to-date with the latest GCP advancements and industry best practices to improve system performance and scalability.
- Troubleshoot and resolve any data-related issues and ensure high availability of data processing platforms.
Required Skills and Qualifications:
-
4+ years of experience as a Data Engineer, preferably in cloud environments.
- Expertise in Google Cloud Platform (GCP) with hands-on experience in BigQuery, Dataproc, Dataflow, and Cloud Storage.
- Proficient in designing and implementing ETL/ELT pipelines using GCP services.
- Experience with workflow orchestration tools like Airflow, Cloud Composer, or similar technologies.
- Strong programming skills in Python, Java, or Scala.
- Knowledge of SQL and experience in optimizing queries for performance.
- Familiarity with cloud infrastructure and containerization tools like Kubernetes and Docker is a plus.
- Experience working with streaming data and batch processing in cloud environments.
- Strong problem-solving skills and the ability to troubleshoot complex data issues.
- Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams.
Preferred Qualifications:
- Google Cloud Certification (Professional Data Engineer, Associate Cloud Engineer, etc.).
- Experience in machine learning pipelines or supporting data science teams.
- Experience with CI/CD pipelines for data engineering workloads.
Working Hours:
- Ability to work during Pacific Standard Time (PST) hours.
Thanks & Regards,
|
|