Hi, This is Akash from Empower Professionals. I am currently seeking a qualified GCP Data Engineer. Kindly share the profiles of your best consultants at your earliest convenience.
Role: GCP Data Engineer Location: Bentonville, AR/Sunnyvale, CA Duration: 12+ Months
Must Have: • 4+ years of recent GCP experience • Experience building data pipelines in GCP • GCP Dataproc, GCS & BIGQuery experience
Requirements: • Hands-on experience with developing data warehouse solutions and data products. • 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required • 5+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms. • Experience with programming languages: Python, Java, Scala, etc. • Experience with scripting languages: Perl, Shell, etc. • Practice working with, processing, and managing large data sets (multi TB/PB scale). • Exposure to test driven development and automated testing frameworks. • Background in Scrum/Agile development methodologies. • Capable of delivering on multiple competing priorities with little supervision. • Excellent verbal and written communication skills. • Bachelor’s Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following: • Gitflow • Atlassian products – BitBucket, JIRA, Confluence etc. • Continuous Integration tools such as Bamboo, Jenkins, or TFS
Role: GCP Data Engineer Location: Bentonville, AR/Sunnyvale, CA Duration: 12+ Months
Must Have: • 4+ years of recent GCP experience • Experience building data pipelines in GCP • GCP Dataproc, GCS & BIGQuery experience
Requirements: • Hands-on experience with developing data warehouse solutions and data products. • 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required • 5+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms. • Experience with programming languages: Python, Java, Scala, etc. • Experience with scripting languages: Perl, Shell, etc. • Practice working with, processing, and managing large data sets (multi TB/PB scale). • Exposure to test driven development and automated testing frameworks. • Background in Scrum/Agile development methodologies. • Capable of delivering on multiple competing priorities with little supervision. • Excellent verbal and written communication skills. • Bachelor’s Degree in computer science or equivalent experience.
The most successful candidates will also have experience in the following: • Gitflow • Atlassian products – BitBucket, JIRA, Confluence etc. • Continuous Integration tools such as Bamboo, Jenkins, or TFS