Urgent Req GCP Ex Walmart

Let Us Help You!
Helping You Realize Your Business Goals!

Hi,

 

I hope you’re doing well!

 

Please look at the requirements below, let us know of your interest, and send us your updated resume to [email protected]

Role: Data Engineer -GCP Ex Walmart

Location: Sunnyvale- CA-Onsite

Job Description:

 

Key Responsibilities:

Big Data Application Development:

  • Design and develop applications for big data platforms using the latest open-source technologies like Apache Hive, Apache Spark, Apache Kafka, and Apache Airflow.
  • Focus on building and maintaining distributed data processing platforms.

Data Modeling & Pipelines:

  • Design both logical and physical data models tailored for big data environments.
  • Build data pipelines and automate workflows using Apache Airflow for workflow orchestration.

Ongoing System Maintenance:

  • Provide maintenance, enhancements, and operational support for existing big data systems.
  • Participate in rotational on-call support for system troubleshooting and issue resolution.

Knowledge Sharing & Mentoring:

  • Quickly understand the business domain and technology infrastructure to contribute meaningfully to projects.
  • Actively mentor junior engineers and share your knowledge across the team.

Leadership & Team Coordination:

  • Lead daily standups and design reviews, ensuring the team stays aligned with objectives and timelines.
  • Groom and prioritize backlogs using JIRA to manage tasks and sprint activities.

Point of Contact:

  • Act as the primary point of contact for the assigned business domain, collaborating closely with stakeholders and business teams.

Essential Requirements:

Google Cloud Platform (GCP) Experience:

  • 3+ years of recent experience working with GCP services like Dataproc, GCS, and BigQuery.
  • Demonstrated experience in building and managing data pipelines in GCP.

Data Warehouse & Distributed Data Systems:

  • 5+ years of hands-on experience with data warehouse solutions and data products.
  • 5+ years of experience with distributed data processing platforms, particularly with technologies like Hadoop, Hive, Apache Spark, and Airflow or similar workflow orchestration tools.

Data Modeling Expertise:

  • 4+ years of experience in data modeling for data lakes and RDBMS platforms.
  • Proficient in designing schemas for large-scale data systems.

Programming & Scripting Skills:

  • Proficiency in programming languages such as Python, Java, and Scala.
  • Experience with scripting languages like Perl and Shell.

Big Data Management:

  • Proven experience working with and processing large data sets at the scale of multi-TB/PB.
  • Familiarity with test-driven development and automated testing frameworks.

Agile/Scrum Methodology:

  • Experience working in Scrum/Agile development environments, managing projects with shifting priorities and deadlines.

Communication Skills:

  • Strong verbal and written communication skills to effectively collaborate with cross-functional teams and stakeholders.

Education:

  • A Bachelor’s Degree in Computer Science or equivalent practical experience.

Desired Skills:

Version Control & CI/CD:

  • Familiarity with Gitflow for managing version control and branching strategies.
  • Experience with Atlassian products like Bitbucket, JIRA, and Confluence.
  • Continuous Integration (CI) tools such as Bamboo, Jenkins, or TFS for automating build and deployment processes.

Cloud & Data Integration:

  • Knowledge of other cloud services and tools beyond GCP may also be beneficial.

With Regards

Kishore Reddy

Submit Your Resume

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments