Urgent Reqmt – GCP Data Engineer – AR

Let Us Help You!
Helping You Realize Your Business Goals!

Hi All

Please share the updated profile to [email protected]

Role: GCP Data Engineer

Location: Bentonville, AR – Hybrid

Client: Walmart

UST Global® is looking for a strong developer to build Client pipelines on Kubeflow, connecting components from BigQuery, Google Cloud Storage, and Dataproc, and deploy API endpoints. The ideal candidate should be an expert in leading projects in developing and testing data pipelines, data analytics efforts, proactive issue identification and resolution and alerting mechanism using traditional, new and emerging technologies.

Excellent written and verbal communication skills and ability to liaise with technologists to executives is key to be successful in this role. this is your opportunity to·

• Assembling large to complex sets of data that meet non-functional and functional requirements

• Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes

• Building required infrastructure for optimal extraction, transformation and loading of data from various data sources

• Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition

• Working with stakeholders including data, design, product and executive teams and assisting them with data-related technical issues 

• Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues

• work on a multi-vendor multi-geo team support one of the complex enterprise environment

Mandatory Skill sets

• Hands on experience in Big Data, GCP Cloud and development.

• Have hands on experience in writing Spark jobs in Pyspark Technology expectations

• 3 + years of experience in BigQuery 

• 3+ years of experience in Kubeflow, and Dataproc

• 3+ years of hands-on experience in API deployment

• Proficient in Python

• Proficient in designing MySQL & Cassandra database.

• Experience in importing and exporting the data using Sqoop/TDCH from HDFS to Relational Database systems and vice-versa.

• Experience in manipulating/analyzing large datasets and finding patterns and insights within structured and unstructured data.

• Good knowledge on DevOps background and and 1+ years of experience in Kubernetes Other required skills

• Strong verbal and written communication skills to effectively share findings with shareholders.

• Good understanding of web-based application development

• Should be an independent contributor from day one and be able to operate with minimal to no supervison

• A bachelor’s degree in computer science or relevant fields is mandatory.

• Should be hands on and be able to work in an agile, fast moving environment. Callouts :

• Should be working in CST timezone

• should be available in VC without notice during working hours

• Should only work from the agreed location

• Be available in video for stand ups and other meetings

• Be available to work from local office on need basis

Regards

[email protected]

linkedin.com/in/shravan-kumar-91b72087

Submit Your Resume

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments