Role:- GCP Data Engineer with Data Pipeline Exp
Location:- Remote
Duration:- 6 – 12 Months
Visa:- Any visa
Moi:- Video
Client :- Warner Bros
Required background/ Skillsets:
• Prior working experience in Data bricks
• Snowflake
• Python
• Data science background
Daily Responsibilities:
- Design, develop, test, deploy, maintain, and improve software applications and services.
- Implement data pipelines and ETL processes using Databricks and Snowflake.
- Collaborate with other engineers to understand requirements and translate them into technical solutions.
- Optimize and fine-tune performance of data pipelines and database queries.
- Ensure code quality through code reviews, unit testing, and continuous integration.
- Contribute to architecture and technical design discussions.
Technology requirements:
• SQL
• GCP
• AWS
- Proficiency in Python for software development and scripting.
- Hands-on experience with Databricks and Snowflake for data engineering and analytics.
- Strong understanding of database design, SQL, and data modeling principles.
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Familiarity with machine learning concepts and frameworks is a plus.
- Excellent problem-solving skills and ability to work independently and as part of a team.
- Strong communication skills and ability to collaborate effectively across teams.
Years experience:
• 10+
10 years of experience as a Software Engineer or related role.
Thanks & Regards Hardik Khanna||VOTO CONSULTING LLC [email protected]|Phn : 201-331-7226,Ext: 132
Linkedin|linkedin.com/in/hardik-khanna-a551ba22b/ http://www.votoconsulting.com |1549 Finnegan Lane,
2nd Floor, North Brunswick, NJ,08902
|
|
|