We are searching for Professionals below business requirements for one of our clients. Please read through the requirements and connect with us in case it suits your profile.
Skills Required:
- 8+ years of experience in data engineering or data architecture, with at least 2 years in a GCP environment.
- Data Integration: Develop and implement ETLELT processes to ingest data from various sources into BigQuery, ensuring data quality and integrity.
- GCP Expertise: Strong knowledge of Google Cloud services, especially BigQuery, Dataflow, Cloud Storage, and PubSub.SQL
- Proficiency: Proficient in SQL and experience with optimizing complex queries in BigQuery.
- Data Modeling: Experience in data modeling techniques and best practices for data warehousing.
- IRIS: 2+ years of hands on experience in solutioning IRIS Integrations
Programming Skills: Proficiency in programming languages such as Python, Java, or Scala for data manipulation and automation.
Soft Skills
(1.) To oversee quality assurance processes, ensuring adherence to coding standards, implementation of best practices and perform Value creation and KM activities.
(2.) To ensure process improvement and compliance| and participate in technical design discussion and to review technical documents.
(3.) Responsible for shaping the overall project strategy working closely with stakeholders to define project scope, objectives, deliverables and keeping track of schedule to ensure on time delivery as per the defined quality standards.
(4.) To work closely with the development team, On-site Engineers to understand technical requirements and work with them to address and resolve technical issues.
(5.) Identify & flag potential risks and issues that may impact project timelines or quality, develop mitigation strategies / contingency plans to address risks and provide regular project updates to key stakeholders.