Hiring for GCP Data Engineer with DBA – 10+ Yrs – Onsite – Rate 53/Hr

Hello ,

My name is Rajat, and I am a Technical Recruiter at K-Tek Resourcing. We are searching for professionals for the below business requirements for one of our clients.

Please send me your updated resume at – [email protected] My number is 832 743 6754.



GCP Data Engineer with DBA

 Location: Parsippany, NJ


Roles and Responsibilities:

Data Engineering:

  • Design, develop, and maintain a Data Lake architecture using GCP services, particularly BigQuery and AlloyDB.
  • Build robust data pipelines for ETL/ELT processes using tools like DataflowCloud Composer, or other GCP services.
  • Integrate data from various sources (structured and unstructured) into the Data Lake, ensuring consistency and reliability.
  • Optimize BigQuery data models and queries for high performance and scalability.

Database Administration:

  • Administer and maintain AlloyDB instances, ensuring high availability, security, and performance.
  • Perform backup, recovery, and disaster recovery planning for AlloyDB and BigQuery.
  • Monitor and optimize database performance, query execution, and resource utilization.
  • Manage schema design, indexing, partitioning, and clustering to enhance database efficiency.
  • Apply database governance and compliance best practices to ensure data security and regulatory adherence.

Data Governance and Management:

  • Implement data governance policies, including role-based access controls, data masking, and encryption.
  • Manage metadatadata cataloging, and data lineage tracking to support audit and compliance requirements.
  • Conduct regular health checks on the Data Lake to ensure data quality and integrity.

Collaboration and Reporting:

  • Collaborate with data scientists, analysts, and application developers to define data requirements and deliver solutions.
  • Design and deploy real-time and batch data analytics solutions using GCP’s BigQuery and Looker.
  • Provide documentation and training to teams on using Data Lake features effectively.

Automation and Monitoring:

  • Automate database and data pipeline tasks.
  • Set up monitoring and alerting for BigQuery and AlloyDB using Cloud Monitoring and Cloud Logging.
  • Resolve incidents and troubleshoot performance issues in real time.

 

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments