Lead Data Engineer || REMOTE || GC AND USC ||

Hello 
 
Here are the Requirement for the below Position 
 
Job Title : Lead Confluence Kafka Engineer
 
Experience : 10+
 
Location :  REMOTE 
 
Work Authorization : GC AND USC ONLY 
 
Tax Term : C2H
 
Tier : Working with PV
 
End Client : HIGHMARK
 
 

Job Summary:

We are seeking a highly skilled Kafka Confluent Engineer to design, implement, and manage real-time data streaming platforms using Apache Kafka and Confluent Platform. You will be responsible for building and maintaining scalable and reliable streaming pipelines to enable real-time data processing across the organization.

 

Key Responsibilities:

Design, deploy, and manage Apache Kafka clusters and Confluent Platform components

Implement high-throughput, low-latency, and fault-tolerant streaming data pipelines.

Develop Kafka producers and consumers for real-time applications using Java, Python, or Scala.

Integrate Kafka with various data sources and sinks (e.g., databases, cloud storage, Elasticsearch).

Monitor and optimize Kafka clusters using tools like Confluent Control Center, Prometheus, Grafana, and Datadog.

Implement security (RBAC, TLS, and ACLs), disaster recovery, and data governance policies within the Confluent ecosystem.

Troubleshoot production issues and ensure high availability and data consistency.

Automate Kafka operations using scripting and tools like Ansible, Terraform, or Kubernetes.

Collaborate with data engineers, software developers, and DevOps teams on data architecture and CI/CD pipelines.

 

Required Skills & Experience:

4+ years of experience in data engineering or platform engineering roles.

Strong hands-on experience with Apache Kafka and Confluent Platform.

Proficient in one or more programming languages such as Java, Python, or Scala.

Experience with Kafka Connect, Kafka Streams, and Schema Registry.

Deep understanding of distributed systems, message queues, and stream processing concepts.

Familiarity with containerization and orchestration tools (Docker, Kubernetes).

Experience working with cloud platforms (AWS, GCP, or Azure), especially Confluent Cloud.

Knowledge of observability and monitoring tools for Kafka.

Excellent problem-solving and communication skills.

 

Preferred Qualifications:

Confluent Certified Developer or Administrator.

Experience with event-driven microservices architectures.

Exposure to infrastructure-as-code tools like Terraform or Helm.

Background in high-volume transactional systems or real-time analytics.

Knowledge of security and compliance in a Kafka environment.

 
 

Regards,

Ritika Negi

Technical Recruiter

Ritika@vsgbusinesssolutions.com

VSG Business Solutions Inc

3240 East State ST Ext Suite 203

Hamilton, New Jersey 08619

To unsubscribe from future emails or to update your email preferences click here

0 0 votes
Article Rating
Subscribe
Notify of
guest


0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments