Remote opportunities :::Lead Data Engineer || C2H

 
 
 

Hi  ,

 

Very good morning , I hope you are doing well today.

 

This Side  Daya Shankar  from VSG Business Solutions. Today I share one of my Better opportunities. So  go through the job description below and let me know your interest.

 

 

Role :: Lead  Kafka Engineer Position

Location :: 100% Remote
Visa :: USC & GC

 

 

 

Multiple Confluence Kafka Engineer Position || Remote || USC, GC only

End-client: Highmark || Working with a Prime Vendor

6-month contract to hire

 

Position name:

Associate Confluence Kafka Engineer

Lead Confluence Kafka Engineer

 

Job Summary:

We are seeking a highly skilled Kafka Confluent Engineer to design, implement, and manage real-time data streaming platforms using Apache Kafka and Confluent Platform. You will be responsible for building and maintaining scalable and reliable streaming pipelines to enable real-time data processing across the organization.

 

Key Responsibilities:

Design, deploy, and manage Apache Kafka clusters and Confluent Platform components

Implement high-throughput, low-latency, and fault-tolerant streaming data pipelines.

Develop Kafka producers and consumers for real-time applications using Java, Python, or Scala.

Integrate Kafka with various data sources and sinks (e.g., databases, cloud storage, Elasticsearch).

Monitor and optimize Kafka clusters using tools like Confluent Control Center, Prometheus, Grafana, and Datadog.

Implement security (RBAC, TLS, and ACLs), disaster recovery, and data governance policies within the Confluent ecosystem.

Troubleshoot production issues and ensure high availability and data consistency.

Automate Kafka operations using scripting and tools like Ansible, Terraform, or Kubernetes.

Collaborate with data engineers, software developers, and DevOps teams on data architecture and CI/CD pipelines.

 

Required Skills & Experience:

4+ years of experience in data engineering or platform engineering roles.

Strong hands-on experience with Apache Kafka and Confluent Platform.

Proficient in one or more programming languages such as Java, Python, or Scala.

Experience with Kafka Connect, Kafka Streams, and Schema Registry.

Deep understanding of distributed systems, message queues, and stream processing concepts.

Familiarity with containerization and orchestration tools (Docker, Kubernetes).

Experience working with cloud platforms (AWS, GCP, or Azure), especially Confluent Cloud.

Knowledge of observability and monitoring tools for Kafka.

Excellent problem-solving and communication skills.

 

Preferred Qualifications:

Confluent Certified Developer or Administrator.

Experience with event-driven microservices architectures.

Exposure to infrastructure-as-code tools like Terraform or Helm.

Background in high-volume transactional systems or real-time analytics.

Knowledge of security and compliance in a Kafka environment.

 

 

 

Thanks and Regards,

Daya shankar jha

Sr. Technical Recruiter

9737862844

daya@vsgbusinesssolutions.com

https://www.linkedin.com/in/daya-shankar-jha-00bb90216

VSG Business Solutions Inc.

3240 East State Street Ext, Suite 203, Hamilton, NJ 08619

 

 

To unsubscribe from future emails or to update your email preferences click here

0 0 votes
Article Rating
Subscribe
Notify of
guest


0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments