Hiring Kafka Architect at Richmond, VA

Hi,

 

Good morning!!

 

This is Himanshu, working as a Recruiter – Recruitment at KTek Resourcing.

 

I am currently looking for someone with your expertise. Your experience brings skills that would make you a great fit for this role. Please find the below JD for your reference.


Job Role: Kafka Architect
Job Location: Richmond, VA
  • Kafka Cluster Management: Design, deploy, and manage Apache Kafka clusters, ensuring high availability, scalability, and fault tolerance.
  • Data Streaming Architecture: Develop and maintain real-time data streaming solutions using Kafka, Kafka Streams, and related technologies.
  • Performance Optimization: Monitor and optimize Kafka clusters for performance, including tuning brokers, topics, partitions, and configurations.
  • Security and Compliance: Implement and manage Kafka security measures, including encryption, authentication, and authorization, to ensure data integrity and compliance with industry standards.
  • Integration: Work closely with application developers, data engineers, and DevOps teams to integrate Kafka with other systems and services.
  • Monitoring and Alerts: Use tools such as Prometheus, Grafana, and Kafka Manager to set up monitoring, logging, and alerting for Kafka clusters.
  • Troubleshooting and Support: Diagnose and promptly resolve issues related to Kafka performance, connectivity, and data processing.
  • Documentation: Create and maintain detailed documentation for Kafka configurations, processes, and best practices.
  • Innovation and Improvement: Stay up-to-date with the latest developments in Kafka and related technologies, proposing improvements and new solutions as appropriate.
  • Proven experience with distributed systems, data streaming, and event-driven architectures.
  • Experience with Kafka Streams, KSQL, and other Kafka ecosystem tools.
  • Hands-on experience with cloud platforms (AWS, Azure, Google Cloud) is a plus.


Technical Skills:

  • Strong proficiency in Apache Kafka, including broker setup, topic management, and partitioning strategies.
  • Knowledge of data serialization formats such as Avro, Protobuf, and JSON.
  • Experience with Linux/Unix systems and scripting (Bash, Python, etc.).
  • Familiarity with DevOps practices and tools like Docker, Kubernetes, CI/CD pipelines, and Terraform.
  • Experience with monitoring tools (Prometheus, Grafana) and logging tools (Elasticsearch, Logstash, Kibana).

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments