Back to Careers

Confluent Kafka Engineer

Data Analytics Baltimore Contract $80000 - $ 120000

Description

Position Description: · Design Confluent Kafka cluster environments, configure and manage Kafka instances, and monitor system performance. · Ensure data integrity and availability in a big data environment. · Expertise in a programming language, such as Java or Python. · Collaborate with product design teams and SMEs to understand data pipeline needs. · Participate in all Agile ceremonies.

Requirements

Basic Qualifications

· 10+ years of experience in a technical field.

· Software development experience with a solid understanding of building, deploying, and maintaining applications that leverage the Confluent Kafka platform, focusing on data streaming and messaging solutions.

· Bachelor's degree in computer science, Information Technology, or a related field.

· Master's or Doctorate degree may substitute for required experience.

· 5+ years of experience on an Agile development team

· Must be able to obtain and maintain a Public Trust. Contract requirement.

· Must be able to work on-site in Woodlawn, MD 5 days a week.

Required Skills

· Extensive experience with Apache Kafka and Confluent Kafka, including proficiency with Confluent Control Center, KSQLDB, Kafka Streams, and Kafka Connect.

· Proven experience in Kafka development, including producer and consumer API, stream processing, and connector development.

· Experience with Kafka cluster management, including setup, configuration, monitoring, and troubleshooting.

· Familiarity with distributed systems, microservices architecture, and event-driven design patterns.

· Experience with AWS and containerization (Kubernetes) is a plus.

· Proficiency in programming languages, such as Java.

· Strong knowledge of Kafka internals, including brokers, zookeepers, topics, partitions, and offsets.

· Experience with monitoring tools (e.g., Prometheus, Grafana) and logging frameworks (e.g., Log4j.

· Proficiency in using Confluent Control Center for monitoring, managing, and optimizing Kafka clusters.

· Expertise in Kafka Streams for building scalable, fault-tolerant stream processing applications.

· Experience with KSQLDB for real-time processing and analytics on Kafka topics.

· Strong understanding of Kafka Connect for integrating Kafka with external data sources and sinks.

· Understanding of networking, security, and compliance aspects related to Kafka.

· Familiarity with CI/CD pipelines and automation tools (e.g., Jenkins, Git).

· Write and maintain high-quality code for Kafka producers, consumers, and stream processing applications.

· Develop and manage Kafka connectors for seamless integration with external systems, ensuring data consistency and reliability.

· Utilize Kafka Streams for real-time processing of streaming data, transforming and enriching data as it flows through the pipeline.

· Employ KSQLDB for stream processing tasks, including real-time analytics and transformations.

· Collaborate with data engineers, software developers, and DevOps teams to integrate Kafka solutions with existing systems.

· Ensure all Kafka-based solutions are scalable, secure, and optimized for performance.

· Troubleshoot and resolve issues related to Kafka performance, latency, and data integrity, including issues specific to Kafka Streams, KSQLDB, and Kafka Connect.

Desired Skills

· Experience in an AWS environment.

· Experience with Hadoop or other big data platform.

· Excellent troubleshooting and analytical skills to quickly identify and resolve issues.

· Proficiency in Software development, preferably Java.

· Experience working on Agile projects and understanding Agile terminology.

· Participate in daily scrum and provide updates.
Sign in to Apply

Use your Microsoft, Google, LinkedIn, or GitHub account.

Sign In to Apply