Verified Visa SponsorConshohocken, PA, USRemoteFull-time$60 - $65Posted 1 weeks ago

Job Description

**About US:** We are a company that provides innovative, transformative IT services and solutions. We are passionate about helping our clients achieve their goals and exceed their expectations. We strive to provide the best possible experience for our clients and employees. We are committed to continuous improvement and innovation, and we are always looking for ways to improve our services and solutions. We believe in working **collaboratively** with our clients and employees to achieve success.

DS Technologies Inc is looking for **KAFKA Admin** rolefor one of our premier clients.

**Job Title : KAFKA Admin**

**Location : Conshohocken, PA (Onsite)**

**Position Type: Contract**

**Only W2**

We are in need of the following role urgently for our client **IBM/ Cencora**. Please send pre-vetted and video screened and interviewed candidates only.

The rate cap for this role is $60 - $65. And the work location is Conshohocken, PA. This is NOT a remote role.

**KAFKA Admin -**

'• 5+ years of hands-on experience with Kafka/Confluent in production.

  • Strong expertise with:

o SSL/TLS end-to-end configuration in Kafka ecosystems

o RBAC authorization configuration and operational administration

o Designing for HA/redundancy and scaling for growth

o Monitoring/alerting with Prometheus & Grafana, plus operational tooling such as New Relic

o Performance testing and tuning (producers/consumers, brokers, Connect, infrastructure)

  • Demonstrated experience implementing:

o Confluent Oracle Premium CDC Connector

o Confluent sink to ADLS Gen2 (ADLS2)

  • Proficiency with Azure DevOps, Git, and building CI/CD pipelines.
  • Working knowledge of Apache Flink and hands-on experience writing Kafka Streams.

**Key Responsibilities:**

  • Security & Access Control

o Configure end-to-end SSL/TLS across Kafka/Confluent components and client integrations.

o Implement and manage RBAC for authorizations, service accounts, and least-privilege access.

  • High Availability, Redundancy & Failover

o Configure core components for redundancy and failover resilience (brokers/controllers, Connect, Schema Registry, etc.).

o Design and implement a Kafka disaster recovery (DR) cluster, including replication strategy, failover testing, and runbooks aligned to RPO/RTO.

  • Scale & Future Growth

o Plan and implement platform scalability for future growth (topic/partition strategy, retention, throughput, capacity planning).

o Establish sustainable operational practices for multi-team usage and governance.

  • Monitoring, Alerting & Operations

o Set up monitoring and alerts for streaming messages and platform health using Prometheus & Grafana.

o Integrate New Relic dashboards/alerts to support operational visibility, incident response, and service health metrics.

  • Performance Engineering

o Perform performance testing and tune Kafka/Confluent components for optimal throughput, latency, and stability.

o Troubleshoot complex production issues across brokers, networking, storage, Connect, and client workloads.

  • Connectors & Data Integration

o Implement and support Confluent Oracle Premium CDC Connector (configuration, offsets, schema evolution, error handling, operations).

o Implement and support Confluent Sink Connector to ADLS2 (Azure Data Lake Storage Gen2) with reliable delivery and partitioning strategies.

  • Streaming Development

o Build and support stream processing using Apache Flink (job configuration, deployment patterns, operationalization).

o Develop Kafka Streams applications (topology design, state stores, exactly-once/processing guarantees as needed).

  • DevOps & Automation

o Use Azure DevOps with Git integration for version control, reviews, and change management.

o Deploy and manage cloud resources using Terraform and Ansible.

o Build and maintain CI/CD pipelines for platform configuration, connectors, and streaming jobs across environments.

  • Cost Allocation

o Support chargeback/showback calculations for Kafka usage (e.g., throughput, storage, partitions, connector/resource utilization) and related reporting.

**Preferred Qualifications:**

  • Experience implementing and testing Kafka DR cluster architectures and operational runbooks.
  • Familiarity with enterprise governance patterns (multi-tenancy, naming standards, quotas, schema governance).
  • Experience defining usage metering to enable reliable chargeback/showback.

**Additional Expectations:**

  • Proficiency with using Linux CLI (preferably RHEL/SLES).
  • Ability to participate in an on-call rotation and provide timely incident support (if required).
  • Strong documentation and stakeholder communication skills across engineering, operations, and product teams.
  • Ability to create design patterns/templates, provide development support and conduct knowledge transfer sessions for onboarding new use cases and modernizing existing ones.
Apply (Original)

AI Resume Tailoring

23%
Before
87%
After

Tailor your resume for KAFKA Admin roles

Skills & keywords matchedATS-optimized format

Reach hiring managers at DS Technologies

A.
A. T.·Human Resources Generalist
LinkedIn
6 contacts · 1 recruiters
Unlock contacts (free)

AI Cover Letters for KAFKA Admin

Generate tailored cover letters, recruiter emails, and LinkedIn messages matched to your resume.

Cover Letter
250-350 words, 4 paragraphs
LinkedIn Message
300 chars, casual tone
Follow-up Email
100-150 words, concise
  • Tailored to your resume & job
  • Cover letters, emails, LinkedIn messages
  • Professional tone, your experience
Try AI Cover Letters (free)

Your toolkit for landing KAFKA Admin roles

AI Resume Tailoring
Recruiter Finder
Job Radar Alerts
Application Tracker