In this role you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers) where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.
- Design develop and maintain scalable and efficient backend services using Python.
- Implement and manage Kafka-based messaging systems for real-time data ingestion processing and distribution.
- Build and maintain data pipelines and stream processing applications using Kafka Streams Kafka Connect or similar technologies.
- Collaborate with DevOps and Data Engineering teams to ensure seamless integration and deployment of services.
- Optimize application performance and scalability through profiling tuning and refactoring.
- Ensure high code quality through unit testing code reviews and adherence to software development best practices.
- Monitor and troubleshoot production issues related to Kafka and backend services.
- Minimum 8 years of experience in Python development.
- Strong experience in building and maintaining Kafka-based systems.
- Proficiency in developing stream processing applications using Kafka Streams or Kafka Connect.
- Solid understanding of backend architecture RESTful APIs and microservices.
- Experience with CI/CD tools and deployment pipelines.
- Strong debugging and performance optimization skills.
- Familiarity with unit testing frameworks and code quality tools.
- Experience with cloud platforms (e.g. AWS Azure GCP).
- Exposure to containerization and orchestration tools like Docker and Kubernetes.
- Knowledge of data engineering concepts and tools.
- Experience with monitoring tools such as Prometheus Grafana or ELK stack.
- Familiarity with NoSQL databases caching systems and distributed computing.
- Contributions to open-source Python projects or active participation in developer communities.