This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We're seeking a Senior Backend Engineer to join our core engineering team and play a pivotal role in building and scaling our AI-driven Knowledge Processing Unit (KPU). You'll work on cutting-edge distributed systems that power enterprise AI applications, contributing to a platform that enables reliable AI agents with deterministic outcomes.
Job Responsibility:
Develop and maintain scalable APIs and backend services using Go (primary) or Python
Build and optimize event-driven architectures handling real-time data processing
Work with Apache Kafka as our communication backbone between microservices
Design and optimize database queries across MySQL or Postgres
Ensure scalability, performance, and security of backend services in containerized environments
Contribute to our Docker containerized and Kubernetes orchestrated microservices ecosystem
Implement GRPC, HTTP, and WebSocket communication protocols
Build resilient systems with proper error handling and monitoring
Collaborate with AI teams to integrate LLM capabilities and AI reasoning engines
Work with CI/CD pipelines and Git version control
Implement monitoring and logging using tools like Prometheus, Grafana, and Sentry
Optimize performance for high-concurrency environments
Ensure proper resource management and auto-scaling
Requirements:
5+ years of backend development experience
Strong experience with Go (preferred) or Python (and willingness to learn Go)
Experience with event-driven architectures and message brokers (Kafka, RabbitMQ, EventBridge)
Solid understanding of RESTful APIs and GraphQL
Experience with MySQL or Postgres databases and data consistency patterns
Knowledge of Docker and Kubernetes for container orchestration
Experience with microservices architecture and distributed systems
Proficiency in CI/CD pipelines and Git
Nice to have:
Experience with AWS services (EventBridge, S3, DocumentDB)
Knowledge of Python for AI/ML integrations
Experience with GRPC and WebSocket protocols
Familiarity with monitoring tools (Prometheus, Grafana, Sentry)
Understanding of AI/ML concepts and enterprise software patterns
Experience with Redis for caching and temporary storage
What we offer:
Opportunity to shape the future of accountable enterprise AI agents
Competitive compensation package
Flat organization focused on impact rather than hierarchy
Work with cutting-edge computational AI technology
Dynamic, experienced team of technical experts
Continuous learning in the rapidly evolving field of Agentic AI
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.