This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
In the Connectors team, we have six engineers. Our work enables Neo4j to connect with the data ecosystem, facilitating the movement of data from other systems into a graph for analysis and returning the results to various business systems. We build on a vast variety of technological stacks, from building products on top of Apache Kafka, Apache Spark, and Apache Beam to integrating Neo4j into Google Dataflow Templates and Microsoft Fabric. We also build backend components for our Data Importer product, including deployment of those into our Kubernetes-based infrastructure. At times, we also need to build frontends for our customers to interact with what we build. We are constantly building new or improving existing connectors for Neo4j, making our database accessible from a wider range of environments, including widely used big data and data warehouse products in the cloud and on-premise, while also helping improve the core database’s data ingestion capabilities with new features.
Job Responsibility:
Be part of designing and architecting connectors for the Neo4j database
Build and maintain new and existing connectors for various technologies
Design new streaming/data ingestion APIs to make Neo4j faster and easier to use for our customers
When necessary, work across teams to enhance our core database features
Occasionally, build front-ends for connectors or integrations that require user interaction
Requirements:
3+ years of experience developing production-level software
Experience designing multi-threaded systems and algorithms
Hands-on experience with backend programming languages, such as Kotlin, Java, or Go
Professional experience developing software systems, either in cloud or on-premises, involving either of the following: Distributed event streaming platforms, such as Apache Kafka or similar products
Data processing and analytics platforms, such as Apache Spark, Apache Beam, or their derivatives
Data warehouses in the cloud, such as AWS Redshift, GCP BigQuery, and Azure Synapse etc.
Nice to have:
Experience with graph databases such as Neo4j or other SQL/NoSQL databases
Experience in frontend development, ideally with React and TypeScript