Your mission
Our mission
Industry resources like energy or water are becoming increasingly scarce on our planet. Sensorfact’s mission is to eliminate all industrial energy waste and help our customers operate more efficiently. As an integration engineer, you will be instrumental in this adventure by designing and developing the necessary services to integrate Sensorfact solutions with available systems in our customers' infrastructure. You will be the enabler of the action in a fast-growing, hardware-enabled SaaS company.
How will you do that?
You will be working on our product, a combination of IoT platform, SaaS product, Data Science engine, and tooling provider for our Consultants. You’ll interact with developers, data scientists, and product managers to ensure your solutions are solving problems that matter. This combination of different expertise across our engineering teams makes it essential to share your knowledge with your colleagues, discuss tech solutions, and help each other improve. This is what we mean when we say “Teamwork” is part of our values.
Our company and the products we develop are growing rapidly. We can’t afford to slow down with faulty and buggy code, so while building excellent features, you’ll create documentation for the systems you work on, covering them with pragmatic tests and avoiding technical debt. Code and process reviews will also be essential to your work because “Constant Improvement” is part of our values.We believe that everything we do starts with the “why”. You can not own what you code without understanding and believing it. Our sprints are not collections of tickets. They are a step toward a bigger goal, and every coworker is responsible for the final result. This is why “ownership” is our fundamental value.
Technologies you will be working with
Our core platform is based on a microservices architecture using Node.js as the runtime environment platform. These microservices are deployed in Kubernetes (AWS is our cloud infrastructure provider). Our platform offers customers real-time data collection and analysis using a streaming architecture. We use Kotlin as the programming language for the services that interact directly with Kafka.
A significant portion of our business logic uses lambda functions. Data is accessible through GraphQL APIs managed by Hasura. Time series data is stored with ClickHouse, and for other data, we use a combination of Postgres(metadata) and S3 (raw data).
You do not need to be an expert in every technology mentioned here, but transparency is another of our values, so we want you to know what you are applying.