
About Kpler
Transforming trade with data-driven insights
Key Highlights
- Raised $220 million in funding for expansion
- Acquired five companies between 2022 and 2023
- Leader in commodity market data with ten times the revenue of competitors
- Headquartered in Etterbeek, Brussels with 201-500 employees
Kpler, headquartered in Etterbeek, Brussels, is a leader in providing data and analytics for the commodity trading industry. Founded in 2014, Kpler has raised $220 million in funding and has acquired five companies between 2022 and 2023 to enhance its offerings. The company serves a diverse range of...
🎁 Benefits
Kpler offers a flexible office policy allowing employees to choose between co-working spaces, office locations, or full remote work. Employees receive...
🌟 Culture
Kpler fosters a culture of innovation by leveraging advanced data technology to transform the traditionally archaic commodity trading industry. The co...
Skills & Technologies
Overview
Kpler is seeking a Data Engineer to build and maintain core datasets and develop REST APIs and streaming pipelines. You'll work with technologies like Kafka and Apache Spark to ensure optimal functionality and reliability.
Job Description
Who you are
You have a strong background in data engineering, with experience in building and maintaining core datasets that include vessels characteristics, companies, and geospatial data. You are skilled in creating and maintaining REST APIs, and you have a solid understanding of streaming pipelines, particularly with Kafka. Your expertise in Apache Spark allows you to efficiently handle batch processing tasks, ensuring data is processed accurately and in a timely manner.
You take pride in your end-to-end ownership of development tasks, starting with a thorough understanding of assigned tickets and requirements. You are adept at designing and building functionality, including APIs and data processing components, and you ensure that your code is deployed to development environments and undergoes rigorous peer and product testing. Your attention to detail ensures that all code is compliant with defined standards and best practices.
You are committed to writing and executing unit, integration, and functional tests that align with defined test scenarios. You understand the importance of validation and compliance in your work, and you strive to maintain high standards in all aspects of your role. After release, you take responsibility for monitoring system performance, alerts, and service level objectives (SLOs) to ensure optimal functionality and reliability.
Desirable
Experience with cloud platforms and data warehousing solutions would be a plus, as would familiarity with data visualization tools. You are a proactive communicator, able to collaborate effectively with cross-functional teams to deliver impactful results.
What you'll do
In this role, you will be responsible for building and maintaining Kpler's core datasets, which are crucial for providing valuable insights to clients in the commodities, energy, and maritime sectors. You will create and maintain REST APIs that facilitate data access and integration, ensuring that clients can easily navigate complex markets.
You will develop streaming pipelines using Kafka to handle real-time data processing, as well as batch pipelines with Apache Spark for more extensive data operations. Your work will involve end-to-end ownership of development tasks, from understanding requirements to deploying code and conducting thorough testing.
You will collaborate closely with other engineers and stakeholders to ensure that the data processing components you build meet the needs of the organization and its clients. Your role will also include monitoring system performance and responding to alerts to maintain optimal functionality.
What we offer
At Kpler, you will join a team of over 700 experts from more than 35 countries, all dedicated to transforming intricate data into actionable strategies. We provide a supportive environment where you can leverage cutting-edge innovation for impactful results. You will have the opportunity to work on user-friendly platforms that simplify global trade information and empower organizations to make informed decisions.
We encourage you to apply even if your experience doesn't match every requirement. Join us in our mission to deliver top-tier intelligence and help our clients stay ahead in a dynamic market landscape.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Kpler.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
GetYourGuide is hiring a Data Engineer to enhance their Core Data Platform. You'll work with Java and Scala, utilizing technologies like Apache Airflow and Kafka to manage large-scale data processing. This role requires experience in data engineering and offers a chance to impact the travel industry.

Data Engineer
IDT is seeking a Data Engineer to join their BI team and perform data analysis, ELT/ETL design, and support functions. This role requires strong skills in data engineering and analysis, with a focus on delivering strategic initiatives.

Data Engineer
Point72 is hiring a Data Engineer to manage and optimize data systems. You'll work on data integration and analysis to support business decisions. This role requires expertise in data engineering principles.

Data Engineer
Texture is hiring a Senior Data Engineer to design and optimize data pipelines for a unified data network in the energy sector. You'll work with technologies like BigQuery, Redshift, and Python to ensure high data quality and availability. This position requires at least 6 years of relevant experience.

Data Engineer
Skello is hiring a Senior Data Engineer to implement technical solutions for data utilization across the company. You'll work with Python, Docker, and Airflow, focusing on data analysis and reporting. This role requires experience in data engineering and cloud platforms.