
About Qonto
Empowering SMEs with seamless financial solutions
Key Highlights
- Raised $732.7 million in funding to date
- Over 1001 employees across multiple countries
- Expanded from France to Spain, Germany, and Italy
- Acquired Penta to enhance digital banking for SMEs
Qonto, headquartered in Saint-Georges, Paris, is a leading fintech company that provides payment services and bookkeeping solutions for SMEs and freelancers. Founded in 2016, Qonto has raised $732.7 million in funding and has expanded its operations from France into Spain, Germany, and Italy. The co...
π Benefits
Qonto offers a tailor-made remote policy that accommodates hybrid and full-remote work options, a Lunch Card for meals, dynamic career tracks for prof...
π Culture
Qonto fosters a culture of innovation and agility, focusing on the digitization of financial services. The company emphasizes collaboration with other...
Skills & Technologies
Overview
Qonto is hiring a Staff Data Platform Engineer to architect and implement scalable data platform tools that support AI product development. You'll work with technologies like Python, AWS, and Docker to enable effective deployment and operation of AI models. This position requires experience in regulated environments, preferably in financial services.
Job Description
Who you are
You have a strong background in data engineering with a focus on building scalable data platforms β your experience includes architecting solutions that support AI product development and deploying machine learning models effectively. You are proficient in Python and have hands-on experience with cloud platforms like AWS, ensuring that your solutions are robust and efficient. Your familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes allows you to streamline deployment processes and enhance system reliability.
You understand the importance of data governance and compliance, especially in regulated environments like financial services β your previous roles have equipped you with a strong mindset for maintaining data integrity and security. You thrive in collaborative settings, acting as a technical bridge between AI product teams and platform engineering, ensuring seamless integration of AI capabilities across business areas.
You are passionate about continuous improvement and mastery in your field β you actively seek opportunities to raise the bar for yourself and your team, contributing to a culture of excellence. Your ability to communicate complex technical concepts to non-technical stakeholders makes you an invaluable asset in cross-functional projects.
Desirable
Experience with machine learning frameworks and tools is a plus, as is familiarity with data visualization tools like Tableau or Looker. You are also encouraged to apply if you have experience in agile methodologies or project management tools, as these skills can enhance your contributions to the team.
What you'll do
In this role, you will architect and implement scalable data platform tools that support AI product development β your primary focus will be on enabling teams to deploy and operate AI models effectively. You will collaborate closely with AI product teams to understand their data needs and ensure that the platform meets those requirements. Your responsibilities will include designing data pipelines, optimizing data storage solutions, and ensuring data quality across various sources.
You will also play a key role in integrating AI capabilities into existing business processes β this involves working with data scientists and product managers to identify opportunities for leveraging AI to improve operational efficiency and customer experience. Your expertise in tools like Apache Spark and Airflow will be crucial in building and maintaining these data workflows.
Additionally, you will mentor junior engineers and contribute to the overall technical strategy of the data engineering team β your leadership will help foster a culture of learning and innovation within the team. You will also be involved in evaluating new technologies and tools that can enhance the data platform's capabilities.
What we offer
At Qonto, we believe in creating a welcoming environment where everyone can thrive β we prioritize diversity and inclusion in our hiring process and encourage applicants from all backgrounds to apply. You will have the opportunity to work in a dynamic and supportive team, contributing to a mission that empowers SMEs across Europe. We offer competitive compensation and benefits, along with opportunities for professional growth and development within the company. Join us in building a finance workspace that truly makes a difference for our customers.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Qonto.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Skello is hiring a Senior Data Engineer to implement technical solutions for data utilization across the company. You'll work with Python, Docker, and Airflow, focusing on data analysis and reporting. This role requires experience in data engineering and cloud platforms.

Data Engineer
Workato is hiring an Intern, Data Engineering to assist in transforming technology complexity into business opportunities. You'll be involved in streamlining operations and connecting data, processes, and applications. This role is ideal for those looking to start their career in data engineering.

Data Engineer
Ogury is seeking a Data Engineer to optimize data architecture and enhance reporting systems. You'll work closely with various teams to make data accessible and actionable for business users. This role requires a passion for creating and optimizing data pipelines.

Data Engineer
Odaseva is hiring a Data Engineer to design, build, and optimize scalable data pipelines for their Enterprise Data Platform. You'll work with Salesforce data and help kick-start AI use cases. This position requires expertise in data management and Salesforce environments.

Data Engineer
Zoox is hiring a Data Engineer Intern to build data pipelines and infrastructure for autonomy development. You'll work with technologies like Databricks, Spark, and Looker in Foster City, CA.