
About Wheely
Luxury ride-hailing with a focus on privacy
Key Highlights
- Founded in 2010, headquartered in London, UK
- Available in London, Moscow, Paris, and Dubai
- $43.1M raised in Series B funding
- Thousands of certified chauffeurs driving under the Wheely brand
Wheely is a privacy-first luxury ride-hailing platform founded in 2010, headquartered in Brackenbury Village, London. The company operates in major cities like Moscow, St. Petersburg, Paris, and Dubai, providing high-quality chauffeuring services with a rigorous driver certification process. With $4...
π Benefits
Wheely offers a stock options plan, monthly credits for rides, generous training allowances, comprehensive healthcare benefits, and a daily lunch allo...
π Culture
Wheely prioritizes quality and customer satisfaction by employing a strict driver certification process, ensuring a premium experience for clients. Th...
Overview
Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and supporting business users. You'll work with technologies like SQL, Python, Kafka, and Snowflake to ensure a seamless data experience.
Job Description
Who you are
You have 3+ years of experience in Data Infrastructure Engineer, Data Engineer, or MLOps Engineer roles, demonstrating a strong foundation in analytical databases such as Snowflake, Redshift, and BigQuery. Your expertise includes deploying and monitoring data pipelines using tools like Kafka and Airflow, ensuring that data flows smoothly and efficiently across systems. You are fluent in SQL and Python, allowing you to manipulate and analyze data effectively. Your experience in data modeling emphasizes a DRY and structured approach, applying performance tuning techniques to enhance data accessibility and usability. You are also skilled in containerizing applications and code using Docker and Kubernetes, which helps streamline deployment processes. Your ability to identify performance bottlenecks ensures that the data infrastructure remains robust and responsive to business needs.
Desirable
Experience with researching and integrating open-source technologies related to data ingestion, data modeling, and BI reporting is a plus. You have a proactive mindset, always looking for ways to improve data quality and support business units with feature requests and bug fixes. Your intermediate level of English allows you to communicate effectively with team members and stakeholders.
What you'll do
As a Data Engineer at Wheely, you will enhance the Data Team by implementing architectural best practices and optimizing low-level processes. You will support the evolution of data integration pipelines, utilizing tools like Debezium and Kafka to ensure that data is accurately captured and processed. Your role will involve data modeling with dbt and working with database engines such as Snowflake to create efficient data structures. You will also engage in ML Ops using Airflow and MLflow, ensuring that machine learning models are effectively integrated into the data pipeline. Additionally, you will be responsible for BI reporting using Metabase and Observable, enabling business users to derive insights from data easily. You will cover business units with feature requests, bug fixes, and address data quality issues, ensuring that the data infrastructure meets the needs of the organization. Enforcing code quality, automated testing, and maintaining code style will be part of your responsibilities, contributing to a high standard of work within the team.
What we offer
Wheely provides a supportive in-person culture while allowing flexible working hours and the option to work from home when needed. You will receive a monthly credit for Wheely journeys, a lunch allowance, and professional development subsidies to help you grow in your career. We also offer a cycle-to-work scheme and top-notch equipment to ensure you have everything you need to succeed. Depending on your role level, a relocation allowance may be available to assist with your transition to our team. We value your personal information and ensure it is collected, stored, and processed in accordance with Wheelyβs Candidate Privacy Notice.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Wheely.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Veeva Systems is hiring a Data Engineer to focus on data pipelines and enhance the Link data processing platform. You'll work in a flexible environment, contributing to the life sciences industry. This position requires expertise in data processing and pipeline management.

Data Engineer
DRW Holdings is hiring a Data Engineer to join their Data Experience team. You'll design and build data pipelines, govern centralized data processes, and collaborate with Traders and Quantitative Researchers. This role requires expertise in SQL and experience with Java or Python.

Data Engineer
Zopa is hiring a Data Engineer to design and build complex data pipelines that support decision-making across the bank. You'll work collaboratively with various teams to solve challenging problems in the finance sector.

Data Engineer
Intropic is hiring a Data Engineer to transform complex data into actionable insights. You'll collaborate with a small team to understand user needs and develop features. This role is based in London.

Data Engineer
Fresha is hiring a Senior Data Engineer to enhance their data infrastructure and pipelines. You'll work with technologies like Kafka, Spark, and Flink in London. This role requires strong experience in data engineering.