
About Arcesium
Empowering asset managers with advanced fintech solutions
Key Highlights
- Spin-out from D. E. Shaw group, enhancing expertise
- Headquartered in New York City with a strong fintech focus
- Supports over 100 clients in the asset management sector
- Employs between 1,000 and 5,000 professionals
Arcesium, a spin-out of the D. E. Shaw group, specializes in software and services for asset managers, focusing on post-trade activities. Headquartered in New York City, Arcesium supports over 100 clients, including hedge funds and investment firms, with its comprehensive suite of financial technolo...
🎁 Benefits
Arcesium offers competitive salaries, equity options, generous PTO policies, and a flexible remote work environment to support work-life balance....
🌟 Culture
Arcesium fosters a culture centered on innovation and collaboration, emphasizing a strong engineering focus and a commitment to delivering high-qualit...
Skills & Technologies
Overview
Arcesium is hiring a Lead Data Engineer to design and optimize scalable data pipelines for their Data Platform. You'll collaborate with Product Managers, Data Scientists, and Software Engineers, utilizing skills in Apache Spark, Airflow, and AWS. This role requires strong hands-on experience in distributed data processing.
Job Description
Who you are
You have 5+ years of experience in data engineering, with a strong focus on building and optimizing data pipelines. Your expertise in distributed data processing allows you to architect robust solutions that meet complex business requirements. You thrive in collaborative environments, working closely with cross-functional teams to translate business needs into technical solutions.
You possess hands-on experience with tools like Apache Spark and Airflow, which you have used to streamline data workflows and enhance operational efficiency. Your proficiency in AWS enables you to leverage cloud technologies effectively, ensuring scalability and reliability in data processing tasks. You are detail-oriented and have a knack for problem-solving, which helps you identify and address potential issues before they escalate.
What you'll do
In this role, you will lead the design and implementation of scalable data pipelines that support Arcesium's Data Platform. You will work closely with Product Managers and Subject Matter Experts to understand business requirements and translate them into engineering solutions. Your responsibilities will include architecting data pipelines, optimizing existing workflows, and ensuring data quality and integrity throughout the process.
You will collaborate with Data Scientists and Software Engineers to integrate data solutions into the broader technology stack, enabling advanced analytics and reporting capabilities. Your role will also involve mentoring junior engineers, fostering a culture of continuous learning and improvement within the team. You will be instrumental in driving efficiency and speed in data processing, contributing to Arcesium's mission of delivering innovative financial technology solutions.
What we offer
Arcesium provides a dynamic work environment where you can grow your skills and advance your career. We value intellectual curiosity and proactive ownership, empowering you to make meaningful contributions from day one. Join us as we tackle complex data challenges and help our clients achieve transformational business outcomes.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Arcesium.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Arcadia is hiring a Lead Data Engineer to drive the development of solution architecture and data pipeline connectors for healthcare analytics. You'll work with SQL and Python to automate data flows between client systems and Arcadia's platform. This role requires strong data architecture knowledge and experience in programming.

Data Engineer
sweetgreen is seeking a Lead Data Engineer to architect and develop robust data infrastructure for their Marketing and Finance departments. This role emphasizes reliability, performance, and data quality, requiring strong leadership and technical skills.