
About Wizeline
Accelerating digital transformation for global businesses
Key Highlights
- Headquartered in San Francisco, California
- Over 1,000 employees worldwide
- Partners with clients like Google and Mastercard
- Raised over $100 million in funding
Wizeline is a global technology services company headquartered in San Francisco, California, specializing in software development, product design, and digital transformation. With a team of over 1,000 employees, Wizeline partners with clients like Google, Mastercard, and the BBC to deliver tailored ...
🎁 Benefits
Wizeline offers competitive salaries, equity options, a generous PTO policy, and flexible remote work arrangements. Employees also benefit from a lear...
🌟 Culture
Wizeline fosters a culture of innovation and collaboration, emphasizing an engineering-first approach. The company values diversity and inclusion, enc...
Overview
Wizeline is seeking a Lead Data Engineer to guide a team in migrating data ecosystems to a modern stack. You'll work with technologies like Snowflake, Airflow, and dbt. This role requires extensive experience in data engineering and leadership.
Job Description
Who you are
You are a highly experienced Lead Data Engineer with a strong background in data platform migrations. With over 5 years of experience in data engineering, you've successfully led teams through complex data migrations and understand the intricacies of data ecosystems. Your technical expertise includes working with tools such as PySpark, BigQuery, and Snowflake, allowing you to architect scalable solutions that ensure data quality and governance.
You possess a deep understanding of ELT processes and have hands-on experience designing and implementing data pipelines using Airflow and dbt. Your leadership style is collaborative, and you thrive in environments where you can mentor and guide your team to achieve their best work. You are passionate about leveraging data to drive business transformation and are committed to fostering a culture of growth and innovation.
Desirable
Experience with cloud cost management and FinOps practices is a plus. You have a background working with cross-functional global teams, which enhances your ability to communicate effectively and drive projects forward.
What you'll do
In this role, you will define and lead the end-to-end migration strategy from BigQuery, Athena, and PySpark to Snowflake. You will design, develop, and implement scalable ELT pipelines using Airflow for orchestration and dbt for transformations. Your responsibilities will include building robust data validation and reconciliation processes to ensure accuracy and integrity throughout the migration.
You will also take on architectural leadership, optimizing scalable data solutions that meet the needs of the business. As a Lead Data Engineer, you will mentor your team, helping them grow their skills and navigate challenges. You will collaborate closely with other teams to ensure that the data platform aligns with business objectives and supports decision-making processes.
What we offer
Wizeline provides a high-impact environment where your contributions will directly influence our data strategy. We are committed to your professional development and offer a flexible and collaborative culture that encourages innovation. You will have the opportunity to work in a vibrant community with global opportunities, allowing you to expand your horizons and make a significant impact in the field of data engineering.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Wizeline.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Wizeline is seeking a Mid Data Engineer to design and implement efficient data pipelines and models. You'll work with technologies like PySpark, SQL, and Snowflake to ensure data quality and performance. This role requires experience in data migration and pipeline development.

Data Engineer
Propelus is hiring a Senior Data Engineer IV to design and build robust data infrastructure. You'll work on developing efficient data pipelines and architectural strategies. This role requires expertise in data engineering and experience in building scalable systems.