
About Wizeline
Accelerating digital transformation for global businesses
Key Highlights
- Headquartered in San Francisco, California
- Over 1,000 employees worldwide
- Partners with clients like Google and Mastercard
- Raised over $100 million in funding
Wizeline is a global technology services company headquartered in San Francisco, California, specializing in software development, product design, and digital transformation. With a team of over 1,000 employees, Wizeline partners with clients like Google, Mastercard, and the BBC to deliver tailored ...
🎁 Benefits
Wizeline offers competitive salaries, equity options, a generous PTO policy, and flexible remote work arrangements. Employees also benefit from a lear...
🌟 Culture
Wizeline fosters a culture of innovation and collaboration, emphasizing an engineering-first approach. The company values diversity and inclusion, enc...
Overview
Wizeline is seeking a Mid Data Engineer to design and implement efficient data pipelines and models. You'll work with technologies like PySpark, SQL, and Snowflake to ensure data quality and performance. This role requires experience in data migration and pipeline development.
Job Description
Who you are
You have a solid background in data engineering, with experience in designing and implementing data pipelines that ensure efficient data migration and transformation. You are proficient in using PySpark and SQL, and you understand how to leverage dbt for building robust data models in Snowflake. Your expertise includes optimizing data pipelines for performance and cost efficiency, ensuring that data quality and accuracy are maintained throughout the process.
You are skilled in troubleshooting and resolving data-related issues, and you proactively monitor data systems to ensure smooth operations. You have a collaborative mindset and enjoy working in a team environment where you can share your knowledge and learn from others. You are committed to professional development and are eager to grow your skills in a high-impact environment.
Desirable
Experience with Airflow for orchestrating data workflows is a plus, as is familiarity with data warehousing best practices. You may also have insights into effective AI use and opportunities for process automation, which can enhance your contributions to the team.
What you'll do
In this role, you will be responsible for designing, developing, and implementing data pipelines that migrate data from various sources to Snowflake. You will translate complex data requirements into actionable dbt models and transformations, ensuring that data is accurately represented and easily accessible for analysis. You will build and maintain Airflow DAGs to orchestrate data ingestion and transformation processes, optimizing existing pipelines for better performance and scalability.
You will write complex SQL queries for data extraction, transformation, and loading, ensuring that data quality is upheld throughout the migration process. Your role will also involve identifying and resolving data-related issues, performance bottlenecks, and discrepancies, ensuring that data availability is maintained at all times.
You will collaborate with cross-functional teams to understand data needs and provide insights that drive business transformation. Your contributions will help accelerate market entry for clients by leveraging data and AI effectively.
What we offer
Wizeline offers a flexible and collaborative culture that fosters growth and innovation. You will be part of a vibrant community of professionals dedicated to making an impact through technology. We are committed to your professional development and provide global opportunities for career advancement. Our total rewards package is designed to recognize your contributions and support your well-being.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Wizeline.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Wizeline is seeking a Lead Data Engineer to guide a team in migrating data ecosystems to a modern stack. You'll work with technologies like Snowflake, Airflow, and dbt. This role requires extensive experience in data engineering and leadership.

Data Engineer
Propelus is hiring a Senior Data Engineer IV to design and build robust data infrastructure. You'll work on developing efficient data pipelines and architectural strategies. This role requires expertise in data engineering and experience in building scalable systems.

Data Engineer
Wizeline is hiring a Senior Data Engineer to lead a critical data platform migration initiative. You'll work with technologies like PySpark, Athena, dbt, and Snowflake to architect scalable data solutions. This role requires extensive experience in data engineering and team leadership.

Database Engineer
CI&T is hiring a Senior Database Platform Engineer to architect, maintain, and optimize mission-critical database solutions. You'll collaborate with application developers and IT leadership to modernize the database landscape across the EMEA region.