
About BitGo
Secure digital asset solutions for institutions
Key Highlights
- Headquartered in Palo Alto, CA with 201-500 employees
- $170.5 million raised in Series B funding
- Flagship multi-signature wallet used by major institutional clients
- In acquisition talks with PayPal and Galaxy Digital
BitGo is a regulated cryptocurrency custodian headquartered in Palo Alto, CA, specializing in secure custody, liquidity, and portfolio management solutions for institutional clients. With over $170.5 million raised in Series B funding, BitGo's flagship multi-signature wallet is trusted by major play...
🎁 Benefits
BitGo offers competitive compensation packages, including stock options and a 401k plan. Employees enjoy catered lunches, flexible vacation time, and ...
🌟 Culture
BitGo fosters a culture focused on security and innovation within the digital asset industry. By exclusively serving institutional clients, the compan...
Overview
BitGo is seeking a Data Engineer to build robust and scalable data pipelines. You'll work with SQL, Python, and modern data platforms like Snowflake and dbt. This role requires strong experience in data engineering and a collaborative mindset.
Job Description
Who you are
You are a skilled Data Engineer with a strong background in building robust and scalable data pipelines. With expertise in SQL and Python, you have hands-on experience with modern data platforms such as Snowflake and dbt. You understand the importance of data quality monitoring and have familiarity with reconciliation processes and anomaly detection. Your collaborative mindset allows you to work effectively with control-focused teams, and you pay close attention to detail in your work. A background in supporting audit, compliance, or risk functions is a plus, as you thrive in environments that require precision and accountability.
What you'll do
In this role at BitGo, you will be responsible for designing and implementing data pipelines that support various business functions. You will collaborate closely with engineers, analysts, data scientists, product managers, and security experts across multiple domains, gaining exposure to all facets of the business. Your work will ensure that the data infrastructure is robust and scalable, enabling the organization to securely navigate the digital asset space. You will also be involved in monitoring systems to ensure data integrity and quality, providing repeatable and auditable results to stakeholders. As part of a talented workforce, you will have opportunities to learn and grow while contributing to the operational backbone of the digital economy.
What we offer
BitGo offers a dynamic work environment where you can be part of a team that is transforming finance through digital asset solutions. You will have the chance to work onsite in the Bangalore office, collaborating with a great workforce that values learning and growth. The company provides snacks on-the-house and fosters a culture of collaboration and innovation. Join us and be part of a leading infrastructure provider in the digital asset space, where your contributions will have a significant impact on the future of finance.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at BitGo.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Real is hiring a Data Engineer to lead the creation and maintenance of data infrastructure and reporting capabilities. You'll work with SQL and AWS, and implement BI systems like Tableau. This role requires advanced knowledge of data design and performance.

Data Engineer
Drivetrain is hiring a Data Engineer to lay the foundation of an exceptional data engineering practice. You'll work with various programming languages and big data workflow frameworks. This position requires strong software engineering fundamentals and the ability to learn new technologies quickly.

Data Engineer
Coupang is hiring a Staff Data Engineer to architect and develop data ingestion systems, data lakes, and data warehouses. You'll work with technologies like AWS, Apache Spark, and SQL to enable data-driven decision-making across the organization.