G2i Inc.

About G2i Inc.

Connecting companies with top-tier remote developers

🏢 Tech👥 11-50📅 Founded 2012📍 Delray Beach, Florida, United States

Key Highlights

  • Headquartered in Delray Beach, Florida
  • Specializes in React, React Native, and Node.js
  • Serves a diverse range of clients from startups to enterprises
  • Vetted remote engineers ensure high-quality development

G2i is a specialized talent marketplace headquartered in Delray Beach, Florida, connecting companies with vetted software developers skilled in web, mobile, and cross-platform technologies, particularly React, React Native, and Node.js. With a focus on remote work, G2i serves clients ranging from st...

🎁 Benefits

G2i offers competitive compensation, flexible remote work options, and a supportive environment for engineers to thrive. Employees enjoy a generous PT...

🌟 Culture

G2i fosters a remote-first culture that emphasizes trust and autonomy, allowing engineers to work from anywhere while focusing on delivering high-qual...

G2i Inc.

Data Engineer

G2i Inc.Europe - Remote

Posted 2 months ago🏠 RemoteData Engineer📍 Europe💰 $80 - $80 / hourly
Apply Now →

Overview

G2i Inc. is hiring a Data Engineer to design and maintain ETL pipelines and data infrastructure for high-growth startups. You'll work with TypeScript and contribute to data orchestration and analytics. This role is fully remote and focused on operational excellence.

Job Description

Who you are

You have experience in data engineering, particularly in designing and maintaining ETL/ELT pipelines that collect, transform, and load data from various sources into a central data layer. You are skilled in TypeScript and understand the importance of writing robust, well-tested code for data and pipeline orchestration. You thrive in collaborative environments, partnering with cross-functional stakeholders to define data needs and ensure data quality, reliability, and availability.

You are detail-oriented and proactive in monitoring data pipelines, setting up alerting systems, and ensuring data accuracy and performance as systems grow. You have a strong understanding of data workflows and best practices, contributing to the design and architecture of data infrastructure that serves as a source of truth for operational systems.

What you'll do

As a Data Engineer, you will be responsible for the end-to-end process of data ingestion, schema design, transformation, and enabling downstream analytics and dashboards. You will collaborate closely with operations, analytics, and engineering teams to understand their data needs and ensure that the data infrastructure supports their workflows effectively. Your role will involve writing and maintaining code in TypeScript, as well as potentially using other languages and tools as needed.

You will monitor the performance of data pipelines, set up alerting mechanisms, and ensure that the systems you build are scalable and reliable. Your contributions will help in building the infrastructure that supports internal tooling and operational systems, enabling the company to deliver high value to its clients. You will also play a key role in defining best practices around data workflows, ensuring that the data processes are efficient and effective.

What we offer

This position offers the flexibility of fully remote work, allowing you to collaborate with a diverse team across Europe. You will have the opportunity to work with high-growth startups, contributing to projects that transform how teams operate. The role is contractor-based, with a competitive rate of up to US$ 80/hr. We value craftsmanship, clarity, and autonomy, and we encourage you to apply even if your experience doesn't match every requirement. Join us in driving operational excellence through data engineering.

Interested in this role?

Apply now or save it for later. Get alerts for similar jobs at G2i Inc..

Similar Jobs You Might Like

Based on your interests and this role

Nebius AI

Data Engineer

Nebius AI📍 Amsterdam

Nebius AI is hiring a Data Engineer to design and maintain robust data infrastructure and pipelines. You'll work with SQL, Python, and Apache Airflow to optimize data workflows. This position requires experience in data engineering and analytics.

Mid-Level
18h ago
Wheely

Data Engineer

Wheely📍 London - Hybrid

Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and supporting business users. You'll work with technologies like SQL, Python, Kafka, and Snowflake to ensure a seamless data experience.

🏢 HybridMid-Level
1 month ago
Optiver

Data Engineer

Optiver📍 Singapore - On-Site

Optiver is hiring a Data Engineer (Commodities) to design and build high-performance data workflows for trading and research teams. You'll work with technologies like Python, SQL, and Airflow in Singapore.

🏛️ On-SiteMid-Level
2 months ago
Confluent

Data Engineer

Confluent📍 United Kingdom - Remote

Confluent is hiring a Data Engineer to build efficient data pipelines that enable data accessibility across the organization. You'll work with technologies like Apache, Airflow, and Kafka. This position requires strong technical capabilities and experience in data engineering.

🏠 RemoteMid-Level
2w ago
Pipedrive

Data Engineer

Pipedrive📍 Tartu

Pipedrive is hiring a Data Engineer to design and maintain data pipelines and models. You'll work with modern tools to ensure data reliability and scalability. This position requires a strong understanding of data engineering principles.

4 months ago