
About Rackspace
Your partner in managed cloud solutions
Key Highlights
- Headquartered in San Antonio, Texas
- Over 200,000 customers including BMW and NASA
- $1.5B+ raised in funding
- Approximately 7,000 employees worldwide
Rackspace Technology, Inc., headquartered in San Antonio, Texas, is a leading managed cloud computing company that provides services such as cloud migration, managed hosting, and multi-cloud solutions. With over 200,000 customers, including major brands like BMW and NASA, Rackspace has raised over $...
🎁 Benefits
Employees enjoy competitive salaries, stock options, generous PTO policies, remote work flexibility, and comprehensive health benefits....
🌟 Culture
Rackspace fosters a customer-centric culture with a strong emphasis on service excellence and innovation in cloud technology, encouraging employees to...
Overview
Rackspace is hiring a Senior Big Data Infra Engineer to develop and scale stream and batch processing systems. You'll work with cloud technologies and automation tools like Terraform and Docker. This position requires 4+ years of experience in data processing systems.
Job Description
Who you are
You are a highly skilled Senior Big Data Infra Engineer with a strong background in developing and scaling both stream and batch processing systems. You have a solid understanding of public cloud technologies and are comfortable working in a remote environment. Your excellent communication skills enable you to solve complex problems independently and creatively.
You have experience implementing automation and DevOps best practices for CI/CD, Infrastructure as Code (IaC), and containerization. You are adept at building reusable infrastructure for stream and batch processing systems at scale, and you have a knack for creating automation through scripting and building DevOps pipelines.
What you'll do
In this role, you will be responsible for developing and scaling data processing systems, utilizing technologies such as Pub/Sub, Kafka, Kinesis, DataFlow, Flink, Hadoop, Pig, Hive, and Spark. You will work with public cloud services, including Azure, AWS, or GCP, and have a deep understanding of cloud-managed services and cloud-based messaging/stream processing systems.
You will participate in work sessions with clients and complete technical documentation to ensure clarity and understanding of the systems you develop. Your expertise in Infrastructure and Applied DevOps principles will be critical in your daily work, as you utilize tools for continuous integration and continuous deployment (CI/CD) and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes.
Additionally, you will leverage your knowledge of containerization technologies such as Docker and Kubernetes to enhance the scalability and efficiency of the systems you build. Your role will involve collaborating with cross-functional teams to ensure that the infrastructure meets the needs of the business and clients.
What we offer
At Rackspace, we offer a collaborative work environment where you can thrive and grow your skills. You will have the opportunity to work with cutting-edge technologies and be part of a team that values innovation and creativity. We encourage you to apply even if your experience doesn't match every requirement, as we believe in the potential of our team members to grow and succeed together.
Interested in this role?
Apply now or save it for later. Get alerts for similar jobs at Rackspace.
Similar Jobs You Might Like
Based on your interests and this role

Data Engineer
Rackspace is hiring a Senior Big Data Engineer to design and implement large-scale data pipelines. You'll work with technologies like Oozie, Pig, and the Apache Hadoop ecosystem, requiring strong programming skills in Java or Python. This fully remote position is ideal for an independent engineer with expertise in distributed systems.

Data Engineer
Rackspace is hiring a Data Engineer with a focus on Infra/DevOps to design and optimize data platforms within the Azure ecosystem. You'll work with technologies like Python, SQL, and Azure services to build scalable data solutions. This role requires expertise in cloud infrastructure and DevOps automation.

Data Engineer
Cobot is hiring a Senior Data Infrastructure Engineer to design data systems for real-time insights in robotics. You'll work with technologies like Kafka, Python, and AWS. This role requires experience in building data pipelines and analytics platforms.

Data Engineer

Data Engineer
Axonius is seeking a Senior Data Infra Engineer to design and build data and ML pipelines, as well as cloud infrastructure. You'll work with AWS and programming languages like Python, Scala, and Java. This role requires 5+ years of experience in big data environments.