Onebridge, a Marlabs Company, is an AI and data analytics consulting firm that strives to improve outcomes for the people we serve through data and technology. We have served some of the largest healthcare, life sciences, manufacturing, financial services, and government entities in the U.S. since 2005. We have an exciting opportunity for a highly skilled GCP Data Engineer to join an innovative and dynamic group of professionals at a company rated among the top “Best Places to Work” in Indianapolis since 2015.
GCP Data Engineer | About You
As a GCP Data Engineer, you are responsible for designing, developing, and maintaining scalable data solutions that empower data-driven decision-making across the organization. With extensive experience in GCP, you specialize in optimizing data pipelines, working with big data processing frameworks, and ensuring data integrity and availability. You thrive in fast-paced environments and excel at solving complex data challenges using innovative cloud architectures. Your ability to collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders ensures that your solutions meet business needs. You are committed to building reliable, efficient, and secure data systems that support analytics and business intelligence.
GCP Data Engineer | Day-to-Day
Design, develop, and maintain robust data pipelines leveraging GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage.
Optimize and scale large-scale ETL workflows to ensure high performance, reliability, and efficiency in data processing.
Implement data modeling, warehousing solutions, and best practices for data organization and accessibility using BigQuery.
Ensure data quality and integrity by developing rigorous testing, validation, and monitoring mechanisms across all data workflows.
Collaborate effectively with data scientists, analysts, and application teams to translate business requirements into scalable data solutions.
Automate infrastructure deployment using Infrastructure-as-Code (IaC) tools like Terraform, and manage security, access control, and compliance standards.
GCP Data Engineer | Skills & Experience
5+ years of experience as a Data Engineer with deep expertise in Google Cloud Platform (GCP) and a proven track record of designing scalable data solutions.
Proficient in SQL and Python, utilizing these skills to develop efficient data pipelines, process large datasets, and automate workflows.
Hands-on experience with GCP services such as BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Cloud Composer for building robust cloud-based data infrastructure.
Strong understanding of ETL/ELT processes, with expertise in handling and transforming large-scale datasets in cloud environments.
Familiarity with distributed data processing frameworks like Apache Beam, Spark, and others to handle complex data transformations at scale.
In-depth knowledge of data governance, security best practices, and compliance standards, ensuring the integrity and safety of cloud-based data systems.
Experience in the Telco domain or a Google Cloud Professional Data Engineer Certification are highly desirable.