Sr. Cloud Data Engineer

Richardson, Texas

Apply Apply with LinkedIn
Save

Type: Contract

Experience: 0

Category: Information Technology

Contractor Work Model: Onsite

Brand: System One

Compensation Range: 90.32 - 90.32 Per Hour

Job ID: 348208

Date Posted: 03/24/2026

Shortcut: http://jobs.systemone.com/Ne9S8A


Job Title: Sr. Cloud Data Engineer

Location: Richardson, Texas

Type: Contract

Compensation: $80 - $90 per hour

Contractor Work Model: Onsite

Hours: 40.0

Overview
A leading organization is seeking a Senior Cloud Data Engineer to design, build, and support scalable data solutions on AWS, with a strong emphasis on Apache Iceberg lakehouse architecture and Snowflake for cloud data warehousing and analytics. You’ll help transform raw data into secure, high-quality datasets and data products that enable analytics, reporting, and operational use cases

Responsibilities

  • Design and develop data architecture (AWS Lakehouse + Snowflake): Create scalable, reliable, and efficient data lakehouse solutions on AWS using Amazon S3 and Apache Iceberg, and design curated/consumption architectures in Snowflake to enable performant analytics and governed data sharing.
  • Build and maintain data pipelines (native AWS tooling): Design, construct, and automate ETL/ELT processes to ingest data from diverse sources into AWS, leveraging native services such as AWS Glue, Lambda, Step Functions, EventBridge, and orchestration patterns as appropriate.
  • Develop and manage Iceberg tables: Build and manage Apache Iceberg datasets, including table design, schema evolution, partition strategies, and compaction/maintenance patterns to support ACID-like behavior and scalable analytics.
  • Snowflake engineering: Design and implement Snowflake objects and pipelines to support analytics and data products (schemas, tables, views), and contribute to patterns for secure and governed consumption.
  • Create and manage data APIs / interfaces: Design, develop, and maintain secure and scalable RESTful (and other) APIs to facilitate data access for internal teams and applications, typically leveraging AWS services (e.g., API Gateway, Lambda, IAM).
  • Optimize performance and cost: Implement partitioning strategies, data layout optimization, and tuning techniques across Iceberg and Snowflake; monitor workloads and continuously improve efficiency and runtime performance.
  • Ensure data quality and integrity: Implement data validation, reconciliation, and error-handling processes; build observability into pipelines so issues are detected early and addressed quickly.
  • Collaborate with stakeholders: Work closely with analysts, data scientists, software engineers, and business teams to understand data needs and deliver effective, reusable solutions.
  • Provide technical support: Offer troubleshooting and technical expertise for data-related issues across pipelines, datasets, and endpoints.
  • Maintain documentation: Create and maintain technical documentation for data workflows, pipelines, dataset definitions, and API specifications.
Requirements
  • Education: Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Experience: Proven experience in data engineering with significant hands-on experience building data solutions on AWS.
  • Programming: Proficiency in Python, Java, or Scala.
  • SQL: Strong SQL skills for querying, transformations, and data modeling / database design (including Snowflake SQL).
  • AWS Data Engineering Services: Practical experience with AWS services such as S3, Glue, Lambda, API Gateway, and IAM (and related native services used to build secure, automated pipelines).
  • Big Data: Experience with Apache Spark and Hadoop ecosystems.
  • API Development: Experience creating and deploying RESTful APIs, including best practices for performance and security.
  • ETL/Workflow / Orchestration: Experience with workflow orchestration tools (e.g., Airflow or AWS-native orchestration patterns).
  • DevOps / IaC: Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code (Terraform).
  • Apache Iceberg: Hands-on experience building and managing Apache Iceberg tables (schema evolution, partitioning strategies, table maintenance).
  • Snowflake: Experience implementing and operating data solutions in Snowflake, including data modeling for analytics and patterns for secure consumption.
  • Soft Skills: Strong problem-solving and analytical skills; excellent communication and collaboration skills; ability to work independently and as part of an agile team.
  • Certifications (Preferred): AWS data-related certifications and other relevant cloud/data engineering certifications.



System One, and its subsidiaries including Joulé and Mountain Ltd., are leaders in delivering outsourced services and workforce solutions across North America. We help clients get work done more efficiently and economically, without compromising quality. System One not only serves as a valued partner for our clients, but we offer eligible employees health and welfare benefits coverage options including medical, dental, vision, spending accounts, life insurance, voluntary plans, as well as participation in a 401(k) plan.


System One is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, age, national origin, disability, family care or medical leave status, genetic information, veteran status, marital status, or any other characteristic protected by applicable federal, state, or local law.

#M-
#LI-