PRI Talent is hiring a Data Engineering SDET on behalf of our client. This role is a full-time, 1099 contract staff augmentation position working with a company that is a leader in reducing electronic waste and finding value in gently used electronics. Our client has seen staggering growth and extraordinary impact on protecting the planet while providing a work culture unlike any other.

Our Client is seeking a highly skilled and motivated Data Engineering SDET (Software Development Engineer in Test) to join our client’s data engineering team. This role focuses on ensuring the quality and reliability of our client’s data pipelines, data warehousing, and analysis solutions driven through automated testing. The ideal candidate will have experience in data engineering, ETL & ELT (Extract, Transform, Load) orchestration tools, SQL, and Tableau.

Key Responsibilities

  • Design, develop, and execute comprehensive test plans and cases for ETL processes (to validate data quality, transformation logic, and performance), machine learning models & APIs. Iden1fy, report defects, and work closely with data engineers to resolve issues.
  • Collaborate with data engineers to validate and optimize data warehousing solutions. Ensure data consistency, accuracy, and efficient storage.
  • Utilize ETL orchestra1on tools like FiveTran or similar platforms to automate and schedule data workflows. Create tests to validate the functionality and reliability of these workflows.
  • Develop and maintain test suites for Tableau dashboards and reports. Verify data accuracy and dashboard functionality to ensure data visualizations provide meaningful insights.
  • Write and execute SQL queries to validate data transformations, data loading, and data retrieval processes. Ensure data consistency and correctness at each stage of the pipeline.
  • Familiarity with real-time data streaming technologies, particularly Amazon Kinesis. Test data streaming processes for correctness, data integrity, and performance.
  • Implement and maintain regression test suites to ensure that changes or updates to data pipelines do not introduce new issues or regressions.
  • Develop and maintain automated test scripts and frameworks for data engineering processes to improve testing efficiency and coverage.
  • Collaborate closely with data engineers, analysts, and other stakeholders to understand requirements and ensure data quality and reliability.
  • Document test cases, test plans, and test results. Create and maintain documentation on data pipelines, ETL processes, and data structures.
  • Stay current with industry best practices, emerging technologies, and data engineering and testing trends—identify opportunities for process improvement and automation.
  • Knowledge of how real-time data streaming technologies like Kinesis work is nice to have.

Education and Experience

  • Bachelor's degree in Computer Science, Information Technology, or related field.
  • Proven experience in software testing, specifically in data engineering, ETL processes, and data warehousing.
  • Strong SQL skills and experience working with databases such as SQLServer or similar and warehouses like Snowflake or Amazon Redshift.
  • Proficiency in ETL orchestra1on tools like Snaplogic or FiveTran or similar.
  • Experience with data visualization tools, particularly Tableau.
  • Familiarity with real-time data streaming technologies like Amazon Kinesis or Apache Kafka is a plus.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Ability to work independently and in a team.
  • Knowledge of scripting languages (e.g., Python) for test automation is a plus.
  • Certifications in data engineering or software testing are advantageous.


Please note we will not accept applications that do not include a cover letter and work examples.