Data Test Engineer
Experience Required: 4+ Years
Salary: Best in Industry
Position Summary:
We are looking for a skilled Data Test Engineer who can design, build, and validate end-to-end data pipelines. In this role, you will work closely with data engineers and business teams to ensure that data is accurate, complete, and reliable. You will be responsible for testing data workflows, writing complex SQL queries, automating data quality checks, and integrating validations into CI/CD pipelines. If you have a strong background in data engineering and a keen eye for quality, we’d love to have you on our team.
What We Look for in You
- A data-focused mindset with a passion for building reliable, scalable, and validated data pipelines
- A strong understanding of both data engineering and data quality assurance practices
- A meticulous eye for detail and a proactive attitude toward ensuring end-to-end data integrity
- The ability to collaborate with diverse stakeholders across engineering, analytics, and business
- A commitment to delivering clean, secure, and accurate data for decision-making
What You’ll Be Doing
- Design, develop, and maintain robust ETL/ELT pipelines to process large volumes of structured and unstructured data using Azure Data Factory, PySpark, and SQL-based tools
- Collaborate with data architects and analysts to understand transformation requirements and implement business rules correctly
- Develop and execute complex SQL queries to validate, transform, and performance-tune data workflows
- Perform rigorous data validation including source-to-target mapping (S2T), data profiling, reconciliation, and transformation rule testing
- Conduct unit, integration, regression, and performance testing for data pipelines and storage layers
- Automate data quality checks using Python and frameworks like Great Expectations, DBT, or custom-built tools
- Monitor data pipeline health and implement observability through logging, alerting, and dashboards
- Integrate testing into CI/CD workflows using tools like Azure DevOps, Jenkins, or GitHub Actions
- Troubleshoot and resolve data quality issues, schema changes, and pipeline failures
- Ensure compliance with data privacy, security, and governance policies
- Maintain thorough documentation for data flows, test logic, and validation processes
What You’ll Bring to the Table
- 4+ years of experience in Data Engineering and Data/ETL Testing
- Strong expertise in writing and optimizing SQL queries (joins, subqueries, window functions, performance tuning)
- Proficiency in Python or PySpark for data transformation and automation
- Hands-on experience with ETL tools such as Azure Data Factory, Talend, SSIS, or Informatica
- Familiarity with cloud platforms, preferably Azure; AWS or GCP is a plus
- Experience working with data lakes, data warehouses (Snowflake, BigQuery, Redshift), and modern data platforms
- Knowledge of version control systems (Git), issue tracking tools (JIRA), and Agile methodologies
- Exposure to data testing frameworks like Great Expectations, DBT tests, or custom validation tools
- Experience integrating data testing into CI/CD pipelines
- Nice to have: Familiarity with Airflow, Databricks, BI tools (Power BI, Tableau), and metadata management practices
Core benefits you’ll gain
- Competitive salary aligned with industry standards
- Hands-on experience with enterprise-scale data platforms and cloud-native tools
- Opportunities to work on data-centric initiatives across AI, analytics, and enterprise transformation
- Access to internal learning accelerators, mentorship, and career growth programs
- Flexible work culture, wellness initiatives, and comprehensive health benefits
Apply and take your seat with us as an OptiSolite!
Explore and find out why we are made for each other: Life at OptiSol
Read and discover: Life at OptiSol – Medium
Learn more about our culture. OptiSol’s Insta Page
Is this your playground? Then, please fill the form to start the application process and we’ll reach out to you soon.