ETL Engineer

📍 Location
President Park AH, IGauteng
⏰ Job Type
Full-time
📅 Posted
January 15, 2026
Apply Now

Job Description

Overview of the Role:
The ETL Engineer is responsible for designing, developing, maintaining, and optimizing. Extract, Transform, Load (ETL) pipelines that support enterprise data integration, analytics,

reporting, and operational requirements. The ideal candidate has strong technical skills in

data engineering, SQL, and ETL tools, with an understanding of data modelling,

governance, and best practices in data processing. The role ensures high-quality, accurate,

and reliable data delivery for business and analytical use.

RESPONSIBLE FOR: -

  1. ETL Development & Pipeline Engineering

  2. Design, build, and optimize ETL/ELT workflows for structured and unstructured datasets.

  3. Develop data pipelines for ingestion, transformation, and loading into data warehouses or data lakes.
  4. Write efficient SQL queries, transformations, stored procedures, and data scripts.
  5. Implement incremental loads, change-data-capture (CDC), and automated data processes.

  6. Data Integration

  7. Integrate data from multiple internal and external systems (APIs, DBs, files, applications).

  8. Seamlessly connect ETL pipelines to cloud storage, relational databases, streaming sources, or third-party platforms.
  9. Handle data profiling, cleansing, mapping, and validation.

  10. Data Quality & Validation

  11. Develop data quality rules, validation checks, reconciliation processes, and exception handling.

  12. Monitor data pipelines and resolve issues proactively.
  13. Ensure accuracy, completeness, and consistency of all processed data.

  14. Data Architecture & Modelling Support

  15. Collaborate with data architects to design scalable data models and ETL frameworks.

  16. Implement SCD (Slowly Changing Dimensions), fact and dimension design, and OLAP/OLTP optimizations.
  17. Contribute to data governance, metadata management, and documentation standards.

  18. Automation, Scheduling & Performance

  19. Automate workflows using schedulers (Airflow, ADF, Control-M, etc.).

  20. Tune ETL performance (query optimization, parallel processing, partitioning).
  21. Assist in CI/CD pipeline setup for ETL deployments (Git, DevOps pipelines).

  22. Collaboration & Stakeholder Engagement

  23. Work with BI teams, analysts, developers, and business owners to understand requirements.

  24. Translate functional requirements into technical ETL designs.
  25. Document data flows, mapping specifications, and transformation logic clearly.

QUALIFICATION REQUIRMENTS AND EXPERIENCE
:

  • 2–5+ years experience in data engineering or ETL development.
  • Strong SQL background is essential.
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience.

KNOWLEDGE:

  • SSIS / Informatica / Talend / Matillion / Pentaho / Azure Data Factory / AWS Glue / GCP Dataflow / Airflow / dbt (for ELT)
  • SAP Data Services (if applicable) Programming / Querying / Strong SQL (required) / Python or Scala (for transformations, scripting, automation) / Bash/PowerShell for system scripting (nice-to-have).
  • Databases & Storage. SQL Server, Oracle, MySQL, PostgreSQL, Cloud storage (ADLS, S3, GCS), Data warehouses (Snowflake, Redshift, BigQuery, Synapse).
  • Data Engineering & Integration, API integrations (REST, SOAP), File formats (CSV, Parquet, Avro, JSON, XML)
  • Data modelling (dimensional modelling, star schema, SCD types)
  • Data Quality & Governance
  • Great Expectations or similar DQ tools
  • Metadata management tools (Collibra, Purview, Alation)
  • DevOps & Deployment
  • Git / GitHub / Azure DevOps / GitLab
  • CI/CD pipelines for ETL
  • Scheduler tools (Airflow, Control-M, cron jobs)
  • Required Competencies
  • Solid analytical and problem-solving skills.
  • Strong communication and documentation capabilities.
  • Ability to work in agile or hybrid project environments.\Detail-oriented and committed to data accuracy.
  • Ability to handle large, complex datasets.
  • Nice-to-Have Skills
  • Experience with modern ELT (dbt, Databricks workflows).
  • Streaming experience (Kafka, Spark Streaming, EventHub).
  • Data lakehouse concepts (Delta Lake, Iceberg, Hudi).
  • Working knowledge of BI tools (Power BI, Tableau).
  • Banking or financial sector experience.

 Certifications such as:

o Azure/AWS/GCP Data Engineer

  • o Informatica/Talend/ADF professional certifications

Start Your Week Right!

Apply now and make every Monday exciting with

Apply for this Position