Data Engineer at Mediro ICT

0

A technology-focused person, data engineers will design, configure, develop, and deploy data transformations

  • Develop ETL pipelines. Data transformations will be developed in Azure Databricks using Python and on Azure SQL using T-SQL and deployed
  • Combine and organize data in a central data lake
  • Serve data for applications and analytics through a variety of technologies such as SQL, Server Synapse, CosmosDB and TSI
  • Build dimensional and factual transformation pipelines and therefore a solid knowledge of standard BI concepts is mandatory
  • Build flow pipelines leverage IoT Hub, Event Hub, Databricks streaming, and other Azure flow technologies.
  • Work in a fluid environment with changing demands while maintaining absolute attention to detail.

Minimum requirements

  • Phyton – Competent
  • PySpark – Competent
  • SQL – Proficient
  • Solution Architecture – Competent
  • API Design – Proficient
  • Containers – competent
  • CI/CD – Competent
  • Azure Cloud – Competent
  • Dataflow Models and Technology – Proficient
  • Dataflow Models and Technology – Proficient
  • Data Engineering Design Patterns – Proficient
  • Data Mining – Benefiting

Find out more/Apply to this position

Share.

Comments are closed.