Senior Data Engineer 2022-65 – IT-Online


An international mining company has a vacancy for a Data Engineer in the IM Regional Support Office.

The objective of Global Information Management (IM) is to enable the delivery of business processes, communication, collaboration and knowledge management through the deployment, support and maintenance of technology, infrastructure , applications and technical systems of the IT/OT company.

An undergraduate degree (Bachelor or equivalent) in the relevant IM discipline and/or technical skills and certification with years of relevant experience in a similar role.

Role specific knowledge:

  • data lake
  • Data modeling
  • Data Architecture
  • Azure Data Environment
  • Phyton – Competent
  • PySpark – Competent
  • SQL – Proficient
  • Solution Architecture – Competent
  • API Design – Proficient
  • Containers – competent
  • CI/CD – Competent
  • Azure Cloud – Competent
  • Dataflow Models and Technology – Proficient
  • Dataflow Models and Technology – Proficient
  • Data Engineering Design Patterns – Proficient

Data Mining – Benefiting Specialized areas:
Unstructured Data – Applies to /wiki/spaces/DAGDG/pages/[Phone Number Removed]; and all products that process large scale data such as images.
Solid experience building pipelines and shipping files at scale, ideally using Azure services such as AzCopy and Azure Data Lake. Experience with unstructured file metadata management, conversions, normalization and related workflows. Experience building analytics jobs that scale on technologies such as Databricks or Azure Batch.

Security knowledge:
Provides an exceptional and consistent role model regarding security practices with a deep understanding of the importance of security

  • Develop ETL pipelines. Data transformations will be developed in Azure Databricks using Python and on Azure SQL using T-SQL and deployed using ARM templates
  • Combine and organize data in a central data lake
  • Serve data for applications and analytics through a variety of technologies such as SQL, Server Synapse, CosmosDB and TSI
  • Build dimensional and factual transformation pipelines and therefore a solid knowledge of standard BI concepts is mandatory
  • Build flow pipelines leverage IoT Hub, Event Hub, Databricks streaming, and other Azure flow technologies.
  • Work in a fluid environment with changing demands while maintaining absolute attention to detail.

Desired skills:

  • data lake
  • Data modeling
  • Azure Data Environment
  • Data architecture

Find out more/Apply to this position


Comments are closed.