Within our Competence Management service, selected professionals don’t just join a project, or position – they become valued members of the SWICON team. Our colleagues are our top priority: we offer personalized support, dedicated attention, and a true professional community – because no one is just a number here. This people-first approach has earned us multiple HR awards. Join us and be part of an inspiring, future-driven, and recognized team!

Introduction

Our partner is the world's largest humanitarian network, which has been carrying out its activities successfully around the world for more than 100 years. They are continuously working on further succession with the help of a dedicated workforce, so their Digitalization and IT Department is now expanding to further support global projects and core activities.

Tasks

The scope is to support the implementation of reporting solutions for our ERP system (D365) "Data analysis and visualization project" as part of the ERP implementation program.

Technical context

As a part of the ERP program implementation team, the Full Stack Data & Analytics Developer will be responsible for the implementation of data visualizations based on the business requirements collected for each business function (Human Resources, Finance, Logistics, Resource mobilization, Project management). The Full Stack Data & Analytics Developer will implement the solution based on agreed architecture guidelines.

The Data Platform is based on Microsoft Fabric and Synapse using Data Vault and Dimensional modeling techniques for consumption in Power BI. The Full Stack Data & Analytics Developer will be responsible for the end-to-end development of reports including data modelling, data engineering, and Power BI report development.

Scope of the current request

Deliver Power BI report solution while enabling all the required elements starting from business requirement document, enable data in Data Platform, create Power BI reports (including paginated reports, using Git), respond to user feedback and deploy to production.

The role will start to carry out the following tasks:

Requirement and PowerBI


  • Understand business process and data model, business logic, modelling logic and ways to identify missing data.
  • Liaise with Business Contact or Business Analyst to understand the requirements.
  • Identify data requirements and data elements missing from the Data Platform.
  • Design and implement dashboards and reports in Power BI that provide actionable information for the business users.
  • Follow through UAT identified issues and their resolution.
  • Publish data and reports to production.

Microsoft Fabric/Azure Synapse


  • Develop database schemas, define relationships, and optimize performance based on the specific requirements of the data solution.
  • Develop data products using SQL and PySpark.
  • Implement data quality checks and processes to ensure data accuracy, consistency, and completeness.
  • Implement security measures to protect sensitive data and ensure compliance with relevant regulations and standards.Confidential
  • Not Protected- Optimize solutions for performance and scalability.
  • Identify and resolve performance bottlenecks, optimize SQL queries, and fine-tune data processing workflows.
  • Work collaboratively with cross-functional teams, including architects, data scientists, data analysts, and businesss take holders.
  • Document data engineering processes, system architecture, and data flow diagrams for knowledge sharing and future reference

Expectations

For this mission, we are looking for a full-time Full Stack Data & Analytics Developer with the following skillsets:

  • Independent and pro-active problem solving, self-starter
  • Can easily engage and interact with business and other developers.
  • 5+ years of progressively responsible postgraduate experience in data engineering with a focus on big data modelling and3+ years in data engineering (Microsoft Fabric or Azure Synapse) and report development (Power BI).
  • Proven track record as a Data Engineer or similar role, and in Power BI development, including Paginated or SSRS reports.
  • Experience in Data Lake and Data Lakehouse implementation (e.g., Microsoft Fabric, Azure Synapse, Databricks, Snowflake,Microsoft SQL Server, Apache Spark/Hadoop, or other similar big data or SQL databases).
  • Experience in Data Vault and dimensional data modeling techniques
  • Proficiency in programming languages such as Python, SQL, or Java.
  • Proficiency in the Apache Spark framework.• Experience in data governance, architecture and handling large datasets and data pipelines.
  • Experience in data products development, business value case development, data product deployment is preferred

Employer's offer

Cafeteria

Egészségbiztosítás

Tags

#Azure Synapse #Microsoft Fabric

Apply for this position

Are you suitable for this positon? Click on the apply button and upload your CV!

Share this position

Share this position on your social media platform to help a friend to find his/her dreamjob!
Share on Linkedin
Share on Facebook