Throughout our recruitment services, we act as trusted advisors – supporting and guiding candidates beyond the hiring process to help them make the best possible decision for their career and future. Our goal is not just to place talent in new positions, but to help them take meaningful steps forward. Every candidate receives personalized attention – a value reflected in our award-winning, people-focused approach. With SWICON, you’re in good hands – especially when it matters most.
Introduction
Swicon Group is one of the leading members of the IT arena for almost a decade, since 2017 present also in Romania. Our wide range of services and highly trained professionals give us the opportunity to shape our solutions to fully meet the ideas and wishes of our clients. We are proud to have leading banks, insurance and telecommunication companies, as well as large FMCG corporations and SSCs amongst our highly prestigious partners.
Tasks
Job Description
You will work closely with cross-functional teams to deliver high-quality data solutions in domains such as Supply Chain, Finance, Operations, Customer Experience, HR, Risk Management, and Global IT.
Key Responsibilities:
- Be part of the technical plan for the migration, including data ingestion, transformation, storage, and access control in Azure's Data Factory and data lake.
- Design and implement scalable and efficient data pipelines to ensure smooth data movement from multiple sources using Azure Databricks
- Developing scalable and re-usable frameworks for ingesting of data sets
- Ensure data quality and integrity throughout the entire data pipeline, implementing robust data validation and cleansing mechanisms.
- Working with event based/streaming technologies to ingest and process data.
- Provide support to the team, resolving any technical challenges or issues that may arise during the migration and post-migration phases.
- Stay up to date with the latest advancements in cloud computing, data engineering, and analytics technologies, and recommend best practices and industry standards for implementing the data lake solution.
Expectations
Qualifications
- 5+ years of experience working within Data Engineering field.
- Hands-on working experience with Azure Databricks.
- Experience in Data Modelling & Source System Analysis
- Familiarity with PySpark.
- Mastery of SQL.
- Knowledge of components: Azure Data Factory, Azure Data Lake, Azure SQL DW, Azure SQL.
- Experience with Python programming language used for data Engineering purposes.
- Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows.
- Experience in data visualization/exploration tools.
- Excellent communication skills, with the ability to effectively convey complex ideas to technical and non-technical stakeholders.
- Strong team player with excellent interpersonal and collaboration skills.
Advantages
- Excellent learning opportunities! Variety in your work and a fantastic, informal work atmosphere.
- A challenging environment that will stimulate you to grow as a professional!
- A great foundation for your career!
- Grow with us! Your role will develop over time, so you can increase your experience and responsibilities. So, you can advance faster and further in your future career.
- A friendly and welcoming work environment with an international working atmosphere where you can practice and learn new language skills with a diverse mix of colleagues and clients.
- A dynamic work environment with a culture that is open, innovative, and performance orientated.
Employer's offer
.
Tags