CrawlJobs Logo

Data Engineer (MS Fabric)

gt-hq.com Logo

GT

Location Icon

Location:
United Kingdom

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Dreams is implementing Microsoft Dynamics 365 Finance & Operations and building a modern data and reporting layer using Microsoft Fabric and Medallion architecture. The project focuses on accelerating the extraction and transformation of ERP data into clean, business-ready datasets that support operational and financial reporting, with Excel as the primary consumption tool. The team works in a pragmatic, delivery-focused environment, closely collaborating with business analysts and end users to ensure data is reliable, accessible, and valuable from day one.

Job Responsibility:

  • Extract data from Microsoft Dynamics 365 Finance & Operations into Microsoft Fabric
  • Work directly with D365 F&O tables, data entities, and the underlying data model
  • Understand how transactions flow through the system (sales orders, inventory, finance, procurement, etc.)
  • Interpret business logic embedded in F&O rather than relying solely on exported data
  • Troubleshoot issues where the root cause sits inside F&O rather than in Fabric
  • Collaborate with our F&O functional and technical teams using the correct terminology and system understanding
  • Build and maintain data pipelines using Medallion architecture (Bronze / Silver / Gold)
  • Transform transactional ERP data into business-ready datasets
  • Expose structured datasets natively for Excel consumption
  • Support creation of simple, column-based Excel reports
  • Work closely with business analysts and end users to understand reporting needs
  • Collaborate with Dreams’ internal team within their Azure environment
  • Ensure secure and reliable access to data

Requirements:

  • 5+ years of experience in data engineering, with a strong focus on Microsoft Azure–based data platforms
  • Hands-on experience with Microsoft Fabric and D365 Finance & Operations
  • Strong data engineering background (SQL, transformations, data modeling)
  • Experience working with transactional systems (ERP, finance, operations data)
  • Solid understanding of Medallion / layered data architecture
  • Experience exposing datasets for Excel-based consumption
  • Comfortable working in a business-facing environment
  • Strong English communication skills

Nice to have:

  • Broader Azure data platform experience (Synapse, ADF, etc.)
  • Power BI experience
  • Previous involvement in ERP implementations
  • Ability to travel to the UK for initial onboarding or workshops

Additional Information:

Job Posted:
February 18, 2026

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer (MS Fabric)

Senior Azure Data Engineer

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/...
Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
Job Responsibility
Job Responsibility
  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
What we offer
What we offer
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
Read More
Arrow Right

Principal Competitive Technical Marketing Engineer

The Competitive Technical Marketing Engineer (TME) position plays a vital role w...
Location
Location
United States , Roseville
Salary
Salary:
115500.00 - 266000.00 USD / Year
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS or MS in Computer Science, Information Systems, or related field. Hands-on experience building customer networks with a non-technical degree can substitute
  • 10+ years of experience
  • In-depth understanding of protocols such as MP-BGP, BGP, IS-IS, EVPN, VxLAN, OSPF, Multicast, MPLS, STP, VLANs, IPv4, IPv6, etc.
  • Strong understanding of Data Center technologies such as Data Center Fabric/Spine architecture, EVPN-VXLAN Fabric, Data Center Interconnect (DCI) Border, Secure DCI, Multi-tier network design, IP Fabric, IPv6, storage networking
  • Excellent communication skills and comfort with public speaking
  • Ability to translate complex technical concepts to understandable language to match the level of the audience
Job Responsibility
Job Responsibility
  • Collaborate with technical experts across a range of HPE Aruba Networking products and functional areas, including but not limited to data center and L2/L3 switching protocols (STP, QoS, BGP, OSPF, TCP/IP, IPv4, IPv6, etc.) and other networking services relevant to data center networking solutions and deployments
  • Bring up network topologies and solutions of varying complexities and compare other vendors solutions to HPE Aruba Networking in a lab environment
  • Present competitive sessions at HPE Aruba Networking events and webinars for field, partner, and R&D engineers
  • Generate technical collateral which includes testing and comparing HPE Aruba Networking with industry vendor solutions, creating competitive analysis reports, third-party testing, sales collateral and assist in development and delivery of competitive updates when required
  • Help manage and maintain lab equipment and inventory
What we offer
What we offer
  • Health & Wellbeing
  • Personal & Professional Development
  • Unconditional Inclusion
  • Fulltime
Read More
Arrow Right
New

Data Architect

We are seeking a Principal‑level Data Architect with deep expertise in enterpris...
Location
Location
Canada
Salary
Salary:
180000.00 - 250000.00 CAD / Year
valtech.com Logo
Valtech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data architecture, or platform engineering
  • Experience designing or building enterprise data platforms on at least one of Azure, GCP, or AWS and Databricks
  • Deep expertise in SQL, Python, distributed data processing, and cloud-native data design
  • Significant experience with medallion/lakehouse architecture patterns
  • Strong knowledge of modern data platforms: Databricks, Azure Synapse, Microsoft Fabric, Delta Lake, BigQuery, etc.
  • Proven experience leading architecture across large programs and multiple concurrent projects
  • Experience with enterprise automation and integration using REST APIs
  • Strong communication skills and ability to engage confidently with senior leadership and clients
  • Experience in pre-sales, technical solutioning, or client-facing architecture leadership
Job Responsibility
Job Responsibility
  • Design and own complex, enterprise-scale data architectures across MS Fabric, Azure, GCP, AWS, or Databricks serverless or hosted environments
  • Define and enforce architectural standards, patterns, and governance frameworks across ingestion, modeling, lineage, security, and orchestration
  • Shape AI‑enabled architecture approaches, including data foundations for ML, feature engineering, and low-latency operationalization pipelines
  • Act as a principal advisor to client technical leadership, helping shape long-term strategy, roadmaps, and modernization initiatives
  • Lead architectural direction during pre-sales cycles, including solutioning, scoping, estimation, and executive-level presentations
  • Anticipate downstream impacts of architectural decisions
  • maintain ownership when delivery teams or constraints require deviation from the original design
  • Architect highly available, distributed, fault‑tolerant data pipelines supporting batch and streaming workloads
  • Oversee migration and integration of complex, diverse data sources into Fabric, Azure, GCP, or Databricks platforms
  • Define medallion/lakehouse modeling patterns across Bronze/Silver/Gold zones or cloud equivalents
What we offer
What we offer
  • A comprehensive insurance plan, where you can choose the module that best suits your needs—Gold, Silver, or Bronze. The employer may contribute up to 80% of your coverage depending on the selected module. This plan includes short- and long-term disability coverage
  • Dialogue via Sun Life provides virtual healthcare services, allowing you to consult with a healthcare professional for emergencies, prescription renewals, and more. You also have access to the Employee and Family Assistance Program, as well as a complete mental health support program
  • A $500 Personal Spending Account, which can be used for healthcare reimbursements, gym memberships, public transit passes, office supplies, or contributions to your RRSP through Valtech
  • A retirement plan where Valtech will match 100% of your RRSP contributions through a Deferred Profit Sharing Plan (DPSP), up to a maximum of 4%. You can start contributing to your RRSP immediately, and to the DPSP after 3 months. The vesting of the DPSP will be after a 24 months of service
  • Access to a flexible vacation under Valtech's policy to support your work-life balance, with 5 days available during your probation period and a prorated amount calculated for the remainder of the year
  • Personal Technology Reimbursement – $30/month for every employee-offered on day 1
  • We close during the winter holidays and offer flexible scheduling throughout the year, so you can enjoy those sunny Friday afternoons—provided your weekly hours are completed
  • Fulltime
Read More
Arrow Right

Head of Data, Automation & AI

Knovia Group, the UK’s leading apprenticeship provider, is on a bold mission to ...
Location
Location
United Kingdom
Salary
Salary:
90000.00 GBP / Year
paragonskills.co.uk Logo
Paragon Skills
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in computer science, data science, engineering, a related field or equivalent professional training/qualifications
  • Strong CPD record keeping abreast of latest in data architecture, governance, cloud data platforms, advanced analytics, connected systems, and development of AI agents
  • Proven experience in data platform strategy, AI/ML enablement, or data transformation at scale
  • 5+ years' Senior experience in a data scientist, data engineer or developer role
  • 3+ years' leading a function
  • Experience in a senior data, AI, or digital transformation leadership role
  • Track record of delivering enterprise-scale data infrastructure and AI/automation initiatives
  • Strong understanding of data architecture, governance, and cloud data platforms (e.g., Snowflake, Databricks, AWS/GCP, MS Fabric)
  • Deep expertise in cloud-based data architectures (e.g., AWS, Azure, or GCP), data engineering, and MLOps
  • Familiarity with tools like Databricks, Snowflake, MLflow, Airflow, dbt, and LLM technologies would be advantageous
Job Responsibility
Job Responsibility
  • Lead the development of a modern data platform (data lake, warehouse, pipelines, BI suite)
  • Automating and integrating end-to-end business processes using tools like Workato
  • Developing and deploying AI agents to enhance operational efficiency and learner/employer experiences
  • Enhancing our analytics capabilities to better understand and serve our customers
  • Shape our internal AI capability — from staff skills to leadership development
What we offer
What we offer
  • Generous Annual Leave: 21 days, increasing with length of service, plus a holiday purchase scheme
  • Holiday Benefits: 3 Knovia Days for our operational December closure and 8 Public Bank Holidays
  • Extra Day Off: Enjoy an additional day off to celebrate your birthday
  • Paid Volunteering Leave: Up to 3 days of paid leave for volunteering opportunities and corporate conscience initiatives
  • Perkbox: Access to a wide range of lifestyle benefits and wellness tools
  • Recognition and Long Service Awards: Celebrating the milestones and contributions of our colleagues
  • Fulltime
Read More
Arrow Right
New

Senior Azure ETL Engineer

SoftClouds LLC is looking for a Azure ETL Developer --- with at least 8+ years o...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
softclouds.com Logo
SoftClouds
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proficiency in all the AWS Web Services below
  • MS Fabric
  • Azure
  • Synapse
  • Data pipelines
  • Lakehouse
  • Candidate must have solid research/troubleshooting and analytical skills
  • The ability to be able to dig into code or documentation to help them solve issues and leverage all resources available to them
  • Must be able to apply SDLC concepts and Agile Scrum methodologies
  • Has a proven track record of delivering solid, robust applications
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data pipelines and modern data Lakehouse architectures using Microsoft Fabric, Synapse Analytics, and Azure Data Factory
  • Implement end-to-end data pipelines across bronze, silver, and gold layers within Microsoft Fabric
  • Develop Dataflows Gen2, Spark notebooks, and Synapse pipelines to ingest data from varied sources (databases, APIs, Excel/CSV files)
  • Manage and optimize data storage within One Lake, including partitioning, schema evolution, and Delta tables
  • Integrate curated data from pipelines with Power BI semantic models for advanced analytics and reporting
  • Implement data governance, lineage tracking, and security using Microsoft Purview, Defender for Cloud, and Azure Key Vault
  • Monitor, schedule, and optimize pipeline performance using tools such as Monitoring Hub and Azure Monitor
  • Automate deployments and workflows using CI/CD pipelines (for Fabric and Azure Data Factory) and Power Automate where appropriate
  • Create interactive Power BI dashboards and reports, leveraging Direct Lake mode and connecting to Lakehouse/Warehouse datasets
  • Apply best practices for data quality, cleansing, and transformation (using KQL, PySpark, or SQL)
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a highly skilled Data Architect to join our Enterprise Transforma...
Location
Location
Bulgaria , Sofia
Salary
Salary:
Not provided
hotschedules.com Logo
HotSchedules Corporate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
  • 5+ years in data architecture, database administration, or data engineering, with a focus on enterprise and business systems
  • Strong knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra)
  • Experience with cloud platforms (e.g., AWS, Azure, GCP – BigQuery, Redshift, Snowflake)
  • Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, dbt, Informatica, Talend, Azure Data Factory, Synapse)
  • Strong knowledge of API’s and Automation Tools (e.g. Make, Zapier, MS Power Automate)
  • Familiarity with ERP, CRM, and HRIS integrations
  • Programming skills in Python, Java, or Scala
  • Deep understanding of data governance, master data management, and security/compliance (especially GDPR)
  • Excellent analytical, problem-solving, and communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain the organization’s overall data architecture to support enterprise‑wide business applications, internal reporting, and analytics
  • Create and manage conceptual, logical, and physical data models for organizational data domains (HR, Finance, Sales, Operations, etc.)
  • Define and implement data governance policies, standards, and best practices across the enterprise
  • Oversee ETL/ELT processes and pipelines for integrating data from diverse business systems (ERP, CRM, HRIS, etc.)
  • Collaborate with internal stakeholders (business teams, IT, data engineers) to align data initiatives with organizational objectives
  • Optimize performance, cost, and scalability of data warehouses and internal reporting systems
  • Evaluate and recommend tools and platforms to enhance internal data and business application efficiency
  • Ensure compliance with GDPR and other relevant data security/privacy regulations
  • Responsible for the successful design and execution of Data related programs and projects
What we offer
What we offer
  • 25+ days off, as well as birthday day off and 4 charity days off per year
  • Flexible start and end of the working day and hybrid working mode, including a combination remote and in the office
  • Team-centric atmosphere
  • Encouraging healthy lifestyle and work-life balance including supplemental health insurance
  • New parents bonus scheme
  • Fulltime
Read More
Arrow Right
New

IT Data Engineer

We are growing Scanfil’s Automation & Integrations team and welcome you to join ...
Location
Location
Poland
Salary
Salary:
Not provided
scanfil.com Logo
Scanfil
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum 2 years of experience in a similar position
  • Higher education in IT/Technical or equivalent skills
  • Experience and knowledge of SQL (MS SQL and/or Oracle)
  • Experience with data integration & transformation (ADF, Fabric & Spark)
  • Knowledge about Data architectural principles/patterns, i.e Medallion architecture and ability to apply them in real‑world solutions
  • Solid understanding software development process including code quality, testing, and documentation
  • Professional working proficiency in English
  • Ability to communicate effectively with business users and local IT teams in a global environment
  • Excellent problems solving & analytical skills
Job Responsibility
Job Responsibility
  • Design, develop, monitor, and continuously improve data pipelines in our Azure Cloud Platform (ADF & Fabric) and SQL Server environment
  • Own and support integrations between key elements of data infrastructure
  • Monitor performance and optimizing data warehouse provisioning processes
  • Proactively identify and implement improvements of existing data warehouse and reporting solutions
  • Design and implement the management dashboards and reports
  • Collaborate and actively contribute in cross‑functional project teams
What we offer
What we offer
  • Exciting and challenging working environment
  • Positive culture expressed in our core values
  • Cross-functional and cross-organizational career opportunities
  • Learning and development supported by Global Programs
  • Process-driven organization with strong team spirit
  • Flexible work hours
  • Medical Insurance, bonus system, and extensive social package
  • Fulltime
Read More
Arrow Right

Data Engineer

We're looking for a hands-on Data Engineer with strong Microsoft Fabric experien...
Location
Location
Poland
Salary
Salary:
Not provided
redglobal.com Logo
RED Commerce - The Global SAP Solutions Provider
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience with MS Fabric
  • Hands-on with Notebooks & PySpark
  • Solid understanding of Medallion Architecture
  • Advanced Python & SQL
  • Experience building Power BI dashboards
  • Excellent communication and stakeholder management skills
  • Proactive, hands-on, "can-do" mindset
Job Responsibility
Job Responsibility
  • Design and develop data solutions using Microsoft Fabric
  • Work extensively with Notebooks, PySpark, and Medallion Architecture
  • Build and optimize data transformations using Python and SQL
  • Create clear, high-quality Power BI visualisations for business users
  • Act as a key interface between technical teams and marketing stakeholders
  • Contribute to team collaboration, knowledge sharing, and light people leadership
Read More
Arrow Right