CrawlJobs Logo

DBT Junior Engineer

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Remote

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The DBT Junior Engineer role involves translating Informatica mappings into dbt models, ensuring data quality, and collaborating with teams to optimize data structures. Candidates should have strong SQL skills and experience with dbt and Snowflake. This position offers an opportunity to work with cutting-edge technologies in a dynamic environment.

Job Responsibility:

  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases

Requirements:

  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation

Additional Information:

Job Posted:
February 16, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for DBT Junior Engineer

Sr. Data Engineer - Snowflake

Data Ideology is seeking a Sr. Snowflake Data Engineer to join our growing team ...
Location
Location
Salary
Salary:
Not provided
dataideology.com Logo
Data Ideology
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, data warehousing, or data architecture
  • 3+ years of hands-on Snowflake experience (performance tuning, data sharing, Snowpark, Snowpipe, etc.)
  • Strong SQL and Python skills, with production experience using dbt
  • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data tooling (Airflow, Fivetran, Power BI, Looker, Informatica, etc.)
  • Prior experience in a consulting or client-facing delivery role
  • Excellent communication skills, with the ability to collaborate across technical and business stakeholders
  • SnowPro Core Certification required (or willingness to obtain upon hire)
  • advanced Snowflake certifications preferred
Job Responsibility
Job Responsibility
  • Design and build scalable, secure, and cost-effective data solutions in Snowflake
  • Develop and optimize data pipelines using tools such as dbt, Python, CloverDX, and cloud-native services
  • Participate in discovery sessions with clients to gather requirements and translate them into solution designs and project plans
  • Collaborate with engagement managers and account teams to help scope work and provide technical input for Statements of Work (SOWs)
  • Serve as a Snowflake subject matter expert, guiding best practices in performance tuning, cost optimization, access control, and workload management
  • Lead modernization and migration initiatives to move clients from legacy systems into Snowflake
  • Integrate Snowflake with BI tools, governance platforms, and AI/ML frameworks
  • Contribute to internal accelerators, frameworks, and proofs of concept
  • Mentor junior engineers and support knowledge sharing across the team
What we offer
What we offer
  • Flexible Time Off Policy
  • Eligibility for Health Benefits
  • Retirement Plan with Company Match
  • Training and Certification Reimbursement
  • Utilization Based Incentive Program
  • Commission Incentive Program
  • Referral Bonuses
  • Work from Home
  • Fulltime
Read More
Arrow Right

Lead Analytics Engineer

As a Analytics Engineer in the Data Infrastructure Team at Prolific, you'll be a...
Location
Location
United Kingdom
Salary
Salary:
Not provided
prolific.com Logo
Prolific
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Expertise in dbt & SQL: Deep experience with dbt and SQL to design, build, and maintain scalable data models
  • Cloud Technology Knowledge: Strong familiarity with cloud platforms like AWS, GCP etc
  • Data Accuracy Focus: Passion for ensuring high data quality through tests/assertions and robust documentation
  • Commercial Acumen: Ability to understand business needs and communicate effectively with non-technical stakeholders
  • Mentorship Ability: Advocate for best practices in logging and data modeling that supports robust and effective analysis, reporting , and experimentation
  • Collaboration: Skilled at working cross-functionally and translating complex technical concepts into actionable insights for the business
  • Process-Driven: Proficiency in designing repeatable and scalable workflows for data transformation
Job Responsibility
Job Responsibility
  • Building Data Models: Create complex dbt models, custom macros, and reusable packages. Optimise transformations and implement robust testing strategies to ensure data integrity and model performance
  • Ownership: Monitoring and maintaining dbt workflow jobs, ensuring smooth data refreshes and up-to-date pipelines.You will also be responsible for data models for BI analytics & company level reporting
  • Ensuring Data Accuracy: Writing tests and assertions to validate data integrity and consistency across models
  • Documenting and Standardizing: Creating and maintaining thorough documentation of dbt processes to ensure best practices within the BI team
  • Translating Complex Data Concepts: Acting as a key communicator, translating technical data issues into understandable business terms for stakeholders
  • Mentoring Team Members: Supporting junior analysts and data engineers, especially in setting up experimentation platforms and data best practices
  • Collaborating Across Teams: Working closely with the product, engineering, and BI teams to ensure data infrastructure supports evolving business needs
Read More
Arrow Right

Senior Analytics Engineer II

Articulate is looking for a Sr. Analytics Engineer to join our amazing Data team...
Location
Location
United States
Salary
Salary:
137700.00 - 206500.00 USD / Year
articulate.com Logo
Articulate
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data or analytics roles
  • At least 5 years in analytics engineering, preferably within a fast-paced tech company or data-driven organization
  • Expert in end-to-end data workflows, from data collection to analysis and presentation, with most expertise in data modeling
  • Confident in translating raw data to reusable, business-friendly data models
  • Proven experience owning and leading the creation of reliable, flexible data models in an enterprise data warehouse
  • Expertise in SQL (writing and analyzing complex queries)
  • Expertise in Data Build Tool (dbt)
  • Expertise in Looker/Tableau (defining data models and measures)
  • Experience building semantic layers using dbt, Looker, or Snowflake’s semantic tables
  • Ability to write complex SQL, run ad-hoc data discovery, and build data models
Job Responsibility
Job Responsibility
  • Provide flexible, trustworthy data models by transforming data from multiple sources with DBT, testing for quality, deploying to visualization tools such as Looker or Tableau, and publishing documentation
  • Collaborate directly with stakeholders to define problems and determine requirements for a solution
  • Set and maintain best practices for data models and processes, including project architecture and QA
  • Proactively identify gaps or design flaws in our data models, and bring recommendations for how to fix them
  • Own documentation of our tools and data, both for our team and for external users
  • Lead discovery on data tools and infrastructure, including setting goals for tooling, continuous analysis of existing tooling, and exploring new tools and features we may adopt
  • Participate as a technical leader in project planning, helping to create well-defined tasks to address data initiatives
  • Mentor more junior members of the data team
  • Represent the data team in some technical architecture and planning conversations
  • Identify and share data risks and dependencies in these contexts
What we offer
What we offer
  • Bonus eligible
  • Robust suite of benefits
  • Fulltime
Read More
Arrow Right
New

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Read More
Arrow Right
New

DBT Junior Engineer

The DBT Junior Engineer role involves translating Informatica mappings into dbt ...
Location
Location
India
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–6 years of experience in Data Engineering / ETL / DW
  • 1–3+ years working with dbt (Core or Cloud)
  • Strong SQL skills, especially on Snowflake or another modern cloud DW
  • Experience with dbt concepts: models, tests, sources, seeds, snapshots, macros, exposures
  • Prior experience with Informatica (developer-level understanding of mappings/workflows) is highly desirable
  • Understanding of CI/CD practices and integrating dbt into automated pipelines
  • Knowledge of data modeling (dimensional models, SCDs, fact/dimension design)
  • Experience working in offshore delivery with onshore coordination
  • Good communication skills and ability to read/understand existing ETL logic and requirements documentation
Job Responsibility
Job Responsibility
  • Translate Informatica mappings, transformations, and business rules into dbt models (SQL) on Snowflake
  • Design and implement staging, core, and mart layers using standard dbt patterns and folder structures
  • Develop and maintain dbt tests (schema tests, data tests, custom tests) to ensure data quality and integrity
  • Implement snapshots, seeds, macros, and reusable components where appropriate
  • Collaborate with Snowflake developers to ensure physical data structures support dbt models efficiently
  • Work with functional teams to ensure functional equivalence between legacy Informatica outputs and new dbt outputs
  • Participate in performance tuning of dbt models and Snowflake queries
  • Integrate dbt with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for automated runs and validations
  • Contribute to documentation of dbt models, data lineage, and business rules
  • Participate in defect analysis, bug fixes, and enhancements during migration and stabilization phases
Read More
Arrow Right

Snowflake Junior Engineer

The Snowflake Junior Engineer role involves designing and implementing Snowflake...
Location
Location
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Data Engineering
  • experience with Snowflake
  • experience with SQL
Job Responsibility
Job Responsibility
  • designing and implementing Snowflake schemas
  • optimizing SQL scripts
  • collaborating with dbt developers
Read More
Arrow Right

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right
New

Snowflake Junior Engineer

The Snowflake Junior Engineer role involves designing and implementing Snowflake...
Location
Location
India , Remote
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2-4 years of experience in Data Engineering / DW development
  • 2+ years of experience on Snowflake
  • Strong hands-on SQL skills and experience with large-scale DW solutions
  • Solid understanding of Snowflake architecture (warehouses, databases, schemas, stages, virtual warehouses, security roles)
  • Experience with cloud platforms, ideally Azure and its integration with Snowflake (ADLS/Blob, ADF/Synapse, Key Vault)
  • Prior exposure to migration from MPP platforms (Yellowbricks, Teradata, Netezza, etc.) to Snowflake is a plus
  • Familiarity with dbt, Databricks, or ETL tools (Informatica) is an advantage
  • Experience working in offshore delivery models, collaborating with onshore teams
Job Responsibility
Job Responsibility
  • Design and implement Snowflake schemas, tables, views, materialized views, and stages to support migrated workloads
  • Recreate/translate Yellowbricks tables, views, and logic into Snowflake with functional equivalence
  • Collaborate with dbt developers to ensure dbt models are aligned with Snowflake best practices (clustering, micro-partitioning, warehouses)
  • Develop and optimize SQL scripts, stored procedures (Snowflake Scripting), and views used by dbt, Databricks, and BI tools
  • Implement and manage Snowflake roles, grants, and security models in line with enterprise standards
  • Support performance tuning for complex queries, including warehouse sizing, result caching, clustering, and statistics
  • Assist with data migration and validation between Yellowbricks and Snowflake (row counts, aggregates, and spot checks)
  • Contribute to CI/CD implementation for Snowflake objects (using Azure DevOps or similar)
  • Work closely with onshore architects and leads, attending overlap meetings in US time zones as required
Read More
Arrow Right