CrawlJobs Logo

Data Engineer Intern

https://www.ikea.com Logo

IKEA

Location Icon

Location:
China, Shanghai

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We deliver sustainable, extraordinary growth by creating a new, unique, inspiring and convenient meeting with the customers. We deliver a multi-channel experience that adds value to the many people and inspires a home furnishing movement.

Job Responsibility:

  • Dive the transformation of IKEA into a more data-driven company by building and operating modern platforms and systems that are aligned with our constantly evolving data and AI landscape
  • Build the AI ecosystem at the top retail company
  • Play with PB level data of IKEA eco-systems(Online channels, Retail, customer fulfillment, etc.)
  • Work with top talents and get a jumpstart to your career

Requirements:

  • Good interpersonal skills with the ability to collaborate, network, and build strong relations with team members and stakeholders
  • Good knowledge of advanced data structures and distributed computing
  • Good Knowledge of AI and machine learning concepts and algorithms
  • Broad knowledge of programming languages (e.g., Python, Java, Go, or Scala), including concepts from functional and object-oriented programming paradigms
  • Experience with AI/ML frameworks such as TensorFlow and PyTorch
  • Project experience with prompt tuning or fine-tuning experience on mainstream large language models such as ChatGPT(3.5, 4.0) and Meta Llama2
  • Fluent in English
  • At least 3 days working in office per week, and at least 6 months as the internship duration

Additional Information:

Job Posted:
March 26, 2025

Employment Type:
Parttime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer Intern

New

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Product Data Engineering Intern

Product Data Engineering Intern role at Hewlett Packard Enterprise. This is an o...
Location
Location
Puerto Rico , Aguadilla
Salary
Salary:
Not provided
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Currently pursuing a Bachelor's degree in Systems Engineering, Industrial Engineering or Computer Engineering
  • Familiarity with SAP
  • Basic programming or scripting knowledge (e.g., Python, Java, C++)
  • Strong interest in high-tech and passion for learning
  • Excellent communication and interpersonal skills
  • Strong problem-solving and analytical skills
  • Time management skills and working with strict deadlines
  • A collaborative, solution-focused mindset and overall sense of urgency
Job Responsibility
Job Responsibility
  • Support senior team members on assigned technical projects
  • Help identify and troubleshoot technical issues, providing support and suggesting solutions
  • Assist with maintaining and updating hardware, software, and other technical systems
  • Participate in team activities by attending team meetings, learn about project methodologies, and collaborate effectively with colleagues
  • Actively engage in learning about new technologies and methodologies relevant to work
  • Fulfill tasks and responsibilities assigned by a supervisor in a timely and efficient manner
  • Participate in periodic reviews to share updates and incorporate feedback on assigned projects/initiatives
What we offer
What we offer
  • Health & Wellbeing benefits
  • Personal & Professional Development programs
  • Unconditional Inclusion environment
  • Comprehensive suite of benefits supporting physical, financial and emotional wellbeing
  • Fulltime
Read More
Arrow Right

Data Engineer, 2025/2026 Intern

Join Atlassian as an intern and spend your summer with us having an impact on ho...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
https://www.atlassian.com Logo
Atlassian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Be currently enrolled in a Bachelors or Masters program in Software Engineering, Computer Science or other related technical field and completing your studies before January 2027
  • Experience programming with Python, or other related object-oriented programming languages
  • Knowledge of data structures, in particular how they are implemented and how to apply them to meet data challenges
  • Proficiency in SQL and relational databases experience
  • Demonstrated interest in the Data Engineering field through academic coursework, previous work or internship experience, or personal projects
Job Responsibility
Job Responsibility
  • Influence product teams
  • inform Data Science and Analytics Platform teams
  • partner with data consumers and products to ensure quality and usefulness of data assets
  • help strategize measurement
  • collecting data
  • generating insights
What we offer
What we offer
  • health coverage
  • paid volunteer days
  • wellness resources
  • Fulltime
Read More
Arrow Right
New

Data and Analytics Engineer Intern

The Data and Analytics Engineer Intern will assist in designing, building, and t...
Location
Location
United States , Irvine
Salary
Salary:
17.00 - 22.00 USD / Hour
trace3.com Logo
Trace3
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enrollment in the Junior or Senior year of an undergraduate program or master’s program at an accredited college or university
  • Candidates should be pursuing a field of study applicable to the Data Intelligence internship
  • Cumulative grade point average (GPA) of 3.0 or better
  • Ability to work independently on assigned tasks and accepts direction on given assignments
  • Self-motivated individuals with a customer mindset and desire to help people
  • Enthusiasm for technical problem solving with attention to detail and strong communication skills
  • Ability to learn and research in a dynamic and engaging environment
  • Availability to work 40 hours per week throughout the internship
Job Responsibility
Job Responsibility
  • Assist in designing, building, and testing data platforms and analytics solutions to generate actionable insights for our customers
  • Partner with our Data Intelligence Team to determine the best approach around data ingestion, structure, and storage, then work with the team to ensure these are implemented accurately
  • Contribute ideas on how to make our customers’ data more valuable and work with members of Trace3’s Engineering Team to implement solutions
What we offer
What we offer
  • Comprehensive medical, dental and vision plans for you and your dependents
  • 401(k) Retirement Plan with Employer Match, 529 College Savings Plan, Health Savings Account, Life Insurance, and Long-Term Disability
  • Competitive Compensation
  • Training and development programs
  • Major offices stocked with snacks and beverages
  • Collaborative and cool culture
  • Work-life balance and generous paid time off
  • Fulltime
Read More
Arrow Right
New

Crypto Data Scientist / Machine Learning - LLM Engineer Intern

Token Metrics is searching for a highly capable machine learning engineer to opt...
Location
Location
United States , Houston
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in computer science, data science, mathematics, or a related field
  • Master’s degree in computational linguistics, data science, data analytics, or similar will be advantageous
  • At least two years' experience as a machine learning engineer
  • Advanced proficiency with Python, Java, and R code
  • Extensive knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture
  • LLM fine-tuning experience and working with LLM Observability
  • In-depth knowledge of mathematics, statistics, and algorithms
  • Superb analytical and problem-solving abilities
  • Great communication and collaboration skills
  • Excellent time management and organizational abilities
Job Responsibility
Job Responsibility
  • Consulting with the manager to determine and refine machine learning objectives
  • Designing machine learning systems and self-running artificial intelligence (AI) to automate predictive models
  • Transforming data science prototypes and applying appropriate ML algorithms and tools
  • Ensuring that algorithms generate accurate user recommendations
  • Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks
  • Developing ML algorithms to analyze huge volumes of historical data to make predictions
  • Stress testing, performing statistical analysis, and interpreting test results for all market conditions
  • Documenting machine learning processes
  • Keeping abreast of developments in machine learning
Read More
Arrow Right
New

Data Engineer

Become a player in our data engineering team, grow on a personal level and help ...
Location
Location
Serbia , Novi Beograd
Salary
Salary:
Not provided
mdpi.com Logo
MDPI
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A university degree, ideally in Computer Science or related science, technology or engineering field
  • 2+ years of relevant work experience in data engineering roles
  • Experience in data acquisition, laking, warehousing, modeling, and orchestration
  • Proficiency in SQL (including window functions and CTE)
  • Proficiency in RDBMS (e.g., MySQL, PostgreSQL)
  • Strong programming skills in Python (with libraries like Polars, optionally Arrow / PyArrow API)
  • First exposure to OLAP query engines (e.g., Clickhouse, DuckDB, Apache Spark)
  • Familiarity with Apache Airflow (or similar tools like Dagster or Prefect)
  • Strong teamwork and communication skills
  • Ability to work independently and manage your time effectively
Job Responsibility
Job Responsibility
  • Assist in designing, building, and maintaining efficient data pipelines
  • Work on data modeling tasks to support the creation and maintenance of data warehouses
  • Integrate data from multiple sources, ensuring data consistency and reliability
  • Collaborate in implementing and managing data orchestration processes and tools
  • Help establish monitoring systems to maintain high standards of data quality and availability
  • Work closely with the Data Architect, Senior Data Engineers, and other members across the organization on various data infrastructure projects
  • Participate in the optimization of data processes, seeking opportunities to enhance system performance
What we offer
What we offer
  • Competitive salary and benefits package
Read More
Arrow Right

Senior Data Engineer

As a Senior Software Engineer, you will play a key role in designing and buildin...
Location
Location
United States
Salary
Salary:
156000.00 - 195000.00 USD / Year
apollo.io Logo
Apollo.io
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Excellent communication skills
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open
Job Responsibility
Job Responsibility
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
What we offer
What we offer
  • Equity
  • Company bonus or sales commissions/bonuses
  • 401(k) plan
  • At least 10 paid holidays per year
  • Flex PTO
  • Parental leave
  • Employee assistance program and wellbeing benefits
  • Global travel coverage
  • Life/AD&D/STD/LTD insurance
  • FSA/HSA and medical, dental, and vision benefits
  • Fulltime
Read More
Arrow Right
New

Lead Data Engineer

As a Lead Data Engineer at Rearc, you'll play a pivotal role in establishing and...
Location
Location
United States
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data architecture, or related technical fields
  • Proven ability to design, build, and optimize large-scale data ecosystems
  • Strong track record of leading complex data engineering initiatives
  • Deep hands-on expertise in ETL/ELT design, data warehousing, and data modeling
  • Extensive experience with data integration frameworks and best practices
  • Advanced knowledge of cloud-based data services and architectures (AWS Redshift, Azure Synapse Analytics, Google BigQuery, or equivalent)
  • Strong strategic and analytical thinking
  • Proficiency with modern data engineering frameworks (Databricks, Spark, lakehouse technologies like Delta Lake)
  • Exceptional communication and interpersonal skills
Job Responsibility
Job Responsibility
  • Engage deeply with stakeholders to understand data needs, business challenges, and technical constraints
  • Translate stakeholder needs into scalable, high-quality data solutions
  • Implement with a DataOps mindset using tools like Apache Airflow, Databricks/Spark, Kafka
  • Build reliable, automated, and efficient data pipelines and architectures
  • Lead and execute complex projects
  • Provide technical direction and set engineering standards
  • Ensure alignment with customer goals and company principles
  • Mentor and develop data engineers
  • Promote knowledge sharing and thought leadership
  • Contribute to internal and external content
What we offer
What we offer
  • Comprehensive health benefits
  • Generous time away and flexible PTO
  • Maternity and paternity leave
  • Access to educational resources with reimbursement for continued learning
  • 401(k) plan with company contribution
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.