CrawlJobs Logo

Sr. Big Data Cloud Engineer

vodafone.com Logo

Vodafone

Location Icon

Location:
Romania , Bucuresti

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

At Vodafone, we’re not just shaping the future of connectivity for our customers – we’re shaping the future for everyone who joins our team. When you work with us, you’re part of a global mission to connect people, solve complex challenges, and create a sustainable and more inclusive world. If you want to grow your career whilst finding the perfect balance between work and life, Vodafone offers the opportunities to help you belong and make a real impact.

Job Responsibility:

  • Drive the definition and delivery of IT solutions by gathering requirements, assisting clients in defining functional specifications, and translating them into technical solutions
  • Lead the evaluation and implementation of complex projects, ensuring technical solutions align with business needs and architectural standards
  • Validate objectives, detailed technical solutions, and effort estimates provided by development teams
  • Provide expert recommendations and solutions to project managers and management teams for system implementation and project execution
  • Ensure overall solutions are aligned with long-term architectural plans and resolve any discrepancies
  • Actively participate in solution design, software development/configuration, and provide support to other developers and testing teams
  • Maintain comprehensive project documentation and deliver technical presentations to internal teams and clients
  • Support and transfer knowledge throughout the system lifecycle, troubleshoot technical issues, and ensure smooth handover to support and maintenance teams
  • Monitor system performance, propose improvements, and contribute to the development strategy for managed systems
  • Proactively identify outdated technologies or capacity needs and recommend improvements in line with Vodafone Global standards
  • Adhere to internal policies, compliance, and participate in mandatory training and professional development programs

Requirements:

  • University degree (or ongoing studies) in IT or a technical field
  • 3–5 years of experience working in complex organizations and delivering large-scale or transformational projects
  • Advanced knowledge of SQL and experience with at least one database management system (Oracle, MySQL, SQL Server, PostgreSQL, etc.)
  • Strong skills in developing database structures and models using SAS Base, PL/SQL, Python, BigQuery (OnPrem and Cloud)
  • Solid foundation in software development processes, best practices, and design patterns
  • Analytical, planning, and technical project coordination skills
  • Fluent in English (written and spoken)
  • Enthusiastic, creative, entrepreneurial, and eager to innovate and improve
  • Able to work under pressure, adapt to change, and make quick decisions with a problem-solving mindset
  • Strong relationship-building skills at all organizational levels, with a collaborative and trustworthy approach
  • Committed to continuous knowledge sharing and professional development

Nice to have:

  • Technical certifications (e.g., TMForum, TOGAF, SAS Base, SQL, Informatica, Dynamo, Python, BigQuery, Data Fusion, CDAP) are a plus
  • Familiarity with web standards and J2EE architectures is an advantage
What we offer:
  • Hybrid working regime 2 days from the office, 3 days remote
  • Special discounts for Vodafone employees, Friends & Family offers
  • Demo telephone subscription - unlimited (voice and data)
  • Voucher for the purchase of a mobile phone
  • Medical subscription to a top private clinic & other medical benefits
  • Insurance for hospitalization and surgical interventions
  • Life insurance
  • Meal tickets
  • Bookster subscription
  • Participation in development programs and challenging projects in the leadership area
  • Access to internal Wellbeing & Recognition events
  • Extra vacation days (for seniority, special events, volunteering)
  • You will benefit from specializations in your field of activity, through programs based on modern training methods and systems

Additional Information:

Job Posted:
January 22, 2026

Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Sr. Big Data Cloud Engineer

Sr Data Engineer

(Locals or Nearby resources only). You will work with technologies that include ...
Location
Location
United States , Glendale
Salary
Salary:
Not provided
enormousenterprise.com Logo
Enormous Enterprise
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of data engineering experience developing large data pipelines
  • Proficiency in at least one major programming language (e.g. Python, Java, Scala)
  • Hands-on production environment experience with distributed processing systems such as Spark
  • Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
  • Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query)
  • Experience in developing APIs with GraphQL
  • Advance understanding of OLTP vs OLAP environments
  • Candidates must work W2, no Corp 2 Corp
  • US Citizen, Green Card Holder, H4-EAD, TN-Visa
  • Airflow
Job Responsibility
Job Responsibility
  • Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
  • Build and maintain APIs to expose data to downstream applications
  • Develop real-time streaming data pipelines
  • Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
  • Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
  • Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
What we offer
What we offer
  • 3 levels of medical insurance for you and your family
  • Dental insurance for you and your family
  • 401k
  • Overtime
  • Sick leave policy: accrue 1 hour for every 30 hours worked up to 48 hours
Read More
Arrow Right

Sr. Data Engineer

We are looking for a skilled Sr. Data Engineer to join our team in Oklahoma City...
Location
Location
United States , Oklahoma City
Salary
Salary:
Not provided
https://www.roberthalf.com Logo
Robert Half
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience with Snowflake data warehousing and schema design
  • proficiency in ETL tools such as Matillion or similar platforms
  • strong knowledge of Python and PowerShell for data automation
  • experience working with Microsoft SQL Server and related technologies
  • familiarity with cloud technologies, particularly AWS
  • understanding of data visualization and analytics tools
  • background in working with big data technologies such as Apache Kafka, Hadoop, Spark, or Pig
  • ability to design and implement APIs for data integration and management.
Job Responsibility
Job Responsibility
  • Design, implement, and maintain Snowflake data warehousing solutions to support business needs
  • assist in the migration of in-house data to Snowflake, ensuring a seamless transition
  • develop data pipelines and workflows using tools such as Matillion or equivalent ETL solutions
  • collaborate with teams to optimize and manage the existing data warehouse built on Microsoft SQL Server
  • utilize Python and PowerShell to automate data processes and enhance system efficiency
  • partner with the implementation team to shadow and learn best practices for Snowflake deployment
  • ensure data integrity, scalability, and security across all data engineering processes
  • provide insights into data visualization and analytics to support decision-making
  • work with cloud technologies, including AWS, to enhance data storage and accessibility
  • implement and manage APIs to enable seamless data integration and sharing.
What we offer
What we offer
  • Medical, vision, dental, and life and disability insurance
  • eligibility to enroll in 401(k) plan
  • access to competitive compensation and free online training.
  • Fulltime
Read More
Arrow Right

Sr. Solutions Engineer

At Databricks, our core principles are at the heart of everything we do; creatin...
Location
Location
South Korea , Seoul
Salary
Salary:
Not provided
databricks.com Logo
databricks
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Engage customers in technical sales, challenge their questions, guide clear outcomes, and communicate technical and value propositions
  • Develop customer relationships and build internal partnerships with account executives and teams
  • Prior experience with coding in a core programming language (i.e., Python, Java, Scala) and willingness to learn a base level of Spark
  • Proficient with Big Data Analytics technologies, including hands-on expertise with complex proofs-of-concept and public cloud platform(s)
  • Experienced in use case discovery, scoping, and delivering complex solution architecture designs to multiple audiences requiring an ability to context switch in levels of technical depth
  • Native level in Korean is required, and proficiency in English is a plus
Job Responsibility
Job Responsibility
  • Form successful relationships with clients throughout your assigned territory, providing technical and business value to Databricks customers in collaboration with Account Executives
  • Operate as an expert in big data analytics to excite customers about Databricks. You will develop into a ‘champion’ and trusted advisor on multiple issues of architecture, design, and implementation to lead to the successful adoption of the Databricks Data Intelligence Platform
  • Scale best practices in your field and support customers by authoring reference architectures, how-tos, and demo applications, and help build the Databricks community in your region by leading workshops, seminars, and meet-ups
  • Grow your knowledge and expertise to the level of a technical and/or industry specialist
Read More
Arrow Right

Sr. Software Engineer, Big Data

Design, build, and maintain scalable data platforms, pipelines, and data lakes t...
Location
Location
United States , Irvine
Salary
Salary:
103170.00 - 158873.00 USD / Year
haeaus.com Logo
Hyundai AutoEver America
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science or related field (or equivalent)
  • 10+ years in IT application support
  • 7+ years as a Big Data Engineer
  • Big Data Expertise: Hadoop, Spark, Spark Streaming, Trino, Flink, Hive, Pig, Kafka, NoSQL (MongoDB, Cassandra, HBase)
  • Programming & Modeling: Proficiency in Java, Python, Scala, SQL
  • strong data modeling and ETL experience
  • Cloud & Security: Familiarity with AWS/Azure/GCP
  • knowledge of data security principles and implementation
  • Performance & Analytics: Experience with performance tuning, data warehousing, and ML/analytics integration in Big Data systems
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data platforms, pipelines, and data lakes to enable large-scale data processing and analysis
  • Collaborate with data scientists and stakeholders to ensure proper data collection, storage, and security, troubleshoot infrastructure issues, and optimize systems for scalability and efficiency
  • Build and maintain scalable data pipelines and robust data models for AI/ML from structured and unstructured sources
  • Develop Big Data pipelines using orchestration tools (Airflow/Oozie) and implement access management, monitoring, and self-service ETL/analytics solutions
  • Write data pipelines using Spark, Python, and Scala
  • develop frameworks/utilities in Python
  • and follow DevOps best practices
  • Advanced SQL skills with ability to query and transform large structured/unstructured datasets using Spark/PySpark, Spark SQL/Hive, Hive/NoSQL
  • hands-on experience with On-Prem Big Data platforms, distributed frameworks like YARN, and proficiency in building data pipelines using Spark, Python, and Scala
  • Diagnose software issues, optimize performance, and support BI tools (Tableau, Power BI, MicroStrategy) for Big Data
What we offer
What we offer
  • comprehensive medical/dental coverage
  • generous PTO
  • education assistance
  • annual merit increase eligibility
  • Fulltime
Read More
Arrow Right
New

Sr Data Engineer

The Sr Data Engineer is essential for designing and developing data architecture...
Location
Location
United States , Overland Park; Atlanta; Frisco
Salary
Salary:
105100.00 - 189600.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree plus 5 years of related work experience OR Advanced degree with 3 years of related experience
  • Acceptable areas of study include Computer Engineering, Computer Science, a related subject area
  • 4-7+ years Developing cloud solutions using data series
  • experience with cloud platforms (Azure Data Factory, Azure Databricks, and Snowflake)
  • 4-7+ years Hands-on development using and migrating data to cloud platforms
  • 4-7+ years Experience in SQL, NoSQL, and/or relational database design and development
  • 4-7+ years Advanced knowledge and experience in building complex data pipelines with experience in languages such as Python, SQL, Scala, and Spark
  • Analytical approach to problem-solving
  • ability to use technology to tackle business problems
  • Knowledge Of Message Queuing, Stream Processing, And Highly Scalable ‘Big Data’ Data Stores
Job Responsibility
Job Responsibility
  • Develop data engineering solutions that enable data pipelines, data transformation, data privacy and analytical tools within the T-Mobile Customer Data Platform (CDP)
  • Collaborate on analysis, architecture, design, and development of data products within the T-Mobile Customer Data Platform (CDP)
  • Design and develop data architectures across on-premise, cloud, and hybrid platforms to ensure scalable data infrastructure
  • Perform data wrangling, exploration, and discovery of heterogeneous data to generate new business insights
  • Tackling the most complex and innovative tasks of the organization usually with strict time constraints.
  • Support white-boarding sessions, workshops, design sessions, and project meetings as needed
  • Contribute to team knowledge sharing and drive the advancement of new data engineering capabilities
  • Mentor team members to build and enhance their data engineering skillsets and professional growth
  • Assist management in project definition, including estimating, planning, and scoping work to meet objectives
  • Also responsible for other duties/projects as assigned by business management as needed
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Employee stock grants
  • Employee stock purchase plan
  • Paid time off
  • Fulltime
Read More
Arrow Right
New

Sr Data Engineer

As a Data Engineer, you will be responsible for designing, building, maintaining...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
amgen.com Logo
Amgen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of Computer Science, IT or related field experience
  • Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing
  • Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops
  • Proficient in SQL, Python for extracting, transforming, and analyzing complex datasets from relational data stores
  • Proficient in Python with strong experience in ETL tools such as Apache Spark and various data processing packages, supporting scalable data workflows and machine learning pipeline development.
  • Strong understanding of data modeling, data warehousing, and data integration concepts
  • Proven ability to optimize query performance on big data platforms
  • Knowledge on Data visualization and analytics tools like Spotfire, PowerBI
Job Responsibility
Job Responsibility
  • Design, develop, and maintain data solutions for data generation, collection, and processing
  • Be a key team member that assists in design and development of the data pipeline
  • Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems
  • Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions
  • Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency
  • Implement data security and privacy measures to protect sensitive data
  • Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions
  • Collaborate and communicate effectively with product teams
  • Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions
  • Identify and resolve complex data-related challenges
What we offer
What we offer
  • Competitive and comprehensive Total Rewards Plans that are aligned with local industry standards
Read More
Arrow Right

Director of Engineering

We are seeking a highly experienced and strategic Director of Engineering to tak...
Location
Location
India , Chennai
Salary
Salary:
Not provided
arrcus.com Logo
Arrcus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Engineering/Computer Science or equivalent degree
  • Excellent communication, presentation, and interpersonal skills
  • 5+ years of experience leading and managing distributed engineering teams involved in creating complex software products
  • 10+ years of relevant experience in managing very senior technical talent in some of the following areas: Networking protocols such as OSPF, BGP, ISIS, MPLS, BFD, MLAG, EVPN, VxLAN, SR-MPLS, SRv6, L3VPN
  • Test Harness like Robot framework, Jinja2
  • Familiarity network merchant silicon chipsets and whitebox platforms
  • Software development of Network Data Path (Linux, virtual and ASIC)
  • Virtualization technologies like SR-IOV, Intel DPDK, FD.io, NSX, OVS
  • High Availability, ISSU, Linux networking
  • Debian Build/Packaging, Linux Kernel, Kernel Networking Stack
Job Responsibility
Job Responsibility
  • Work with customer and product teams to understand and prioritise new requirements
  • Develop a holistic understanding of individual employee skill sets and drive resource allocation for customer requirements
  • Help drive the recruiting process both in terms of attracting new talent as well as defining and streamlining the recruitment process
  • Continuous Process Improvement: Solicit feedback, drive discussion and implement process and workflow improvements
  • Provide technical guidance and mitigation for engineering projects
  • Build 1:1 rapport with engineers, help identify and fulfil personal aspirations by aligning with larger team goals
  • Strong ability to plan, execute and deliver multiple projects across worldwide sites
  • Experience with rapidly growing engineering organizations in all aspects of people, resources, tools, and more
What we offer
What we offer
  • Generous compensation packages including equity
  • Medical Insurance
  • Parental Leave
  • Fulltime
Read More
Arrow Right

Sr Big Data Engineer - Oozie and Pig (GCP)

We are seeking a Senior Big Data Engineer with deep expertise in distributed sys...
Location
Location
United States
Salary
Salary:
116100.00 - 198440.00 USD / Year
rackspace.com Logo
Rackspace
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, software engineering or related field of study
  • Experience with managed cloud services and understanding of cloud-based batch processing systems
  • Proficiency in Oozie, Airflow, Map Reduce, Java
  • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
  • Expertise in public cloud services, particularly in GCP
  • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
  • Familiarity with BigTable and Redis
  • Experienced in Infrastructure and Applied DevOps principles in daily work
  • Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform
  • Proven experience in engineering batch processing systems at scale
Job Responsibility
Job Responsibility
  • Design and develop scalable batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, MapReduce, and HBase, with hands-on coding in Java or Python (Java is a must)
  • Must be able to lead Jira Epics
  • Write clean, efficient, and production-ready code with a strong focus on data structures and algorithmic problem-solving applied to real-world data engineering tasks
  • Develop, manage, and optimize complex data workflows within the Apache Hadoop ecosystem, with a strong focus on Oozie orchestration and job scheduling
  • Leverage Google Cloud Platform (GCP) tools such as Dataproc, GCS, and Composer to build scalable and cloud-native big data solutions
  • Implement DevOps and automation best practices, including CI/CD pipelines, infrastructure as code (IaC), and performance tuning across distributed systems
  • Collaborate with cross-functional teams to ensure data pipeline reliability, code quality, and operational excellence in a remote-first environment
  • Fulltime
Read More
Arrow Right