This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Wells Fargo, we want to satisfy our customers’ financial needs and help them succeed financially. We’re looking for talented people who will put our customers at the center of everything we do. Help us build a better Wells Fargo. It all begins with outstanding talent. It all begins with you. Wells Fargo Technology sets IT strategy; enhances the design, development, and operations of our systems; optimizes the Wells Fargo infrastructure footprint; provides information security; and enables continuous banking access through in-store, online, ATM, and other channels to Wells Fargo’s more than 70 million global customers.
Job Responsibility:
Lead moderately complex initiatives within Technology and contribute to large scale data processing framework initiatives related to enterprise strategy deliverables
Build and maintain optimized and highly available data pipelines that facilitate deeper analysis and reporting
Review and analyze moderately complex business, operational or technical challenges that require an in-depth evaluation of variable factors
Oversee the data integration work, including developing a data model, maintaining a data warehouse and analytics environment, and writing scripts for data integration and analysis
Resolve moderately complex issues and lead teams to meet data engineering deliverables while leveraging solid understanding of data information policies, procedures, and compliance requirements
Collaborate and consult with colleagues and managers to resolve data engineering issues and achieve strategic goals
Requirements:
Bachelor’s degree in Computer Information Systems, Information Technology, or related technical field
Five (5) years of experience in the job offered or in a related position involving application development and implementation experience
5 years of ETL (Extract, Transform, Load) Programming experience
4 years of experience in data warehouse and data analytics capabilities on big data architecture
4 years of Agile experience
3 years of SQL experience
2 years of Hadoop experience
2 years of design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)
2 years of experience in real-time batch data ingestion, processing, and provisioning such as Apache Kafka or Apache Sqoop