This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. The Boeing Company is currently seeking a Senior Data Engineer (Data Solutions Architect) to join the Finance Systems and Analytics (FS&A) team in Tukwila, WA. The selected candidate will design, develop, and maintain advanced analytics solutions that deliver step-function business improvement. As the Data Engineering Subject Matter Expert (SME) on the Finance Analytics full‑stack Development team, you will partner with Base Data Architects to harmonize finance data sources and build the data foundation needed by Data Scientists and Business Intelligence (BI) Engineers to deliver analytics solutions across Finance.
Job Responsibility:
Build and operate production data pipelines, Extract Transform Load (ETL)/Extract Load Transform (ELT) processes, and data platforms that support analytics and Machine Learning (ML) workloads
Design, implement, and optimize reliable, scalable data architectures (data lakes, warehouses, lakehouses) and data modeling patterns for business consumption
Build robust ingestion, transformation, and orchestration solutions using tools such as Spark, SQL, Airflow, Bash, or equivalents
Possess domain knowledge in Finance (e.g., Earned Value Management (EVM), accounting, treasury, reconciliation, payments, Financial Planning and Analysis (FP&A)) or equivalent business-facing experience to partner effectively with Finance SMEs
Ensure data quality, governance, lineage, and cataloging to support auditability and regulatory requirements (e.g., SOX)
Implement security, privacy, and data protection best practices for sensitive financial data, including access controls and encryption
Apply observability and monitoring best practices (logging, metrics, alerts) and use Continuous Integration and Continuous Delivery (CI/CD) for data infrastructure and pipeline deployments
Work effectively in collaborative analytics environments using modern analytics practices (code reviews, feature branches, agile delivery) and provide technical leadership across cross-functional teams
Requirements:
Bachelor’s Degree or higher in Computer Science, Engineering, Information Systems, or a related quantitative/technical field, or equivalent practical experience
5+ years of experience designing, building, and operating production data pipelines, ETL/ELT workflows, and data platforms supporting analytics and ML workloads
Experience with SQL and querying relational and cloud data warehouses (e.g., SQL Server, Oracle, Teradata, BigQuery)
Experience with at least one programming language used for data engineering (e.g., Python, Scala, Java) and experience with Spark or equivalent distributed processing frameworks
Experience with orchestration and workflow tools (e.g., Airflow, SQL, Bash) and authoring repeatable, testable transformations
Experience with data modeling, schema design, and data architecture patterns for lakes, warehouses, or lakehouses
Experience working with cloud platforms and managed data services (Amazon Web Services (AWS)/Google Cloud Platform (GCP)/Azure) and basic familiarity with containerization (Docker, Podman)
Experience with version control and collaborative coding practices (Git, code reviews, branching)
Successful candidates for this job must satisfy the Company’s Conflict of Interest (COI) assessment process
This position must meet U.S. export control compliance requirements. To meet U.S. export control compliance requirements, a “U.S. Person” as defined by 22 C.F.R. §120.62 is required
Nice to have:
10+ years of related work experience or an equivalent combination of education and experience
Experience with streaming and event-driven systems and real‑time ingestion patterns
Experience with lakehouse technologies, object storage, and cloud native data services
Familiarity with infrastructure such as code, automation tools and CI/CD for data infrastructure
Experience with monitoring and observability for data systems (logging, metrics, alerting, lineage tools)
Experience with security, privacy, and governance for sensitive financial data, including access controls, encryption, and SOX/compliance requirements
Experience optimizing cost and performance for cloud data environments
Experience in Finance domain (EVM, accounting, treasury, reconciliation, payments, FP&A) or working closely with Finance SMEs
Ability to mentor and provide technical leadership across cross‑functional teams
Master’s Degree or higher in a technical discipline or equivalent advanced experience
What we offer:
Generous company match to your 401(k)
Industry-leading tuition assistance program pays your institution directly
Fertility, adoption, and surrogacy benefits
Up to $10,000 gift match when you support your favorite nonprofit organizations
health insurance
flexible spending accounts
health savings accounts
retirement savings plans
life and disability insurance programs
a number of programs that provide for both paid and unpaid time away from work