Costco Travel is looking for a creative, team-oriented, and motivated Data Engineer to mentor and assist project teams with database modeling, design, development and optimization. This will include both internal and online applications as well as the health of the production environment to make sure all database systems are running efficiently and smoothly. Data Engineers are also responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration, and deploying code to production via CI/CD. This role will also partner closely with the DBA team, Product Owners, Data Architects, and Platform/DevOps Engineers. Costco Travel contains a fast paced, growing, and exciting team environment and our ideal candidate enjoys challenges, appreciates being able to apply creative thinking to solve complex business problems and wants to be part of a team that is goal focused and results oriented.
Role
- Develops complex SQL & Python against a variety of data sources.
- Implements streaming data pipelines using event/message based architectures.
- Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
- Works in tandem with Data Architects to align on data architecture requirements provided by the requestor.
- Defines and maintains optimal data pipeline architecture.
- Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
- Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
- Analyzes data to spot anomalies, trends and correlate data to ensure Data Quality.
- Develops data pipelines to store data in defined data models/structures.
- Demonstrates strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL)/Extract, Load, Transform (ELT)) tools.
- Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
- Identifies ways to improve data reliability, efficiency and quality of data management.
- Performs peer review for another Data Engineer’s work.
- Optimizes existing SQL code for performance.
- Works within and across teams to solve complex technical challenges and priority issues.
- Coaches and mentors the development teams on all things SQL and Data.
- Develops, maintains, and operationalizes Azure based ETL pipelines for reporting, advanced analytics, testing, and archiving.
- Develops data engineering best-practices – continues to evaluate our processes and reporting to identify opportunities to improve, enhance, and automate existing, and new, capabilities.
- Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestration.
- Analyzes data to spot anomalies, trends, and correlate data to ensure Data Quality.
- Identifies ways to improve data reliability, efficiency, and quality of data management.
- Performs peer review for the Data Engineering team and other development teams as needed.
- Works in tandem with Architects, Product owners, and Engineers to design data requirements and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
- Regular and reliable workplace attendance at your assigned location.
Required:
- 5+ years’ data engineering experience and a BS or MS in Computer Science, Engineering or related technical discipline.
- 5+ years’ SQL server experience.
- 3+ years’ Azure, ADF, ADSL, and Synapse or equivalent.
- 3+ years’ creating data pipelines and ETL solutions.
- 2+ years’ Python experience.
- MS SQL scripting and server subject matter expert.
- Able to work in a fast-paced agile development environment.
Recommended:
- Proficient in Google Workspace applications, including Sheets, Docs, Slides, and Gmail.
- Successful internal candidates will have spent one year or more on their current team.
- BA/BS in Computer Science, Engineering, or equivalent software/services experience.
- Azure Certifications.
- Experience delivering data solutions through agile software development methodologies.
- Excellent verbal and written communication skills.
- Able to demonstrate strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.).
- Able to demonstrate ability to communicate technical concepts to non-technical audiences both in written and verbal form.
- Successful internal candidates will have spent one year or more on their current team.
Required Documents
- Cover Letter
- Resume
California applicants, please click here to review the Costco Applicant Privacy Notice.
Pay Ranges:
Level 3 - $130,000 - $160,000
Level SR: - $150,000 - $190,000, Bonus and Restricted Stock Unit (RSU) eligible
We offer a comprehensive package of benefits including paid time off, health benefits - medical/dental/vision/hearing aid/pharmacy/behavioral health/employee assistance, health care reimbursement account, dependent care assistance plan, short-term disability and long-term disability insurance, AD&D insurance, life insurance, 401(k), stock purchase plan to eligible employees.
Costco is committed to a diverse and inclusive workplace. Costco is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or any other legally protected status. If you need assistance and/or a reasonable accommodation due to a disability during the application or the recruiting process, please send a request to IT-Recruiting@costco.com
If hired, you will be required to provide proof of authorization to work in the United States.