Senior Data Engineer (P3152)
Job Description
84.51° Overview:
84.51° is a retail data science, insights and media company. We help The Kroger Co., consumer packaged goods companies, agencies, publishers and affiliates create more personalized and valuable experiences for shoppers across the path to purchase.
Powered by cutting-edge science, we utilize first-party retail data from more than 62 million U.S. households sourced through the Kroger Plus loyalty card program to fuel a more customer-centric journey using 84.51° Insights, 84.51° Loyalty Marketing and our retail media advertising solution, Kroger Precision Marketing.
Join us at 84.51°!
__________________________________________________________
As a Senior Data Engineer, you will have the opportunity to design and build data pipeline for both internal and external clients. You will have deeper understanding of Data Engineering approaches along with hands on experience in building highly scalable solutions. We are a team of innovators, continuously exploring new technologies that ensure 84.51° remains on the forefront of data development. In this position, you will be utilizing Python, Pyspark, SQL, Databricks, Azure Cloud Services, and Snowflake.Â
As part of the Data Asset team, you will be working in a space that will be using various internal and external data in order to facilitate reporting, analysis and warehousing.
Responsibilities
Take ownership of stories and drive them to completion through all phases of the entire 84.51° SDLC. This includes external and internal data pipeline as well as process improvement activities such as:
- Design and develop Python & SQL based data pipeline solutionsÂ
- Perform unit and integration testingÂ
- Create quality checks for ingested and post processed dataÂ
- Ensure alerting and monitoring of automated pipeline solutionsÂ
- Provide mentoring to junior team membersÂ
- Participate in retrospective reviewsÂ
- Participate in the estimation process for new work and releasesÂ
- Maintain and enhance existing applicationsÂ
- Bring new perspectives to problemsÂ
- Be driven to improve yourself and the way things are doneÂ
Requirements
- Understanding of Agile Principles (Scrum)
- 5+ years of proven professional Python and SQL development experienceÂ
- Proficient with distributed data processing (PySpark, Snowpark)
- Proficient with automated testing (PyTest, etc…)
- Proficient with GitHub
- Experience using Python frameworksÂ
- Experience with Cloud Technologies & Services (Azure preferred, GCP, AWS)Â
- Experience with performance tuning enterprise applicationsÂ
- Experience with debugging data pipleinesÂ
- Understanding of CI/CDÂ
- Understanding of Object Oriented PrinciplesÂ
- Understanding SOLID principlesÂ
Preferred Skills
- Python & SQL DevelopmentÂ
- Distributed Data Processing (PySpark, Snowpark)
- CI/CD practicesÂ
- Automated data pipeline orchestrationÂ
- Data observability – Logging, Monitoring, and Alerting (Datadog)
- Azure Cloud infrastructure developmentÂ
- SnowflakeÂ
- Databricks
- Data quality checksÂ
- Cloud Technologies
IMPORTANT INFO
This is a Hybrid position. Candidates must be able to come into the office on Monday, Tuesday, and Wednesday of each week. We have locations in Cincinnati, OH, Chicago, IL, Deerfield, IL, New York, NY, and Portland, OR. There are no remote options for this position.Â
We are NOT working with Staffing Firms, Consulting Companies, or any other 3rd parties on this position.Â
Full-time only. No contracts. No C2C.Â
#LI-DOLFÂ Â Â Â
Date Posted
09/22/2024
Views
4
Similar Jobs
Senior Partner, Advertising Sales, Walmart Connect - Kimberly Clark (Chicago) - Walmart
Views in the last 30 days - 0
View DetailsLead Architect - Analytics Engineering (Requiring Python) - CVS Health
Views in the last 30 days - 0
View Details