Job Description
About the Job
You will be a member of the Data Application and Engineering Team. We are a force multiplier owning the data analysis and knowledge infrastructure that enables ourselves and our teammates to move faster and smarter.
The Data Application and Engineering team’s mission is to empower decision making with data maintain data integrity and security enable scalability and agility. The team’s work includes ingest data build ETL pipelines and create services and tools for others to use data more efficiently. You will work on developing and enhancing our data warehouse defining processes for data monitoring and alerting as well as maintaining data integrity in our data ecosystem. You will work with cross functional teams and internal stakeholders to define requirements and build solutions to meet the requirements. You will work with other engineers to ensure that our data platform and infrastructure are scalable and reliable.
Responsibilities
-
Work on high impact projects that improve data availability and quality and provide reliable access to data for the rest of the business
-
Build and manage a state-of-the-art data pipeline architecture leveraging our tech stack to fulfill business requirements.
-
Assemble large complex data sets that meet functional and non-functional business requirements.
-
Oversee the ingestion of data into Snowflake employing tools like Fivetran as the data integration platform and facilitate the operation of this data through dbt and Looker.
-
Conduct thorough analyses and debugging of data pipeline issues ensuring data integrity and reliability.
-
Communicate strategies and processes around data modeling and architecture to the data engineering as well as other teams
-
Identify design and implement internal process improvements: automating manual processes optimizing data delivery re-designing infrastructure for greater scalability etc.
-
Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using SQL Snowflake and AWS technologies.
Requirements
-
Minimum of 3 years experience in data engineering with substantial work on ETL pipeline construction preferably in environments utilizing Fivetran dbt Snowflake and Looker.
-
Proficient in advanced SQL with a strong background in designing and implementing ETL processes.
-
Demonstrable coding skills in Python.
-
Proven track record of managing large datasets including their processing transformation and transportation.
-
Experienced with cloud services particularly AWS and familiar with services like EC2 SQS SNS RDS and Cache.
-
Bachelor’s degree in Computer Science Software Engineering or a related field.
-
Deep understanding of the complete data stack including Apache Hadoop Apache Spark Spark Streaming Kafka and the ability to adapt and learn new technologies.
-
Direct experience in deploying machine learning models into production environments particularly using Java/Python.
-
Familiarity with data visualization and business intelligence tools specifically Looker/Sigma to translate data into actionable insights.
Compensation & Benefits:
At Taskrabbit our approach to compensation is designed to be competitive transparent and equitable. Our total compensation consists of base pay + bonus + benefits + perks .
The base pay range for this position is $115000 - $160000. This range is representative of base pay only and does not include any other total cash compensation amounts such as company bonus or benefits. Final offer amounts may vary from the amounts listed above and will be determined by factors including but not limited to relevant experience qualifications geography and level.
Date Posted
04/15/2024
Views
5
Similar Jobs
Staff Salesforce Engineer - CRM Systems - GitLab
Views in the last 30 days - 0
This job description outlines a Staff Salesforce Developer role focusing on designing building and scaling enterprisegrade solutions across Salesforce...
View DetailsSoftware Engineer III | Platform - ExtraHop
Views in the last 30 days - 0
This job posting seeks a Software Engineer III to develop features lead junior team members and contribute to secure cloud and appliance solutions The...
View DetailsDevOps Engineer - Guidehouse
Views in the last 30 days - 0
This job posting seeks a skilled DevOps Engineer to support development QA and operations across applications emphasizing automation cloudnative infra...
View DetailsData Scientist - Capstone Integrated Solutions
Views in the last 30 days - 0
Capstone Integrated Solutions promotes itself as a customerfocused provider offering comprehensive software services and seeks a Data Scientist with e...
View DetailsSenior Marketer - Usage Automation - HubSpot
Views in the last 30 days - 0
This job posting outlines a Senior Marketer role at HubSpot focusing on customer experience through datadriven automation Responsibilities include des...
View DetailsEngineering Manager - Software Supply Chain Security: Auth Infrastructure - GitLab
Views in the last 30 days - 0
This job description highlights a leadership role in developing secure scalable authentication infrastructure for GitLab It emphasizes technical exper...
View Details