Do complex data ingestion challenges interest you? What about building data warehousing solutions to help our company scale as we ingest data from hundreds of sources as we continue to accelerate our brand acquisition? The Data Engineering Team at Thrasio is responsible for all aspects of data sourcing/crawling, data integration, and warehouse engineering, including the web scraping, monitoring, development, maintenance, and administration of our rapidly scaling data warehouse! We’re looking for a hands-on staff-level software engineer to help lead the effort in designing and building the systems that will help Thrasio scale into our next phase of growth.
- Architect and build robust systems for managing large scale data processing
- Work closely with Data Scientists, Data Analytics Engineers, and Analysts
- Architect and build processes for monitoring data sanity and checking for data availability and reliability.
- Build connectors integrating APIs from external systems.
- Build data ingestion pipelines
- Build web scrapers
- Support batch and real-time data processing built on Airflow, Snowflake, dbt, and Spark
- Collaborate cross-functionally with other engineering teams, business stakeholders, and project management to ensure high-quality deliverables are met on time.
- Provide technical leadership to team, analyze tradeoffs of designs, and partner with team members to arrive at the most optimal solution.
- Mentor junior engineers and establish best engineering practices across the team by doing code reviews and ensuring accurate code documentation.
- A minimum of a Bachelor’s degree in Computer Science, Computer Engineering, Software Engineering, Software Development, or a related field, or equivalent alternative education, skills, and/or practical experience is required
- At least 8 years of experience in software engineering
- Solid understanding of system design, data structures, and algorithms
- Exceptional coding skills in a structured language (e.g. Python, Java, C#, C++, etc.)
- Solid SQL skills
Nice to have skills:
- Strong background in data warehouse principles, architecture, design and its implementation at large scale
- Experience with AWS cloud services
- Experience working with data orchestration tools such as Airflow and dbt