YOU ARE EXCITED ABOUT THIS ROLE BECAUSE YOU WILL
- Together with your cross-functional partners you will define key metrics for the product area: how do we measure success?
- You will build easy to understand and consistently modelled datasets to serve metrics, dashboards and exploratory analysis
- You will develop semantic models in Looker, build dashboards and coach users on how to use self-serve capabilities of our platform
- Working together with software engineers you will ensure that the data necessary for analysis is captured well in our systems
- You will build data transformation pipelines, primarily by using SQL and Python in dbt + Airflow infrastructure
- Throughout this process you will apply best practices to build reliable, well tested, efficient, documented data assets
WE ARE EXCITED ABOUT YOU BECAUSE YOU HAVE EXPERIENCE IN
- You’ll have experience working in the following area dimensional data modelling for analytics / data warehouses / big data infrastructures
- High proficiency in SQL (any dialect)
- Experience programming in Python, Java or another language
- 5+ years experience working with SQL
- 3+ years working with data warehouses technologies
- 3+ years experience working with BI tools such as Looker or Tableau
- 2+ years experience working with a transformation tool such as dbt
- Experience working with orchestration systems such as airflow
- Technical degree is plus, but not a must
APPLY ON THE OFFICIAL WEBSITE USING THE LINK BELOW: