Careers

Intern

> Apply

Job Description

Detailed Description:
The key elements of the role are;

  • Build scalable, reliable, test-driven data pipelines to ingest data (e.g. API endpoints) from various sources into the data warehouse (BigQuery)
  • Design data warehouse schema to ensure extensibility and clarity to maintain data quality and reliability
  • Maintain existing pipelines and continuously identify improvements to the data architecture and data flows
  • Enjoy working together in a team passionate about quality of engineering and a growth culture where open communication is encouraged

Skills & Experience Required:

Essential qualities

  • Outstanding communication skills: an intuition, as well as methodology, for helping a diverse group understand what needs to be done and by when
  • Great interpersonal skills: ability to build and maintain strong working relationships, as well as an empathy for resistance to change
  • Proficient in programming languages such as SQL, Python to perform data operations
  • Knowledge of building and scaling batch ETL/ELT pipelines using Airflow, Docker, Kubernetes, DBT with an understanding of size and performance constraints
  • Experience with Google Cloud Platform a bonus, but not a requirement
  • Knowledge of infrastructure automation tools like Terraform is advantageous
  • Articulate in explaining technical concepts in simple terms
  • Calm when faced with multiple tasks; able to prioritise and deliver on time
  • An eye for innovation; in analytical approaches, business process and commercial structure ¬†
  • Fluency in English (written and oral)