ETL Engineer, Data Warehousing

Ngày đăng tuyển: 03/09/2020 Về trang chủ Tuyển Dụng


Delivering the best customer experience is the biggest compulsion at Tiki. And our data team is one crucial part of this important mission. We have been trying to understand our customer deeply based on our massive amount of data so that we could make shopping better and easier for our customer as well as grow our business quickly.

ETL data engineer at Tiki take responsibility for build a data mart system, design full and strong data schema based on business requirements.

Data mart system can help many departments in giving the right decision in short time such as Marketing team can spend suitable budget for each channel, Finance team can manage the company’s health.


  • Data modelling, process design and overall data schema architecture.
  • Ensure the data quality and consistency with monitoring and support, and play an active role in establishing data governance around company KPIs.
  • Works closely with Finance Team, Marketing team to design data mart system.
  • Assist with defining project scope, developing work plans, and estimating project requirements (time, resources and pricing).
  • Architect, design, and develop data warehouse solutions, includes evaluation and selection of tools and technologies to use.
  • Transform, clean raw data to get useful data.
  • Design data mart’s new model architecture for other departments.

Why you will want to work here:

  • Tiki’s culture that's open for new ideas and sharing.
  • Opportunities to deliver big impacts are everywhere.
  • Great opportunity to dive deeply into a big data system.


  • Must have specific experience in at least one of the following DBMS environments: Oracle/UNIX & NT, MS SQL Server/NT, Sybase/UNIX and/or Sybase IQ/UNIX, Informix/UNIX, DB2/UNIX (UDB).
  • Minimum of 1 years experience in development of data warehouse, decision support, or executive information systems. Related experience in relevant application software development may be considered
  • Must have good understanding of Dimensional Modeling and ER-Modeling
  • A knowledge of cubes, ROLAP, MOLAP and HOLAP is a definite plus.
  • Must be strong in the fundamentals of DBMS concepts
  • Must be strong in querying data using SQL
  • Experience with data pipeline and workflow management and big data tools: Azkaban, Luigi, Airflow
  • Experience with cloud-based system Google Cloud Platform, Amazon Web Services: BigQuery, Dataflow, EC2, EMR, Redshift is a plus
Apply for this job or send your CV to