Delivering the best customer experience is the biggest compulsion at Tiki. And our data team is one crucial part of this important mission. We have been trying to understand our customers deeply based on our massive amount of data so that we could make shopping better and easier for our customers as well as grow our business quickly.
ETL data engineer at Tiki takes responsibility for build a data mart system, design a full and strong data schema based on business requirements.
Datamart system can help many departments in giving the right decision in a short time such as Marketing team can spend suitable budget for each channel, the Finance team can manage the company’s health.
- Data modeling, process design and overall data schema architecture.
- Ensure the data quality and consistency with monitoring and support, and play an active role in establishing data governance around company KPIs.
- Works closely with Finance Team, Marketing team to design data mart system.
- Assist with defining project scope, developing work plans, and estimating project requirements (time, resources and pricing).
- Architect, design, and develop data warehouse solutions, includes evaluation and selection of tools and technologies to use.
- Transform, clean raw data to get useful data.
- Design data mart’s new model architecture for other departments.
Why you will want to work here:
- Tiki’s culture that's open to new ideas and sharing.
- Opportunities to deliver big impacts are everywhere.
- Great opportunity to dive deeply into a big data system.
- Must have specific experience in at least one of the following DBMS environments: Oracle/UNIX & NT, MS SQL Server/NT, Sybase/UNIX and/or Sybase IQ/UNIX, Informix/UNIX, DB2/UNIX (UDB).
- Minimum of 1-year experience in the development of data warehouse, decision support, or executive information systems. Related experience in relevant application software development may be considered
- Must have a good understanding of Dimensional Modeling and ER-Modeling
- A knowledge of cubes, ROLAP, MOLAP and HOLAP is a definite plus.
- Must be strong in the fundamentals of DBMS concepts
- Must be strong in querying data using SQL
- Experience with data pipeline and workflow management and big data tools: Azkaban, Luigi, Airflow
- Experience with cloud-based system Google Cloud Platform, Amazon Web Services: BigQuery, Dataflow, EC2, EMR, Redshift is a plus