JOB DESCRIPTION
- Understands and communicates requirement to the stakeholders
- Define detail tasks for project members
- Lead Data Lakehouse / Big Data projects
- Build ETL pipeline to process structured & unstructured data following best practices
- Develop automated reports and dashboards using Power BI
- Define and document approach and design (code / testing / migration)
- Explain scope / approach / design to customer / internal team as needed
- Assists team members on tasks as needed
- Independently able to do development and bug fixing
REQUIREMENTS
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Master Data Modeling (Star Schema, Snowflake)
- Experience with ETL design, Data Factory / Synapse / Databricks
- Experience with object-oriented scripting languages: Python, Java, Scala, etc
- Experience with big data tools: Hadoop, Spark, Kafka, etc. is a plus
- Some experience with visualization tools such as PowerBI / Tableau or other BI tools will be beneficial
- Ability to work in a team / collaborative environment but also work independently with the customer
- Willing to learn & hardworking
- Smart, good attitude, logical, reasonable, honest & simple
- Good communication
- Good Academic track record preferred
QUALIFICATIONS / SALARY
- Degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field
- >= 5 years of experience