JOB DESCRIPTION
- Understand and communicate requirements to the stakeholders
- Get involved in Data Lakehouse / Big Data projects
- Build ETL pipeline to process structured & unstructured data following best practices
- Develop automated reports and dashboards using Power BI
- Define and document approach and design (code/testing/migration)
- Explain scope/approach/design to customer / internal team as needed
- Assists team members on tasks as needed
- Independently able to do development and bug-fixing
REQUIREMENTS
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Knowledge in Data Modeling (Star Schema, Snowflake)
- Experience with object-oriented scripting languages: Python, Java, Scala, etc.
- Knowledge of big data tools: Hadoop, Spark, Kafka, etc. is a plus
- Knowledge in ETL design, Data Factory, Synapse, Databricks, Microsoft Fabric, Snowflake, etc. is a plus
- Some experience with visualization tools such as Power BI / Tableau or other BI tools will be beneficial
- Ability to work in a team / collaborative environment but also work independently with the customer
- Willing to learn & hardworking
- Smart, good attitude, logical, reasonable, honest & simple
- Good communication
- Good Academic track record preferred
QUALIFICATIONS / SALARY
- Degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field
- English speaking