Job Description :
- Experience in Enterprise Data Warehouse technologies (ideally 3-5 years)
- Multi-Dimensional Data Modeling, Data Architectures or other work related to the construction of enterprise data assets
- Strong experience implementing ETL processes and building data pipelines
- Experience with Big Data frameworks such as Hadoop, Azure Big Data solutions (DataFactory, ADLS, DataBricks, Synapse), Apache Spark/DataBricks
- Experience tuning Hadoop/Spark parameters to improve performance, Big Data querying tools, Data modeling and schema design
- Experience with MDM systems
- Knowledge of SalesForce and Oracle NetSuite would be a plus
- Strong SQL programming background with the ability to troubleshoot and tune the code
- Proven understanding and demonstrable implementation experience of cloud data platform technologies
- Excellent inter-personal and teamwork skills
- Strong problem solving, troubleshooting and analysis skills
- Experience working in a geographically distributed team
- Experience with leading and mentorship of other engineers
- Good knowledge of Agile Scrum
- Good communication skills
No comments:
Post a Comment