Friday, April 16, 2021

Salesforce - Software Engineer - ETL Tools (9-14 yrs) (Salesforce)

ROLE BRIEF :


- Enterprise Data Services (EDS)- is part of Business Technology and functions with a goal to deliver technology that is centered around our business and collective success.

- We(BT Enterprise Data Service team) are an agile team which owns the Enterprise Data Platform and has been/is responsible for the delivery of reporting/performance metrics/self-service analytics to various business domains. Activities also include management of on-premise & cloud platforms.


- We oversee technology strategy, Salesforce on Salesforce, customer and partner enablement, collaboration, applications engineering, architecture, governance and program enablement for Data and Analytics.We are responsible for the delivery of operational reporting and performance metrics to various business domains including Sales and Sales Operations, Marketing, Finance, and Employee Success.

- Team also manages all aspects of rolling out Salesforce's analytics tools/solutions to internal business partners.

- We value transparency and trust.

- The Salesforce BT EDS team is looking for a Senior Data Centric Solution Engineer to Integrate, Optimise and Maintain data from Heterogeneous Data Warehouses involving varied Logical database models, ETL tools and Reporting technologies.


- For this role, you should be adept at analysing system requirements, performing FIT GAP analysis and implementing migration methods for existing data along with strong hands on experience in working with multiple DB- s, ETL & Reporting tools.


- To have good knowledge of Cloud platform data storage models and petabyte Data Lake enterprise implementations, also need to be familiar with Normalised & De-Normalised data models.


- Ultimately, you will build world-class data solutions and applications that power crucial business decisions throughout the organisation.

Responsibilities :

- Lead, design and development of on premise/cloud based analytical solutions, data models, data pipelines and transformations using any ETL tool or Next Gen technologies(Python,Spark,Snowflake...)

- Review solution design/s and ensure that it meets the defined guidelines & standards

- Understand and incorporate the required security standards across development cycle

- Define standards and procedures; refine methods and techniques for Db, Transformation & Visualisation layers

- Perform prompt root cause analysis of technical issues leading to quick turnaround time

- Ensure seamless delivery of assigned tasks

- Ensure quality assurance plan and test cases are comprehensive to validate the solution thoroughly.

- Support QA, UAT and performance testing phases of the development cycle

- Provide technical leadership and oversee the delivery of solutions developed by vendors

- Create/review technical documentation for all the solutions designed/developed

- Work closely with all cross functional teams to deliver coordinated software solutions

Required Skills :

- Bachelor's Degree in Computer Science, MIS, or a related discipline, with 10+ years related information systems experience in Data Warehousing and delivery of BI solutions

- Deep understanding of data warehousing concepts, relational & dimensional data model designs

- Hands on experience in Data Migrations & Data Lake design in Cloud, Snowflake,Python, Spark, Informatica/Matillion, Tableau/Tableau CRM is highly preferred

- Solid expertise in SQL and Unix shell scripting

- Good experience in AWS - Data related services

- Create descriptive, predictive and prescriptive analytics solutions using Tableau, Tableau CRM & Python.

- To possess good knowledge in Automated/Manual testing mechanisms(Regression/Performance/Integration Testing).

- Develop key metrics for data validation and ensure integrity of the same

- Expertise in Job scheduling engines(Ex: Tidal, Airflow) is preferred

- Experience working in an agile environment is required

- Excellent team player able to work with virtual and global across functional teams at all levels

- Self-starter, highly motivated, able to shift directions quickly when priorities change, think through problems to come up with innovative solutions and deliver against tight deadlines

- To collaborate with internal/external stake holders as required for meeting the project objectives

- Excellent spoken and written communication as well as receptive listening skills, with the ability to present complex ideas in a clear, concise fashion to all kinds of audience

- Excellent interpersonal skills in order to build strong relationships that will be critical for the success in this role

- Desired Skills :

- Dev-Ops scripting experience in Ansible, Puppet, Chef

- Dev-Ops methodology and CI/CD process

- Basic Knowledge of Hadoop - HDFS, Hive, Pig, Sqoop, Kafka etc

- Prior experience in GCP or cloud based data warehousing is a plus

- Salesforce CRM

- Understanding of Machine learning, Data set training & deployment of statistical models

- Salesforce, the Customer Success Platform and world's #1 CRM, empowers companies to connect with their customers in a whole new way. We are the fastest growing of the top 10 enterprise software companies, the World's Most Innovative Company according to Forbes, and one of Fortune's 100 Best Companies to Work For six years running.


- The growth, innovation, and Aloha spirit of Salesforce are driven by our incredible employees who thrive on delivering success for our customers while also finding time to give back through our 1/1/1 model, which leverages 1% of our time, equity, and product to improve communities around the world. Salesforce is a team sport, and we play to win.

Apply Now

No comments:

Post a Comment