Vacancy caducado!
- Create Optimized and Scalable Data Models (star schemas, snowflakes, etc.) for Enterprise Analytics Solutions.
- Experience in Data bricks / Snowflakes
- Experience in API , Python , Java
- Work experience in Data injection from salesforce
- Develop ELT/ETL and Implement best practices for ELT/ETL development.
- Create Data Pipelines that are highly optimized with very large data sets.
- Work effectively using scrum with multiple team members to deliver analytical solutions to the business functions.
- Have a high sense of urgency to deliver projects as well as troubleshoot and x data queries/ issues
- Always be on the lookout to automate and improve existing data processes for quicker turnaround and high productivity
- Experience with designing complex Data Models and Data Engineering Solutions for large scale Data Warehouse/Data
- Lakes from various heterogeneous Data Sources.
- Experience with Data Integration, Business Intelligence and Analytics tools and/or any other open source and self-service
- Analytics tool (Pentaho Data Integration,
- Strong Database Management system knowledge; Experience with Microsoft SQL Server, PostgreSQL, MySQL, Oracle and NoSQL Databases are needed.
- Experience with AWS technologies stack including S3, Redshift, Glue, RDS, Sagemaker, or similar solutions.
- Experience in designing data models with Salesforce, Workday, Marketo, Gainsight, Adobe Analytics is Preferred.
- Experience with Agile development methodologies (Scrum, Pair Programming).