Vacancy caducado!
- Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support.
- Worked extensively in Unix Shell Scripting, Python and SQL server with strong implementation on performance management.
- Experience of 2 years on SnowflakeCloud Datawarehouse.
- Worked on Cloud Database SnowflakeCloud Datawarehouse and Integrated Automated Generic Python Framework to Process XML ,CSV, JSON,TSV,TXT files.
- Developed Python Framework for Data Ingestion into Snowflake
- Excellent hands on experience in trouble shooting the problems and improving the performance of application processes by Debugging, tuning and tracing.
- Designing and implementing a fully operational production grade large scale data solution on SnowflakeData Warehouse.
- Work with structured/semi-structured data ingestion and processing on AWS using S3, Python. Migrate on premisis big data workloads to AWS.
- Experience in implementing complex Data Integration/Ingestion with Structured, and Unstructured data sets.
- Design and implement secure data pipelines into a Snowflakedata warehouse from on premise and cloud data sources.
- Creation of best practices and standards for data pipelining and integration with Snowflakedata warehouses.
- Experience in building data ingestion pipeline using Informatica.
- Design, develop, and implement advanced data pipelines that bring together data from disparate sources, making it available to data scientists, analysts, and other users using a variety of programming languages ( Python)
- Understanding of Multi-Process Architecture.
- Analytical skills.
Vacancy caducado!