Vacancy caducado!
Job Description: Position Summary This person will Lead Technical Design and Development of Software. This person will be single point of contact for a given project. Will be responsible for high level design and architecture for Hadoop technologies. This person will be an individual contributor and will not have direct reports.Required Skills
- Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand.
- 3-5 years experience in Unix shell scripting
- Must have knowledge of the Standard Agile Ceremonies: Release Planning, Grooming, Pre-Sprint (PI) Planning, Daily Stand Up, Post Stand Up, Sprint Design Session, Demo, Retrospective, Celebration.
- 3-5 years experience in working with a Big Data implementation in production environment
- 3-5 years experience in HDFS, Map Reduce, Hive, impala, Linux/Unix technologies, Spark
- Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecture big data environments.
- Support multiple projects with competing deadlines
- Previous experience in the financial services industry
- Previous experience in workload migration from Legacy SQL to Hadoop
- Understanding of industry trends and relevant application technologies
- Experience in designing and implementing analytical environments and business intelligence solutions
- Experience working in Agile development shop 6. Able to analyze the existing shell scripts/python/perl code to debug any issues or enhance the code
- Sound knowledge of relational databases (SQL) and experience with large SQL based systems.
- Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment.
- Benchmark and debug critical issues with algorithms and software as they arise.
- Lead and assist with the technical design/architecture and implementation of the big data cluster in various environments.
- Able to guide/mentor development team for example to create custom common utilities/libraries that can be reused in multiple big data development efforts.
- Exposure to ETL tools e.g. Data Stage, NoSQL (HBase, Cassandra, MongoDB), desirable
- Work with line of business (LOB) personnel, external vendors, and internal Data Services team to develop system specifications in compliance with corporate standards for architecture adherence and performance guidelines.
- Provide technical resources to assist in the design, testing and implementation of software code and infrastructure to support data infrastructure and governance activities.