Vacancy caducado!
Your OpportunityWould you like to be part of a new team chartered to build the next generation data & analytics platform supporting Schwab Technology Services? The Customer Data Team in Data and Rep Technology (DaRT) is looking for a Software Engineer that will help develop and enable the strategic use of DaRT data assets. Our ideal candidate is enthusiastic about learning new and existing technologies in order to deliver exceptional software solutions. You need to have proven critical thinking skills and a laser focus on pragmatic problem solving. We require strong ethics, critical thinking skills, and the ability to partner with and influence business stakeholders and technologists across the organization. You should have strong backgrounds in both data architecture and data engineering, along with a passion for learning new data integration techniques. What you are good at
- Collaborating directly with business and technology stakeholders to define future-state business capabilities & requirements, and translating those into transitional and target state data architectures
- Analyzing the current technology environment to detect critical deficiencies, and recommend solutions for improvement
- Designing, implementing, and maintaining data warehouses and near real-time data pipelines via the practical application of existing and new data engineering techniques
- Developing continuous integration and continuous deployment pipelines for data solutions that include automated unit & integration testing
- Mentoring, motivating, and supporting the team to achieve organizational objectives and goals
- Advocating for agile practices to increase delivery throughput
- Ensuring consistency with published development, coding and testing standards
- 2+ years of experience designing, building, and supporting near real-time data pipelines and analytical solutions using Hadoop, Teradata, MS SQL Server, Talend, Informatica, and/or SSIS
- 1+ years of experience working on agile teams delivering data solutions
- 2+ years of experience building data pipelines and interfaces with object oriented languages (.Net, Java, Python)
- 1+ years of experience modeling star schema data warehouses using the Kimball dimensional modeling techniques
- Knowledge of delivering solutions on public cloud platforms (Google Cloud preferred)
- Experience writing automated unit, integration, and acceptance tests for data interfaces & data pipelines
- Ability to quickly learn & become proficient with new technologies
- Exceptional interpersonal skills, including teamwork, communication, and negotiation