Vacancy caducado!
- Work with cutting edge technologies and business models
- Contribute to brand new, patent-worthy, concepts and products.
- Be part of small, self-empowered teams.
- Participate in customizable skill-level and personal development training.
- Opportunity to identify, research, and feed the development of new and experimental products.
- Freedom to utilize different technologies, languages, and frameworks that apply to the problem being solved.
- Influence and inform solution design efforts that consider performance, risk mitigation, user experience, and testability.
- Participates in Design Thinking to identify personas, develop problem solving ideas, and pitch ideas to leadership as a team.
- Competitive Benefits, Pay, and Bonus Potential.
- STEM Mentoring Opportunities: Give back to the community in your area of expertise through volunteering at STEM events for students!
- Local volunteer opportunities.
- 401k plan
- A Learning Culture: Mentoring, Tuition Reimbursement, Health Initiatives, and more!
- Applies skills, tools, security processes, applications, environments and programming language(s) to complete complex assignments.
- Understands and develop/maintain data movement scripts related to storing, retrieving, or acting on housed data to AWS Cloud
- Tests requirements for the movement, replication, synchronization, and validation of data
- Identifies ways to automate and improve upon existing automation
- Develop and improve monitoring solutions
- Be willing to take on special assignments that may require additional learning
- Understanding in programming (e.g., Python), and database functionality (e.g., SQL, Non-SQL)
- Understanding in compute environments, including but not limited to Linux, Mainframe and Public Cloud
- Understanding of Application Programming Interface (APIs)
- Data Validation and Qualitative and Quantitative Analysis
- Strong understanding in database technologies like IBM DB2, Postgres and AWS RDS, Redshift, Aurora
- Certifications in AWS Cloud technologies
- Advanced Python and SQL functional experience
- Linux experience with strong bash scripting, python with Pandas data frames and spark with Pyspark or Scala Spark experience
- AWS cloud experience with glue, lambda, DMS, step functions and Redshift, API Gateway
- Experience building and using CI/CD pipelines, leveraging tools like GitLab CI/CD
- SAS or R
- Python w/ Pandas Data Frames
- Pandas Profiling
- Spark with PySpark or Scala Spark - more data frame driven software
- SQL Clients; examples: DBeaver, WinSQL, PgAdminn, SQL Workbench
- Jupyter Notebooks