Vacancy caducado!
- Being able to get an understanding of the multi-channel cyber optimization problem space
- Perform exploratory data analysis to understand relationships, opportunities to influence outcomes and how to attribute cross channel outcomes
- Specifying a research plan
- Running experiments and extracting the necessary data
- Being able to interpret the model that are being generated
- Develop a proof of concept to verify your ideas
- Assisting the Data Engineering team in setting your proof of concept live in our production environment
- Closing the loop to make sure that the proposed solution is performing as it should once it is released
- Bachelor's or Master's degree in a quantitative field (engineering, mathematics, physics, machine learning, statistics or computer science)
- Good problem decomposition skills and autonomy when faced with solving data problems. A PhD in a quantitative field (engineering, mathematics, physics, machine learning, statistics or computer science) is one way to acquire this experience. Master's degrees will still be considered as long as they can demonstrate the required ability.
- At least 3+ years of industry experience outside of academia.
- Experience manipulating big data using opensource frameworks. This means being familiar with tools such as Spark, Kafka, Hadoop, Impala and Luigi. In addition, because we handle very large volumes of data past work with cloud-based environment is required.
- Knowledge of current modeling tools such as XGBoost or Vowpal Wabbit (or equivalent)
- Be able to write high quality Python code. This means being familiar with unit testing, source control and code review
- Experience working in Agile teams and using Scrums to organize work
- Demonstrated communication capabilities. We ask that you be able to explain your insights to people beyond the data science team. This means knowing how to best craft your message and selecting the right visualizations
- Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
- Travel up to 20% (While 20% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice)
- A proven ability deploying automated data processing pipelines
- A deep interest in Data Science and AI. A good measure of this would be competition participation (Kaggle or other) as well as participation in open courses
- If you have code in the open domain (for example Github) or have written about AI/DS please share this with us
- Some experience in cyber security is a plus
Vacancy caducado!