This 24-month Internship programme offers a transformative journey – as we work to future-proof our team’s skills and capabilities, while meeting world-class delivery standards across digitisation; technology and operations in our drive to win. We’re looking for talented individuals who have a deep customer obsession with data and solving problems and are inspired by solving real customer problems by analysing data, identifying trends and building bespoke solutions for our customers. This programme construct is bespoke and has a strong learning on the job, self-learning and technical training focus. If you have what it takes, then…you’re Good to Go.
Essential
QUALIFICATIONS AND QUALITIES:
- Degree in a relevant technical field (Actuarial Science, Computer Science, Electronic Engineering, Mathematics, Applied Mathematics, Financial Mathematics, Statistics, Informatics, Information Systems)
- Minimum of 60% average over all years of study
- SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Working knowledge of architectures to support advanced analytics and data science
- Basic analytic skills related to working with unstructured datasets
- Knowledge about and experience of use of data to deliver business insights and transform delivery in an FMCG context
- Experience with building processes supporting data transformation, data structures, dependency and workload management
- A successful history of manipulating, processing and extracting value from large disconnected datasets
- Working knowledge of message queuing, stream processing, and highly scalable data stores
- Working Knowledge supporting and working with cross-functional teams in a dynamic environment
- Passion for empirical research and for answering hard questions with data
- Ability to apply an agile analytic approach that allows for results at varying levels of precision
- Strong communication skills, particularly the ability to communicate complex quantitative insights in a precise, and actionable manner to business leaders
- Basic track record in solving analytical problems using quantitative and machine learning approaches
- Knowledge in common machine learning techniques such as Random Forests, Boosting, Regularized Regression, Naïve Bayes Classifiers
- Knowledge of advanced machine learning such as Deep Neural Networks, Support vector machines, Reinforcement learning and Bayesian networks
- Knowledge in classical statistics (Regression, Clustering, Optimization, Time Series, Probability)
- Knowledge in testing and measurement (A/B, multivariate, inferential measurement e.g. Causal Impact)
- Knowledge working with and coding in R, R Shiny, Python
- Knowledge of data visualization concepts in reports (Power BI) and specialist tools (D3 or equivalent)
- Knowledge of extracting and combining complex, high-volume, high-dimensionality data from multiple sources (enterprise, proprietary, IoT, public domain), including unstructured data (comment threads, audio, video)
- Knowledge of working with large data sets, experience working with distributed computing tools a plus (Apache Spark, Hive, Impala)
- Knowledge working in Microsoft Azure and scaling analytic products over GPUs in the cloud
- Self-directed
- Possesses a natural curiosity, openness to possibilities and imagination to create novel business solutions
- Ability to work collaboratively with peers and demonstrate vertical and lateral influence.
Professional
- Machine learning forecasting techniques
- Statistical modelling
- Operational research and supply chain
- Optimisation techniques and tools
- Manipulating multi-source data
- Python coding
- Cloud architecture (preferably MS Azure)
- Simulation packages e.g. Anylogic
- Distributed computing (Hadoop, Spark)
Key Capabilities
- Business Acumen
- Data Science
- Machine Learning
- Programming
- Statistical Analysis
- Data Analytics
- Data Expertise
- NLP / NLG
- Model re-engineering
- Model Deployment
- Machine Learning
- Python
- Cloud programming
- Data warehouse, Data lakes