Machine Learning is a form of artificial intelligence, which uses algorithms to make predictions against defined goals. Contrary to traditional algorithms, which must be precisely coded to execute predefined instructions, these algorithms learn (self-adjust) through a process of trial and error (training).
According to Deloitte, Artificial Intelligence is the next frontier for investment management firms. It focuses on four pillars for transformation that can empower the firms to develop new propositions and deliver new kinds of value. These four pillars are generating alpha, enhancing operational efficiency, improving product and content distribution, and managing risk. But, according to a 2019 CFA Institute Survey, only 10% of portfolio managers responding to the survey used ML technology. Major challenges in applying AI/ML, according to the survey, are upfront and ongoing costs, absence of talent, the rapidly evolving AI technology landscape, lack of vision, and time. At DataArt, we find confirmation of these findings. It takes years to develop a solid Machine Learning practice, learn rapidly changing technologies and approaches as well as grow and scale talent. But we have good news for those who would like to quickly test their ideas. With modern cloud-based technologies and a broad set of ML tools, the cost of experimentation considerably decreases. Companies do not have to invest in technology upfront, hire, and grow in-house talent.
With the help of an experienced professional consultant, you can test your theories quickly and decide whether you should invest in ML technologies. DataArt helps to accelerate your experimentation in the early stages, build necessary infrastructure, and transfer knowledge in house at later stages of your ML journey.
Introducing new AI/ML tools for superior portfolio modeling
There are four major steps in the ML journey. DataArt can help with any or all of these:
Start with developing business vision and ideas. Identify a problem in one of these categories: increasing alpha, improving operational efficiency, strengthening risk management or creating new investment products. DataArt can help with an initial feasibility assessment and provide implementation options to test these ideas with a shortened time to market and lower implementation cost.
Set up ML infrastructure. The infrastructure that is required for Machine Learning is significantly different from the traditional one, usually set up for the application development process. Since machine learning is fully based on data, it requires a data infrastructure that can support virtually any volume of data and the ability to query and analyze this data quickly. The good news is that modern cloud providers offer mature solutions to enable your Big Data capabilities in a matter of minutes. Amazon S3, Azure Data Lake Store, and Google Storage come with data access solutions that allow you to make queries on raw data ingested into storage. Data Lake (your Big Data storage) could be up and running in a matter of hours, and data will be flowing and available for analysis in a matter of days.
Implement initial ML solution. To start experimenting with ML models, you do not need a solid ML infrastructure. Most Machine Learning today is done using Notebooks. Analytical Notebooks (such as Jupiter Notebook, Apache Zeppelin) are web-based notebooks that enable interactive data analysis. The beauty of notebooks is that they make it possible to run data science experiments and visualize details with stakeholders in an interactive mode. An example of Notebook is available on the DataArt demo page. It took several days to collect data, implement notebook and provide visual results for stakeholders to analyze. The details describing Notebook functionality are outlined below.
This notebook implements PCA (principal component analysis), a method of multivariate analysis traditionally used to reduce the dimensionality of a multivariate data table and express this information as a set of a few new variables, called principal components. It provides an example of portfolio modeling and makes it possible to set up risk distribution over the PCA components, visualize portfolio performance using interactive plots, and calculate weights with the help of eigenvalues and eigenvectors.
The Machine Learning model that is used behind this notebook supports several input parameters:
Up to 43 S&P 500 companies weighted by capitalization;
My portfolio: Risk makes it possible to distribute the risk over the PCA components and specify a risk level different from the S&P 500;
My portfolio: Longs makes it possible to increase or decrease long positions in the portfolio;
My portfolio: Shorts makes it possible to increase or decrease short positions in the portfolio. The value is set to zero by default as a recommended option for long-term investment.
The results will be displayed on a visual plot where you can see the newly created portfolio. You can see that higher risk leads to higher performance. If you take less risk or add shorts, you can see that it is possible to construct the proximally same portfolio in terms of S&P 500 performance with less risk.
Please note that the PCA model demo should not be used as a standalone tool for making investment decisions. It was trained using historical data only and cannot predict the future in any way. Past performance is no guarantee of future results. Neither should the PCA portfolio demo or the ML approach be taken to constitute investment advice.
Evaluate results and make further decisions. Steps 1-3 could be executed in a matter of weeks or a few month, with controlled costs and without upfront investment in infrastructure and talent to quickly validate business ideas. New business advantages will be achieved by firms whose business ideas are executed in a matter of days or hours, not weeks or months. Unless you enable quick experimentation, more agile firms will seize the competitive advantage.
At DataArt finance AI competency center, we provide consulting services and help clients enable key AI and ML capabilities and principles, by implementing the right data platform architectures, providing interim data science and data engineering talent, designing and ramping up data science processes, and implementing data science platforms and AI-infused systems.
Contact our Principal Technical Consultant Oleg Komissarov to discuss your AI and data science needs and initiatives: email@example.com.