Contribute Media
A thank you to everyone who makes this possible: Read More

Recent advancements in NLP and Deep Learning: A Quant's Perspective

Description

There is a gold-rush among hedge-funds for text mining algorithms to quantify textual data and generate trading signals. Harnessing the power of alternative data sources became crucial to find novel ways of enhancing trading strategies.

With the proliferation of new data sources, natural language data became one of the most important data sources which could represent the public sentiment and opinion about market events, which then can be used to predict financial markets.

Talk is split into 5 parts;

  • Who is a quant and how do they use NLP?
  • How deep learning has changed NLP?
  • Let’s get dirty with word embeddings
  • Performant deep learning layer for NLP: The Recurrent Layer
  • Using all that to make money

1. Who is a quant and how do they use NLP?

Quants use mathematical and statistical methods to create algorithmic trading strategies.

Due to recent advances in available deep learning frameworks and datasets (time series, text, video etc) together with decreasing cost of parallelisable hardware, quants are experimenting with various NLP methods which are applicable to quantitative trading.

In this section, we will get familiar with the brief history of text mining work that quants have done so far and recent advancements.

2. How deep learning has changed NLP?

In recent years, data representation and modeling methods are vastly improved. For example when it comes to textual data, rather than using high dimensional sparse matrices and suffering from curse of dimensionality, distributional vectors are more efficient to work with.

In this section, I will talk about distributional vectors a.k.a. word embeddings and recent neural network architectures used when building NLP models.

3. Let’s get dirty with word embeddings

Models such as Word2vec or GloVe helps us create word embeddings from large unlabeled corpus which represent the relation between words, their contextual relationships in numerical vector spaces and these representations not only work for words but also could be used for phrases and sentences.

In this section, I will talk about inner workings of these models and important points when creating domain-specific embeddings (e.g. for sentiment analysis in financial domain).

4. Performant deep learning layer for NLP: The Recurrent Layer

Recurrent Neural Networks (RNNs) can capture and hold the information which was seen before (context), which is important for dealing with unbounded context in NLP tasks.

Long Short Term Memory (LSTM) networks, which is a special type of RNN, can understand the context even if words have long term dependencies, words which are far back in their sequence.

In this talk, I will compare LSTMs with other deep learning architectures and will look at LSTM unit from a technical point of view.

5. Using all that to make money

Financial news, especially if it’s major, can change the sentiment among investors and affect the related asset price with immediate price corrections.

For example, what’s been communicated in quarterly earnings calls might indicate whether the price of share will drop or increase based on the language used. If the message of the company is not direct and featuring complex sounding language, it usually indicates that there’s some shady stuff going on and if this information extracted right, it’s a valuable trading signal. For similar reasons, scanning announcements and financial disclosures for trading signals became a common NLP practice in investment industry.

In this section, I will talk about the various data sources that researchers can use and also explain common NLP workflows and deep learning practices for quantifying textual data for generating trading signals.

I will end with summary with application architecture in case anyone would like to implement similar systems for their own use.

in __on sabato 21 aprile at 14:45 **See schedule**

Details

Improve this page