Word2vec Concept from scratch – part 1

In this post I will be implementing the Word2vec algorithm with negative sampling from scratch using python. Implementing this from scratch allow to have a better grasp the inner operations of word2vec’s skip-gram and negative sampling approach. For our analysis, we will use the Open American National Corpus (http://www.anc.org/), which consists of roughly 15 million…

Read more
machine learningNatural Language ProcessingProgrammingTutorials

Text generation using Markov chain models

Markov chains is one of the oldest ways to generate quite believable text. Rather than generating text by randomly selecting characters (possible but completely impractical), we use the Markov property that follows the chain of linked events, where what get generated next depends only on what is generated currently. Typically generating text trough, a Markov process…

Read more
machine learningNatural Language ProcessingTutorials

Text generation using Context Free Grammar

What is Context free grammar? A Context-Free Grammar (CFG) is a set of rules that define what is and is not a valid sequence of “words” of a language. The rules aren’t limited to the production of text and can be applied in all fields where a fixed number of rules structures the production of…

Read more
Natural Language ProcessingProgramming

Introduction to automatic text generation

Text generation is a hot topic right now in the domain of Artificial intelligence. Any advanced artificial intelligence will have the ability to interact with us, humans. And this means the ability to talk our language to understand us and to communicate back to us. Ultimately the IA needs to truly interact with us in…

Read more