×

DeepLearning.TV's video: Recursive Neural Tensor Nets - Ep 11 Deep Learning SIMPLIFIED

@Recursive Neural Tensor Nets - Ep. 11 (Deep Learning SIMPLIFIED)
Certain patterns are innately hierarchical, like the underlying parse tree of a natural language sentence. A Recursive Neural Tensor Network (RNTN) is a powerful tool for deciphering and labelling these types of patterns. Deep Learning TV on Facebook: https://www.facebook.com/DeepLearningTV/ Twitter: https://twitter.com/deeplearningtv The RNTN was conceived by Richard Socher in order to address a key problem of current sentiment analysis techniques – double negatives being treated as negatives. Structurally, an RNTN is a binary tree with three nodes: a root and two leaves. The root and leaf nodes are not neurons, but instead they are groups of neurons – the more complicated the input data the more neurons are required. As expected, the root group connects to each leaf group, but the leaf groups do not share a connection with each other. Despite the simple structure of the net, an RNTN is capable of extracting deep, complex patterns out of a set of data. An RNTN detects patterns through a recursive process. In a sentence-parsing application where the objective is to identify the grammatical elements in a sentence (like a noun phrase or a verb phrase, for example), the first and second words are initially converted into an ordered set of numbers known as a vector. The conversion method is highly technical, but the numerical values in the vector indicate how closely related the words are to each other compared to other words in the vocabulary. Once the vectors for the first and second word are formed, they are fed into the left and right leaf groups respectively. The root group outputs, among other things, a vector representation of the current parse. The net then feeds this vector back into one of the leaf groups and, recursively, feeds different combinations of the remaining words into the other leaf group. It is through this process that the net is able to analyze every possible syntactic parse. If during the recursion the net runs out of input, the current parse is scored and compared to the previously discovered parses. The one with the highest score is considered to be the optimal parse or grammatical structure, and it is delivered as the final output. After determining the optimal parse, the net backtracks to figure out the appropriate labels to apply to each substructure; in this case, substructures could be noun phrases, verb phrases, prepositional phrases, and so on. RNTNs are used in Natural Language Processing for both sentiment analysis and syntactic parsing. They can also be used in scene parsing to identify different parts of an image. Have you ever worked with data where the underlying patterns were hierarchical? Please comment and let us know what you learned. Credits Nickey Pickorita (YouTube art) - https://www.upwork.com/freelancers/~0147b8991909b20fca Isabel Descutner (Voice) - https://www.youtube.com/user/IsabelDescutner Dan Partynski (Copy Editing) - https://www.linkedin.com/in/danielpartynski Jagannath Rajagopal (Creator, Producer and Director) - https://ca.linkedin.com/in/jagannathrajagopal

540

23
DeepLearning.TV
Subscribers
81.1K
Total Post
31
Total Views
4M
Avg. Views
128K
View Profile
This video was published on 2015-12-24 00:36:06 GMT by @DeepLearning.TV on Youtube. DeepLearning.TV has total 81.1K subscribers on Youtube and has a total of 31 video.This video has received 540 Likes which are lower than the average likes that DeepLearning.TV gets . @DeepLearning.TV receives an average views of 128K per video on Youtube.This video has received 23 comments which are lower than the average comments that DeepLearning.TV gets . Overall the views for this video was lower than the average for the profile.

Other post by @DeepLearning.TV