This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Suppose your text corpus contains 80000 different words. What is typically done to reduce the dimensionality of input vector to neural classifier?
Randomly select 10% of the words and ignore the rest
Use convolutional layer before fully connected classifier layer
Use embedding layer before fully connected classifier layer
Select 10% of most frequently used words and ignore the rest
You must answer all questions before checking your work.
Continue
Was this page helpful?