Question

1)Give a visual example that PCA is not a good way to reduce the dimension? 2)Based on Bayes theorem, express P(c|x) in terms of likelihood and prior.

1)Give a visual example that PCA is not a good way to reduce the dimension?

2)Based on Bayes theorem, express P(c|x) in terms of likelihood and prior.

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Since there was a brief hint of SVD, I'd suggest you consider that independently of PCA. SVD can be used directly on a table of records versus attributes, rather than doing the covariance thing that PCA does. This is the basic approach of latent semantic analysis tWhy is this useful? Imagine that the dimensionality of the feature set is larger than just two or three Using a PCA we can nothe advantage is that using SVD on the raw record vs attribute data means that the singular values are actually very easy to interpret, as they are directly related to both the records and the attributes. Typical LSA will normalize the attributes just slightly, maybe using log counts rather than integer counts, and it's still fairly easy to interpret.

Add a comment
Know the answer?
Add Answer to:
1)Give a visual example that PCA is not a good way to reduce the dimension? 2)Based on Bayes theorem, express P(c|x) in terms of likelihood and prior.
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT