Bringing out knowledge from the Internet with NLP

Natural Language Processing (NLP) is a term used a little to vaguely when IT people talk to non-technical people about the magic capability of IT, sort of when they/we use the term AI.

But, that said, we use NLP.:)

Our use of NLP is that of creating or strengthening relations by proximity in topics and quantity. We are able to use such Named Entity Recognition (NER) as we have twice the amount of topics of Wikipedia in our database and have such topics in some 30 aliases and languages. This means that we can crawl the Internet for such topics and create more value as we determine/optimize how they relate. Then, either by quantity (at some point, I think it is safe to say that the Internet knows the truth … when the number of occurancies become overwhelming) or later (we are not there yet as we have yet to create the user interface/application) by a community, we can move relations – previously created or new – to a higher lever of probability/trust.

We are also able use Part of Speech (POS) to find nouns and verbs … in effect be able to create new topics and new relations between new or existing topics.

With the data we have, with NLP to create an optimization loop for data and relations together with more structured/already verified data, and the later creation of a user application including community features, it might just be possible for us to make data our of the collective intelligence of mankind.:)