I know this post is likely to generate some strong discussion 🙂
- Wikipedia is now a fact of life and the focus of discussion. For any given topic, Wikipedia is a good point to start. It may be completely unreliable, incomplete and untrustworthy – but it is vast and live and public
- Assume all the information of the world is contained in Wikipedia like interlinked structure – or, simply speaking, in Wikipedia
- If someone looks from knowledge generation point of view and going by Pareto logic, 80% of research topics must be coming out of 20% of strategies
- Such strategies can be thought of some kind of graph traversal in Wikipedia – if successful, no research is required; else work to fill in the gap. For example, is the boiling point of Kevlar known? If no, it is a point of research
Already people know that ideas in Wikipedia are on an average separated by 3.8 degrees, diameter of Wikipedia is less than a few hundred pages etc.
Is there someone working on codifying the kind of strategies I mention above? If so, such a code is worth every character in gold. With such a code, in a snapshot, the human race will have a list of low hanging fruits. [I agree that the lowest hanging fruits are not the sweetest fruits. However, they ARE fruits nevertheless.]
Can such mechanical, bull-dozed approach succeed in accelerating knowledge generation?