期刊名称:International Journal of Innovative Research in Computer and Communication Engineering
印刷版ISSN:2320-9798
电子版ISSN:2320-9801
出版年度:2015
卷号:3
期号:8
DOI:10.15680/IJIRCCE.2015. 0308022
出版社:S&S Publications
摘要:Classification rules are extracted from sample data known as knowledge. If we extract these knowledgein a distributed way, it is necessary to combine or fuse these rules. The task of data fusion is to identify the true valuesof data items among multiple observed values drawn from different sources of varying reliability. In data miningapplications knowledge extraction is splitted into subtasks due to memory or run-time limitations. Again, locallyextracted knowledge must be consolidated later because communication overhead should be low. Extractinginformation from multiple data sources, and reconciling the values so the true values can be stored in a central datarepository. But it’s a problem of vital importance to the database and knowledge management communities.In a conventional approach extracting knowledge is typically done either by combining the classifiers’ outputs or bycombining the sets of classification rules but in this paper, I introduce a new way of fusing classifiers at the level ofparameters of classification rules. Here its focused around the utilization of probabilistic generative classifiers utilizingmultinomial circulations and multivariate ordinary dispersions for the consistent ones. We are using these distributionsas hyper distributions or second-order distributions. Fusing of these classifiers are can be done by multiplying thehyper-distributions of the parameters.