Ontology Alignment Evaluation Initiative - OAEI-2015 Campaign

Results OAEI 2015::Large BioMed Track

Results of the FMA-NCI matching problem

The following tables summarize the results for the tasks in the FMA-NCI matching problem.

XMAP-BK and AML provided the best results in terms of F-measure in Task 1 and Task 2. Note that, the use of background knowledge based on the UML-Metathesaurus has an important impact in the performance of XMAP-BK. LogMapBio improves LogMap's recall in both tasks, however precision is damaged specially in Task 2.

Note that efficiency in Task 2 has decreased with respect to Task 1. This is mostly due to the fact that larger ontologies also involves more possible candidate alignments and it is harder to keep high precision values without damaging recall, and vice versa. Furthermore, ServOMBI, CroMacther, LiLy, DKP-AOM-Lite and DKP-AOM could not complete Task 2.

* Uses background knowledge based on the UMLS-Metathesaurus as the LargeBio reference alignments.


Task 1: FMA-NCI small fragments

System Time (s) # Mappings Scores Incoherence Analysis
Precision  Recall  F-measure Unsat. Degree
XMAP-BK * 31 2,714 0.971 0.902 0.935 2,319 22.6%
AML 36 2,690 0.960 0.899 0.928 2 0.019%
LogMap 25 2,747 0.949 0.901 0.924 2 0.019%
LogMapBio 1,053 2,866 0.926 0.917 0.921 2 0.019%
LogMapLite 16 2,483 0.967 0.819 0.887 2,045 19.9%
ServOMBI 234 2,420 0.970 0.806 0.881 3,216 31.3%
XMAP 26 2,376 0.970 0.784 0.867 2,219 21.6%
LogMapC 106 2,110 0.963 0.710 0.817 2 0.019%
Average 584 2,516 0.854 0.733 0.777 2,497 24.3%
Lily 740 3,374 0.602 0.720 0.656 9,279 90.2%
DKP-AOM-Lite 1,579 2,665 0.640 0.603 0.621 2,139 20.8%
DKP-AOM 1,491 2,501 0.653 0.575 0.611 1,921 18.7%
CroMatcher 2,248 2,806 0.570 0.570 0.570 9,301 90.3%
RSDLWB 17 961 0.964 0.321 0.482 25 0.2%
Table 1: Results for the largebio task 1.


Task 2: FMA-NCI whole ontologies

System Time (s) # Mappings Scores Incoherence Analysis
Precision  Recall  F-measure Unsat. Degree
XMAP-BK * 337 2,802 0.872 0.849 0.860 1,222 0.8%
AML 262 2,931 0.832 0.856 0.844 10 0.007%
LogMap 265 2,693 0.854 0.802 0.827 9 0.006%
LogMapBio 1,581 3,127 0.773 0.848 0.809 9 0.006%
XMAP 302 2,478 0.866 0.743 0.800 1,124 0.8%
Average 467 2,588 0.818 0.735 0.759 3,742 2.6%
LogMapC 569 2,108 0.879 0.653 0.750 9 0.006%
LogMapLite 213 3,477 0.673 0.820 0.739 26,478 18.1%
RSDLWB 211 1,094 0.798 0.307 0.443 1,082 0.7%
Table 2: Results for the largebio task 2.