Ontology Alignment Evaluation Initiative - OAEI-2015 Campaign

Results OAEI 2015::Large BioMed Track

Results of the FMA-SNOMED matching problem

The following tables summarize the results for the tasks in the FMA-SNOMED matching problem.

XMAP-BK provided the best results in terms of both Recall and F-measure in Task 3 and Task 4. Precision of XMAP-BK in Task 2 was lower than the other top systems but Recall was much higher than the others.

As in the FMA-NCI tasks, the use of the UMLS-Metathesaurus in XMAP-BK has an important impact.

Overall, the results were less positive than in the FMA-NCI matching problem. As in the FMA-NCI matching problem, efficiency also decreases as the ontology size increases. The most important variations were suffered by LogMapBio and XMap in terms of precision. Furthermore, LiLy, DKP-AOM-Lite and DKP-AOM could not complete neither Task 3 nor Task 4, while ServOMBI and CroMatcher could not complete Task 4 within the permitted time.

* Uses background knowledge based on the UMLS-Metathesaurus as the LargeBio reference alignments.


Task 3: FMA-SNOMED small fragments

System Time (s) # Mappings Scores Incoherence Analysis
Precision  Recall  F-measure Unsat. Degree
XMAP-BK * 49 7,920 0.968 0.847 0.903 12,848 54.4%
AML 79 6,791 0.926 0.742 0.824 0 0.000%
LogMapBio 1,204 6,485 0.935 0.700 0.801 1 0.004%
LogMap 78 6,282 0.948 0.690 0.799 1 0.004%
ServOMBI 532 6,329 0.960 0.664 0.785 12,155 51.5%
XMAP 46 6,133 0.958 0.647 0.772 12,368 52.4%
Average 1,527 5,328 0.919 0.561 0.664 5,902 25.0%
LogMapC 156 4,535 0.956 0.505 0.661 0 0.000%
CroMatcher 13,057 6,232 0.586 0.479 0.527 20,609 87.1%
LogMapLite 36 1,644 0.968 0.209 0.343 771 3.3%
RSDLWB 36 933 0.980 0.128 0.226 271 1.1%
Table 1: Results for the largebio task 3.


Task 4: FMA whole ontology with SNOMED large fragment

System Time (s) # Mappings Scores Incoherence Analysis
Precision  Recall  F-measure Unsat. Degree
XMAP-BK * 782 9,243 0.769 0.844 0.805 44,019 21.8%
AML 509 6,228 0.889 0.650 0.751 0 0.000%
LogMap 768 6,281 0.839 0.634 0.722 0 0.000%
LogMapBio 3,248 6,869 0.776 0.650 0.707 0 0.000%
XMAP 698 7,061 0.720 0.609 0.660 40,056 19.9%
LogMapC 1,195 4,693 0.852 0.479 0.613 98 0.049%
Average 1,004 5,395 0.829 0.525 0.602 11,157 5.5%
LogMapLite 419 1,822 0.852 0.209 0.335 4,389 2.2%
RSDLWB 413 968 0.933 0.127 0.224 698 0.3%
Table 2: Results for the largebio task 4.