Ontology Alignment Evaluation Initiative - OAEI-2012 Campaign

Results OAEI 2012::Large BioMed Track

Summary Results OAEI 2012::Large BioMed Track (Top 8 systems)

The following table summarises the results for the 8 systems that completed all 9 tasks in the Large BioMed Track. The table shows the average precision, recall, F-measure and incoherence degree; and the total time to complete the tasks.

The systems have been ordered according to the average F-measure. YAM++ obtained the best average F-measure, GOMMA-Bk the best recall and ServOMap computed the most precise mappings. The first 6 systems obtained very close results in terms of F-measure and there were only a gap of 0.024 between the first (YAM++) and the sixth (ServOMap).

Regarding mapping incoherence, LogMap and LogMap-noe were the unique systems providing mapping sets leading to a small number of unsatisfiable classes.

Finally, LogMapLt, since it implements basic and efficient string similarity techniques, was the fastest system. The rest of the tools, apart from YAM++, were also very fast and only needed between 38 and 97 minutes to complete the tasks. YAM++ was the counterexample and required almost 19 hours to complete the nine tasks.

System Total Time (s) Average
Precision  Recall  F-measure Incoherence
YAM++ 67,817 0.876 0.710 0.782 45.30%
ServOMapL 2,405 0.890 0.699 0.780 51.46%
LogMap-noe 3,964 0.869 0.695 0.770 0.004%
GOMMA_Bk 5,821 0.767 0.791 0.768 45.32%
LogMap 3,077 0.869 0.684 0.762 0.006%
ServOMap 2,310 0.903 0.657 0.758 55.36%
GOMMA 5,341 0.746 0.553 0.625 24.01%
LogMapLt 711 0.831 0.515 0.586 33.17%