Search Mailing List Archives
jonquet.lirmm at gmail.com
Fri Sep 29 20:55:23 PDT 2017
Quick response, I am on the go.
You can also check
And previous message I have sent on that mailing list.
We recently evaluated also the NCBO Annotator in the CLEF eHealth 2017 campaign. Check the proceedings.
Annotator generally has good precision (especially if good ontologies cover well a domain e.g diseases) because of its huge dictionary but not so good recall because of poor NLP implemented.
Check out also historical 2009-2010 papers for this.
The CLEF eHealth lessons are : NCBO Annotator or the SIFR Annotator are among the best dictionary based Annotators. But we all loose the game against supervised machine learning methods.
What justify their existence (among other things ) is the fact that in the health domain it's often very hard to get preannotated training data.
Something the community always loved about the NCBO Annotator is that it's available out of the shelf as a web service. This was one of the killer idea.
If you are interested in annotating clinical data or scoring your annotations, check out the NCBO Annotator+ on the SIFR BioPortal. We have added NegEx/ConText here and scoring. It's under submission for publication.
Hope it helps.
Assistant Professor, Univ of Montpellier, Visiting scholar at Stanford univ.
> Le 29 sept. 2017 à 15:32, John Graybeal <jgraybeal at stanford.edu> a écrit :
> Hi Alexander,
> A quick answer, sorry I don't have all the research data at hand but perhaps you can use Google Scholar to find more precise information if you need it.
> Since we consider BioPortal to be the biggest source of Biomedical Ontologies, we definitely consider our annotator the best service for annotations against a large and diverse number of ontologies. (And therefore, also the most reliable service in that category. :->)
> The article "Semantic annotation in biomedicine: the current landscape" in Journal of Biomedical Semantics (https://jbiomedsem.biomedcentral.com/articles/10.1186/s13326-017-0153-x) has a survey of many, if not all, of the existing semantic annotation technologies. NCBO Bioportal appears in Tables 1 and 3 (scroll the table left on screen). While quite dated in some respects (BioPortal has over 500 BioMedical ontologies, and it is open source, available on GitHub) there is a lot of useful information here. I know there are some pipeline annotators that leverage BioPortal services to provide their own annotation capabilities, but can not offer the details off-hand, sorry.
> In general, the only technical evaluation I've seen has shown BioPortal first or second (if I recall correctly) in terms of speed and completeness of annotation. Of course, without specific evaluation criteria and comparisons, it is all but impossible to have a meaningful rebuttal to the observation "something else is better". Certainly there are features available with some other annotators that Bioportal does not provide, so there can be tradeoffs depending on specific needs. Conversely, among many other features, BioPortal supports both free text (paragraphs of content), and the ability to annotate the largest matching strings. These often are not available in other annotators.
> With regard to reliability, the most common concern raised about BioPortal is that it is not consistently responsive, especially for people with long-running and large-scale annotation processes and similar complex queries. Our monitoring services show the annotation service is up over 99.5% of the time, but it is definitely challenged by repeatedly annotating pages or more of text against all 500+ ontologies 15 times per second (per requestor), whether your text, or someone else's. We regularly get requests for the Virtual Appliance so that people can set up their own annotation pipeline, with only the ontologies they are interested in, and handling queries as fast as they can send them. That is an example of a more reliable service, but not a public one.
> Perhaps this information will be sufficient to help you, or will bring forward other comments from the user list.
>> On Sep 27, 2017, at 6:59 AM, Alexander Garcia Castro <alexgarciac at gmail.com> wrote:
>> Hi all. I am using the NCBO annotator and I like it. In a paper I just submitted the reviewer is saying that there are other NER tools that are better. this may be true but when processing the whole of PMC over a web service against all biomedical ontologies or a big portion of them then... is there anything as reliable as bioportal annotator? could some one help me out with some reference that justifies the choice of the ncbo annotator? I am using it because I dont know if there is a web service that is as reliable when processing lots of texts and also that works against all of biomedical ontologies -in my case it is pretty much a choice due to the need to reuse other peoples infrastructure because I would not be able to process it on my own.
>> also, there are lots of other NER tools but why was this one specifically selected for the NCBO NER service? as oppose to using another NER tool for the NCBO annotator.
>> Alexander Garcia
>> bioontology-support mailing list
>> bioontology-support at lists.stanford.edu
> John Graybeal
> Technical Program Manager
> Center for Expanded Data Annotation and Retrieval /+/ NCBO BioPortal
> Stanford Center for Biomedical Informatics Research
> bioontology-support mailing list
> bioontology-support at lists.stanford.edu
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the bioontology-support