Search Mailing List Archives
[protege-owl] Modeling Massive Ontologies (SNOMED) at Kaiser
phendler at hotmail.com
Fri Oct 17 09:45:18 PDT 2008
We (at Kaiser Permanente) have recently created an OWL file for all of SNOMED
(Systematized Nomenclature of Medicine -- Clinical Terms). It has nearly
half a million terms and the file is 154 Megs and it will not load into
I have 4 megs of RAM. Is there a limit to how large an ontology can be to
load it into Protege 4, or is this related only to memory? If I were to use
a 64 bit OS on a machine with 16 gigs of RAM would that allow SNOMED to
load, and would I be able to model new terms? Any suggestions on how large
a JAVA heap this would take? Any other known OWL editors that are known to
be able to handle modeling ontologies with half a million terms?
View this message in context: http://www.nabble.com/Modeling-Massive-Ontologies-%28SNOMED%29-at-Kaiser-tp20036973p20036973.html
Sent from the Protege - OWL mailing list archive at Nabble.com.
More information about the protege-owl