Search Mailing List Archives

Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[protege-owl] Modeling Massive Ontologies (SNOMED) at Kaiser

phend phendler at
Fri Oct 17 09:45:18 PDT 2008

We (at Kaiser Permanente) have recently created an OWL file for all of SNOMED
(Systematized Nomenclature of Medicine -- Clinical Terms).  It has nearly
half a million terms and the file is 154 Megs and it will not load into
I have 4 megs of RAM.  Is there a limit to how large an ontology can be to
load it into Protege 4, or is this related only to memory?  If I were to use
a 64 bit OS on a machine with 16 gigs of RAM would that allow SNOMED to
load, and would I be able to model new terms?  Any suggestions on how large
a JAVA heap this would take?  Any other known OWL editors that are known to
be able to handle modeling ontologies with half a million terms?
View this message in context:
Sent from the Protege - OWL mailing list archive at

More information about the protege-owl mailing list