Difference between revisions of "MetaMap Installation"

From NCBO Wiki
Jump to navigation Jump to search
(New page: == Introduction == MetaMap is a highly powerful tool to map biomedical text to the UMLS Metathesaurus or a custom Metathesaurus, equivalently, to discover Metathesaurus concepts referred t...)
 
 
(14 intermediate revisions by 2 users not shown)
Line 1: Line 1:
== Introduction ==
+
== Presentation ==
MetaMap is a highly powerful tool to map biomedical text to the UMLS Metathesaurus or a custom Metathesaurus, equivalently, to discover Metathesaurus concepts referred to in text. The following article describes the way of using the Data File Builder utility inside MetaMap to build a concept recognizer for text containing NCBO terms.
+
MetaMap [http://mmtx.nlm.nih.gov/] is a highly powerful tool to map biomedical text to the UMLS [http://www.nlm.nih.gov/research/umls/] Metathesaurus or a custom Metathesaurus, equivalently, to discover Metathesaurus concepts referred to in text. The following article describes the way of using the Data File Builder utility inside MetaMap to build a concept recognizer for text containing NCBO terms.
 
 
Installation of MetaMap primarily involves 3 stages.  
 
  
1.Preparation of data
+
NCBO [http://www.bioontology.org/] is developing a system [http://www.bioontology.org/annotator-service] to annotate large numbers of data resources automatically, and have developed a prototype system for ontology-based annotation and indexing of biomedical data [http://www.biomedcentral.com/1471-2105/10/S9/S14]. As a part of the workflow a concept recognizer is used to recognize a given ontology concept in text. The goal of the project is to embed MetaMap into the annotator workflow. MetaMap is a very powerful for this task due to the use of complex NLP techniques in its concept recognition algorithm.
  
2.Setting up the workspace
+
Below listed is some of the useful processing done by MetaMap:
  
3.Installation of Metamap
+
1. Removal of problematic terms from data [numbers, single alphabets etc..]
  
Each of the step is explained below followed by instructions to run the newly installed custom MetaMap.
+
2. Lexical and Syntactic filtering to recognize long phrases accurately
  
== Preparation of data ==
+
3. Identifying linguistic variants of base words and terms that have been computed using a variety of generation methods. This handles various forms like inflectional (heart → hearts), derivational (treat → treatment), synonyms (pyrosis ↔ heartburn), acronyms (vt ↔ ventricular tachycardia) and spellings (apnea ↔ apnoea)
 +
 
 +
Such kind of rich semantic processing can drastically increase the number of concepts that can be recognized from a text.
 +
 
 +
Although Metamap was originally built to work on UMLS concepts, its Data File builder utility (DFB) lets us create our own custom dictionary for NCBO concepts.
 +
 
 +
== Contacts ==
 +
 
 +
* For questions or feature requests, contact Clement Jonquet [mailto:jonquet@stanford.edu?subject=MetaMap%Installation] and Tejaswi Tenneti [mailto:tejaswit@stanford.edu?subject=MetaMap%Installation]
 +
 
 +
== Versions (prototypes & releases) ==
 +
 
 +
* '''December 2009 - Second Annotator prototype including MetaMap: [http://obs.bioontology.org/oba/OBA_v1.2_rest.html]'''
 +
 
 +
The release consists of an integrated MetaMap 2009 installation. The Data File builder utility was used build a custom dictionary containing the ontologies '''CST, OMS, 39857, 39046, 40187 and 39885'''. The data was obtained from the '''obs_v1_sept09''' schema. However, some ontologies like 39857 is not present in '''obs_v1_sept09'''
 +
 +
* '''November 2009 - Second Annotator prototype including Mmtx: [http://obs.bioontology.org/oba/OBA_v1.2_rest.html]'''
 +
 
 +
The release consists of an integrated MMTx installation. The Data File builder utility was used build a custom dictionary containing all the ontologies in the '''obs''' schema. A defect with this installation is that the Data File builder scripts processed only CUIs of 8 digits length. This is a bug in the MMTx documentation.
 +
 
 +
== References ==
 +
* gforge code location https://bmir-gforge.stanford.edu/gf/project/obs/
 +
 
 +
== Collaboration & Acknowledgment ==
 +
 
 +
We thank the researchers at National Library of Medicine [http://www.nlm.nih.gov/] for their constant support and help at various stages of this project.
 +
 
 +
== MetaMap Installation documentation ==
 +
 
 +
The installation of MetaMap primarily involves 3 stages. Each of them are explained below followed by instructions to run the newly installed custom MetaMap.
 +
 
 +
=== Preparation of data ===
 
   
 
   
 
There are primarily 4 files to be generated for the installation.  
 
There are primarily 4 files to be generated for the installation.  
Line 77: Line 106:
  
  
== Setting up the workspace ==
+
=== Setting up the workspace ===
  
1. Install the Lexical Variant Generator (LVG) before running
+
1. '''Install the Lexical Variant Generator (LVG)''' before running
 
MetaMap's install program.  LVG is part of the Lexical Tools distribution and is  available from the Lexical Systems Group (http://lexsrv3.nlm.nih.gov/SPECIALIST/Projects/lvg/current/index.html).
 
MetaMap's install program.  LVG is part of the Lexical Tools distribution and is  available from the Lexical Systems Group (http://lexsrv3.nlm.nih.gov/SPECIALIST/Projects/lvg/current/index.html).
  
2. Before using MetaMap install program to install data file builder,
+
2. Before using MetaMap install program to install data file builder, you also need to '''add LVG's bin directory''' {LVG_DIR}/bin to your program path:
You also need to add LVG's bin directory {LVG_DIR}/bin to your
+
 
program path:
 
# in Bourne Again Shell (bash)
 
 
export PATH=$PATH:<LVG_DIR>/bin
 
export PATH=$PATH:<LVG_DIR>/bin
  
3. Install MetaMap using instructions at their site  
+
3. '''Install MetaMap''' using instructions at their site  
  
 
4. Unzip the DFB installation files in the same directory as MetaMap
 
4. Unzip the DFB installation files in the same directory as MetaMap
  
5. Connect to the new directory created by extracting the distribution
+
5. Connect to the new directory created by extracting the distribution and invoke the install program:
and invoke the install program:
+
 
 
cd <distribution directory>
 
cd <distribution directory>
 +
 
./bin/install.sh
 
./bin/install.sh
  
A sample run of the installation script follows:
+
'''A sample run of the installation script follows:'''
 +
 
 +
Enter basedir of installation [/nfsvol/nlsaux15/public_mm]
  
^<<
 
Enter basedir of installation [/nfsvol/nlsaux15/public_mm] <user hits
 
                                                            return to get the default>
 
 
Basedir is set to /nfsvol/nlsaux15/public_mm.
 
Basedir is set to /nfsvol/nlsaux15/public_mm.
  
Line 120: Line 147:
 
/nfsvol/nlsaux15/public_mm/bin/skrmedpostctl generated.
 
/nfsvol/nlsaux15/public_mm/bin/skrmedpostctl generated.
 
Install complete.
 
Install complete.
Would like to use a custom data set with MetaMap (use data file builder)? [yN]: <user types y and return>
+
Would like to use a custom data set with MetaMap (use data file builder)? [yN]:  
  
 
running Data File Builder Install...
 
running Data File Builder Install...
Line 126: Line 153:
  
 
running Data File Builder Install...
 
running Data File Builder Install...
Enter home path of LVG [/nfsvol/nls/tools/Linux-i686/lvg2009]: <user hits
+
Enter home path of LVG [/nfsvol/nls/tools/Linux-i686/lvg2009]:
                                                            return to get the default>
 
 
 
 
Using /nfsvol/nls/tools/Linux-i686/lvg2009 for LVG_DIR.
 
Using /nfsvol/nls/tools/Linux-i686/lvg2009 for LVG_DIR.
  
Line 134: Line 159:
 
/nfsvol/nlsaux15/public_mm/scripts/dfbuilder/mm_variants/0doit.xwords generated.
 
/nfsvol/nlsaux15/public_mm/scripts/dfbuilder/mm_variants/0doit.xwords generated.
 
Datafile Builder Setup is complete.
 
Datafile Builder Setup is complete.
%
 
^>>
 
  
6. Make sure that the SKR/MedPOST tagger is running; to run the tagger, move to the public_mm directory present inside the working directory and invoke
+
 
 +
6. '''Make sure that the SKR/MedPOST tagger is running;''' to run the tagger, move to the public_mm directory present inside the working directory and invoke
 +
 
 
./bin/skrmedpostctl start
 
./bin/skrmedpostctl start
  
Incase there is an error due to a port number, change the port number in the following files and try to run the tagger again:
+
'''Incase there is an error''' due to a port number, change the port number in the following files and try to run the tagger again:
 +
 
 
/bin/skrmedpostctl
 
/bin/skrmedpostctl
  
If you change the port number skrmedpostctl uses you need to change
+
If you change the port number skrmedpostctl uses you need to change public_mm/bin/SKRrun (or SKRrun.in and re-run install.sh) to match the port you set in skrmedpostctl.  The environment variables that should be changed in SKRrun are:
public_mm/bin/SKRrun (or SKRrun.in and re-run install.sh) to match the
+
 
port you set in skrmedpostctl.  The environment variables that should
 
be changed in SKRrun are:
 
 
TAGGER_SERVER_DEFAULT_TCP_PORT
 
TAGGER_SERVER_DEFAULT_TCP_PORT
 +
 
TAGGER_SERVER_TCP_PORT_0
 
TAGGER_SERVER_TCP_PORT_0
 +
 
TAGGER_SERVER_TCP_PORT_1
 
TAGGER_SERVER_TCP_PORT_1
  
  
 
7. Inside the directory public_mm/sourceData create a directory for workspace
 
7. Inside the directory public_mm/sourceData create a directory for workspace
 +
 
mkdir sourceData/09_custom
 
mkdir sourceData/09_custom
  
 
Finally, Create a directory to store the knowledge sources:
 
Finally, Create a directory to store the knowledge sources:
 +
 
mkdir sourceData/09_custom/umls
 
mkdir sourceData/09_custom/umls
  
 
+
=== Installation of Metamap ===
== Installation of Metamap ==
 
  
  
Line 204: Line 231:
 
./bin/LoadDataFiles
 
./bin/LoadDataFiles
  
 
+
=== Running MetaMap ===
== Running MetaMap ==
 
  
 
To run the newly installed MetaMap dfb, move to the main workspace folder (public_mm) and run the command below
 
To run the newly installed MetaMap dfb, move to the main workspace folder (public_mm) and run the command below
  
 
bin/SKRrun -L 2009 -M /DATA/XDR -B /BDB4 -w ./lexicon ./bin/metamap09.BINARY -Z 09_custom ./resources/input -I
 
bin/SKRrun -L 2009 -M /DATA/XDR -B /BDB4 -w ./lexicon ./bin/metamap09.BINARY -Z 09_custom ./resources/input -I

Latest revision as of 21:42, 27 January 2010

Presentation

MetaMap [1] is a highly powerful tool to map biomedical text to the UMLS [2] Metathesaurus or a custom Metathesaurus, equivalently, to discover Metathesaurus concepts referred to in text. The following article describes the way of using the Data File Builder utility inside MetaMap to build a concept recognizer for text containing NCBO terms.

NCBO [3] is developing a system [4] to annotate large numbers of data resources automatically, and have developed a prototype system for ontology-based annotation and indexing of biomedical data [5]. As a part of the workflow a concept recognizer is used to recognize a given ontology concept in text. The goal of the project is to embed MetaMap into the annotator workflow. MetaMap is a very powerful for this task due to the use of complex NLP techniques in its concept recognition algorithm.

Below listed is some of the useful processing done by MetaMap:

1. Removal of problematic terms from data [numbers, single alphabets etc..]

2. Lexical and Syntactic filtering to recognize long phrases accurately

3. Identifying linguistic variants of base words and terms that have been computed using a variety of generation methods. This handles various forms like inflectional (heart → hearts), derivational (treat → treatment), synonyms (pyrosis ↔ heartburn), acronyms (vt ↔ ventricular tachycardia) and spellings (apnea ↔ apnoea)

Such kind of rich semantic processing can drastically increase the number of concepts that can be recognized from a text.

Although Metamap was originally built to work on UMLS concepts, its Data File builder utility (DFB) lets us create our own custom dictionary for NCBO concepts.

Contacts

  • For questions or feature requests, contact Clement Jonquet [6] and Tejaswi Tenneti [7]

Versions (prototypes & releases)

  • December 2009 - Second Annotator prototype including MetaMap: [8]

The release consists of an integrated MetaMap 2009 installation. The Data File builder utility was used build a custom dictionary containing the ontologies CST, OMS, 39857, 39046, 40187 and 39885. The data was obtained from the obs_v1_sept09 schema. However, some ontologies like 39857 is not present in obs_v1_sept09

  • November 2009 - Second Annotator prototype including Mmtx: [9]

The release consists of an integrated MMTx installation. The Data File builder utility was used build a custom dictionary containing all the ontologies in the obs schema. A defect with this installation is that the Data File builder scripts processed only CUIs of 8 digits length. This is a bug in the MMTx documentation.

References

Collaboration & Acknowledgment

We thank the researchers at National Library of Medicine [10] for their constant support and help at various stages of this project.

MetaMap Installation documentation

The installation of MetaMap primarily involves 3 stages. Each of them are explained below followed by instructions to run the newly installed custom MetaMap.

Preparation of data

There are primarily 4 files to be generated for the installation.

1.MRCON: This file is primarily generated from the table OBS_TT. Below is a mapping between the columns in MRCON and OBS_TT. The columns listed in OBS_TT are used to fill their corresponding columns in MRCON file.

MRCON OBS_TT

CUI (Concept Unique ID) conceptID

Language- Status isPreferred LUI(Lexical Unique ID) termID String type - SUI(String Unique ID) termID String termName LRL -

The sql command would fill correct values into MRCON table in database.


Example entry row:

C0027051|ENG|P|L0027051|PF| S0064638|Myocardial Infarction|0|


2.MRSO: This file is primarily generated from the tables OBS_TT, OBS_OT and OBS_CT. Below is a mapping between the columns in MRSO and the above OBS tables. The columns listed in the tables are used to fill their corresponding columns in MRSO file.

MRSO OBS tables CUI (Concept Unique ID) OBS_TT.conceptID LUI(Lexical Unique ID) OBS_TT.termID SUI(String Unique ID) OBS_TT.termID SABSourceAbbrev OBS_OT.localOntologyID TermType OBS_TT.isPreferred SourceID OBS_OT.ontologyID Restrictionlevel -

The sql command would fill correct values into MRSO table in database.

Example entry row:

C0027051|L0027051|S0064638| MEDLINEPLUS|ET|T5|0|


3.MRSTY: This file is primarily generated from the table OBS_STT. Below is a mapping between the columns in MRSTY and OBS_STT. The columns listed in OBS_STT are used to fill their corresponding columns in MRSTY file.

MRSTY OBS_STT CUI (Concept Unique ID) OBS_STT.conceptID TUI (term Unique ID) OBS_STT.localSemanticTypeID STY OBS_STT.semanticTypeName

The sql command would fill correct values into MRSO table in database.

Example entry row:

C0027051|T047|Disease or Syndrome|


4.st.raw: This file is primarily generated from the column semanticTypeName in the table OBS_STT. This file contains row of semantic type names and a short name for each of them.

Example row: semantic Type Name|semantictypename


Setting up the workspace

1. Install the Lexical Variant Generator (LVG) before running MetaMap's install program. LVG is part of the Lexical Tools distribution and is available from the Lexical Systems Group (http://lexsrv3.nlm.nih.gov/SPECIALIST/Projects/lvg/current/index.html).

2. Before using MetaMap install program to install data file builder, you also need to add LVG's bin directory {LVG_DIR}/bin to your program path:

export PATH=$PATH:<LVG_DIR>/bin

3. Install MetaMap using instructions at their site

4. Unzip the DFB installation files in the same directory as MetaMap

5. Connect to the new directory created by extracting the distribution and invoke the install program:

cd <distribution directory>

./bin/install.sh

A sample run of the installation script follows:

Enter basedir of installation [/nfsvol/nlsaux15/public_mm]

Basedir is set to /nfsvol/nlsaux15/public_mm.

The WSD Server requires Sun's Java Runtime Environment (JRE) Sun's Java Developer Kit (JDK) will work as well. if the command: "which" java returns /usr/local/jre1.4.2/bin/java, then the JRE resides in /usr/local/jre1.4.2/.

Where does your distribution of Sun's JRE reside? Enter home path of JRE (JDK) [/usr]: /nfsvol/nls/tools/Linux-i686/java1.4.2 Using /nfsvol/nls/tools/Linux-i686/java1.4.2 for JAVA_HOME.

/nfsvol/nlsaux15/public_mm/WSD_Server/config/disambServer.cfg generated /nfsvol/nlsaux15/public_mm/WSD_Server/config/log4j.properties generated /nfsvol/nlsaux15/public_mm/bin/SKRrun generated. /nfsvol/nlsaux15/public_mm/bin/metamap07 generated. /nfsvol/nlsaux15/public_mm/bin/wsdserverctl generated. /nfsvol/nlsaux15/public_mm/bin/skrmedpostctl generated. Install complete. Would like to use a custom data set with MetaMap (use data file builder)? [yN]:

running Data File Builder Install... Is LVG installed? [yN] <The user types y and return>

running Data File Builder Install... Enter home path of LVG [/nfsvol/nls/tools/Linux-i686/lvg2009]: Using /nfsvol/nls/tools/Linux-i686/lvg2009 for LVG_DIR.

/nfsvol/nlsaux15/public_mm/scripts/dfbuilder/mm_variants/0doit.lvglab generated. /nfsvol/nlsaux15/public_mm/scripts/dfbuilder/mm_variants/0doit.xwords generated. Datafile Builder Setup is complete.


6. Make sure that the SKR/MedPOST tagger is running; to run the tagger, move to the public_mm directory present inside the working directory and invoke

./bin/skrmedpostctl start

Incase there is an error due to a port number, change the port number in the following files and try to run the tagger again:

/bin/skrmedpostctl

If you change the port number skrmedpostctl uses you need to change public_mm/bin/SKRrun (or SKRrun.in and re-run install.sh) to match the port you set in skrmedpostctl. The environment variables that should be changed in SKRrun are:

TAGGER_SERVER_DEFAULT_TCP_PORT

TAGGER_SERVER_TCP_PORT_0

TAGGER_SERVER_TCP_PORT_1


7. Inside the directory public_mm/sourceData create a directory for workspace

mkdir sourceData/09_custom

Finally, Create a directory to store the knowledge sources:

mkdir sourceData/09_custom/umls

Installation of Metamap

1. First, run the BuildDataFiles program as follows:

$fullpath/public_mm/bin/BuildDataFiles

2. The file st.raw, located in $fullpath/public_mm/data/dfbuilder/2009 will need to be modified. Append the file generated in point 4 of Preparation of Data section to this file.

3. Move to $fullpath/public_mm/sourceData/09_custom/01metawordindex and execute the following scripts in order

./01CreateWorkFiles

./02Suppress

./03FilterPrep

./04FilterStrict

./05GenerateMWIFiles

4. Move to 02treecodes directory and run 01GenerateTreecodes

cd ../02treecodes

./01GenerateTreecodes

5. Move to 03Variants directory and run 01GenerateVariants

cd ../03Variants

./01GenerateVariants

6. cd ../04synonyms

./01GenerateSynonyms

7. cd ../05abbrAcronyms

./01GenerateAbbrAcronyms

8. move to $fullpath/public_mm and run

./bin/LoadDataFiles

Running MetaMap

To run the newly installed MetaMap dfb, move to the main workspace folder (public_mm) and run the command below

bin/SKRrun -L 2009 -M /DATA/XDR -B /BDB4 -w ./lexicon ./bin/metamap09.BINARY -Z 09_custom ./resources/input -I