Named Entity Recognition (NER) is one of the most popular and in-demand NLP tasks. As NER has expanded it has become more domain specific as well. Building custom NER models for a specific domain such as healthcare/medical, can be difficult and require extensive amounts of data and computing power. AWS Comprehend is a high-level service, AWS offers that automates many different NLP tasks such as Sentiment Analysis, Topic Modeling, and NER. Comprehend branched out to create an sub-service called Comprehend Medical, that is specifically geared for Medical NER. In this article we will cover how to build a web application with Streamlit, that can call Comprehend Medical and return medical entities detected. The article assumes foundational knowledge of AWS, ability to navigate around the AWS console, and Python basics. I’ll provide a list of the services we are going to be using along with more in-depth definitions following this blurb, but feel free to skip to the code demonstration of Sentiment Analysis & Entity Extraction if already familiar with these services.
Table of Contents
- AWS Services Overview
- Architecture
- Frontend Creation with Streamlit
- Creating Lambda Function and REST API
- Integrate Lambda Function with AWS Comprehend Medical
- Hook up Front-End and Backend
- Code & Conclusion
1. AWS Services
AWS Comprehend Medical: A HIPAA compliant NLP service that provides a high-level API for users extracting health data from text.
AWS Lambda: A serverless computing service, that allows developers to run code without managing or provisioning servers. We will be using this service to access AWS Comprehend Medical for NER and to communicate with the REST API to output results to the Front-End.
Boto3: AWS Software Development Kit (SDK) for Python developers, we use this on our Lambda function to access other AWS services such as Comprehend.
Identity Access and Management (IAM): Lets you manage access of AWS services through permissions and roles. We will be creating a role for our Lambda function to be able to access AWS Comprehend and API GW.
AWS API Gateway (API GW): Service that allows for developers to create, publish, and monitor secure RESTful and Socket APIs.
2. Architecture

The architecture is relatively simple. We will be building a streamlit web application that accesses a REST API that we will create using Amazon API Gateway. This REST API will serve as an interface for the backend Lambda function which uses the Boto3 SDK to access Comprehend Medical for Medical NER.
3. Frontend Creation with Streamlit
For our application we’ll be creating a simple front-end using a nifty Python library called Streamlit that lets Python developers and data scientists get web applications/dashboards up and running quickly. All we need for our application is some headers and a text box for the input text that we will be analyzing to detect any potential medical entities.
The requests library will later be used to access the API we create, but a basic front-end template should now be created if you do "streamlit run filename.py" you should see the setup.

4. Create Lambda Function and REST API
NOTE: This step is a little arduous and has many small details for the API creation that I skip for simplicity sake, if you would prefer a video demonstration of this portion go ahead and follow this link.
Now it’s time to head over to the AWS console, once you’ve logged in head over to the IAM service. Before we can work with Lambda, we need to create a role for our Lambda function that gives it permission to work with Comprehend Medical and API Gateway. Once at the IAM service, click Roles on the left side of the page and click Create Role. Now you choose the service the role is for, which is Lambda in our case. Click next: Permissions, and now we can look up policies in the search tab to attach to our Role. Make sure to check AmazonComprehendFullAccess, Amazon APIGatewayFullAccess, and AmazonAPIGatewayAdministrator.

Go ahead and click create role and we can head over to the Lambda service. Click Create Function, name your Lambda function, choose Python 3.8 as your runtime, and click Change default execution role. Now choose use an existing role and select the role that you’ve created for the Lambda service.

Go ahead and click create function and we have our Lambda function ready to go. Our next step is creating a REST API with API GW and integrating it with our Lambda function. Go to the API Gateway service on the console and click create API. Choose Build REST API, name your API, and click create.

Once you’ve created your REST API, make sure to create a POST method so that we can feed data from our front-end to our back-end Lambda function. After creating the appropriate resources and methods make sure to deploy your API and enable CORS. We now have a REST API that we can work with to integrate our front and back-end.
5. Integrate Lambda Function with AWS Comprehend Medical
Now that the general flow of the architecture has been established we can focus on the back-end work to integrate Comprehend Medical for NER. Using the boto3 library we use the API call: detect_entities for medical NER with whatever input text we are entering. Detect entities for Comprehend Medical has five different categories it can classify terms as: Anatomy, Medical_Condition, Medication, Protected_Health_Information, and Test_Treatment_Procedure. For further documentation of the API calls boto3 has for Comprehend Medical follow this link.
Notice that you must include a headers portion in the JSON that we are returning to avoid any CORS errors.
6. Hook up Front-End and Backend
Now that we have our API deployed we need to access this REST API on our front-end so that we can feed the input text into the Lambda function. We use the Python requests library to access the REST API we have created.
The first part of the function takes the user input and passes this to our REST API. After Comprehend Medical returns the Entities it has detected we then parse the data into a presentable format that we can write back to the front-end.

7. Code & Conclusion
To access all the code for the demonstration, go to the link posted above. AWS has a constantly expanding set of high-level AI services that can truly automate and power applications for people low on time or not as experienced with building custom ML models.
I hope that this article has been useful for anyone trying to work with AWS and NLP. Feel free to leave any feedback in the comments or connect with me on Linkedln if interested in chatting about ML, Data Science, and AWS. Thank you for reading!