Issurance_Agent_Rag / README_HF.md
Rivalcoder
Add application file
ec96972
|
raw
history blame
2.64 kB
# HackRx Insurance Policy Assistant
A FastAPI application that processes PDF documents and answers questions using AI, deployed on Hugging Face Spaces.
## Features
- PDF document parsing and text extraction
- Vector-based document search using FAISS
- AI-powered question answering using Google Gemini
- RESTful API endpoints for document processing
## API Endpoints
### Health Check
- `GET /` - Root endpoint
- `GET /health` - API status check
### Process PDF from URL
- `POST /api/v1/hackrx/run`
- **Headers**: `Authorization: Bearer <your_token>`
- **Body**:
```json
{
"documents": "https://example.com/document.pdf",
"questions": ["What is the coverage amount?", "What are the exclusions?"]
}
```
### Process Local PDF File
- `POST /api/v1/hackrx/local`
- **Body**:
```json
{
"document_path": "/app/files/document.pdf",
"questions": ["What is the coverage amount?", "What are the exclusions?"]
}
```
## Environment Variables
Set these in your Hugging Face Space settings:
- `GOOGLE_API_KEY` - Your Google Gemini API key
## Usage Examples
### Using curl
```bash
# Health check
curl https://your-space-name.hf.space/
# Process PDF from URL
curl -X POST https://your-space-name.hf.space/api/v1/hackrx/run \
-H "Content-Type: application/json" \
-H "Authorization: Bearer your_token_here" \
-d '{
"documents": "https://example.com/insurance-policy.pdf",
"questions": ["What is the coverage amount?", "What are the exclusions?"]
}'
```
### Using Python
```python
import requests
# Health check
response = requests.get("https://your-space-name.hf.space/")
print(response.json())
# Process PDF
url = "https://your-space-name.hf.space/api/v1/hackrx/run"
headers = {
"Content-Type": "application/json",
"Authorization": "Bearer your_token_here"
}
data = {
"documents": "https://example.com/insurance-policy.pdf",
"questions": ["What is the coverage amount?", "What are the exclusions?"]
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
```
## Local Development
To run the application locally:
```bash
pip install -r requirements.txt
python app.py
```
The API will be available at `http://localhost:7860`
## Deployment
This application is configured for deployment on Hugging Face Spaces using Docker. The following files are included:
- `app.py` - Main application entry point
- `Dockerfile` - Docker configuration
- `.dockerignore` - Docker build optimization
- `requirements.txt` - Python dependencies
## Model Information
- **Framework**: FastAPI
- **AI Model**: Google Gemini
- **Vector Database**: FAISS
- **Document Processing**: PyMuPDF