A Flask-based REST API that integrates with Azure OpenAI to provide AI-powered responses to user questions.
This project creates a simple REST API endpoint that allows users to ask questions and receive responses from Azure OpenAI's language models using LangChain.
- REST API endpoint for submitting questions
- Integration with Azure OpenAI services
- Environment-based configuration
- Error handling and validation
- Python 3.11
- Azure OpenAI API access
- Required environment variables (see below)
# Create a virtual environment
python3.11 -m venv venv
# Activate the virtual environment
# On macOS/Linux:
source venv/bin/activate
# On Windows:
# venv\Scripts\activate# Install required packages
pip install -r requirements.txtCreate a .env file in the project root with the following variables:
AZURE_OPENAI_API_KEY=your_api_key
AZURE_OPENAI_API_ENDPOINT=your_endpoint
AZURE_OPENAI_API_VERSION=your_api_version
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=your_deployment_name
python app.pyThe server will start on port 3000 and can be accessed at http://localhost:3000.
Endpoint: POST /ask
Request Body:
{
"question": "Your question here"
}Response:
{
"answer": "AI-generated response",
"status": "success"
}The API returns appropriate HTTP status codes and error messages:
- 400: Missing or invalid request parameters
- 500: Server-side errors