Skip to content

This sample contains a .NET RAG front-end application and a FastAPI back end that hosts a CPI optimized Phi-3 mini model. They can run together in a Linux App Service app with a sidecar.

License

Notifications You must be signed in to change notification settings

Azure-Samples/ai-slm-in-app-service-sidecar

page_type languages products urlFragment name description
sample
dotnet
python
javascript
java
bicep
html
azure
azure-app-service
ai-slm-in-app-service-sidecar
Tutorial sample - Run a local SLM in a sidecar container in Azure App Service
Contains sample projects that demonstrate how to run a local small language model in a sidecar container within Azure App Service.

Tutorial Sample - Run a Local SLM in a Sidecar Container in Azure App Service

This repository contains sample projects that demonstrate how to run a local small language model (SLM) in a sidecar container within Azure App Service. The project includes:

Project Overview

This repository demonstrates two key use cases for AI sidecar containers in Azure App Service:

  1. Custom SLM Sidecar Container (located in bring_your_own_slm/):
    • The src/phi-3-sidecar/ directory contains the FastAPI back-end and its Dockerfile for hosting a custom SLM.
    • The src/webapp/ directory contains a .NET chatbot frontend that interacts with the custom SLM.

Contributing

We welcome contributions! Please see the CONTRIBUTING.md file for guidelines.

License

This project is licensed under the MIT License. See the LICENSE.md file for details.

Resources

About

This sample contains a .NET RAG front-end application and a FastAPI back end that hosts a CPI optimized Phi-3 mini model. They can run together in a Linux App Service app with a sidecar.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •