| page_type | languages | products | urlFragment | name | description | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample |
|
|
ai-slm-in-app-service-sidecar |
Tutorial sample - Run a local SLM in a sidecar container in Azure App Service |
Contains sample projects that demonstrate how to run a local small language model in a sidecar container within Azure App Service. |
This repository contains sample projects that demonstrate how to run a local small language model (SLM) in a sidecar container within Azure App Service. The project includes:
This repository demonstrates two key use cases for AI sidecar containers in Azure App Service:
- Prebuilt AI Sidecar Extensions (located in
use_sidecar_extension/):- This directory contains multiple applications that demonstrate how to integrate with existing AI sidecar extensions.
- Examples include:
dotnetapp/: A .NET chatbot application. For more information, see Tutorial: Run chatbot in App Service with a Phi-3 sidecar extension (ASP.NET Core).expressapp/: A Node.js Express chatbot application. For more information, see Tutorial: Run chatbot in App Service with a Phi-3 sidecar extension (Express.js).fastapiapp/: A Python FastAPI chatbot application. For more information, see Tutorial: Run chatbot in App Service with a Phi-3 sidecar extension (FastAPI).springapp/: A Java Spring Boot chatbot application. For more information, see Tutorial: Run chatbot in App Service with a Phi-3 sidecar extension (Spring Boot).
- Custom SLM Sidecar Container (located in
bring_your_own_slm/):- The
src/phi-3-sidecar/directory contains the FastAPI back-end and its Dockerfile for hosting a custom SLM. - The
src/webapp/directory contains a .NET chatbot frontend that interacts with the custom SLM.
- The
We welcome contributions! Please see the CONTRIBUTING.md file for guidelines.
This project is licensed under the MIT License. See the LICENSE.md file for details.