Skip to content

jordanbean-msft/cloud-service-onboarding-agent

Repository files navigation

cloud-service-onboarding-agent

architecture

Cloud Service Onboarding Agent process

stateDiagram-v2
    [*] -->  request_for_new_cloud_service_onboarding
    request_for_new_cloud_service_onboarding --> API
    API --> cloud_service_onboarding_process
    state cloud_service_onboarding_process {
        [*] --> retrieve_public_documentation

        retrieve_public_documentation --> retrieve_security_documentation
        retrieve_security_documentation --> make_security_recommendations
        make_security_recommendations --> build_azure_policy
        build_azure_policy --> write_terraform
    }
    write_terraform --> update_ui
    update_ui --> [*]
Loading

Disclaimer

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

Prerequisites

Azure Resources needed

Deployment

Upload local files

The cloud-security-agent will try and use internal security documentation to help it decide what security policies to recommend. If you have any internal security documentation, you can upload it to the src/api/app/agents/cloud_security_agent/files directory. These files will get uploaded to the Azure AI Agent service and the agent will use this data to help it make recommendations.

Local

Perform each of the following sections in a new shell window.

  1. Authenticate your local environment with Azure (this is used by the DefaultAzureCredential in code to authenticate)

    az login
  2. Create a /src/api/.env file for the backend service

    AZURE_OPENAI_MODEL_DEPLOYMENT_NAME=<your-openai-model-deployment-name>
    AZURE_AI_AGENT_ENDPOINT=<your-ai-agent-endpoint>
    AZURE_AI_AGENT_API_VERSION=2025-01-01-preview
    APPLICATION_INSIGHTS_CONNECTION_STRING=<your-app-insights-connection-string>
    BING_CONNECTION_NAME=<your-bing-connection-name>
    BING_INSTANCE_NAME=<your-bing-instance-name>
  3. Create a /src/web/.env file for the frontend service

    services__api__api__0=http://127.0.0.1:8000

Individual terminals

Api

  1. Navigate into the src/api directory

    cd src/api
  2. Create a virtual environment

    python -m venv .venv
  3. Activate the virtual environment (Windows)

    ./.venv/Scripts/activate
  4. Install the prerequisites

    pip install -r ./requirements.txt
  5. Run the API

    python -m uvicorn app.main:app --log-level debug

Web

  1. Open a new shell

  2. Navigate to the src/web directory

    cd src/web
  3. Create a virtual environment

    python -m venv .venv
  4. Activate the virtual environment (Windows)

    ./.venv/Scripts/activate
  5. Install the prerequisites

    pip install -r ./requirements.txt
  6. Run the web app

    streamlit run ./app.py
  7. Navigate to the URL that is printed

Docker Compose

  1. Navigate to the root of the repository

    cd src
  2. Create a .env file in the src directory with the following content. The UID & GID are the user ID and group ID of the user that will run the azclicredsproxy container. These are needed because the azclicredsproxy needs to be able to access the Azure CLI credentials stored in the ~/.azure directory. If you are running docker from Linux, you don't need the DISTRONAME parameter.

You can find the numerical UID & GID needed to access the ~/.azure directory by running the following command shell stat -c "UID: %u, GID: %g" ~/.azure

```txt
UID=
GID=
USERNAME=
DISTRONAME=
```
  1. Update the docker-compose.yml file to select the appropriate volume mount for the azclicredsproxy service. Comment out one or the other volume mounts as needed.

  2. Run the following command to build & run the Docker images locally

    docker compose up --build

Additional logging

If you want additional logging to show up in AI Foundry and App Insights, set these environment variables in whichever environment you are running in.

SEMANTICKERNEL_EXPERIMENTAL_GENAI_ENABLE_OTEL_DIAGNOSTICS=true
export SEMANTICKERNEL_EXPERIMENTAL_GENAI_ENABLE_OTEL_DIAGNOSTICS
SEMANTICKERNEL_EXPERIMENTAL_GENAI_ENABLE_OTEL_DIAGNOSTICS_SENSITIVE=true
export SEMANTICKERNEL_EXPERIMENTAL_GENAI_ENABLE_OTEL_DIAGNOSTICS_SENSITIVE
AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED=true
export AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED
AZURE_SDK_TRACING_IMPLEMENTATION=opentelemetry
export AZURE_SDK_TRACING_IMPLEMENTATION
  • For local process, you can set these in the .venv/bin/activate file to have them set automatically when you activate the virtual environment.

Links

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors