Add Support for BYOM

Articles

(2)
RS
Get Started with BYOM in AI+ Studio

Sprinklr follows an LLM-agnostic approach to Generative AI, giving you the flexibility to choose the language model that best fits your business needs. Along with Sprinklr’s in-house and partner models, you can use the Bring Your Own Model (BYOM) option to integrate your internally developed LLMs di

Sprinklr Service

20.10

Updated Article

Knowledge Base Article

Core Components

Governance

Sandbox

Integrations

AI+ Studio

Introduction to AI+ Studio

AI+ Studio - Provider and Models Settings

Add Support for OpenAI

Add support for Azure OpenAI

Add Support for Amazon Bedrock

Add Support for Google Vertex

Add Support for BYOM

Add Support for Anthropic

Add Support for xAI

Use-Case Deployments

Supported List of Use-Cases

Manage Copilots

Introduction to Security and Compliance in AI+ Studio

PII Masking

Guardrails

Prompt Engineering Best Practices

Sprinklr Accessibility and Compatibility Features

Sprinklr AI Features