Table of Contents

Interface IContentSafetyService

Namespace
FoundationaLLM.Gatekeeper.Core.Interfaces
Assembly
FoundationaLLM.Gatekeeper.Core.dll

Interface for calling a content safety service.

public interface IContentSafetyService
Extension Methods

Methods

AnalyzeText(string)

Checks if a text is safe or not based on pre-configured content filters.

Task<AnalyzeTextFilterResult> AnalyzeText(string content)

Parameters

content string

The text content that needs to be analyzed.

Returns

Task<AnalyzeTextFilterResult>

The text analysis restult, which includes a boolean flag that represents if the content is considered safe. In case the content is unsafe, also returns the reason.

DetectPromptInjection(string)

Detects attempted prompt injections and jailbreaks in user prompts.

Task<string?> DetectPromptInjection(string content)

Parameters

content string

The text content that needs to be analyzed.

Returns

Task<string>

The text analysis restult, which includes flags that represents if the content contains prompt injections or jailbreaks.