In modern enterprises, access to the right information drives productivity — but controlling who can see what is just as important. From HR policies and payroll guidelines to financial reports and legal contracts, sensitive documents must only be available to authorized users. This is where security trimming in Azure AI Search plays a crucial role. It ensures that users can only access the data they are permitted to see, even when using Retrieval-Augmented Generation (RAG) AI pipelines, which pull together enterprise data to answer natural language questions.
What is Security Trimming?
Security trimming in Azure AI Search is the process of filtering search results
at query time based on user identity, group membership, or other security principals. Instead of
enforcing direct authentication or full access-control lists (ACLs) on the search service,
Azure AI Search utilizes a filterable field in the search index, such as group_ids,
to simulate document-level authorization by dynamically filtering search results.
For example, when a user queries the search index, a filter expression matches their group IDs
to the document’s group_ids field, so only authorized documents appear in the
results.
Example Filter Query:
{
"filter": "group_ids/any(g:search.in(g, 'group_id1, group_id2'))"
}
Here, group_ids is a field in your Azure AI Search index that stores which groups a document belongs to.
Why Security Trimming Matters for RAG
Retrieval-Augmented Generation (RAG) pipelines are architectures combining document retrieval with generative AI models. These pipelines synthesize answers based strictly on enterprise data sources. Without security trimming, RAG pipelines risk exposing confidential or restricted content to unauthorized users, leading to compliance violations and privacy risks.
Without Security Trimming:
- Sensitive documents might be exposed to unauthorized users.
- Compliance violations can occur.
- AI-generated answers may leak confidential information.
Key Benefits of Security Trimming:
- Confidentiality: Ensures sensitive documents are only accessible to authorized users.
- Compliance: Adheres to internal policies and regulatory requirements like GDPR.
- Context-Aware Generation: Answers are produced only from documents the user can access, preventing accidental leaks.
Use Case
Consider an enterprise scenario with two distinct user groups: HR and Finance.
- HR users should access documents like leave policies, working guidelines, and salary rules, but should never see finance records.
- Finance users require access to budgets, audits, and financial statements, but are barred from HR files.
Step 1: Defining the Search Index With a Security Field
Create an index schema including a filterable security field (group_ids) that
stores group or user IDs as a collection of strings. The field should be filterable but not
retrievable.
POST https://[search-service].search.windows.net/indexes/securedfiles?api-version=2025-09-01
Content-Type: application/json
api-key: [ADMIN_API_KEY]
{
"name": "securedfiles",
"fields": [
{ "name": "file_id", "type": "Edm.String", "key": true, "searchable": false },
{ "name": "file_name", "type": "Edm.String", "searchable": true },
{ "name": "file_description", "type": "Edm.String", "searchable": true },
{ "name": "group_ids", "type": "Collection(Edm.String)", "filterable": true, "retrievable": false }
]
}
Key Points:
filterable: true→ allows filtering by group IDs.retrievable: false→ prevents exposing group IDs in search responses.
With the index schema in place, your foundation for secure, scalable search is ready—each document will now respect access policies from the start.
Step 2: Upload Documents With Group IDs
Push documents to the index, including the groups authorized to access each document.
POST https://[search-service].search.windows.net/indexes/securedfiles/docs/index?api-version=2025-09-01
Content-Type: application/json
api-key: [ADMIN_API_KEY]
{
"value": [
{
"@search.action": "upload",
"file_id": "1",
"file_name": "secured_file_a",
"file_description": "File access restricted to Human Resources",
"group_ids": ["group_id1"]
},
{
"@search.action": "upload",
"file_id": "2",
"file_name": "secured_file_b",
"file_description": "File access restricted to HR and Recruiting",
"group_ids": ["group_id1", "group_id2"]
},
{
"@search.action": "upload",
"file_id": "3",
"file_name": "secured_file_c",
"file_description": "File access restricted to Operations and Logistics",
"group_ids": ["group_id5", "group_id6"]
}
]
}
If document groups need updating, use the merge or mergeOrUpload action:
POST https://[search-service].search.windows.net/indexes/securedfiles/docs/index?api-version=2025-09-01
Content-Type: application/json
api-key: [ADMIN_API_KEY]
{
"value": [
{
"@search.action": "mergeOrUpload",
"file_id": "3",
"group_ids": ["group_id7", "group_id8", "group_id9"]
}
]
}
By assigning group IDs at upload, you ensure that every document is automatically filtered for the right audience—security is built into your search pipeline.
Step 3: Perform Filterable Search Query
When a user searches, issue a search query with a filter that restricts results to documents containing the user’s authorized groups.
POST https://[search-service].search.windows.net/indexes/securedfiles/docs/search?api-version=2025-09-01
Content-Type: application/json
api-key: [QUERY_API_KEY]
{
"search": "*",
"filter": "group_ids/any(g:search.in(g, 'group_id1, group_id2'))"
}
-
This query returns only documents where
group_idscontains either"group_id1"or"group_id2", matching the user’s groups.
Sample response:
[
{
"@search.score": 1.0,
"file_id": "1",
"file_name": "secured_file_a"
},
{
"@search.score": 1.0,
"file_id": "2",
"file_name": "secured_file_b"
}
]
Executing a filtered search now guarantees that users see only what they’re authorized to access—empowering secure, context-aware AI responses.
How Security Trimming Works Under the Hood
Azure AI Search uses OData filter expressions to simulate document-level authorization. It
filters results purely based on string values stored in the security field (group_ids)
without direct authentication or ACL enforcement. This approach provides simple, performant
security filtering that scales to large enterprises and integrates seamlessly into RAG AI
pipelines.
Conclusion
Security trimming in Azure AI Search is essential for building enterprise-grade, compliant knowledge retrieval systems. Implementing group-based access filtering at the search layer empowers organizations to deliver personalized, secure AI experiences while safeguarding sensitive content and meeting regulatory requirements.
For AI-powered knowledge assistants leveraging RAG, security trimming should be the first priority—ensuring users receive answers strictly from content they are authorized to access.
By implementing security trimming in Azure AI Search, your enterprise ensures that AI-driven insights are both powerful and secure - delivering the right information to the right people, every time.
