Skip to content

Adds Agentic Audience#210

Merged
hillslatt merged 7 commits intomainfrom
feature/adds-agentic-audiences
Mar 30, 2026
Merged

Adds Agentic Audience#210
hillslatt merged 7 commits intomainfrom
feature/adds-agentic-audiences

Conversation

@therevoltingx
Copy link
Copy Markdown

@therevoltingx therevoltingx commented Mar 22, 2026

Add Agentic Audiences Community Extension

This PR adds a community extension document for Agentic Audiences in OpenRTB. Agentic Audiences (formerly the User Context Protocol/UCP) is an IAB Tech Lab standard that defines how intelligent agents in advertising exchange vector embeddings representing consumer intent and ad response.

Summary

The extension specifies how embeddings are sent in bid requests. They are placed in BidRequest.user.data using the standard Data and Segment objects.

Structure

  • Each provider uses one Data object with a name (provider identifier) and a segment array.
  • Each segment is a standard OpenRTB Segment with id, name, and ext.
  • Agentic Audiences fields go in Segment.ext: ver, vector, model, dimension, and type.
  • The type array indicates signal type: 1 = identity, 2 = contextual, 3 = reinforcement.

Files Changed

  • Added: extensions/community_extensions/agentic-audiences.md – full specification with attribute definitions, embedding type values, single and multiple provider examples, client-side storage notes, and implementation guidance
  • Modified: extensions/community_extensions/README.md – added Agentic Audiences entry

References


## Request Change

This community extension defines how Agentic Audiences embeddings are conveyed in OpenRTB bid requests. The extension uses the existing `Data` and `Segment` objects in `BidRequest.user.data`. Each data provider supplies one or more segment entries, where each entry is a vector embedding with metadata describing its type and model.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Community extensions use ext fields, so all these extra fields should hand off user.data.sement.ext if you're going to use segment

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

# Agentic Audiences in OpenRTB

**Sponsors**: LiveRamp

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can add raptive

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added

"data": [
{
"name": "live_ramp",
"segment": [
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The next five fields should hang off ext not segment directly or this isn't an extension.

Also according to the aa spec this is rather incomplete

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i fixed it so that the aa specific fields are under ext. in regards to the incompleteness.
these were the minimum ones we thought would be needed in openrtb.
do you believe we should add all fields in the aa spec? are you concerned with request size bloat?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am worried about that but the appendix audience spec says a bunch of fields are required and then it lists even more optional fields, so I'm not sure which is right

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed the ext, ty

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The aa spec's required fields could be pruned down a bit. Also the original spec was meant to cover the targeting side as well, for example metric specification isn't as relevant on the supply side.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can:

  1. we move forward with this small set of fields and add more as needed
  2. we make it follow the AA spec exactly, including nested fields and long form names

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adam-zimmerman @patmmccann i think we should move forward with #1 and add fields as needed by the community

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree, let's stick with #1.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay but then you should change the aa spec to move more things to optional

"user": {
"data": [
{
"name": "live_ramp",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Liveramp is one word

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

"name": "live_ramp",
"segment": [
{
"ver": "1.0",
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The spec calls for much longer vectors and the arrays of floats would require much more precision than one decimal place. I think this example is misleading. It is likely wise to use a better data structure than this vector spelled out so poorly compacted

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moved to ext but this is for version. so version of the model that was used to generate the embedding. not the actual vector or array of floats.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I meant the line below, looks like we already came to agreement ty

"segment": [
{
"ver": "1.0",
"vector": [0.1, -0.2, 0.3],
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe something like this instead?

const floats = [
  1.2345678, -2.5, 0.0, 3.1415927, 12345.678,
  -0.00012345, 42.42, -999.999, 0.000001, 987654.25
];

function floats32ToBase64(arr) {
  const buffer = new ArrayBuffer(arr.length * 4);
  const view = new DataView(buffer);

  arr.forEach((x, i) => view.setFloat32(i * 4, x, true)); // little-endian

  const bytes = new Uint8Array(buffer);
  let binary = "";
  for (const b of bytes) binary += String.fromCharCode(b);

  return btoa(binary);
}

console.log(floats32ToBase64(floats));
// UQaePwAAIMAAAAAA2w9JQLbmQEZbcgG5FK4pQvD/ecS9N4Y1ZCBxSQ==

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you're proposing that instead of an array of floats, that the array is encoded to base64. is that to reduce size of the request or is there an issue with just transporting an array of floats?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to reduce the size, the spec talks about it being 512 or 1024 entries long

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was also thinking the array itself should be base64 encoded

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@patmmccann @adam-zimmerman i went ahead and changed vector to be a base64 encoded value. along with examples on encoding and decoding

@therevoltingx
Copy link
Copy Markdown
Author

@patmmccann @adam-zimmerman I have addressed all your comments. Can you please review and approve the entire PR so we can move forward thanks.

@therevoltingx
Copy link
Copy Markdown
Author

@hillslatt we're good to merge this PR

@hillslatt hillslatt merged commit eebe6bd into main Mar 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants