Cloudflare
Overview
This integration uses Cloudflare Logpush to stream CDN and edge activity logs directly to the Agent Analytics ingestion endpoint. Logpush provides a reliable, high throughput mechanism for exporting real time log data to external HTTPS destinations, making it ideal for monitoring traffic behavior and AI agent interactions at scale.
For additional details about Logpush, refer to Cloudflare’s official documentation.
Step 1
Open Terminal and install the following npm
npm create [email protected] -- log-collector
Select the "worker only”

sa


Select the “TypeScript” language.
Now the CLI will create a new project with the name log-collector and install the necessary dependencies.Select Git for version control.
Step 2
Configure your worker
Edit your wrangler.json file to configure the LIMY_KEY environment variable and the route binding:
Replace the placeholder pattern example.com/* with your organization’s actual domain. In most cases, this will correspond to your primary marketing site. For example, if your marketing site is https://www.example.com, the appropriate pattern would be www.example.com/*. The zone_name should reflect your canonical root domain, excluding the “www.” prefix.
{
"$schema": "node_modules/wrangler/config-schema.json",
"name": "log-collector",
"main": "src/index.ts",
"compatibility_date": "2025-01-29",
"observability": {
"enabled": true
},
"route": {
"pattern": "example.com/*",
"zone_name": "example.com"
},
"vars": { "LIMY_KEY": "https://stream.getlimy.ai" }
}Then copy the TypeScript code into src/index.ts:
If your unsure- contact us at [email protected]
/**
* Cloudflare Worker for Log Collection
*
* This worker captures HTTP request/response data and forwards it to Limy's log collection API.
* It runs as a middleware, meaning it doesn't interfere with the actual request handling.
*/
export interface Env {
LIMY_KEY: string;
LIMY_KEY: string;
}
export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
// Get the original response
const response = await fetch(request);
// Clone the response so we can read it multiple times
const responseClone = response.clone();
ctx.waitUntil(handleRequest(request, responseClone, env));
return response;
}
} satisfies ExportedHandler<Env>;
async function handleRequest(request: Request, response: Response, env: Env) {
const requestUrl = new URL(request.url);
// Calculate header size
const headerSize = Array.from(response.headers.entries())
.reduce((total, [key, value]) => {
// +2 for ': ' and +2 for '\r\n'
return total + key.length + value.length + 4;
}, 0);
// Get response body size
const responseBody = await response.blob();
const bodySize = responseBody.size;
// Total bytes sent includes headers and body
const totalBytesSent = headerSize + bodySize;
const logData = {
timestamp: Date.now(),
host: requestUrl.hostname,
method: request.method,
pathname: requestUrl.pathname,
query_params: Object.fromEntries(requestUrl.searchParams),
ip: request.headers.get('cf-connecting-ip'),
userAgent: request.headers.get('user-agent'),
referer: request.headers.get('referer'),
bytes: totalBytesSent,
status: response.status
}
await fetch(env.LIMY_KEY, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': env.LIMY_KEY
},
body: JSON.stringify([logData])
}).catch(error => console.error('Failed to send logs:', error))
}Step 3
Login to Cloudflare & Deploy Your Worker
Use the Wrangler CLI to deploy the Worker:
//Login to Cloudflare
npx wrangler loginConfigure Limy API key
Secrets is a feature of Cloudflare Workers that allows you to store sensitive information like API keys in a secure environment.
//Configure the API key secret
npx wrangler secret put LIMY_KEYStep 4
TEST your worker
Verify your Worker is functioning correctly:
Navigate to Limy product and check if the logs are being collected in the Log panel. Note that AI log filter is on by default.
That’s it! You have now successfully configured Limy router for your website. Data should begin to populate on your dashboard within an hour.
Need More Help?
Contact [email protected] for API-related questions
Last updated