Skip to main content

Martech Integration

This cookbook shows how a martech company can integrate MDB into their existing pixel infrastructure with multi-tenancy support and scalable, production-ready serverless architecture.
This guide shows examples using both the legacy pixel format and the new v1 pixel format. For new integrations, use the v1 Pixel format: https://p.mdb.tools/v1/PIXEL_ID

Martech Multi-Tenant Integration

This example shows how a martech company can integrate MDB into their existing pixel infrastructure with multi-tenancy support and scalable serverless processing.

Scenario

You’re a martech product with:
  • Existing pixels deployed on multiple customer sites
  • Multi-tenant architecture with multiple clients
  • Need to attribute website visitors to your own system
  • Requirement for scalable, serverless processing

Architecture Overview


Step 1: Modify Your Existing Pixel

Update your existing pixel to load the MDB pixel with tenant identification:

Self-Executing Function Example

// Your existing pixel code
(function() {
    // Your existing pixel logic here
    var tenantId = 'client_abc123';
    var pageId = 'homepage_v2';
    var orgId = 'your_org_id_from_mdb_dashboard';
    
    // Load MDB pixel with tenant attribution
    var mdbOptions = {
        tenant_id: tenantId,
        page_id: pageId,
        source: 'your_pixel_v1',
        timestamp: Date.now(),
        // Add any other attribution data you need
        campaign: getCampaignData(), // Your existing function
        user_segment: getUserSegment() // Your existing function
    };
    
    // Create MDB pixel script
    var script = document.createElement('script');
    
    // For v1 pixel (recommended):
    // script.src = 'https://p.mdb.tools/v1/' + pixelId + '?options=' + 
    //              encodeURIComponent(JSON.stringify(mdbOptions));
    
    // For legacy pixel:
    script.src = 'https://p.mdb.tools/' + orgId + '?options=' + 
                 encodeURIComponent(JSON.stringify(mdbOptions));
    script.async = true;
    
    // Load MDB pixel
    document.head.appendChild(script);
    
    // Continue with your existing pixel logic
    // ... rest of your pixel code
})();

Multi-Tenant Configuration

// Advanced multi-tenant setup
function loadMDBPixel(config) {
    var mdbOptions = {
        // Core tenant identification
        tenant_id: config.tenantId,
        client_name: config.clientName,
        
        // Page and context attribution  
        page_id: config.pageId,
        page_type: config.pageType,
        
        // Campaign attribution from your system
        campaign_id: config.campaignId,
        campaign_name: config.campaignName,
        
        // Your pixel metadata
        pixel_version: '2.1.0',
        integration_type: 'martech_wrapper',
        
        // Custom attribution
        custom_attributes: config.customAttributes || {}
    };
    
    var script = document.createElement('script');
    
    // For v1 pixel (recommended):
    // script.src = 'https://p.mdb.tools/v1/' + config.mdbPixelId + 
    //              '?options=' + encodeURIComponent(JSON.stringify(mdbOptions));
    
    // For legacy pixel:
    script.src = 'https://p.mdb.tools/' + config.mdbOrgId + 
                 '?options=' + encodeURIComponent(JSON.stringify(mdbOptions));
    script.async = true;
    document.head.appendChild(script);
}

// Usage in your pixel
loadMDBPixel({
    tenantId: 'client_abc123',
    clientName: 'AcmeCorp',
    pageId: 'product_page_electronics',
    pageType: 'product',
    campaignId: 'summer2024_electronics',
    campaignName: 'Summer Electronics Sale',
    mdbOrgId: 'your_mdb_org_id',
    customAttributes: {
        product_category: 'electronics',
        price_range: 'premium'
    }
});

Step 2: Scalable Serverless Webhook Handler

Create a serverless function that can handle high-volume webhook traffic:

AWS Lambda Example

// AWS Lambda handler
const AWS = require('aws-sdk');
const sqs = new AWS.SQS();

exports.handler = async (event) => {
    try {
        // Parse webhook payload
        const webhook = JSON.parse(event.body);
        const { version, options, data } = webhook;
        
        // Validate webhook
        if (version !== 'v1') {
            return {
                statusCode: 400,
                body: JSON.stringify({ error: 'Unsupported version' })
            };
        }
        
        // Extract tenant information
        const tenantId = options.tenant_id;
        const pageId = options.page_id;
        
        if (!tenantId) {
            return {
                statusCode: 400,
                body: JSON.stringify({ error: 'Missing tenant_id' })
            };
        }
        
        // Enrich with metadata
        const enrichedEvent = {
            id: generateEventId(),
            timestamp: Date.now(),
            tenant_id: tenantId,
            page_id: pageId,
            mdb_data: data,
            pixel_options: options,
            processing_metadata: {
                webhook_received_at: new Date().toISOString(),
                lambda_request_id: event.requestContext.requestId
            }
        };
        
        // Send to SQS for processing
        await sqs.sendMessage({
            QueueUrl: process.env.PROCESSING_QUEUE_URL,
            MessageBody: JSON.stringify(enrichedEvent),
            MessageAttributes: {
                'tenant_id': {
                    DataType: 'String',
                    StringValue: tenantId
                },
                'event_type': {
                    DataType: 'String', 
                    StringValue: 'visitor_identified'
                }
            }
        }).promise();
        
        return {
            statusCode: 200,
            body: JSON.stringify({ success: true, event_id: enrichedEvent.id })
        };
        
    } catch (error) {
        console.error('Webhook processing error:', error);
        
        return {
            statusCode: 500,
            body: JSON.stringify({ error: 'Processing failed' })
        };
    }
};

function generateEventId() {
    return 'evt_' + Date.now() + '_' + Math.random().toString(36).substr(2, 9);
}

Firebase Functions Example

// Firebase Cloud Functions
const functions = require('firebase-functions');
const { PubSub } = require('@google-cloud/pubsub');

const pubsub = new PubSub();

exports.mdbWebhook = functions.https.onRequest(async (req, res) => {
    try {
        if (req.method !== 'POST') {
            return res.status(405).json({ error: 'Method not allowed' });
        }
        
        const { version, options, data } = req.body;
        
        // Validate webhook
        if (version !== 'v1') {
            return res.status(400).json({ error: 'Unsupported version' });
        }
        
        const tenantId = options.tenant_id;
        if (!tenantId) {
            return res.status(400).json({ error: 'Missing tenant_id' });
        }
        
        // Create processing event
        const processingEvent = {
            id: generateEventId(),
            timestamp: Date.now(),
            tenant_id: tenantId,
            page_id: options.page_id,
            mdb_data: data,
            pixel_options: options,
            processing_metadata: {
                webhook_received_at: new Date().toISOString(),
                function_execution_id: req.get('Function-Execution-Id')
            }
        };
        
        // Publish to Pub/Sub
        await pubsub
            .topic('visitor-identification-events')
            .publishMessage({
                data: Buffer.from(JSON.stringify(processingEvent)),
                attributes: {
                    tenant_id: tenantId,
                    event_type: 'visitor_identified'
                }
            });
        
        res.status(200).json({ success: true, event_id: processingEvent.id });
        
    } catch (error) {
        console.error('Webhook processing error:', error);
        res.status(500).json({ error: 'Processing failed' });
    }
});

Step 3: Queue-Based Event Processing

Process events from your queue system for scalability:

AWS SQS + Lambda Processor

// SQS event processor Lambda
exports.processVisitorEvents = async (event) => {
    const promises = event.Records.map(async (record) => {
        try {
            const eventData = JSON.parse(record.body);
            await processVisitorIdentification(eventData);
        } catch (error) {
            console.error('Event processing failed:', error);
            throw error; // This will send message to DLQ
        }
    });
    
    await Promise.all(promises);
};

async function processVisitorIdentification(eventData) {
    const { tenant_id, mdb_data, pixel_options } = eventData;
    
    // Look up tenant configuration
    const tenantConfig = await getTenantConfig(tenant_id);
    
    // Process the identified visitor data
    const visitorProfile = {
        tenant_id: tenant_id,
        visitor_data: mdb_data,
        attribution: {
            page_id: pixel_options.page_id,
            campaign_id: pixel_options.campaign_id,
            source: pixel_options.source
        },
        processed_at: new Date().toISOString()
    };
    
    // Store in your database
    await storeVisitorProfile(visitorProfile, tenantConfig);
    
    // Trigger any downstream processes
    await triggerTenantWebhooks(tenant_id, visitorProfile);
    await updateAnalytics(tenant_id, visitorProfile);
}

Pub/Sub Processing Example

// Pub/Sub consumer for event processing
const { PubSub } = require('@google-cloud/pubsub');

const pubsub = new PubSub({
    projectId: process.env.GCP_PROJECT_ID
});

const subscription = pubsub.subscription('visitor-identification-subscription');

async function startProcessing() {
    subscription.on('message', async (message) => {
        try {
            const eventData = JSON.parse(message.data.toString());
            await processVisitorEvent(eventData);
            message.ack(); // Acknowledge the message
        } catch (error) {
            console.error('Processing failed:', error);
            message.nack(); // Negative acknowledgment - retry
        }
    });
}

async function processVisitorEvent(eventData) {
    const { tenant_id, mdb_data, pixel_options } = eventData;
    
    // Multi-tenant processing logic
    const processor = await getTenantProcessor(tenant_id);
    await processor.handleVisitorIdentification(mdb_data, pixel_options);
}

Step 4: Database Schema for Multi-Tenancy

Design your database schema to handle multi-tenant visitor data:

PostgreSQL Schema Example

-- Tenants table
CREATE TABLE tenants (
    id VARCHAR(255) PRIMARY KEY,
    name VARCHAR(255) NOT NULL,
    mdb_org_id VARCHAR(255) NOT NULL,
    webhook_config JSONB,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Visitor identifications table
CREATE TABLE visitor_identifications (
    id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
    tenant_id VARCHAR(255) REFERENCES tenants(id),
    visitor_data JSONB NOT NULL,
    pixel_options JSONB NOT NULL,
    page_id VARCHAR(255),
    campaign_id VARCHAR(255),
    identified_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
    processed_at TIMESTAMP
);

-- Indexes for performance
CREATE INDEX idx_visitor_identifications_tenant_id ON visitor_identifications(tenant_id);
CREATE INDEX idx_visitor_identifications_identified_at ON visitor_identifications(identified_at);
CREATE INDEX idx_visitor_identifications_page_id ON visitor_identifications(page_id);

Step 5: Monitoring and Observability

Implement comprehensive monitoring for your integration:

CloudWatch/Application Insights Metrics

// Custom metrics tracking
const AWS = require('aws-sdk');
const cloudwatch = new AWS.CloudWatch();

async function trackMetrics(tenantId, eventType, success) {
    await cloudwatch.putMetricData({
        Namespace: 'MDB/Integration',
        MetricData: [
            {
                MetricName: 'WebhookProcessed',
                Dimensions: [
                    { Name: 'TenantId', Value: tenantId },
                    { Name: 'EventType', Value: eventType },
                    { Name: 'Success', Value: success.toString() }
                ],
                Value: 1,
                Unit: 'Count'
            }
        ]
    }).promise();
}

Best Practices Summary

  • Use serverless functions for webhook handling
  • Implement queue-based processing for high volume
  • Design for horizontal scaling with proper partitioning
  • Use appropriate database indexes for query performance
  • Include tenant_id in all pixel options
  • Isolate tenant data at the database level
  • Implement tenant-specific configuration
  • Monitor per-tenant usage and performance
  • Implement proper error handling and retries
  • Use dead letter queues for failed processing
  • Design idempotent processing logic
  • Monitor webhook delivery success rates
  • Validate all webhook payloads
  • Use HTTPS for all endpoints
  • Implement proper authentication
  • Sanitize and validate tenant data

Next Steps

  1. Start with a single tenant - Implement and test with one client first
  2. Monitor performance - Track latency, throughput, and error rates
  3. Scale gradually - Add tenants and monitor system performance
  4. Optimize based on usage - Adjust queue sizes, function timeouts, and database queries based on real usage patterns