Skip to main content

Export to JSON

Export your analytics data in JSON format for programmatic processing, API integration, and custom applications.

What is JSON Export

JSON (JavaScript Object Notation) exports provide structured, machine-readable analytics data ideal for:

  • API integration and automation
  • Custom dashboards and visualizations
  • Data pipelines and ETL processes
  • Application integration
  • Programmatic analysis

How to Export JSON

From Dashboard

  1. Log in to your Ovyxa dashboard
  2. Select your site
  3. Choose your date range
  4. Click Export → JSON in the top-right corner
  5. Choose sections to export
  6. Click Download JSON

File downloads immediately for small datasets. Large exports are emailed when ready.

Via API

For automated exports, use the Stats API:

const response = await fetch('https://api.ovyxa.com/v1/stats/export', {
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
method: 'POST',
body: JSON.stringify({
site_id: 'your-site-id',
date_from: '2024-11-01',
date_to: '2024-11-30',
format: 'json'
})
});

const data = await response.json();

See Stats API documentation for details.

JSON File Structure

Complete Export Example

{
"export_metadata": {
"site_id": "abc123",
"site_domain": "example.com",
"exported_at": "2024-11-13T10:30:00Z",
"date_from": "2024-11-01",
"date_to": "2024-11-30",
"timezone": "Europe/Paris",
"export_format": "json",
"version": "1.0"
},
"summary": {
"total_pageviews": 125450,
"unique_visitors": 89234,
"bounce_rate": 42.5,
"avg_session_duration": 185,
"total_events": 8945,
"conversion_rate": 5.6
},
"top_pages": [
{
"pathname": "/",
"pageviews": 32450,
"unique_visitors": 24123,
"bounce_rate": 38.2,
"avg_time_on_page": 125,
"entries": 18900,
"exits": 8234
}
],
"top_sources": [
{
"source": "google",
"medium": "organic",
"pageviews": 45670,
"unique_visitors": 32145,
"conversion_rate": 6.8,
"revenue": 12450.50
}
],
"top_locations": [
{
"country": "United States",
"country_code": "US",
"pageviews": 56789,
"unique_visitors": 41234,
"percentage": 46.2
}
],
"devices": [
{
"device_type": "Desktop",
"browser": "Chrome",
"os": "Windows",
"pageviews": 67890,
"percentage": 54.1
}
],
"custom_events": [
{
"event_name": "signup",
"count": 1234,
"unique_triggers": 982,
"conversion_rate": 45.3,
"total_revenue": 0,
"properties": {
"plan": {
"pro": 567,
"starter": 415
}
}
}
]
}

JSON Schema

Metadata Object

{
"export_metadata": {
"site_id": string,
"site_domain": string,
"exported_at": ISO8601 timestamp,
"date_from": YYYY-MM-DD,
"date_to": YYYY-MM-DD,
"timezone": string,
"export_format": "json",
"version": string
}
}

Summary Object

{
"summary": {
"total_pageviews": number,
"unique_visitors": number,
"bounce_rate": number,
"avg_session_duration": number (seconds),
"total_events": number,
"conversion_rate": number
}
}

Page Object

{
"pathname": string,
"pageviews": number,
"unique_visitors": number,
"bounce_rate": number,
"avg_time_on_page": number (seconds),
"entries": number,
"exits": number
}

Source Object

{
"source": string,
"medium": string,
"campaign": string | null,
"pageviews": number,
"unique_visitors": number,
"conversion_rate": number,
"revenue": number
}

Using JSON Exports

JavaScript/Node.js

const fs = require('fs');

// Load export
const data = JSON.parse(fs.readFileSync('export.json', 'utf8'));

// Analyze top pages
const topPages = data.top_pages
.filter(page => page.pageviews > 1000)
.sort((a, b) => b.pageviews - a.pageviews)
.slice(0, 10);

console.log('Top 10 Pages:', topPages);

// Calculate total revenue
const totalRevenue = data.top_sources.reduce(
(sum, source) => sum + (source.revenue || 0),
0
);

console.log('Total Revenue:', totalRevenue);

Python

import json
import pandas as pd

# Load export
with open('export.json', 'r') as f:
data = json.load(f)

# Convert to DataFrame
pages_df = pd.DataFrame(data['top_pages'])
sources_df = pd.DataFrame(data['top_sources'])

# Analyze
high_bounce = pages_df[pages_df['bounce_rate'] > 60]
print(f"High bounce pages: {len(high_bounce)}")

# Best converting sources
top_converting = sources_df.nlargest(5, 'conversion_rate')
print(top_converting[['source', 'conversion_rate']])

Ruby

require 'json'

# Load export
data = JSON.parse(File.read('export.json'))

# Find pages with low engagement
low_engagement = data['top_pages'].select do |page|
page['avg_time_on_page'] < 30
end

puts "Pages with < 30s engagement: #{low_engagement.count}"

# Group events by name
events_by_name = data['custom_events'].group_by { |e| e['event_name'] }

Advanced Use Cases

Build Custom Dashboard

// React component example
import React, { useState, useEffect } from 'react';

function Analytics() {
const [data, setData] = useState(null);

useEffect(() => {
fetch('/exports/latest.json')
.then(res => res.json())
.then(setData);
}, []);

if (!data) return <div>Loading...</div>;

return (
<div>
<h1>{data.export_metadata.site_domain}</h1>
<div className="stats">
<Stat label="Visitors" value={data.summary.unique_visitors} />
<Stat label="Pageviews" value={data.summary.total_pageviews} />
<Stat label="Bounce Rate" value={`${data.summary.bounce_rate}%`} />
</div>
<TopPages pages={data.top_pages} />
<TopSources sources={data.top_sources} />
</div>
);
}

Data Pipeline Integration

# Apache Airflow DAG example
from airflow import DAG
from airflow.operators.python import PythonOperator
import requests
import pandas as pd

def fetch_ovyxa_data():
response = requests.get(
'https://api.ovyxa.com/v1/stats/export',
headers={'Authorization': f'Bearer {API_KEY}'},
json={'site_id': SITE_ID, 'format': 'json'}
)

data = response.json()

# Transform and load to data warehouse
df = pd.DataFrame(data['top_pages'])
df.to_sql('ovyxa_pages', engine, if_exists='replace')

dag = DAG('ovyxa_export', schedule_interval='@daily')

export_task = PythonOperator(
task_id='export_analytics',
python_callable=fetch_ovyxa_data,
dag=dag
)

Webhook Integration

Automatically send JSON exports to external systems:

// Webhook receiver (Express.js)
app.post('/webhooks/ovyxa', async (req, res) => {
const analyticsData = req.body;

// Process the export
await processAnalytics(analyticsData);

// Trigger alerts if needed
if (analyticsData.summary.bounce_rate > 70) {
await sendAlert('High bounce rate detected!');
}

res.sendStatus(200);
});

Best Practices

Version Control

Track JSON schemas in version control:

// package.json
{
"devDependencies": {
"@ovyxa/export-types": "^1.0.0"
}
}

Type Safety (TypeScript)

interface OvyxaExport {
export_metadata: ExportMetadata;
summary: Summary;
top_pages: Page[];
top_sources: Source[];
top_locations: Location[];
devices: Device[];
custom_events: CustomEvent[];
}

async function loadExport(): Promise<OvyxaExport> {
const response = await fetch('/export.json');
return response.json();
}

Error Handling

async function safeLoadExport(filepath) {
try {
const data = JSON.parse(await fs.readFile(filepath, 'utf8'));

// Validate schema
if (!data.export_metadata || !data.summary) {
throw new Error('Invalid export format');
}

return data;
} catch (error) {
console.error('Failed to load export:', error);
return null;
}
}

Streaming Large Exports

For large datasets, stream JSON instead of loading all at once:

const { createReadStream } = require('fs');
const JSONStream = require('JSONStream');

createReadStream('large-export.json')
.pipe(JSONStream.parse('top_pages.*'))
.on('data', (page) => {
// Process each page individually
console.log(page.pathname, page.pageviews);
});

Limitations

Same as CSV exports:

  • Free tier: 1 export/day, 6 months retention
  • Paid plans: Unlimited exports, full retention
  • File size: Large exports (> 100MB) emailed

Troubleshooting

Invalid JSON Error

SyntaxError: Unexpected token < in JSON at position 0

Causes:

  • Server returned HTML error page instead of JSON
  • File corrupted during download
  • API authentication failed

Solutions:

  • Verify API key is valid
  • Check file isn't an HTML error page
  • Re-download the export

Missing Data

If certain fields are null or missing:

  • Data may not exist for that period
  • Feature wasn't enabled during that time
  • Check date range is correct

Large File Performance

For files > 50MB:

  • Use streaming parsers
  • Process in chunks
  • Consider paginated API instead
  • Filter data server-side

Need Help?

See also: Export to CSV for spreadsheet-friendly exports.