Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

I Built A Price Monitoring System In 30 Minutes — Here's The Stack (all Free Apis)

Card image cap

Last month, a friend running a small e-commerce store asked me: "How do big companies track competitor prices? I can't afford $500/month tools."

I told him I'd build one for free. It took 30 minutes.

Here's exactly how I did it — and you can copy this approach for any niche.

Full code on GitHub: price-monitoring-free

The Problem

Price monitoring tools like Prisync ($99-399/mo) or Competera ($1000+/mo) are built for enterprises. Small business owners need something simpler:

  • Track 10-50 products across 3-5 competitor sites
  • Get notified when prices change
  • See price history over time

The Architecture (Dead Simple)

┌─────────────┐     ┌──────────────┐     ┌─────────────┐  
│  Scheduler   │────▶│  Scraper API  │────▶│  JSON Store  │  
│  (cron/n8n)  │     │  (sitemap +   │     │  (GitHub)    │  
│              │     │   product     │     │              │  
└─────────────┘     │   pages)      │     └──────┬──────┘  
                    └──────────────┘            │  
                                          ┌────▼─────┐  
                                          │ Notifier  │  
                                          │ (webhook) │  
                                          └──────────┘  

Step 1: Find Product URLs via Sitemap API

Most e-commerce sites expose their product URLs in XML sitemaps. Instead of guessing URLs, parse the sitemap:

const axios = require('axios');  
const { parseStringPromise } = require('xml2js');  
  
async function getProductUrls(domain) {  
  const { data } = await axios.get(`https://${domain}/sitemap.xml`);  
  const parsed = await parseStringPromise(data);  
  
  const urls = parsed.urlset.url  
    .map(u => u.loc[0])  
    .filter(url => url.includes('/product') || url.includes('/item'));  
  
  console.log(`Found ${urls.length} product pages on ${domain}`);  
  return urls;  
}  
  
const products = await getProductUrls('competitor-store.com');  
// Found 847 product pages  

Pro tip: Check robots.txt first — it often links to multiple sitemaps including product-specific ones like /sitemap-products.xml.

Step 2: Extract Prices with Structured Data

Here's the trick most people miss: e-commerce sites embed prices in JSON-LD structured data for Google. You don't need to parse HTML — just extract the JSON:

const cheerio = require('cheerio');  
  
async function getPrice(url) {  
  const { data: html } = await axios.get(url, {  
    headers: { 'User-Agent': 'Mozilla/5.0 (compatible; PriceBot/1.0)' }  
  });  
  
  const $ = cheerio.load(html);  
  
  const jsonLd = $('script[type="application/ld+json"]')  
    .map((_, el) => {  
      try { return JSON.parse($(el).html()); }   
      catch { return null; }  
    })  
    .get()  
    .find(d => d['@type'] === 'Product');  
  
  if (jsonLd?.offers) {  
    return {  
      name: jsonLd.name,  
      price: parseFloat(jsonLd.offers.price || jsonLd.offers.lowPrice),  
      currency: jsonLd.offers.priceCurrency,  
      availability: jsonLd.offers.availability,  
      url  
    };  
  }  
  return null;  
}  

Step 3: Store Price History in GitHub (Free)

Instead of paying for a database, use GitHub as a free JSON store:

const { Octokit } = require('@octokit/rest');  
const octokit = new Octokit({ auth: process.env.GITHUB_TOKEN });  
  
async function savePriceData(prices) {  
  const date = new Date().toISOString().split('T')[0];  
  const content = Buffer.from(JSON.stringify(prices, null, 2)).toString('base64');  
  
  await octokit.repos.createOrUpdateFileContents({  
    owner: 'your-username',  
    repo: 'price-tracker-data',  
    path: `data/${date}.json`,  
    message: `Price snapshot ${date}`,  
    content  
  });  
}  

Step 4: Detect Changes & Notify

async function detectChanges(today, yesterday) {  
  const changes = [];  
  
  for (const product of today) {  
    const prev = yesterday.find(p => p.url === product.url);  
    if (prev && prev.price !== product.price) {  
      const change = ((product.price - prev.price) / prev.price * 100).toFixed(1);  
      changes.push({  
        name: product.name,  
        oldPrice: prev.price,  
        newPrice: product.price,  
        change: `${change}%`,  
        url: product.url  
      });  
    }  
  }  
  
  if (changes.length > 0) {  
    await axios.post(process.env.WEBHOOK_URL, {  
      content: `Price changes detected!\n` +  
        changes.map(c => `${c.name}: $${c.oldPrice} -> $${c.newPrice} (${c.change})`).join('\n')  
    });  
  }  
  return changes;  
}  

Step 5: Automate with Cron

const competitors = ['store-a.com', 'store-b.com', 'store-c.com'];  
  
async function run() {  
  const allPrices = [];  
  for (const domain of competitors) {  
    const urls = await getProductUrls(domain);  
    for (const url of urls.slice(0, 50)) {  
      const price = await getPrice(url);  
      if (price) allPrices.push(price);  
      await new Promise(r => setTimeout(r, 1000));  
    }  
  }  
  console.log(`Tracked ${allPrices.length} products`);  
  await savePriceData(allPrices);  
}  
  
run();  

Add to crontab: 0 8 * * * node /path/to/monitor.js

The Cost Breakdown

Component Tool Cost
Sitemap parsing axios + xml2js Free
Price extraction cheerio Free
Data storage GitHub API Free
Notifications Discord/Slack webhook Free
Scheduling cron / n8n Free
Total $0/month

Compare that to Prisync at $99/month or Competera at $1000+/month.

What I Learned

  1. 80% of e-commerce sites have structured data — JSON-LD is your best friend
  2. Sitemaps are goldmines — they list every page without crawling
  3. GitHub is a legitimate database for small datasets — free, versioned, with API
  4. 150 lines of JavaScript replaces $100/month subscriptions

My friend has been running this for 3 weeks. He's already adjusted prices on 12 products and estimates it saved him $800 in the first month.

Want to Go Further?

  • Add product matching (fuzzy match names across stores)
  • Build a dashboard with GitHub Pages + Chart.js
  • Monitor stock availability via JSON-LD
  • Track prices on Amazon using their Product Advertising API

Full code: github.com/spinov001-art/price-monitoring-free

More scraping tools: 77 Web Scraping Tools | API Templates

What's the most creative thing you've built with free APIs? Drop a comment — I'm always looking for new project ideas.