Scrape data from URLs in database

I’m not a coder. I’m just trying to figure out how to properly create an automation that will take a url from one cell, scrape some data from the webpage located at that url, and then write the return into another cell in the same record. Then do that for every record in that database. Can someone help me out?

Hi @realscapemedia, welcome to the community! :wave:

That’s a solid use case. Here’s a simpler version of how you can set it up:

You can use either the “Run Script” action or the “Send HTTP Request” action to fetch data from a URL, then write that data back into a field in the same record.

Here are two helpful tutorials:

Thank you for your response. I’ve been trying to make this work for a couple of days, but I’m not being able to. My JS script keeps returning an error that the URL is invalid. I’m not sure exactly what I’m doing wrong since I can’t seem to get any console output besides that.

I don’t think what I’m trying to do should be that complex. It’s just:

  1. Get URL from database record
  2. Scrape data from that URL
  3. Write scraped data into another cell in the record
  4. Iterate for all records in the database.

Any ideas?

Hi @realscapemedia, that really depends on your use case, but you might find the Community Manager Agent template helpful:

It comes with an MCP server called fetch, which can pull data from URLs and convert it into markdown. It might be exactly what you need for this scenario.

Just curious on what you are scraping?