Hello World โ๏ธ,
In this article, I would like to tell about how you can scrape HTML content from a website build with the Javascript framework.
But why it is even a problem to scrape a JS-based website? ๐ค
Problem Definition:
You need to have a browser environment in order to execute Javascript code that will render HTML.
If you will try open this website (https://web-scraping-playground-site.firebaseapp.com) in your browser โ you will see a simple page with some content.
However, if you will try to send HTTP GET request to the same url in the Postman โ you will see a different response.
A response to GET request โhttps://web-scraping-playground-site.firebaseapp.comโ in made in the Postman.
What? Why the response contains no HTML? It is happening because there is no browser environment when we sending requests from a server or Postman app.
๐ We need a browser environment for executing Javascript code and rendering content โ HTML.
It sounds like an easy and fun problem to solve! In the below ๐ section I will show 2 ways how to solve the above-mentioned problem using:
Let's get started ๐จโ๐ป
For people who prefer watching videos, there is a quick video ๐ฅ demonstrating how to get an HTML content of a JS-based website.
Solution using Puppeteer
The idea is simple. Use puppeteer on our server for simulating the browser environment in order to render HTML of a page and use it for scraping or something else ๐.
See the below code snippet.
This code simply:
- Accepts GET request
- Receives โurlโ param
- Returns response of the โgetPageHTMLโ function
The โgetPageHTMLโ function is the most interesting for us because thatโs where the magic happens.
The โmagicโ is, however, pretty simple. The function simply does the following steps:
- Launch puppeteer
- Open the desired url
- Internally executes JS
- Extract HTML of the page
- Return the HTML
Easy-peasy ๐
Letโs run the script and send a request to http://localhost:3000?url=https://web-scraping-playground-site.firebaseapp.com
in the Postman app.
The below screenshot shows the response from our local server.
Yaaaaay ๐๐๐ We Did it! Great job guys! We got HTML back!
It was easy, but it can be even easier, letโs have a look at the second approach.
Solution using Proxybot
With this approach, we actually only need to send an HTTP GET request. The API service will run a virtual browser internally and send you back HTML.
https://proxybot.io/api/v1/API_KEY?render_js=true&url=your-url-here
Letโs try to call the API in the Postman app.
Yaaay ๐๐๐ More HTML!
There is not much to say about the request, because it is pretty straightforward. However, I want to emphasize a small detail. When calling the API to remember to include the render_js=true
url param.
Otherwise, the service will not execute Javascript ๐ค
Congratulations ๐ฅณ Now you can scrape websites build with javascript frameworks like Angular, React, Ember etc..
I hope this article was interesting and useful.
Proxybot it just one of the services allowing you to proxy your requests. If you are looking for proxy providers here you can find a list with best proxy providers.