Fola's blog

Fola's blog

Creating a Web Scraping Application Using Laravel.

Introduction

I started working on this app about 12 days ago when I saw the post on the Hashnode hackathon.

So I decided to finish up the project (at least to a reasonable point) and use it to apply for the hackathon.

General Idea

My aim in this project was to make a job scraping application using Laravel. The application would scrape jobs from various sites and bring them together for the user.

The user has the ability to save jobs to his dashboard where he can see them later and apply to the jobs if he wants to.

Personally I am using Goutte to scrape the data and I am only scraping from Stack Overflow and LinkedIn.

Side Note (A bit of technical explanation)

Creating the main scraping application itself wasn't too difficult, but I what made it a bit difficult was that I wanted to create the application in such a way that it is easily extensible and has the following features:

  1. Can scrape new websites without writing additional code

  2. Supports various scraping services (not only Goutte - which is what I used).

Can scrape new websites without writing additional code: I achieved this by creating a config file that contains all the parameters necessary for data to be scraped.

It is a json file that contains these parameters and the scraper service pulls required information from this file when it wants to scrape.

image.png

image.png Here I have information for Stack Overflow and LinkedIn which are the sites I scraped

In order to scrape new websites, all the developer needs to do is to add additional website information into this json file and the website will be scraped without writing any additional code.

Supports various scraping services (not only Goutte - which is what I used):

Another major concern for me was creating the application in such a way that it supports various scraping services.

How I achieved this was by creating a scraper interface which any Scraper Service can extend.

This way all the developer needs to do is to write a new Scraper Service extending the Scraper Interface .

image.png Scraper Interface

image.png Goutte Scraper Service I created (since I'm using Goutte to scrape).

If another developer wants to extend the application while using another scraper service, he just needs to create the scraper service class and extend the scraper interface.

How The Application Works (the fun part)

The application has these major features:

  • Ability to search for a job: When you search you get the title, location and date posted. You have the ability to save a job if you're logged in.

  • Ability to save a job: You can save jobs to the database in order to view it later and apply when you're ready.

  • Ability to delete saved jobs

The application comes with a dashboard where you can see all the jobs you have saved and apply to them (or delete them) if you want.

Forgive my poor UI. Since we had very limited time, I was focused on functionality over UI.

Home page

image.png

Search Results

image.png Stack Overflow Jobs

image.png LinkedIn Jobs

Notice the button that says "Save Job" save job image.png

When you click that button, the job will be saved to the database and you'd be redirected to your dashboard.

However you have to be logged in to save a job.

image.png When you try to save a job without login in

Login Screen

image.png

Registration Screen

image.png

Dashboard Screen

image.png

Notice the button that says "Delete"

Delete button.png

When you click that button, you'd receive a prompt to confirm your action. Once you confirm it, the job will be deleted from the database.

image.png Delete confirmation screen

Live Website

You can view the live website here: https://scraper.explicitdevelopers.com/public/

Future Features

In case you want to extend the project, here are some features I think you can add: - A reminder feature that sends emails and/or text messages to the user periodically (depending on how they set it) to remind them to apply for the job.

- Pagination: Currently there is no pagination in the project, that'll be a nice thing to add.

- Advanced Search Filters: Adding more advanced search filters will also make the application more usable.

Conclusion

Overall I enjoyed creating this project. This is my first scraping project and it was really interesting to build.

I will probably redo the UI and add more features to the project as time goes on.

Thank you for being with me to the end. I hope you enjoyed reading this article.

If you liked this project, do give reactions to this article, give feedback and share this to your friends. Thanks for your valuable time.

 
Share this