I've been learning python over lockdown because getting stuck into a programming language has always been a goal of mine, and python has always seemed both accessible and versatile; the swiss-army language.

I've also recently moved into a new flat, and have been sourcing odds and ends to furnish it. I'm a big fan of Freecycle.org, which encourages people to donate unwanted items for free as a sort of pay-it-forward experiment in generosity. I thought that for a few key items like a table i would try and get it from freecycle. I want a fairly standard folding table for a small flat and ikea is miles away, so the better solution is to use tech to get one for free - obvious right?

Pain points:

No 1 - Free stuff is claimed fast

As posts are made, anyone is free to get in touch with the poster and ask for the free thing.

I therefore want to have a way to find out not long after the post is made, so I can go straight to the site and claim the item. Checking every day is laborious, and leads to a strange desire to accumulate useless free stuff (6 bath plugs of mixed sizes? Count me in!)

No 2 - The site looks like it was built, well, for free. In 2002.

The site, though lovely in concept, is a pain to navigate and checking every day pains the UX guy in me. I want a solution that minimises wasted time checking.

At least the page is pretty minimal

At least the page is pretty minimal

I love the colour scheme

I love the colour scheme

So the site won't let me do anything clever to set up alerts - in fact the search function feels like the most advanced feature by far - so it's time to DIY something.

Python Time

I imagine that python can be used to scrape the freecycle website for me on a regular basis, and filter based on a keyword - say "table" or "folding table" - and then alert me somehow. This means the three key parts will be:

Imports

All good python starts with grabbing a couple of cool modules, otherwise is's basically just maths.

I'm aware that python has a great web scraping module; BeautifulSoup, which is where I'll start.

A quick google lets me know that to do any of this kind of scraping, I'll also need to import the Requests module, which is handy.

Lastly, I need a way for python to tell me if it has found something. I briefly considered trying to connect up something with Webhooks so that an alert could come through to IFTTT, but that instinctively felt complicated, and I soon came across a simpler version: email. Python's smtplib allows you to connect to a gmail account and send an email - perfect for my use case and hopefully simple as it can be.

So our first code looks like this:

#imports
from bs4 import BeautifulSoup
import requests
import smtplib

Note: BeautifulSoup is third party for python which means two things: