If youโre an information worker and serious about work, you HAVE to be using technology as your friend.
ย
Iโm not talking about social media. Iโm talking about actual software that you can use to run automations, save you time, and 10x your output. If you arenโt learning the skills to operate like this, youโre NGMI.
ย
Itโs never been easier to build things. You donโt need to learn computer science, you donโt need to break the bank on a degree, and you donโt need to continually outsource work to somebody else that can code.
ย
You need personal accountability and the ability to use logic.
ย
I built this, I canโt code. You have no excuses.
ย
Here are the steps I took to create this in a day so that you can do the same. If this is helpful, let me know, and Iโll start creating more walkthroughs like this.
ย
ย
I never liked manually sourcing companies to evaluate, having multiple browser windows open distracts me, and I like centralizing information.
ย
I figured that other investors have similar feelings, so I wanted to create a deal flow scraper that pulls in startup data from some of my favorite databases.
ย
That was the genesis of this idea, so I built this to test it.
I decided not to reinvent the wheel here.
ย
When I look for new startups, I essentially scan four different sources for information:
ย
ย
There are countless of other places to add, but these are the ones I focused on for the first version of this project.
ย
ย
Listing websites (for the most part) will only include a certain amount of information on their landing page (they want you to click in the find the rest of the information).
ย
To understand what you can pull from a website, you first have to understand what elements are actually included on the website. These elements live inside the HTML code of the website, and you can find this code by right-clicking on something within a page and clicking โInspect Elementโ (should also be able to press Command+Option+i on your Mac or F12 on your PC). From this view, youโll be able to see what HTML elements youโll be able to pull from each page. This includes things like a title, description, and URL.
ย
If youโre non-technical and HTML seems scary, itโs not. Take five minutes to read this overview, and youโll be good to go.
ย
We tested a couple of different pieces of software before landing on Simplescraper. Itโs a Chrome plugin, incredibly easy-to-use, and it matched our budget ($35/month for paid plan).
ย
Once we downloaded the plugin, it was pretty straightforward how to get information pulled in.
ย
Steps:
ย
Now youโve built a basic scraper, and every day the property data from the website you chose in step 1 will pull into the Google Sheet you created in step 12.
ย
Youโve done the hard work of setting up a system to consistently pull data. Now you want an easy way to make that data presentable.
ย
We recommend doing this by transferring your Google Sheet data over to Airtable, cleaning it up with add ons (optional), and then pulling that cleaned data into Softr.
ย
Hereโs how we recommend doing that.
ย
ย
What you need for this step:
ย
ย
If youโre familiar with Zapier and no-code automation, this step is straightforward. If you arenโt familiar with Zapier, theyโll help make this easy once you log into your new account.
ย
The logic is that whenever a new spreadsheet row is created in the Google Sheet for your scraper output, that record will be automatically synced to your Airtable base. When you log into Zapier, youโll need to create a new Zap. Hereโs what you input when instructed to do that:
ย
The trigger: New spreadsheet row created in Google Sheets
ย
The action: Create new record in Airtable
ย
Alternatively, you can just copy the Zap weโve already made (linked HERE).
ย
Some sites make it hard to scrape image data, so this is a workaround if presentability of your data matters to you. Skip to the next section if thatโs not you.
ย
What you need for this step:
ย
ย
The trigger:ย New record is created in Airtable
ย
The action #1:ย Generate screenshot URL from Urlbox
ย
ย
The action #2:ย Update record in Airtable
ย
ย
Alternatively, you can just copy the Zap weโve already made (linked HERE).
ย
What you need for this step:
ย
ย
If youโre still following along, youโve found a website to scrape, youโve developed a system to consistently scrape website data into a Google Sheet, and youโve transformed that data into something presentable by adding images if they werenโt otherwise available.
ย
Now comes the fun part to make all of that hard work look like a real application.
ย
We recommend using Softr for this step. We started using their software two months, and itโs given us superpowers.
ย
ย
After you create an account (they offer free trials), youโll need to select an existing template to get started.
ย
This is a matter of personal preference, but if you plan on building something similar to this, youโll want something with lists built in. Most of these are in the โResource Directoriesโ section of the templates.
ย
When you click into your new app, it will be filled with dummy data.
ย
To change this, youโll need to click into the list section within Softr, click into โDataโ on the right side of the screen, and connect to your Airtable base you built in the previous step.
ย
If you are using Softr for the first time, they will ask you for your Airtable API key to authenticate yourself first. If youโre struggling to get that set up, Softr has made a walkthrough explainer, and Iโve linked it here.
ย
ย
Almost done. Now you have the right data in Softr, and you just need to map it correctly.
ย
In the list section of Softr, go into โFeaturesโ and for each of the items fields, map to the correct data, re-name it show it has the correct label, and remove any unnecessary items.
ย
If you have a button or something similar and want to redirect people to a URL when they click into that button, you have two options:
ย
ย
Thatโs it! Youโve now taken a project from 0 to 1, and you have a tool you can use to automate the collection of whatever information you want.
ย
ย
This post is brought to you by:
ย
Softr: The easiest way to build professional web apps on top of Airtable
If youโre using Airtable to store data, you HAVE to layer Softr on top.
ย
Their software lets you turn your ugly databases into beautiful web apps. Weโve used Softr to build our investor directory, public roadmap, and the Signal Tracker that this newsletter walked through how to build.
ย
Get started with their free plan, and try out any of their paid plans at no cost for 30 days.