Search
Twitter
LinkedIn
Reddit
WhatsApp
Email

5 Steps to Building a No-Code Product: How We Built a Web App Without Writing a Line of Code

Automation > everything

If you’re an information worker and serious about work, you HAVE to be using technology as your friend.

Β 

I’m not talking about social media. I’m talking about actual software that you can use to run automations, save you time, and 10x your output. If you aren’t learning the skills to operate like this, you’re NGMI.

Β 

It’s never been easier to build things. You don’t need to learn computer science, you don’t need to break the bank on a degree, and you don’t need to continually outsource work to somebody else that can code.

Β 

You need personal accountability and the ability to use logic.

Β 

I built this, I can’t code. You have no excuses.

Β 

Here are the steps I took to create this in a day so that you can do the same. If this is helpful, let me know, and I’ll start creating more walkthroughs like this.

Β 

Tools you need

Β 

Ideation and background

I never liked manually sourcing companies to evaluate, having multiple browser windows open distracts me, and I like centralizing information.

Β 

I figured that other investors have similar feelings, so I wanted to create a deal flow scraper that pulls in startup data from some of my favorite databases.

Β 

That was the genesis of this idea, so I built this to test it.

Β 

Step 1: determine sources of information

I decided not to reinvent the wheel here.

Β 

When I look for new startups, I essentially scan four different sources for information:

Β 

Β 

There are countless of other places to add, but these are the ones I focused on for the first version of this project.

Β 

Step 2: find what you want to gather from listing websites

Β 

Listing websites (for the most part) will only include a certain amount of information on their landing page (they want you to click in the find the rest of the information).

Β 

To understand what you can pull from a website, you first have to understand what elements are actually included on the website. These elements live inside the HTML code of the website, and you can find this code by right-clicking on something within a page and clicking ‘Inspect Element’ (should also be able to press Command+Option+i on your Mac or F12 on your PC). From this view, you’ll be able to see what HTML elements you’ll be able to pull from each page. This includes things like a title, description, and URL.

Β 

If you’re non-technical and HTML seems scary, it’s not. Take five minutes to read this overview, and you’ll be good to go.

Β 

Step 3: scrape a website without coding

We tested a couple of different pieces of software before landing on Simplescraper. It’s a Chrome plugin, incredibly easy-to-use, and it matched our budget ($35/month for paid plan).

Β 

Once we downloaded the plugin, it was pretty straightforward how to get information pulled in.

Β 

Steps:

  1. Go to the site you want to scrape (using Google Chrome as browser).
  2. Click into Simplescraper plugin and select β€˜Scrape this website’.
  3. Create property and give it a name.
  4. Hover over the property you want to scrape into Google Sheets.
  5. Click into the property, review previewed results, then click the check mark.
  6. Once you’re done scraping properties, click β€˜View results’.
  7. After being redirected to the results page, click β€˜Save recipe’.
  8. Give your recipe a name, make sure all of the information is correct, then click into show advanced options.
  9. Schedule scraper to run once a day.
  10. Keep the rest of the settings untouched.
  11. Create recipe and go click into your new recipe shown on the left hand side of your screen.
  12. Click into β€˜Integrate’ then toggle on Google Sheets.

Β 

Now you’ve built a basic scraper, and every day the property data from the website you chose in step 1 will pull into the Google Sheet you created in step 12.

Β 

Step 4: transform your data

You’ve done the hard work of setting up a system to consistently pull data. Now you want an easy way to make that data presentable.

Β 

We recommend doing this by transferring your Google Sheet data over to Airtable, cleaning it up with add ons (optional), and then pulling that cleaned data into Softr.

Β 

Here’s how we recommend doing that.

Β 

Google Sheets ➑️ Airtable

Β 

What you need for this step:

Β 

  • An Airtable base with the columns mapped to match your Google Sheet
  • A Zapier subscription (if you aren’t already using Zapier, hopefully this changes that)

Β 

If you’re familiar with Zapier and no-code automation, this step is straightforward. If you aren’t familiar with Zapier, they’ll help make this easy once you log into your new account.

Β 

The logic is that whenever a new spreadsheet row is created in the Google Sheet for your scraper output, that record will be automatically synced to your Airtable base. When you log into Zapier, you’ll need to create a new Zap. Here’s what you input when instructed to do that:

Β 

The trigger: New spreadsheet row created in Google Sheets

Β 

The action: Create new record in Airtable

Β 

Alternatively, you can just copy the Zap we’ve already made (linked HERE).

Β 

Adding images (optional)

Some sites make it hard to scrape image data, so this is a workaround if presentability of your data matters to you. Skip to the next section if that’s not you.

Β 

What you need for this step:

Β 

  • A subscription to Urlbox
  • Create a new column in your Airtable base to show images. Make this column an β€˜Attachment’ field.

Β 

The trigger:Β New record is created in Airtable

Β 

The action #1:Β Generate screenshot URL from Urlbox

Β 

  • Output file type: PNG
  • Viewport Width: 320
  • Viewport Height: 600
  • Hide cookie banners: true
  • Retina: false

Β 

The action #2:Β Update record in Airtable

Β 

  • Update the empty product image field to now include the screenshot URL created by Urlbox

Β 

Alternatively, you can just copy the Zap we’ve already made (linked HERE).

Β 

Step 5: putting a front end on your cleaned data

What you need for this step:

Β 

Β 

If you’re still following along, you’ve found a website to scrape, you’ve developed a system to consistently scrape website data into a Google Sheet, and you’ve transformed that data into something presentable by adding images if they weren’t otherwise available.

Β 

Now comes the fun part to make all of that hard work look like a real application.

Β 

We recommend using Softr for this step. We started using their software two months, and it’s given us superpowers.

Β 

Finding a template

Β 

After you create an account (they offer free trials), you’ll need to select an existing template to get started.

Β 

This is a matter of personal preference, but if you plan on building something similar to this, you’ll want something with lists built in. Most of these are in the β€˜Resource Directories’ section of the templates.

Β 

Connecting Softr to Airtable

When you click into your new app, it will be filled with dummy data.

Β 

To change this, you’ll need to click into the list section within Softr, click into β€˜Data’ on the right side of the screen, and connect to your Airtable base you built in the previous step.

Β 

If you are using Softr for the first time, they will ask you for your Airtable API key to authenticate yourself first. If you’re struggling to get that set up, Softr has made a walkthrough explainer, and I’ve linked it here.

Β 

Mapping your data in Softr

Β 

Almost done. Now you have the right data in Softr, and you just need to map it correctly.

Β 

In the list section of Softr, go into β€˜Features’ and for each of the items fields, map to the correct data, re-name it show it has the correct label, and remove any unnecessary items.

Β 

If you have a button or something similar and want to redirect people to a URL when they click into that button, you have two options:

Β 

  • You can reroute to whatever URL you are capturing in your database if you are scraping this as a property. This is the easy option.
  • You can build a separate page in Softr that let’s you click into any company and see more information about that specific company. This is the harder option, but it is worth doing if you want whoever your users are to stick around linger (versus exiting to the company URL).

Β 

That’s it! You’ve now taken a project from 0 to 1, and you have a tool you can use to automate the collection of whatever information you want.

Β 


Β 

This post is brought to you by:

Β 

Softr: The easiest way to build professional web apps on top of Airtable

If you’re using Airtable to store data, you HAVE to layer Softr on top.

Β 

Their software lets you turn your ugly databases into beautiful web apps. We’ve used Softr to build our investor directory, public roadmap, and the Signal Tracker that this newsletter walked through how to build.

Β 

Get started with their free plan, and try out any of their paid plans at no cost for 30 days.

Our best VC resources … FREE