Inside Webmo — Making a simple Twitter Bot with Google Sheets and Zapier

Last month I introduced my Twitter bot to track the adoption of Web Monetization across websites on the internet.

I was really happy to finalize this little project after it provided me with solid headache for a short while, but I really wanted to find a way to make it work.

As requested by others and promised by me, I wanted to share a look under the hood of Webmo (that's his name now) and how it all works.

You'll see, it actually isn't that complicated, I just had to put the right pieces together.

Finding the Data

Everything started with discovering a website called PublicWWW. It's a search engine, that crawls the source code of websites and indexes it, allowing users of the platform to look for specific code snippets and to find out on which websites these occur.

Since web monetization is integrated via a simple meta tag on a website, all I had to do was to find the right query to look for exactly this code snippet.

Then the results will return me a list of websites, sorted by page rank, and also a total number of websites that have the web monetization meta tag integrated.

You can simply run that query from above on your own, by clicking here.

There is a small pitfall coming with my query though, as in theory websites can also integrate the meta tag with the “name” and “content” parameters flipped around, which would then not be recognized by this query anymore. But since most websites keep this consistent syntax, this is sufficient enough for me.

Collecting the Data

The total number of websites was what I was interested in for that project.

I knew that I wanted to create a chart in the end that gets posted to Twitter on a daily basis, so I needed to find a way to pull the data into a Google sheet.

I spent quite some time to figure out a way to pull the total number of websites into the Google sheet — fiddeling around with the basic API access, CSV downloads, etc. — but nothing seemed to work. I was already about to give up, considering to just write the number manually into my spreadsheet on a daily basis (oh god).

But then it just turned out to be just another lesson teaching that sometimes the solution is much simpler than we think it is.

I knew that Google sheets has a built-in function to read static website content — the =IMPORTXML function.

Combining this function with the URL from PublicWWW from above and a simple format selector, allowed me to get what I needed.

=IMPORTXML("https://publicwww.com/websites/%22meta+name%3D%5C%22monetization%5C%22+content%3D%5C%22%24%22/","//h4")

You can copy this function into any Google sheet to see it work.

The *//h4* selector comes from this piece of text on the PublicWWW website being formatted as a fourth level heading. So all that function does is pull any heading from the website that is tagged as an h4. And the line showing the total number of websites is apparently the only one of that kind. Lucky me.

I'm honestly amazed that it works well like this, since actually the total number is not static content and is rendered via JavaScript on each user query, which I read is a big obstacle for an =IMPORTXML function. But hey, it works.

From the moment I was able to pull the number into my Google sheet, I knew that I would make it work. The most important part was done.

Below you can see data import on the left.

URL > Selector > Result > Adjusted (I only need the number).

The next step I needed to get done was to automatically fetch the data once a day and collect it into a list, from which I could make a chart that keeps evolving over time.

Therefore, I wrote a script with the Google Apps Script editor. It took me a few days and iterations to figure it out, as I didn't really work with this stuff before, but I got the hang of it eventually. It looks more complicated than what it actually does.

When it runs, the script adds a random number queryString to the end of the defined URL from above and updates the =IMPORTXML function. This forces Google sheets to visit the website again and make a new data fetch each time the script is running. Otherwise Google sheets just gets lazy and uses the cached data, returning the same result again and again.

Then the new result gets added to the top of the list and the current date and time are added as additional information. That's basically it.

The next step was to make the script run each day on its own, without me triggering it manually. Luckily, there's exactly such function available in the Google Apps Script dashboard. All I had to do was select my new script and set the timer to daily.

Making the Chart

This was the easiest part. I guess most people have created a chart in Excel before, so it's really the same with a Google sheet. Once the data is there, the chart is just a few clicks away.

The only thing I didn't know yet was how to get the image out of the spreadsheet and into an automated Tweet each day.

Fortunately, Google delivers again on that part. I figured out that you can simply publish a chart as an interactive or image version right from Google sheet. You can find this option when clicking on a chart and then on the menu in the top right.

This functionality allowed me to get this link, which always shows the latest version of the chart in an image format. So convenient for what I needed!

Pushing the Chart to Twitter

The last and most important step was of course to get the chart to Twitter, because that's the whole point of this bot — to raise awareness by tracking the adoption of Web Monetization.

Everyone loves charts and it can be a great tool to make people curious about something, especially when it is a chart that keeps on growing.

Enter Zapier. This great tool allows to combine different applications to automate workflows very easily, often without any coding knowledge required.

In this case, firstly I needed to connect my Google sheet and Webmo's Twitter account with Zapier.

Then I had to define upon what trigger Zapier would run this task. I told the tool that it should run the task whenever there was a change in a particular cell in the Google sheet, which would always change when the script is running.

Lastly, I needed to format the tweet that gets published each day with the dynamic data entries. In this case that's the total number, the weekday, the absolute and relative 7 day difference, and the date.

To attach the image of the chart, I simply took the link provided by my Google sheet.

That's everything required by Zapier to do exactly what I want it to do day after day.

And since Zapier has a free plan, which allows me to run 100 tasks a month for free, the whole Twitter bot is running at no cost for me.

The result is a tweet like the one below, being posted on a daily basis through Webmo's Twitter account. Make sure to follow it!

That wraps up my little guide for this Twitter bot project. It was a lot of fun figuring it out, even though it took up more time than I had anticipated. It felt like trying to put a puzzle together.

But I'm very happy with the outcome and look forward to see that chart values increase over the coming weeks and months ahead! 📈

As I mentioned in my previous post — momentum is building!

💻 Check out my Blog

🔔 Follow me on Coil

📺 See my videos on Cinnamon

📺 Subscribe on YouTube

🐦 Reach out on Twitter

As a Coil subscriber, you can see the magic of my script and Zapier happen in a video recording I took of the whole process happening. 👇

Continue reading with a Coil membership.