Skip to content

Slackbot using google cloud serverless functions

Slack bot using Google Cloud Functions to post a roundup of recently created channels

Srijan Choudhary
9 min read
Slackbot using google cloud serverless functions
Google Cloud Functions Slack Bot

Table of Contents

At my org, we wanted a simple Slack bot that posts a roundup of new channels created recently in the workspace to a channel. While writing this is easy enough, I wanted to do it using Google Cloud Functions with Python, trying to follow best practices as much as possible.

Here's how the overall flow will look like:

We want this roundup post triggered on some schedule (maybe daily), so the Cloud Scheduler is required to send an event to a Google Pub/Sub topic that triggers our cloud function, which queries the slack API to get channels details, filter recently created, and post it back to a slack channel. Secret Manager is used to securely store slack's bot token and signing secret.

Note that the credentials shown in any screenshots below are not valid.

Create the slack app

The first step will be to create the slack app. Go to https://api.slack.com and click on "Create an app". Choose "From scratch" in the first dialog; enter an app name and choose a workspace for your app in the second dialog. In the next screen, copy the "Signing Secret" from the "App Credentials" section and save it for later use.

Next, go to the "OAuth and Permissions" tab from the left sidebar, and scroll down to "Scopes" -> "Bot Token Scopes". Here, add the scopes:

  • channels:read: required to query public channels and find their creation times
  • chat:write: required to write to a channel (where the bot is invited)

Next, scroll up on the same screen and click "Install to Workspace" to install to your workspace. Click "Allow" in the next screen to allow the installation. Next, copy the "Bot User OAuth Token" from the "OAuth Tokens for Your Workspace" section on the same page and save it for later use.

💡
Keep track of the Bot User OAuth Token and Signing Secret you copied above.

Post to a Slack channel from a Google Cloud Function

Next, we will try to use the credentials copied above to enable a Google Cloud Function to send a message to a slack channel.

Google Cloud Basic Setup

We will use gcloud cli for the following sections, so install and initialize the Google Cloud CLI if not done yet. If you already have gcloud cli, run gcloud components update to update it to the latest version.

Create a new project for this if required, or choose an existing project, set it as default, and export the project id as a shell environment for using later. Also export the region you want to use.

export PROJECT_ID=slackbot-project
export REGION=us-central1

gcloud config set project ${PROJECT_ID}

You will have to enable billing for this project to be able to use some of the functionality we require.

You may also have to enable the Secret Manager, Cloud Functions, Cloud Build, Artifact Registry, and Logging APIs if this is the first time you're using Functions in this project. Note that some services like Secret Manager need billing to be setup before they can be enabled.

gcloud services enable --project slackbot-project \
        secretmanager.googleapis.com \
        cloudfunctions.googleapis.com \
        cloudbuild.googleapis.com \
        artifactregistry.googleapis.com \
        logging.googleapis.com

Create a service account

By default, Cloud Functions uses a default service account as its identity for function execution. These default service accounts have the Editor role, which allows them broad access to many Google Cloud services. Of course, this is not recommended for production, so we will create a new service account for this and grant it the minimum permissions that it requires.

SA_NAME=channelbot-sa
SA_EMAIL=${SA_NAME}@${PROJECT_ID}.iam.gserviceaccount.com

gcloud iam service-accounts create ${SA_NAME} \
    --description="Service Account for ChannelBot slackbot" \
    --display-name="ChannelBot SlackBot SA"

Store secrets and give permissions to service account

First, we need to store the secrets in Secret Manager.

 printf $SLACK_BOT_TOKEN | gcloud secrets create \
    channelbot-slack-bot-token --data-file=- \
    --project=${PROJECT_ID} \
    --replication-policy=user-managed \
    --locations=${REGION}
    
 printf $SLACK_SIGNING_SECRET | gcloud secrets create \
    channelbot-slack-signing-secret --data-file=- \
    --project=${PROJECT_ID} \
    --replication-policy=user-managed \
    --locations=${REGION}

And give our service account the roles/secretmanager.secretAccessor role on these secrets.

gcloud secrets add-iam-policy-binding \
    projects/${PROJECT_ID}/secrets/channelbot-slack-bot-token \
    --member serviceAccount:${SA_EMAIL} \
    --role roles/secretmanager.secretAccessor

gcloud secrets add-iam-policy-binding \
    projects/${PROJECT_ID}/secrets/channelbot-slack-signing-secret \
    --member serviceAccount:${SA_EMAIL} \
    --role roles/secretmanager.secretAccessor

Create and deploy the function

Here's a simple HTTP function that sends a message to slack on any HTTP call:

import functions_framework
from slack_bolt import App

# process_before_response must be True when running on FaaS
app = App(process_before_response=True)

print('Function has started')

@functions_framework.http
def send_to_slack(request):
    print('send_to_slack triggered')
    channel = '#general'
    text = 'Hello from Google Cloud Functions!'
    app.client.chat_postMessage(channel=channel, text=text)
    return 'Sent to slack!'
src-v1/main.py
functions-framework
slack_bolt
src-v1/requirements.txt

Assuming main.py and requirements.txt are present in src-v1 folder, deploy using:

gcloud beta functions deploy channelbot-send-to-slack \
    --gen2 \
    --runtime python310 \
    --project=${PROJECT_ID} \
    --service-account=${SA_EMAIL} \
    --source ./src-v1 \
    --entry-point send_to_slack \
    --trigger-http \
    --allow-unauthenticated \
    --region ${REGION} \
    --memory=128MiB \
    --min-instances=0 \
    --max-instances=1 \
    --set-secrets 'SLACK_BOT_TOKEN=channelbot-slack-bot-token:latest,SLACK_SIGNING_SECRET=channelbot-slack-signing-secret:latest' \
    --timeout 60s
💡
We're using --allow-unauthenticated here just to test it out. It will be removed in later sections.

Test it out

Once the deployment is complete, we can view the function logs using:

gcloud beta functions logs read channelbot-send-to-slack \
	--project ${PROJECT_ID} --gen2

If everything was successful above, once of the recent log statements should say: Function has started.

Next, add the bot to the #general channel using /invite @ChannelBot in the general channel on your slack workspace.

Next, find the service endpoint using:

gcloud functions describe channelbot-send-to-slack \
    --project ${PROJECT_ID} \
    --gen2 \
    --region ${REGION} \
    --format "value(serviceConfig.uri)"

This will give a URL like https://channelbot-send-to-slack-ga6Ofi9to0-uc.a.run.app.

To trigger the channel post, just do curl ${SERVICE_URL}. This should result in a test message from ChannelBot to the #general channel.

ChannelBot message from Google Cloud Functions

Trigger via Google Pub/Sub

Now, instead of an unauthenticated HTTP trigger, we would like to trigger this via Google Pub/Sub. We would also like to pass the channel name and the message to post in the event.

Google Pub/Sub basics

Pub/Sub enables you to create systems of event producers and consumers, called publishers and subscribers. Publishers communicate with subscribers asynchronously by broadcasting events. Some core concepts:

  • Topic. A named resource to which messages are sent by publishers.
  • Subscription. A named resource representing the stream of messages from a single, specific topic, to be delivered to the subscribing application.
  • Message. The combination of data and (optional) attributes that a publisher sends to a topic and is eventually delivered to subscribers.
  • Publisher. An application that creates and sends messages to a single or multiple topics.

In this section, we will create a topic, create a subscription for our cloud function to listen to messages to that topic, and produce messages manually to that topic using gcloud cli. The message will contain the channel name and message to post, and the cloud function will post that message to the specified slack channel.

Create pub/sub topic

First, we need to create a topic.

export PUBSUB_TOPIC=channelbot-pubsub
gcloud pubsub topics create ${PUBSUB_TOPIC} \
	--project ${PROJECT_ID}

Grant permissions to the service account

Next, we need to give the roles/pubsub.editor role to the service account we're using for the function execution so that it can create a subscription to this pub/sub topic.

gcloud pubsub topics add-iam-policy-binding ${PUBSUB_TOPIC} \
	--project ${PROJECT_ID} \
    --member serviceAccount:${SA_EMAIL} \
    --role roles/pubsub.editor

Update the function code

Here's the main.py we'll need to listen to pub/sub events, extract channel and text, and sent it to slack:

import base64
import json
import functions_framework
from slack_bolt import App

# process_before_response must be True when running on FaaS
app = App(process_before_response=True)

print('Function has started')

# Triggered from a message on a Cloud Pub/Sub topic.
@functions_framework.cloud_event
def pubsub_handler(cloud_event):
    try:
        data = base64.b64decode(
            cloud_event.data["message"]["data"]).decode()
        print("Received from pub/sub: %s" % data)
        event_data = json.loads(data)
        channel = event_data["channel"]
        text = event_data["text"]
        app.client.chat_postMessage(channel=channel, text=text)
    except Exception as E:
        print("Error decoding message: %s" % E)
src-v2/main.py

Before deploying, we also need to enable the Eventarc API in this project.

gcloud services enable --project ${PROJECT_ID} \
	eventarc.googleapis.com

Deploy and Test

Now, there's a slightly modified version of the deploy command to deploy this:

gcloud beta functions deploy channelbot-send-to-slack \
    --gen2 \
    --runtime python310 \
    --project ${PROJECT_ID} \
    --service-account ${SA_EMAIL} \
    --source ./src-v2 \
    --entry-point pubsub_handler \
    --trigger-topic ${PUBSUB_TOPIC} \
    --region ${REGION} \
    --memory 128MiB \
    --min-instances 0 \
    --max-instances 1 \
    --set-secrets 'SLACK_BOT_TOKEN=channelbot-slack-bot-token:latest,SLACK_SIGNING_SECRET=channelbot-slack-signing-secret:latest' \
    --timeout 60s

The main changes are:

  • Changed entry-point to the new function pubsub_handler
  • Replaced --trigger-http with --trigger-topic
  • Removed --allow-unauthenticated

Before sending a pub/sub message, we will also need to give the roles/run.invoker role to our service account to be able to trigger our newly deployed function.

gcloud run services add-iam-policy-binding channelbot-send-to-slack \
    --project ${PROJECT_ID} \
    --region ${REGION} \
    --member=serviceAccount:${SA_EMAIL} \
    --role=roles/run.invoker

To test this out, we can send a pub/sub message using gcloud cli:

gcloud pubsub topics publish ${PUBSUB_TOPIC} \
	--project ${PROJECT_ID} \
	--message '{"channel": "#general", "text": "Hello from Cloud Pub/Sub!"}'
ChannelBot message via pub/sub

Post new channels roundup using cloud scheduler

Manually post recently created channels

Now that we have gained the capability to trigger a message from pub/sub to slack, we can add some logic to fetch the recently created channels from slack and post it as a message on this trigger.

Here's the modified main.py to do this:

import base64
import json
import time
import functions_framework
from slack_bolt import App

# process_before_response must be True when running on FaaS
app = App(process_before_response=True)

print('Function has started')

# Triggered from a message on a Cloud Pub/Sub topic.
@functions_framework.cloud_event
def pubsub_handler(cloud_event):
    try:
        data = base64.b64decode(
            cloud_event.data["message"]["data"]).decode()
        print("Received from pub/sub: %s" % data)
        event_data = json.loads(data)
        max_days = event_data["max_days"] # Max age of channels
        channel = event_data["channel"]
        recent_channels = get_recent_channels(app, max_days)
        if len(recent_channels) > 0:
            blocks, text = format_channels(recent_channels, max_days)
            app.client.chat_postMessage(channel=channel, text=text,
                                        blocks=blocks)
        else:
            print("No recent channels")
    except Exception as E:
        print("Error decoding message: %s" % E)


def get_recent_channels(app, max_days):
    max_age_s = max_days * 24 * 60 * 60
    result = app.client.conversations_list()
    all = result["channels"]
    now = time.time()
    return [ c for c in all if (now - c["created"] <= max_age_s) ]

def format_channels(channels, max_days):
    text = ("%s channels created in the last %s day(s):" %
            (len(channels), max_days))
    blocks = [{
        "type": "header",
        "text": {
            "type": "plain_text",
            "text": text
        }
    }]
    summary = ""
    for c in channels:
        summary += "\n*<#%s>*: %s" % (c["id"], c["purpose"]["value"])
    blocks.append({
        "type": "section",
        "text": {
            "type": "mrkdwn",
            "text": summary
        }
    })
    return blocks, text
src-v3/main.py

After deploying this with the same command above (just change --source ./src-v2 to --source ./src-v3), we can send a pub/sub event to trigger it:

gcloud pubsub topics publish ${PUBSUB_TOPIC} \
	--project ${PROJECT_ID} \
	--message '{"channel": "#general", "max_days": 7}'

And it posts a message like this:

Recently created channels posted by ChannelBot

Create schedule

Next, we want to periodically schedule this message. For this, we will configure a cron job in Google Cloud Scheduler to send a Pub/Sub event with the required parameters periodically.

Before we create a schedule, we will have to enable the Cloud Scheduler API

gcloud services enable --project ${PROJECT_ID} \
	cloudscheduler.googleapis.com

To schedule the Pub/Sub trigger at 1600 hours UTC time every day:

gcloud scheduler jobs create pubsub channelbot-job \
	--project ${PROJECT_ID} \
	--location ${REGION} \
	--schedule "0 16 * * *" \
	--time-zone "UTC" \
	--topic ${PUBSUB_TOPIC} \
	--message-body '{"channel": "#general", "max_days": 1}'

After this, a Pub/Sub event should be fired to the channelbot-pubsub topic every day, which should result in a slack message to #general with a list of channels created in the last day.

Closing Thoughts

Full code samples for this can be found in this github repo. I've also included a Makefile with targets split into sections associated with the different steps in this post.

I also plan to follow this up with a part 2 where we will use slack's slash commands to allow the end-user of this bot to setup the channel and frequency of posting of the recent channels list, and even configure multiple schedules. Please comment below if this is something you will be interested in.

development cloud slack

Related Posts

Download a file securely from GCS on an untrusted system

Download files from google cloud storage using temporary credentials or time-limited access URLs

Erlang: Dialyzer HTML Reports using rebar3

How I made a custom rebar3 plugin to generate HTML reports for dialyzer warnings

Running multiple emacs daemons

Run multiple emacs daemons for different purposes and set different themes/config based on daemon name