Post

Automating Routine Tasks with Python + Google Cloud Platform + Line Bot

Creating a daily automatic check-in script using a check-in reward app as an example

Automating Routine Tasks with Python + Google Cloud Platform + Line Bot

ℹ️ℹ️ℹ️ The following content is translated by OpenAI.

Click here to view the original Chinese version. | 點此查看本文中文版


Automating Routine Tasks with Python+Google Cloud Platform+Line Bot

Using a check-in reward app as an example, we will create a daily automatic check-in script.

Photo by [Paweł Czerwiński](https://unsplash.com/@pawel_czerwinski?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText){:target="_blank"}

Photo by Paweł Czerwiński

Origin

I have always had a habit of using Python to create small tools; some are serious, like automatically scraping data and generating reports for work, while others are more casual, such as scheduling automatic checks for desired information or delegating tasks that I would normally perform manually to a script.

For a long time, I have been quite brute-force about automation, simply running a Python script on a computer. The advantage is that it’s simple and convenient, but the downside is that you need a device connected to the internet and power. Even a Raspberry Pi consumes a small amount of electricity and internet costs, and it can’t be remotely controlled to start or stop (it can be done, but it’s cumbersome). This time, taking advantage of a break at work, I researched free and cloud-based methods.

Goal

To run Python scripts in the cloud, execute them automatically on a schedule, and be able to start/stop them via the internet.

This article uses a clever trick I devised, writing an automatic check-in script for a check-in reward app that can check in for me daily without needing to open the app; it also sends me a notification after execution.

Completion Notification!

Completion Notification!

Chapter Order

  1. Using Proxyman for Man-in-the-Middle API sniffing
  2. Writing a Python script to spoof the app’s API request (simulating the check-in action)
  3. Deploying the Python script to Google Cloud
  4. Setting up automatic scheduling in Google Cloud
  • Due to the sensitivity of the topic, this article will not disclose which check-in reward app is being used; readers can extend the knowledge to their own use.
  • If you only want to understand how to automate execution with Python, you can skip the first half about Man-in-the-Middle API sniffing and start from Chapter 3.

Tools Used

  • Proxyman: Man-in-the-Middle API sniffing
  • Python: Writing scripts
  • Linebot: Sending notifications of script execution results to myself
  • Google Cloud Function: Hosting service for Python scripts
  • Google Cloud Scheduler: Automatic scheduling service

1. Using Proxyman for Man-in-the-Middle API Sniffing

Previously, I wrote an article titled “Apps Use HTTPS for Transmission, Yet Data Still Gets Stolen.” which has a similar principle, but this time I used Proxyman instead of mitmproxy; it’s also free but more user-friendly.

  • Download the Proxyman tool from the official website https://proxyman.io/.
  • After downloading, launch Proxyman and install the Root certificate (to perform the Man-in-the-Middle attack and unpack HTTPS traffic content).

“Certificate” -> “Install Certificate On this Mac” -> “Installed & Trusted”

After installing the Root certificate on the computer, switch to the phone:

“Certificate” -> “Install Certificate On iOS” -> “Physical Devices…”

Follow the instructions to set up the proxy on your phone and complete the certificate installation and activation.

  • Open the app you want to sniff API transmission content on your phone.

At this point, Proxyman on the Mac will display the sniffed traffic. Click on the device IP under the app API domain you want to view; the first time you check, you need to click “Enable only this domain” so that subsequent traffic can be unpacked.

After clicking “Enable only this domain,” you will see the intercepted traffic with the original Request and Response information:

We use this method to sniff which API endpoint is called and what data is sent when checking in on the app, recording this information to simulate requests directly with Python later.

⚠️ Note that some app tokens may change, causing the Python simulation requests to fail later; you need to understand how the app token exchange works.

⚠️ If you confirm that Proxyman is functioning normally but the app cannot make requests while Proxyman is running, it indicates that the app may have implemented SSL Pinning; currently, there is no solution, and you will have to abandon it.

⚠️ App developers wanting to know how to prevent sniffing can refer to the previous article.

Assuming we obtained the following information:

1
2
3
4
5
6
7
8
9
10
11
POST /usercenter HTTP/1.1
Host: zhgchg.li
Content-Type: application/x-www-form-urlencoded
Cookie: PHPSESSID=dafd27784f94904dd586d4ca19d8ae62
Connection: keep-alive
Accept: */*
User-Agent: (iPhone12,3;iOS 14.5)
Content-Length: 1076
Accept-Language: zh-tw
Accept-Encoding: gzip, deflate, br
AuthToken: 12345

2. Writing a Python Script to Spoof the App’s API Request (Simulating the Check-in Action)

Before writing the Python script, we can use Postman to debug the parameters and observe which parameters are necessary or may change over time; however, you can also directly copy them.

checkIn.py:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
import requests
import json

def main(args):
    results = {}
    try:
      data = { "action" : "checkIn" }
      headers = { "Cookie" : "PHPSESSID=dafd27784f94904dd586d4ca19d8ae62", 
      "AuthToken" : "12345",
      "User-Agent" : "(iPhone12,3;iOS 14.5)"
      }
      
      request = requests.post('https://zhgchg.li/usercenter', data = data, headers = headers)
      result = json.loads(request.content)
      if result['status_code'] == 200:
        return "CheckIn Success!"
      else:
        return result['message']
    except Exception as e:
      return str(e)

_⚠️ The main(args) here will be explained later; if you want to test locally, just call main(True). _

Using the Requests library to execute the HTTP Request, if you encounter:

1
ImportError: No module named requests

Please install the package first using pip install requests.

Adding Linebot Notification for Execution Results:

This part is quite simple; I only reference it to notify myself.

  • Choose “Create a Messaging API channel.”

Fill in the basic information on the next step and click “Create” to submit.

  • After creation, find the “Your user ID” section under the first “Basic settings” tab; this is your User ID.

  • After creation, select the “Messaging API” tab, scan the QR code to add the bot as a friend.

  • Scroll down to find the “Channel access token” section and click “Issue” to generate the token.

  • Copy the generated token; with this token, we can send messages to users.

With the User ID and Token, we can send messages to ourselves.

Since there are no other functions to implement, I didn’t even install the Python Line SDK; I just send HTTP requests directly.

After integrating with the previous Python script…

checkIn.py:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
import requests
import json

def main(args):
    results = {}
    try:
      data = { "action" : "checkIn" }
      headers = { "Cookie" : "PHPSESSID=dafd27784f94904dd586d4ca19d8ae62", 
      "AuthToken" : "12345",
      "User-Agent" : "(iPhone12,3;iOS 14.5)"
      }
      
      request = requests.post('https://zhgchg.li/usercenter', data = data, headers = headers)
      result = json.loads(request.content)
      if result['status_code'] == 200:
        sendLineNotification("CheckIn Success!")
        return "CheckIn Success!"
      else:
        sendLineNotification(result['message'])
        return result['message']
    except Exception as e:
      sendLineNotification(str(e))
      return str(e)
      
def sendLineNotification(message):
    data = {
        "to" : "insert your User ID here",
        "messages" : [
            {
                "type" : "text",
                "text" : message
            }
        ]
    }
    headers = {
        "Content-Type" : "application/json",
        "Authorization" : "insert channel access token here"
    }
    request = requests.post('https://api.line.me/v2/bot/message/push', json=data, headers=headers)

Let’s test to see if the notification was sent successfully:

Success!

As a side note, I initially intended to use Gmail SMTP to send notifications via email, but after deploying to Google Cloud, I found it couldn’t be used…

3. Deploying the Python Script to Google Cloud

Now that the basics are covered, we will officially move on to the main event of this article: deploying the Python script to the cloud.

Initially, I considered using Google Cloud Run, but found it too complex for my needs, which are quite simple; therefore, I opted for Google Cloud Function, a serverless solution commonly used to build serverless web services.

  • If you haven’t used Google Cloud before, please go to the Console to create a new project and set up billing information.
  • On the project console homepage, click “Cloud Functions” under resources.

  • At the top, select “Create Function.”

  • Enter the basic information.

⚠️ Make sure to note the “ Trigger URL“.

Available regions:

  • US-WEST1, US-CENTRAL1, US-EAST1 can enjoy free Cloud Storage service quotas.
  • asia-east2 (Hong Kong) is closer to us but incurs a small Cloud Storage fee.

⚠️ When creating Cloud Functions, Cloud Storage will be needed to host the code.

⚠️ For detailed pricing, please refer to the end of the article.

Trigger condition: HTTP

Authentication: Based on my needs, I want to be able to execute the script from an external link, so I choose “Allow unauthenticated invocations”; if authentication is required, the subsequent Scheduler service will also need corresponding settings.

Variables, networking, and advanced settings can be set in the variables section for Python to use (this way, if parameters change, you won’t need to modify the Python code):

How to call in Python:

1
2
3
4
import os

def main(request):
  return os.environ.get('test', 'DEFAULT VALUE')

No other settings need to be changed; just click “Save” -> “Next.”

  • Select the runtime as “Python 3.x” and paste the written Python script, changing the entry point to “main.”

Supplementing main(args), as mentioned earlier, this service is more suited for serverless web applications; thus, args is actually the Request object, from which you can retrieve HTTP GET query and HTTP POST body data, as follows:

1
2
To get GET Query information:
request_args = args.args

Example: ?name=zhgchgli => request_args = [“name”:”zhgchgli”]

1
2
To get POST Body data:
request_json = request.get_json(silent=True)

Example: name=zhgchgli => request_json = [“name”:”zhgchgli”]

If testing POST with Postman, remember to use “Raw+JSON” for POST data; otherwise, there will be nothing:

  • After the code section is okay, switch to “requirements.txt” and input the necessary package dependencies:

We use the “requests” package to help us call the API, which is not included in the native Python library; thus, we need to add it here:

1
requests>=2.25.1

You can specify the version as ≥ 2.25.1, or simply input requests to install the latest version.

  • Once everything is okay, click “Deploy” to start the deployment.

It will take about 1-3 minutes for the deployment to complete.

  • After deployment, you can execute the script using the “ Trigger URL “ noted earlier to check if it runs correctly, or use “Actions” -> “Test Function” to perform a test.

If you see 500 Internal Server Error, it means there is an error in the code. You can click on the name to view the “Logs” and find the reason:

1
UnboundLocalError: local variable 'db' referenced before assignment
  • After clicking on the name, you can also click “Edit” to modify the script content.

If testing is successful, we have successfully deployed the Python script to the cloud!

Supplement on Variables

According to our needs, we need a place to store and read the token for the check-in app; since the token may expire, we need to request it again and write it for use in the next execution.

To dynamically pass variables into the script from external sources, there are the following methods:

  • [Read Only] The previously mentioned runtime environment variables.
  • [Temp] Cloud Functions provides a /tmp directory for writing and reading files during execution, but it will be deleted after the execution ends; for details, please refer to the official documentation.
  • [Read Only] Sending data via GET/POST.
  • [Read Only] Including additional files.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
In the program, using the relative path `./` allows for reading, **but only for reading; dynamic modification is not possible**; to modify, you can only do so in the console and redeploy.

> _If you want to read and dynamically modify, you need to connect to other GCP services, such as: Cloud SQL, Google Storage, Firebase Cloud Firestore…_

- \[Read & Write\] Here, I chose Firebase Cloud _Firestore_ because currently, this option has a free quota available.

**According to the [Getting Started Steps](https://firebase.google.com/docs/firestore/quickstart#read_data){:target="_blank"}, after setting up the Firebase project; enter the Firebase backend:**

![](/assets/70a1409b149a/1*0DO31noJ4a3xweb1annbSQ.png)

In the left menu, find “ **Cloud Firestore** ” -> “ **Add Collection**![](/assets/70a1409b149a/1*7c9sA8ZbxE6uGh6f-nfiVA.png)

Enter the collection ID.

![](/assets/70a1409b149a/1*wcp94_25maNL9EoFJTOndA.png)

Enter the data content.

A collection can have multiple documents, and each document can have its own field content; it is very flexible to use.

**Using in Python:**

First, go to [GCP Console -> IAM & Admin -> Service Accounts](https://console.cloud.google.com/iam-admin/serviceaccounts){:target="_blank"}, and follow the steps below to download the authentication private key file:

First, select the account:

![](/assets/70a1409b149a/1*JeB9m4BWzfRCZSofHq2tLg.png)

At the bottom, “Add Key” -> “Create New Key”

![](/assets/70a1409b149a/1*xi9nQUy48-QlFI4BEdIMew.png)

Select “JSON” to download the file.

![](/assets/70a1409b149a/1*bsphvdEHgg0XDnHAHMXJvg.png)

Place this JSON file in the same project directory as your Python code.

**In the local development environment:**
```bash
pip install --upgrade firebase-admin

Install the firebase-admin package.

In Cloud Functions, you need to add firebase-admin to the requirements.txt.

Once the environment is set up, you can read the data we just added:

firebase_admin.py:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
import firebase_admin
from firebase_admin import credentials
from firebase_admin import firestore

if not firebase_admin._apps:
  cred = credentials.Certificate('./authentication.json')
  firebase_admin.initialize_app(cred)
# If you call initialize_app again, it will throw the following error
# providing an app name as the second argument. In most cases, you only need to call initialize_app() once. But if you do want to initialize multiple apps, pass a second argument to initialize_app() to give each app a unique name.
# So for safety, check if it has already been initialized before calling initialize_app.

db = firestore.client()
ref = db.collection(u'example')  # Collection name
stream = ref.stream()
for data in stream:
  print("id:" + data.id + "," + data.to_dict())

If you are on Cloud Functions, besides uploading the authentication JSON file, you can also change the connection syntax to use the following:

1
2
3
4
5
6
cred = credentials.ApplicationDefault()
firebase_admin.initialize_app(cred, {
  'projectId': project_id,
})

db = firestore.client()

If you see Failed to initialize a certificate credential., please check if the authentication JSON is correct.

For adding, deleting, and more operations, please refer to the official documentation.

4. Setting Up Automatic Scheduling in Google Cloud

Now that we have the script, we need to make it run automatically to achieve our ultimate goal.

  • Enter the basic job information.

Execution Frequency: Input format is the same as crontab. If you are not familiar with crontab syntax, you can directly use crontab.guru, this amazing website:

It can clearly translate the actual meaning of the syntax you set. (Click next to see the next execution time.)

Here, I set 15 1 * * *, because the sign-in only needs to run once a day, scheduled to run at 1:15 AM daily.

URL Section: Enter the previously noted “ Trigger URL

Timezone: Enter “Taiwan”, select Taipei Standard Time.

HTTP Method: According to the previous Python code, we can just use GET.

If you set “Authentication” earlier, remember to expand “SHOW MORE” to configure the authentication settings.

Once everything is filled out, click “ Create ”.

  • After successful creation, you can choose “Run Now” to test if it works properly.

  • You can check the execution results and the last execution date.

⚠️ Please note, execution results marked as “failed” only refer to web status codes between 400-500 or errors in the Python code.

Mission Accomplished!

We have achieved the goal of uploading the routine task Python script to the cloud and setting up automatic scheduling for it to run automatically.

Pricing Method

Another important aspect is the pricing method; Google Cloud and Linebot are not entirely free services, so understanding the billing method is crucial; otherwise, for a small script, paying too much money is not worth it compared to just running it on a computer.

Linebot

Refer to the official pricing information, which allows for 500 messages per month for free.

Google Cloud Functions

Refer to the official pricing information, which includes 2 million invocations per month, 400,000 GB-seconds, and 200,000 GHz-seconds of compute time, along with 5 GB of outbound internet traffic.

Google Firebase Cloud Firestore

Refer to the official pricing information, which includes 1 GB of storage, 10 GB of traffic per month, 50,000 reads, and 20,000 writes/deletes per day; this is sufficient for light usage!

Google Cloud Scheduler

Refer to the official pricing information, which allows for 3 free jobs to be set up per account.

For scripts, the above free usage is more than enough!

Google Cloud Storage Conditional Free Tier

Despite trying to avoid it, there are still services that may incur charges.

Once Cloud Functions are created, two Cloud Storage instances will be automatically created:

If you selected US-WEST1, US-CENTRAL1, or US-EAST1 for Cloud Functions, you can enjoy free usage quotas:

I chose US-CENTRAL1, and indeed, the first Cloud Storage instance is in US-CENTRAL1, but the second is labeled multiple regions in the US; I estimate that this will incur charges.

Refer to the official pricing information, which varies based on the region of the host.

The code is not large, but I estimate it should incur a minimum charge of 0.0X0 per month (?)

⚠️ The above information was recorded as of 2021/02/21; actual prices may vary, and this is for reference only.

Budget Control Notifications

Just in case… if there are indeed situations where the free usage exceeds and charges begin, I would like to receive notifications; to avoid being unaware of potential errors in the program that could cause unexpected billing amounts…

  • Go to the Console
  • Find the “ Billing ” card:

Click “ View Detailed Billing History ” to enter.

  • Expand the left menu and go to the “ Budgets & Alerts ” feature.

  • Click “ Create Budget ” at the top.

  • Enter a custom name.

Next step.

  • For the amount, enter the “ Target Amount ”, which can be $1 or $10; we don’t want to spend too much on small things.

Next step.

In the actions section, you can set notifications to trigger when the budget reaches a certain percentage.

CheckSend alerts via email to billing administrators and users ”, so that when the conditions are met, you can receive notifications immediately.

Click “Finish” to submit and save.

When the budget exceeds, we will know immediately to avoid incurring more costs.

Summary

Human energy is limited, and in today’s flood of technological information, every platform and service wants to extract our limited energy; if we can use some automation scripts to share our daily lives, little by little, we can save more energy to focus on what’s important!

Further Reading

If you have any questions or suggestions, feel free to contact me.

If you have any automation-related optimization needs, feel free to hire me, thank you.

```


This article was first published on Medium ➡️ Click Here

Automatically converted and synchronized using ZMediumToMarkdown and Medium-to-jekyll-starter.

Improve this page on Github.

Buy me a beer

13,492 Total Views
Last Statistics Date: 2025-03-25 | 13,025 Views on Medium.
This post is licensed under CC BY 4.0 by the author.