How To Execute Python Script As A Service On Linux & Windows

Scraping Robot
April 14, 2023
Community

Are you wondering how to run a Python script in the background while you do other things? You can do so easily on both Linux and Windows by running the Python program as a service. That way, the program/service will start once the device powers on and stop as it shuts down, or you manually stop it.

Table of Contents

Python remains the most popular programming language globally because of its easy syntax, but you’ll need a tutorial to run Python script as a service on Linux or Windows if you don’t have coding experience.

In this article, you’ll learn how to run Python code in the background on Linux and Windows. Once we’ve covered how to run a Python script for scraping data, we’ll show you how to do it the easy way!

Why Learn How To Run a Python Script in the Background?

Why Learn How To Run a Python Script in the Background?

One of the most frequent uses of Python is web scraping. Web scraping involves programs crawling websites (HTML code) to extract data. Businesses may want to collect data for research or competitor analysis, and they need to do it continuously. Many websites are dynamic and constantly update their data.

You can learn how to run a Python script as a service, but manually scraping data from such websites would require running the code repeatedly, which can be a hassle.

A better solution is to write a script for scraping data and run it in the background. You can specify the websites the program should visit, the data it should collect, and where it should store it.

You can continue to run other programs on your device and do other tasks while the scraping script does its job silently in the backdrop.

How To Run a Python Script in Linux

How To Run a Python Script in Linux

If you want to learn how to run a Python script in Linux distributions, you’ve come to the right place.

On Linux, you can run a Python script in the background using a terminal or the interface of any Linux distribution. The former is more straightforward, so let’s talk about how to run a Python file in a terminal.

Here’s how to run a Python script for scraping as a Linux daemon (service).

Step 1: Write a script for collecting HTML content of web pages

The first step is to write the script for parsing through the URLs of webpages and collect the HTML content. This script can run in a loop, refreshing the data every time. That way, it collects the latest data from the websites.

urls = [

‘url1’,

‘url2’,

]

 

index = 0

while True:

   url = urls[index % len(urls)]

   index += 1

 

  print(‘Scraping url’, url)

   response = requests.get(url)

Enter appropriate URLs in place of URL1, URL2, and so on.

Step 2: Parse data with the BeautifulSoup library

Once you have collected the HTML data from the URLs, it’s time to parse the data for valuable information. The BeautifulSoup Python library allows for data extraction from HTML and XML files. This step is much easier as you need only to use this library and save the information in JSON format:

soup = BeautifulSoup(response.content, ‘html.parser’)

Step 3: Handle shutdown requests

This is how to run a Python script without any disruptions: Add a class to ensure that the operating system lets an iteration complete its cycle before it is shut down.

For example:

class SignalHandler:

   shutdown_requested = False

 

   def __init__(self):

     signal.signal(signal.SIGINT, self.request_shutdown)

     signal.signal(signal.SIGTERM, self.request_shutdown)

 

   def request_shutdown(self, *args):

    print(‘Service stopping, shut down request received’)

     self.shutdown_requested = True

 

   def can_run(self):

    return not self.shutdown_requested

Running Linux Daemon with systemd

Now that you know how to run a Python script in the background using the terminal, let’s go over an easier alternative. It’s also possible to run a Python script in Linux distributions using the user interface. However, that approach is not feasible as it doesn’t ensure the script runs after the restart. A better way is to use systemd, which is a service manager.

Create a file with the service extension in the system directory, cd /etc/systemd/system.

Using any editor, write the code for running the Python script by defining the Python execute file. Define parameters for running the service, such as After, Restart, and ExecStart.

Then, run the daemon-reload command to refresh and tell the system about the new service. Start the service.

How To Run a Python Script in Windows

How To Run a Python Script in Windows

It’s not as simple or easy to run a Python script as a Windows service compared to Linux. If you want to run Python script from the command line, you’ll need to make changes to the code to define how the script executes based on input from the command line. You’ll also need to set properties.

It can be quite complicated. A better way to run Python code as a service in Windows is to use Non-Sucking Service Manager. NSSM is an application that lets Windows users create a service using any executable code or batch script. Windows operating system treats it as a service, which can be managed and run with the help of services.msc in Command Prompt or Services option in Task Manager.

The best part is that you don’t need to change your script. You’ll just need to download and install NSSM from the official website. Once you have the nssm.exe file downloaded, run it to install the application. It’s best to add the folder to your path environment variable.

To create and run the services, launch the Windows terminal. Change the directory to where your Python script exists.

Enter the commands to install the script and then run it. Here’s an example of a script saved as Python_BackgroundScript.exe:

nssm.exe install Python_BackgroundScript C:\Users\User\Username\dist\Python_BackgroundScript.exe

nssm.exe start Python_BackgroundScript

Now, check if the service is running successfully. You can look for it in Task Manager.

Should there be issues, you can direct the error output to a file using the following line of code in the terminal:

nssm set Python_BackgroundScript AppStderr C:\Users\User\Username\service-error.log

You don’t need to create and run the service in the Windows terminal manually. You can use simple commands with NSSM and get the Python code running in the background. Now, every time Windows runs, the Python program will also run automatically.

Thanks to NSSM, you don’t even have to learn how to run a Python script as a service manually.

An Easier Alternative To Running Python Scripts for Scraping

An Easier Alternative To Running Python Scripts for Scraping

You don’t need to be a coding expert to know how to run a Python script as a service. Instead of coding the script and using a terminal to run the service, you can use the Scraping Robot API.

A scraping API automatically collects website data and integrates it with other software through API calls. It automates the scraping process, eliminating the need to run the code manually whenever you need data from specific websites. All the data is organized, too.

Scraping Robot already has all the code for collecting data, so you don’t need to create your own. With a single API endpoint, you can send the HTTP request to collect data using the appropriate credentials. It’s simple, fast, and convenient, and makes scraping data easy for everyone.

Why use scraping robot?

Even if you’re confident you know how to run a Python script as a service or daemon for scraping, using a bot is a much better alternative. Here’s why.

Prebuilt and custom modules

Scraping Robot provides several models for parsing and scraping data that cater to different business needs. The prebuilt modules eliminate the need to define the kind of data you’re looking for.

Getting a custom module is also possible should you need more complex data. With these modules, you get the exact data you need from different websites. No more spending hours refining and filtering data.

Speedy automation

Scraping Robot scrapes data automatically based on the parameters defined. As a result, it can parse hundreds of web pages in minutes. Plus, little to no intervention is required as the program runs independently, collecting data and storing it in the appropriate location.

Easy to use

Not every business has a developer on their payroll. Scraping Robot API allows even those who aren’t tech-savvy to gather the data they need. As the API already has the code to execute the scraping bot, you don’t need to hire a developer or outsource one.

Try Scraping Robot Instead!

Try Scraping Robot Instead!

Now that you know how to run a Python script in Linux and Windows, you can see why Scraping Robot is the ideal solution. For both Linux and Windows, you must manually run the script as a service and add the necessary conditions to ensure it runs without disruption.

But even if you’re sure you know how to run a Python script in the background, errors are always a possibility. The Scraping Robot API provides a viable alternative. It’s cost-effective and convenient, ideal for eCommerce websites and companies that scrape websites to collect data on competitors and the market.

The information contained within this article, including information posted by official staff, guest-submitted material, message board postings, or other third-party material is presented solely for the purposes of education and furtherance of the knowledge of the reader. All trademarks used in this publication are hereby acknowledged as the property of their respective owners.