Subjectio’s Digital Tool is an MIT license open source project, written in a Python 3.6 Flask framework.
The application has been programmed with Twitter best practices in mind, ensuring complete compliance and fair use of services.

MongoDB combined with Mongoengine drives the database allowing Subjectio’s digital tool working in persistent mode to post replies to tweets from a Job Queue

The application is composed of two coupled parts:

1. A control panel, written in Flask framework managing Payloads.

2. A Daemon type part processing operations posted in a Job Queue and making calls to an Twitter Application Programming Interface (API), using Python library Tweepy. A Unix type operating system is required to run the application. Operating System Linux Debian is recommended.

Subjectio’s digital tool uses a database to record the acquired and processed information. Found Tweets matching the search queries embedded in the Payloads are saved in a special data structure (collection) called Found Tweets. When a new tweet is found to trigger a reply, the tweet is added to the job collection named Job Queue. The Job Queue associates datas from the Payloads and Found Tweets, and submit additional information, for example, whether the job has already been processed and, if completed, when.

Documentation
FK = Foreign Key. A Foreign Key marks unique values from other tables / collections in database schemas. 1/1
Documentation
1/1
The user logs in in a web browser into the control panel and adds a new Payload on the “Payloads” page. A Payload contains a Twitter search query, including advanced Twitter search features. In addition to the query, a reply made of a text and a photograph is uploaded. The photograph’s resolution is lowered in order to be sent faster by the Twitter Application Programming Interface (API).
Documentation
1/3
The user activates a Payload by switching the “on/off” button to “on” on the Playloads page. An active payload switched to "on" is then processed by the system. Active Payloads are constantly taken into account when searching for new tweets. A Payload can be deactivated by switching the “on/off” button to “off” on the Payload page. Subjectio’s Digital Tool stops sending replies and searching for Tweets. The application works with multiple Payloads activated at the same time.
2/3
A Payload can be added with the button “Add Payload” on the top right of the application’s Payload page. All Payloads are displayed on the Payload page. All Payloads can be activated or deactivated with the “On/Off'' button. The number of replies sent to tweets with @subjectio_relay is displayed in the column “Sent”. An URL address written in a Payload counts as 18 characters for a maximum of 280 characters. The system takes the characters count into account in order to properly post the reply. Deactivated Payloads can be removed by clicking on the bin icon.
3/3
The Daemon elements of the application operate periodically through the list of activated Payloads, asking Twitter to provide results matching queries. A Job Queue is created with tweets matching search queries. Subjectio’s digital tool processes the Job Queue and sends replies with @subjectio_reply, an API Twitter account. The process is kept within Twitter's API hourly post limits. Replies are sent every 36 seconds, a timing based on the use of Twitter’s developer account. The user can monitor the current API limits on the "API" page. The API page offers the option to cancel interactions with Twitter in case of emergency such as mistakes in contents.
Documentation
1/3
The Tweepy Python Library is responsible for interactions with the Twitter API. In case of attempts to exceed the API limit, it neutralizes the application until the limit is unlocked. The loop processing Twitter queries is restrained by separate Twitter search limits, done every 80 seconds. The default configuration of the application allows a maximum of 800 tweets from the last 7 days for a first advanced search of a new Payload. 7 days is a maximum for free Twitter developer accounts.
Logging events 2/3
Each response is archived for an easy localization in case of unexpected events. A full audit log is required. That includes the system's Daemon elements and the control panel. Logs are collected on the server in the “./logs” directory.
Error handling 3/3
Error handling is implemented as the application constantly handles various adversities. When the system receives a Twitter JavaScript Object Notation, the task is marked as completed to prevent a second processing attempt in the Job Queue. [{'code': 433, 'message': 'The original Tweet author restricted who can reply to this Tweet.'}] [{'code': 385, 'message': 'You attempted to reply to a Tweet that is deleted or not visible to you.'}]
Documentation
Example of the log file on the Monitor page: 1/1
---------- Bot iteration nr 29, started: 2020-09-17 13:03:12.468679 ----------- [PROCESS_QUEUE] processing JobQueue id: 5f6de15cb806fda66462406a [WAIT] Now I need a rest for 36 seconds before I can send next tweet [PROCESS_QUEUE] Preparing to send comment Nº 188 / 386 [API] Comment will be sent for tweet ID: 1309416668637138948 WITH TEXT: The OSCE Office for Democratic Institutions and Human Rights has launched an observation mission in Moldova to monitor the presidential elections. Moldova, Chisinau, 2016/10 : An election worker during the 1st direct presidential elections © @subjectio2020 subjectio.org WITH PHOTO: ./static/uploaded/9afb16d5-d52a-460c 962f-6b5c7092542d.JPG [API] Comment SENT! [QUEUE] New ID for sent tweet: 1309497943574093824 [QUEUE] Payload sent at: 2020-09-25 16:20:13.972189 [QUEUE] From now this task is marked as done.

The full documentation and code can be found on GitHub.

GitHub: Subjectio's digital storytelling tool