Based in the RSS feed generated by websites, feed a Twitter account with daily new blog posts with Bash and Python.
This method is specially useful for websites that has been built with static site generators, like Jekyll for example, where they automatically publish a RSS feed with its new posts in each build.
Based in this information, you can set up a script in another server that will check the above feed, and if it detects new posts it will publish a link to it directly in Twitter.
Set up a virtual environment
If you already have a virtual environment installed, you can skip this
step. In Ubuntu you can install the package
$ apt install python-virtualenv
Activate the virtual environment to install the required packages
$ virtualenv -p python3.6 ~/.virtualenvs/twitter_bot Running virtualenv with interpreter /usr/bin/python3.6 New python executable in /home/user/.virtualenvs/twitter_bot/bin/python3.6 Also creating executable in /home/user/.virtualenvs/twitter_bot/bin/python $ source ~/.virtualenvs/twitter_bot/bin/activate (twitter_bot)$
Log into Twitter Apps and select “Create New App”.
Create the Application with the following content:
- Website RSS Twitter Feeder
- Automatic new blog posts publisher
- Callback URL
And then proceed to “Create your Twitter Application” button.
Now go to “Keys and Access Tokens” tab, and make sure your app has rights to publish tweets, the Access Level Read and write option should be enabled.
Then go to “Token Actions” and “Create my access token”.
You will need the following items from this page:
- Consumer Key (API Key)
- Consumer Secret (API Secret)
- Access Token
- Access Token Secret
Set up application
We will use the publishfeed script to publish tweets, so following its instructions, we clone the repo:
(twitter_bot)$ git clone https://github.com/marcanuy/publishfeed Cloning into 'publishfeed'... remote: Counting objects: 41, done. remote: Compressing objects: 100% (27/27), done. remote: Total 41 (delta 13), reused 34 (delta 9), pack-reused 0 Unpacking objects: 100% (41/41), done. (twitter_bot)$ cd publishfeed (twitter_bot) publishfeed$
Set up feeds
feeds.yml and edit your feeds information:
(twitter_bot) publishfeed$ cd publishfeed (twitter_bot) publishfeed$ cp feeds.yml.skel feeds.yml
Dependencies are handled by pip:
$ make install pip install -r requirements.txt Collecting beautifulsoup4==4.6.0 (from -r requirements.txt (line 1)) Using cached beautifulsoup4-4.6.0-py3-none-any.whl Collecting certifi==2017.4.17 (from -r requirements.txt (line 2)) Using cached certifi-2017.4.17-py2.py3-none-any.whl Collecting chardet==3.0.3 (from -r requirements.txt (line 3)) Using cached chardet-3.0.3-py2.py3-none-any.whl Collecting feedparser==5.2.1 (from -r requirements.txt (line 4)) Collecting idna==2.5 (from -r requirements.txt (line 5)) Using cached idna-2.5-py2.py3-none-any.whl Collecting munch==2.1.1 (from -r requirements.txt (line 6)) Collecting oauthlib==2.0.2 (from -r requirements.txt (line 7)) Collecting PyYAML==3.12 (from -r requirements.txt (line 8)) Collecting requests==2.17.3 (from -r requirements.txt (line 9)) Using cached requests-2.17.3-py2.py3-none-any.whl Collecting requests-oauthlib==0.8.0 (from -r requirements.txt (line 10)) Using cached requests_oauthlib-0.8.0-py2.py3-none-any.whl Collecting six==1.10.0 (from -r requirements.txt (line 11)) Using cached six-1.10.0-py2.py3-none-any.whl Collecting SQLAlchemy==1.1.10 (from -r requirements.txt (line 12)) Collecting tweepy==3.5.0 (from -r requirements.txt (line 13)) Using cached tweepy-3.5.0-py2.py3-none-any.whl Collecting urllib3==1.21.1 (from -r requirements.txt (line 14)) Using cached urllib3-1.21.1-py2.py3-none-any.whl Installing collected packages: beautifulsoup4, certifi, chardet, feedparser, idna, six, munch, oauthlib, PyYAML, urllib3, requests, requests-oauthlib, SQLAlchemy, tweepy Successfully installed PyYAML-3.12 SQLAlchemy-1.1.10 beautifulsoup4-4.6.0 certifi-2017.4.17 chardet-3.0.3 feedparser-5.2.1 idna-2.5 munch-2.1.1 oauthlib-2.0.2 requests-2.17.3 requests-oauthlib-0.8.0 six-1.10.0 tweepy-3.5.0 urllib3-1.21.1
After setting up credentials you can execute python main.py
(twitter_bot) feedr$ python main.py TWITTERHANDLER --getfeeds (twitter_bot) feedr/src$ python main.py TWITTERHANDLER --tweet
We set up two cron jobs, one to download new posts in each feed, in this case every hour, and another one that will tweet one of each post when we execute it, so it can be good to set it up with for example, 15 minutes between each run. We enter the crontab editor:
$ crontab -e
And then we add the following line (adjust the path of your
installation, in this case I have used
# download feeds hourly 0 * * * * cd /opt/publishfeed/publishfeed/; flock -n /tmp/twsimpleitp.lock ~/.virtualenvs/publishfeed/bin/python main.py simpleitrocks -g # publish tweets every 15 minutes */15 * * * * cd /opt/publishfeed/publishfeed/; flock -n /tmp/twsimpleitp.lock ~/.virtualenvs/publishfeed/bin/python main.py simpleitrocks -t
We use the
flock command to
prevent duplicate cron job executions.
Every time the cron job detects new content present in a website feed it will tweet in the account you have selected. This is a great way to have your blog posts automatically tweeted and your content always present in social media.
*[RSS]: Really Simple Syndication
- Save audio from Google Translator in 6 steps in Ubuntu Linux THE RIGHT WAYJune 9, 2022
- How to Tell if a Webpage Can Also Be Delivered Gzipped - Command Line With CurlAugust 18, 2018
- Colors Palettes For Web DesignDecember 9, 2016
- Script to automatically tweet new blog posts based in a website RSS feed
- Minify Html in your static website (Hugo or Jekyll)August 15, 2018
- Checklist for website developers about performance, SEO and general webmaster's considerationsApril 3, 2017
- How To Check Local Websites For Broken LinksNovember 1, 2016
- Appropriate HTML5 tags to show source code, commands and software related tasks on blogsJuly 22, 2016
- Html5 Full CheatsheetJune 4, 2016
- HTML viewport meta tag for responsive designsJune 21, 2016
- Redirect HTTP to HTTPS and WWW to non-WWW with AWS S3, Cloudfront and Route 53 with a custom domainApril 22, 2017
- Essential Seo Tips And Techniques From Trusted SourcesJanuary 26, 2017