Crossposting with a single script: Crossposter.sh

Crossposting with a single script: Crossposter.sh

Introduction

If you have been writing articles you know the pain to get some attention, if you have already been cross-posting your articles it usually takes some time to do that. This task can be automated with a shellscript. If you have been cross-posting articles on medium.com, dev.to and at hashnode.com, then I have a treat for you.

Introducing crossposter.sh!!

What is Crossposter.sh?

Crosspost to dev.to/hahsnode/medium from the command line.

Crossposter.sh is a shell script(BASH) to automate crossposting to platforms like dev.to, medium.com and hashnode.com. The script takes in a markdown version of your post with a few inputs from you and posts it to those platforms. You would require a token/key for each of those platforms to post it from the command line. You can check out the official repository of Crossposter.

The actual script is still not perfect (has a few bugs). Though it posts on dev.to and medium.com easily, the hashnode.com is buggy as it parses the raw markdown into the post and doesn't render as desired. So, its an under-development script, feel free to raise any issues or PRs on the official GitHub repo.

Run the script on a bash interpreter with the command:

bash crosspost.sh

For posting the article you need to provide the following details:

Front-Matter

Metadata about the post

  • Title of Post
  • Subtitle of Post
  • Publish status of the post(true or false)
  • Tags for the post (comma separated values)
  • Canonical Url (original url of the post)
  • Cover Image (URL of the post's image/thumbnail)

This information is a must for dev.to especially the title. This should be provided in the same order as given below:


---
title: The title of the post
subtitle: The description of your article
published: true
tags: programming, anything else
canonical url: url of your original blog
cover_image: coverimage_url
---

There is no need to enclose any of them with quotation marks. Published argument will be true if you want to publish it and false if you want to keep it in your Drafts.

In the demonstrations, we just need to enter the tokens once. The tokens will be stored locally in the keys.txt file and retrieved later within the script.

Posting on dev.to:

Posting on dev.to requires their API key which can be generated by going on the Dev Community API Keys. From there you can generate a new key with any name you like. You just need to enter the key to CLI once or manually enter in the keys.txt file with the format dev.to:key on the first line. This will be used for the future cross-posting whenever you execute the shell script(bash crosspost.sh)

You can provide the front matter manually in your markdown file or you will be prompted for the input. So, that is all you will require for posting on dev.to from the Command line.

Lets see the script in action

dev.to

If you want to add in more stuff to the post, you can check out the DEV.to API docs which is powered by Forem, there a ton of options you can hook to the front-matter in the shell script.

NOTE: There is a limit of 10 requests per 30 seconds, so keep in mind while testing the script and don't try to spam

Posting on hashnode.com:

This part is still under development as it only displays the raw markdown in the post, also the tags are too heavy to implement from the API as id of every tag is required along with the slug and name. Still, it serves some purpose at least. For posting on hashnode.com, we need Personal Access Token. This can be generated by going to Developer Settings. You will also require the user-id of your hashnode account. You can get your user-id/username from the settings tab under profile information. We require a Username for posting to the Publication Blog if any. As usual, the Personal Access Token for interacting with the Hashnodes' GraphQL API. The API is quite user-friendly and provides everything in one place. There are docs for running each and every query and mutations present in the API.

You can paste the token when prompted from the script or manually type in the keys.txt text file as hashnode:token on the 4th line. Yes, that should be on the 4th line, which make retrieving much easier and safe. Next also input in the username when the script asks for the input or again type in on the 5th line, hashnode_id:username in the text file keys.txt. Please enter the credentials from the script prompt so as to avoid errors and misconfigurations when doing manually

This will create the Post on hashnode with the title, subtitle, cover image correctly but will mess up the content. I tried hard but it's just not happening. There needs to be some character for newline as the API rejects the rn characters passed in, so I have substituted them with <br> and the result is raw markdown. As the Hashnode API is still under development and they are bringing changes and new features in, the API should improve its core functionality and make it much easier for creating some common queries. So, I'll create an issue on GitHub for posting the actual content via the script.

So, this is the demonstration of posting on hashnode.

hashnode

Posting on medium.com:

Medium API is much more versatile and markdown friendly, though it has some limitations on the number of posts you can make in a day. For posting on Medium.com, we will require the Integration Token which can be generated on the settings tab. As similar to hashnode, you can name the token whatever you like and then get the token. Paste the token when prompted from the script or manually type in the keys.txt text file as medium:token on the 2nd line. We also require the Medium_id, but we can get that from the token itself, so inside the script, once the token is obtained, the curl command is executed to fetch in the id and it is stored on the next(3rd) line in the keys.txt file for actually posting on medium.com. So that is all the configuration you need for posting on medium.com.

There is some documentation on Medium API, we can even post to a Publication, that shall be created in the future. Also, the cover images can be posted on medium, it is not currently done but that can again be a #TODO. The tags are not rendered on Medium yet with the script. The way we can parse strings is limited in BASH, so this might still be a doable thing later. Most of the checkboxes are ticked like title, subtitle, cover-image, canonical URL, and importantly the content.

Let's look at the post on the medium from the script.

medium

All platforms:

Now, once you have configured everything, you can opt for the 4 choice that is posted on all platforms(dev.to, hashnode and medium), but as hashnode is not looking a good option right now, so there is the 5 option for only dev.to and medium.

allplatforms

Why use Crossposter.sh?

This can be not so big of an issue for most people but it was a good side project to work on and learn more about how APIs work and get some ideas on the design of the platform. Though it is quite time-saving to cross-post on 3 different platforms within a minute or two. You can tailor your own script as per your specifications and desire.

So, if you are an author on all of the mentioned platforms, please give it a try. Other Platforms are welcome for contributions. If you found any unexpected things, please hit them in the issues tab.

Script

The script mostly leverages curl, sed, cat and some other basic utilities in BASH.

Using curl for posting the article from APIs

Curl is a lifesaver command for this project, without this tool, the project might not be as great and efficient. Let's see some quick commands used in the script.

curl -H "Content-Type: application/json" -H "api-key": \"'"$key"'\" -d '{"content":\"'"$body"'\"}' "$url"

So, the above command is quite basic, some more additions are also added as per the specifications of the Platform. But, let us understand the structure of the command we are sending to the APIs. The first part is the Header (-H), here we specify the content that is going to get parsed and the api-keys to access the API. Next, we have the body or the data (-d), here we parse in the actual contents, it might be the front matter along with the markdown content. Finally, we have the url where we send the POST request i.e. the API endpoint. The is the escape character that is used to preserve the literal value of the next character and in short, we can shorten the command to fit in the next line.

The wired $body is used to parse the value of the variable body inside of single quotes as in BASH, we can only access the variables' value in double-quotes. We are using single quotes as we have to pass the `JSON object and which has already double quotes in it.

Using sed for editing text

Sed is a super-powerful stream editor, it's somewhat similar to Vim without an interface, only commands. We use this tool to manipulate the front-matter for posting on the platforms by parsing them to variables in BASH. We also use to enter the API keys inputted by the user from variables into the file at a specific position to retrieve later.

sed -i "1a title: $title" file.txt

Here, we are appending(a) to the 1st line, text title: $title, here $title is the variable, so we are technically parsing the value of the variable title. We are editing the file file.txt in-place -i i.e. we are editing it live without creating any temp or backup files.

sed -n -e "s/dev.to://p' keys.txt"

Here we are essentially getting the text after a particular pattern. In this case, we are searching in keys.txt file for the string dev.to: and anything after that till the end of the line is returned, we can further store it in the variable and do all sorts of operations.

Using awk for programmatic editing

awk '{print $0"\r\n"}' temp.txt >file.txt

AWK is a command-line utility for manipulating or writing certain operations/patterns programmatically. We use this tool so as to add 4r4n to the end of each line, the APIs can't parse the file contents directly so we have to add certain characters before the end of the line and do further operations.

cat temp.md | tr -d '\r\n' > temp.txt

After we have added the \r\n characters to the end of the file, we simply can use cat and tr to merge all the lines into a single line. This is how we parse the contents to the API more safely and concisely, of course, we need to parse them in a variable by reading the file.

OK, I won't bore anyone with more BASH but that was some of the quite important commands in the script that form the backbone of the cross-posting and handling text with the APIs.

Conclusion

So, we can see crosspost.sh is a BASH script that cross-posts markdown articles with a bit of input to 3 different platforms within a couple of minutes. This article was basically to demonstrate the project and its capabilities also highlighting the issues present. I hope you liked the project, please do try it and comment on the feedback, please. Thank you for reading, Until next time, Happy Coding :)