Theme Layout

Boxed or Wide or Framed

Theme Translation

Display Featured Slider

Featured Slider Styles

Display Grid Slider

Grid Slider Styles

Display Trending Posts

Display Author Bio

Display Instagram Footer

Dark or Light Style

Powered by Blogger.

Social Media Bot with deep learning

After my last debacle with attempting to find a social media person who could make few posts on my twitter, facebook & instagram daily I decided to write my own Social Media Manager Bot.

We are already working on our deep linguistics search using TensorFlow Syntaxnet and I thought of taking some time off in the weekend (Yes, time off on weekend) to work on Inception instead of Syntaxnet.

In order to quickly whip up this bot I came up with a quick program flow :

  1. Get imgur album and list the images via imgur api
  2.  select a random image from the album
  3.  if image has title & description set then use that for posting text
  4.  If there is no title & description then
            4.a) download the image
            4.b) pass the image to TensorFlow inception to guess the contents of the image
            4.c) create hashtags out of the top three Inception guesses
  5.  Subscribe to Buffer (shout out to Buffer.com for such an awesome app)
  6. Create social profiles on Buffer where I wanted the post to appear (I ended up creating twitter, facebook, linkedin, Google+ & Instagram)
  7. For each social profile get the image link, the post text (which could be the image title on imgur or guessed by inception) and post it to Buffer using Buffer api.
  8. Setup this program on crontab
  9.  Make sure the amount of times the crontab runs is less then or equal to the amount of posting times on buffer (else the queue buffer keeps filling up on Buffer)

I decided to use my old favorite groovy to whip up the program. Java? Ain't got time for classes.
Simple and straightforward. lets get to work now

1) Get imgur album and list the images
The imgur album was created by fellow colleagues who took the pain to upload images from unsplash and their personal instagram accounts.

Groovy call to get list of images:

def imgurhttp = new HTTPBuilder('https://api.imgur.com/3/album/bPt5C' )
 imgurhttp.request(GET,JSON) { req ->
  uri.path = '/3/album/bPt5C'
headers.'Authorization' = ' Client-ID XXXXXXXXXX'
 headers.Accept = 'application/json'

2) selecting the random image from response

 response.success = { resp, jsonresp ->
String[] images = jsonresp.data.images;
int max= images.length;
int getImage= rand.nextInt(max+1);
def imagetouse = images[getImage]; // this is the image we will use. We can get the link, title, description here.

4) If title is null or description is null then "deep process" (Always give preference to human edited title/description for posting text)
(Google TensorFlow Inception details are here: https://www.tensorflow.org/versions/r0.9/tutorials/image_recognition/index.html)

String downloadedfile=downloadimige("tmpfilename",imglink)
// This function code is mentioned below but its just a simple HTTP GET

Then run TensorFlow Inception on the downloaded file

def command = new String[3]
command[0] = "sh"
command[1] = "-c"
command[2] = "./bazel-bin/tensorflow/examples/label_image/label_image  --image="+ downloadedfile;
def processBuilder=new ProcessBuilder(command)
processBuilder.directory(new File("/media/spyder/BitEast/project/tensorflow/tensorflow/"))  // <-- working directory where tensor flow has been installed
def process = processBuilder.start();

Then extract the results (This is really a very very hacky hacky way)

process.inputStream.eachLine {

  if(it.startsWith("I tensorflow/examples/label_image/main.cc:210]"))
 int braceindex = it.indexOf("(")
 def val = it.substring("I tensorflow/examples/label_image/main.cc:210] ".length(), braceindex); // this is a label that tensorflow guesses for the image
posttext += "#${val} " // create hashtags out of the labels created by the deep learning inception network

Finally we have the image link and title description

println "Image ${imglink} has decoded to ${imgtitle}/${imgdescription}"


Lets start the Buffer postings

 def http = new HTTPBuilder( 'https://api.bufferapp.com/1/profiles.json' )
 http.request(GET,JSON) { req ->
  uri.path = '/1/profiles.json?access_token='+accesstoken;  // this is the access token for Buffer
  headers.'User-Agent' = 'Chrome/5.0' // we don't need this. but whatever

Now for all social profiles in the response

 response.success = { resp, jsonresp ->
jsonresp.each{socialprofile ->

def socialhttp = new HTTPBuilder( 'https://api.bufferapp.com/1/updates/create.json' )

socialhttp.request( POST ) {
    uri.path ='/1/updates/create.json'
    requestContentType = URLENC
    body = ["profile_ids[]": "${socialprofile.id}", text: posttext, //posttext is the labels that inception guessed. The human labeling on imgur is given preference
    "media[photo]": imglink,"access_token":accesstoken,"media[description]": "${imgdescription.toString()}",
   "media[title]": "${imgtitle.toString()}","shorten":"false"]

Voila! The post has been made to our social media.

Groovy function to download file
 public static String downloadimige( String tmpfilename,def address) {
  new File("/tmp/${tmpfilename}.jpg").withOutputStream { out ->
      new URL(address).withInputStream { from ->  out << from; }
  return "/tmp/${tmpfilename}.jpg";

Setting up the crontab on my computer

First create the bash shell script

vi runbuffer.sh
cd /media/spyder/BitEast/project/dataimport  
 groovy src/main/groovy/BufferSocialMedia.groovy

The script just cds into the working directory and runs our groovy file
Then set it up on crontab

crontab -e
0 */6 * * * /media/spyder/BitEast/project/dataimport/runbuffer.sh

I deliberately have typos in function to avoid clashes with any internal functions. In groovy its hard to debug such errors so I instinctively put a typo
in my functions eg downloadimige instead of downloadImage
Code samples shown are working however they may not be complete and have missing ending braces
Pretty sure there are better and elegant ways to do all these but this is the best I could manage in two days.

Our automated postings are on
Share This Post :

You Might Also Like

Follow @WikiNomad