Question:

I have a few custom social buttons on my website for whom I get the share number/followers number using json from API. I have tried to implement a cache system to reduce the load time and eliminate de risk of being 'red-flagged' for over-using the APIs. However, I had no success in this area, basically because I don't quite understand the integration steps. I hope someone could help me integrate a cache system.

Here are the php codes for Twitter, Google Plus and Instagram:





    ob_start();


    $twittershare = 'http://cdn.api.twitter.com/1/urls/count.json?url='.$product["href"] .'';



    $ch = curl_init();


    curl_setopt($ch, CURLOPT_URL, $twittershare);


    curl_setopt($ch, CURLOPT_HEADER, 0);


    $jsonstring = curl_exec($ch);


    curl_close($ch);


    $bufferstr = ob_get_contents();


    ob_end_clean();


    $json = json_decode($bufferstr);



    echo $json->count;







    $url = ''.$product["href"] .'';



    $ch = curl_init();


    curl_setopt($ch, CURLOPT_URL, "https://clients6.google.com/rpc?key=xxxxxxxxxx");


    curl_setopt($ch, CURLOPT_POST, 1);


    curl_setopt($ch, CURLOPT_POSTFIELDS, '[{"method":"pos.plusones.get","id":"p","params":{"nolog":true,"id":"' . $url . '","source":"widget","userId":"@viewer","groupId":"@self"},"jsonrpc":"2.0","key":"p","apiVersion":"v1"}]');


    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);


    curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-type: application/json'));


    curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);


    $curl_results = curl_exec($ch);


    curl_close($ch);


    $json = json_decode($curl_results, true);


    $count = intval($json[0]['result']['metadata']['globalCounts']['count']);


    $data = array();


    $data['plus_count'] = (string) $count;


    $data['url'] = $url;


    echo $data['plus_count'];







    ob_start();


    $insta = 'https://api.instagram.com/v1/users/00000000?access_token={token}';



    $ch = curl_init();


    curl_setopt($ch, CURLOPT_URL, $insta);


    curl_setopt($ch, CURLOPT_HEADER, 0);


    $jsonstring = curl_exec($ch);


    curl_close($ch);


    $bufferstr = ob_get_contents();


    ob_end_clean();


    $json = json_decode($bufferstr);



    echo $json->data->counts->followed_by;



Hope you guys can guide me step by step as to how to implement a cache system for the code snippets above.


Best Answer:


Well, as mentioned in my comment, I'd use Memcached and a database, but I'll draft a database-only solution (with PDO for twitter) and leave the Memcached part as bonus exercise for you. ;) I would load the follower information via AJAX to reduce page loading time for when e.g. the follower count needs to be updated.

I'll be using the following database schema:

CREATE TABLE IF NOT EXISTS `Followers` (


  `id` int(11) NOT NULL AUTO_INCREMENT,


  `url` varchar(100) NOT NULL,


  `data` longtext NOT NULL,


  `followers` int(5) NOT NULL,


  `last_update` TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,


  PRIMARY KEY (`id`)


) ENGINE=InnoDB  DEFAULT CHARSET=utf8;


First I'd define an interface so you don't rely on any implementations:

interface SocialFollowers


{


    public function getFollowers();


}


Then, for the twitter share API I'd have an implementing class that gets a database handle and the target URL for initialization. The class attributes are populated with the retrieved data (if available). If the timestamp is new enough you'll get the amount of followers instantly, else the API is queried, the results stored and then the amount of followers retrieved.

class TwitterFollowers implements SocialFollowers


{


    private $data = null;


    private $url = "";


    private $db = null;


    private $followers = null;



    protected $shareURL = "https://cdn.api.twitter.com/1/urls/count.json?url=";



    public function __construct($db, $url) {


        // initialize the database connection here


        // or use an existing handle


        $this->db = $db;



        // store the url


        $this->url = $url;



        // fetch the record from the database


        $stmt = $this->db->prepare('SELECT * FROM `Followers` WHERE url = :url ORDER BY last_update DESC LIMIT 1');


        $stmt->bindParam(":url", $url);


        $stmt->execute();



        $this->data = $stmt->fetch(PDO::FETCH_ASSOC);


        if (!empty($this->data))


            $this->followers = $this->data["followers"];


    }



    public function getFollowers()


    {


        // create a timestamp that's 30 minutes ago


        // if it's newer than the value from the database -> call the api


        $old = new DateTime();


        $old->sub(new DateInterval("PT30M"));



        if (is_null($this->followers) || (new DateTime($this->data["last_update"]) < $old) ) {


            return $this->retrieveFromAPI();


        }



        return $this->followers;


    }



    private function retrieveFromAPI()


    {


        // mostly untouched


        ob_start();


        $twittershare = $this->shareURL . $this->url;



        $ch = curl_init();


        curl_setopt($ch, CURLOPT_URL, $twittershare);


        curl_setopt($ch, CURLOPT_HEADER, 0);


        $jsonstring = curl_exec($ch);


        curl_close($ch);


        $bufferstr = ob_get_contents();


        ob_end_clean();


        $json = json_decode($bufferstr);



        $this->followers = $json->count;



        // store the retrieved values in the database


        $stmt = $this->db->prepare('INSERT INTO Followers (url, data, followers)'


            .'VALUES (:url, :data, :followers)');


        $stmt->execute(array(


            ":url" => $this->url,


            ":data" => $bufferstr,


            ":followers" => $this->followers


        ));



        return $this->followers;


    }


}


For Facebook, Google+, the-next-social-network you just need to add another implementation.

Please keep in mind that this code isn't tested. It misses some try/catch blocks for the PDO queries and there's room for improvement (e.g.: some kind of locking mechanism is missing to prevent the concurrent retrieval of the same URL, is it necessary to store the returned blob, etc.).

Hope this helps you.

[edit] I updated the code slightly (fixed some typos and conversion issues) and tested it. You can find a working version at github. All that's missing is the ajax snippet (assuming jQuery) like

$.ajax({


    url: "http://example.com/twitter.php",


    type: "get",


    data: {url: "http://stackoverflow.com"}


    success: function(data, textStatus, jqXHR) {


        // Update the corresponding counter like


        // $("#twitterfollowers").text(data);


        console.log(data);


    }


});