Skip to content


Adding a Twitter Feed to Your Website

You post to Twitter but you’d like to use to talk to more than the people that actually bother to log into Twitter. Maybe you’ve set up the Facebook app that posts all your Tweets to Facebook. Here I’ll talk about how to add your twitter feed to your own website.

First off, if you’re just looking for a generic Twitter feed, there are many prebuilt for you, including several from Twitter itself. This article is for those that want the experience of building their own or need the fine-grain control you get only by understanding all the underlying pieces.

The Basics
We will be utilizing PHP and MySQL on a Unix based server. You should be at least vaguely familiar with the first two. There are two parts to the script, one to connect to Twitter and add tweets to the database (caching them) and one to read from the database and display the tweets. We are going to be caching the twitter feed instead of connecting to Twitter every time someone loads your website.

Twitter API
Here are some basic facts about the API we will need before we move forward:

  1. Entirely HTTP based
  2. Conforms to REST
  3. 150 connection / hour limit
  4. Limitations are IP Address or Account based

What does this mean? Well, #1 means that we can use the HTTP ‘get’ protocol to retrieve all the information we need. Which is good, because it means we can then use cURL, which is a command line based tool for transporting data with the URL syntax – or just what we need. It also has implementations in many, many programming languages.

#2 tells us how the information is formatted. REST stands for reStructuredText and is compatible with many XML parsers – also good.

#3 is very important and why we will be caching. It means that if you connected to your Twitter feed more than 150 times in one hour, Twitter would stop responding to you. If you weren’t caching, it means that you would have no data in your twitter feed. For high volume websites (or even medium volume ones) that number could be reached in minutes or even seconds, making your feed useless.

#4 tells us a little bit more about this limitation, though, and gives us hope. If limitations are based on IP address or accounts, it means that if Neil Patrick Harris decides he’s going to look at my Twitter feed 10 times a minute every minute then it will only lock him out, not me (assuming we’re not sharing a computer). Be careful with this one, though, because if you’re using a shared hosting server (if you can’t answer this question then assume you are) then another website on the same server connecting to twitter a lot could affect your website. So we’re going to use an account to log in – which means you can connect 150 times per hour on the same account before you get locked out. Sounds sufficient to me and it is.

You will need to find the address of your RSS feed. From your profile (http://www.twitter.com/username) find the ‘RSS Feed of Username’s Tweet’s’ on the righthand side. It should look like ‘http://twitter.com/statuses/user_timeline/14287293.rss’

Caching
At this point we’ve covered how we will connect to Twitter (cURL), what protocol to use (‘GET’), we know to authenticate as our account and we know that our tweets will come to use formatted as REST – and that we can use an XML parse like SimpleXML in PHP to read it.

Once we retrieve a tweet and decode we will want to cache it. In this case, we’ve chosen to use MySQL. I’ve created a database called ‘generic’ that I store all my random tables in. For this script, I’ve created a table called ‘twitter’ with the appropriate fields to store the data from a tweet. In MySQL, make sure you have selected the appropriate database and type:

CREATE TABLE IF NOT EXISTS `twitter` (
  `title` tinytext NOT NULL,
  `description` tinytext NOT NULL,
  `pubDate` datetime NOT NULL,
  `guid` char(60) NOT NULL,
  `link` char(60) NOT NULL,
  PRIMARY KEY  (`guid`)
);

This is the table we will be storing everything. Now, every time someone connects to our website we will read from this database instead of going to Twitter. On top of the limitations Twitter has given us, this would be a good idea anyways because it will speed up how fast your website loads by not having to go to Twitter every time.

Beginning
We begin to have an idea of what our script should look like. At this point, we have an outline that should look something like this:

Script 1:
Connect to Twitter using cUrl
Retrieve our Twitter feed with the 'GET' command
Read REST formatted data
Store data in MySQL database (caching)

Script 2:
Connect to MySQL database
Retrieve our Twitter feed
Display on website

That’s great! Let’s look at the first part of the script:

Script 1 – twitterParse.php

<?php
/*
This page will connect to the RSS feed for a single twitter user and pull the most recent updates (up to the # in $cnt)
It will then update a database as specified in db_conn.php and upload any tweets not already in the database. It is 
not recommended to use this script on a browser facing site, but rather to call it via cron and have the visual 
Twitter feed pull from the database. The reason for this is that Twitter only allows 100 calls an hour which would
be quickly maxed out on a website with any traffic.
Written by Tyler Johnson (c) 2009
*/
$dbserver =''; // address of your dbserver
$dbuser = ''; // MySQL user 
$dbpass = ''; // MySQL user's password
$conn = mysql_connect('$dbserver', '$dbuser', '$dbpass') or die(mysql_error());
mysql_select_db('generic', $conn);
$username = ''; // Twitter username
$password = ''; // Twitter password
$userrss = ''; // Twitter RSS location


// Setup cURL connect
$ch = curl_init($userrss);
// Define parameters
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); // this means wait for a server response
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_USERPWD, "$username:$password");
// Execute cURL
$data = curl_exec($ch);
// Close connection
curl_close($ch);


// Parse XML
$xml = new SimpleXMLElement($data, LIBXML_NOCDATA);

$cnt = 20; //count($xml->channel->item);

for($i=0; $i<$cnt; $i++) // start looking at results
{
	$title	=	mysql_real_escape_string($xml->channel->item[$i]->title);
	$desc	=	mysql_real_escape_string($xml->channel->item[$i]->description);
	$pubDate=	mysql_real_escape_string($xml->channel->item[$i]->pubDate);
	$guid	=	mysql_real_escape_string($xml->channel->item[$i]->guid);
	$link	=	mysql_real_escape_string($xml->channel->item[$i]->link);
	
	// input results into MySQL
	$query = "INSERT INTO `generic`.`twitter` (`title`, `description`, `pubDate`, `guid`, `link`) VALUES ('$title','$desc',STR_TO_DATE('$pubDate','%a, %d %b %Y %H:%i:%s %x'),'$guid','$link')";
	$result = mysql_query($query);
	$error = mysql_errno(); // Record errors

	if($error) // If error...
	{
		echo $error; // then halt MySQL
		break;
	}
}
?>

There’s a lot to understand in this script. I would recommend Googling any PHP functions you don’t understand.

Now, this page will connect to Twitter, grab your 20 most recent tweets and add them to your MySQL database. Because there is no unique ID on the Twitter table, identical tweets will overwrite each other so you will not get any duplicates.

While this code does what we want, do do not want to run this directly from our website. If we did, there would be no point it caching – instead, we want to put this script into a cron job, which is a chronological based job scheduler on Unix platforms. I have mine running every minute, which puts me fair below the 150 connections a minute and seems more than sufficient to me. To add this, you’ll need to add the following like to cron by typing ‘crontab -e’ and adding the following line, editted with your scripts location:

* * * * * /usr/local/php5/bin/php -q /location/of/script/twitterParse.php > /dev/null 2>&1

Retrieving & Displaying Your Tweets
The hard part is over! The next script is the one we will be putting on your website to viewers to see. It’s also much simpler to write. I won’t spend too much time on it, so here it is:

Script 2 – twitread.php

<?php
$tweeter = ""; // Twitter username
$dbserver =''; // address of your dbserver
$dbuser = ''; // MySQL user 
$dbpass = ''; // MySQL user's password
$conn = mysql_connect('$dbserver', '$dbuser', '$dbpass') or die(mysql_error());
mysql_select_db('generic', $conn);

// Connect to DB and pull $tweetLimit latest tweets
$tweetLimit = 5;
$query = "SELECT * FROM `generic`.`twitter` WHERE 1 ORDER BY `pubDate` DESC LIMIT $tweetLimit";
$result = mysql_query($query);

echo "<div class='twitter'>\n"; // begin Twitter display
echo "<div class='tweeter'><a href='http://twitter.com/$tweeter'>$tweeter</a></div>\n";

while($row = mysql_fetch_array($result,MYSQL_ASSOC))
{

	$desc = $row['description'];
	$link = $row['link'];
	$date = $row['pubDate'];
	
	$pattern = '/^([A-Za-z0-9]*):(.*)/'; // let's separate the name and the tweet
	$replaceTweet = '\2';
	$noname = preg_replace($pattern, $replaceTweet, $desc);

	$pattern = '/@([a-z0-9]*)/';
	$replaceName = '@<a href="http://twitter.com/\1" class="twit">\1</a>';
	$tweet = preg_replace($pattern, $replaceName, $noname);


	echo "<div class='tweet'>$tweet</div>\n";
	echo "<div class='tweetDate'><a href='$link'>$date</a></div>\n";

}

echo "</div>\n";

?>

That’s it! You’re done! You can tweak the HTML in the second script to your hearts desire and add CSS to format it anyway you like.

This tutorial was much heavier on the ‘theory’ behind why the script is written this way and less on the techniques used. I will be happy to answer and questions you may have to help you get this running.

Posted in Web.

Tagged with , , , , .


4 Responses

Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.

  1. Anthony says

    Thank You very much for this great tutorial

    http://www.guymix.com

  2. Zach says

    How would this work if you wanted to pull only the tweets using one of more hashtags regardless of which user it came from?

    I’m looking into pulling a stream of 4 separate hashtags onto my website… Curious, thanks!

  3. Tyler Johnson says

    Hi Zach!
    That’s a great question. Tracking public responses requires a much different animal than following a single user’s tweets. Due to the sheer volume you’re looking at, we would want to use a streaming method versus the get/close method I’ve written here.

    Take a look at http://dev.twitter.com/pages/streaming_api_concepts to understand what kind of groundwork you’ll need. Twitter also has a great example for what you’re trying to do at http://dev.twitter.com/pages/streaming_api_methods#track.

  4. cheap Jeans says

    I have read a few of the articles on your site today, and I really like your way of blogging. I added it to my favorites blog list and will be checking back soon. Pls check out my site as well and let me know what you think.



Some HTML is OK

or, reply to this post via trackback.