Tag Archives: jquery

Why we (mobile website designers) need to learn CSS animation.

I got a comment on an earlier post about making animation look better on mobile, and there seemed to be some confusion about using javascript animation versus css animation, when targeting mobile browsers. I wrote back and explained my thoughts, but I thought it might make for a good post too. So here goes…

It is absolutely vital for mobile developers to learn CSS animation instead of javascript.

The reason is that Javascript is actually very inefficient for animation, and it’s only recently, with the proliferation of mobile browsers, that we are starting to notice this. If we had been trying to do javascript animation on the computers we had back in the late nineties (which were of similar power to our current mobile phones), we’d have seen it then.

You see, with a JS animation (e.g. created using jQuery’s animate function), the javascript library sets an interval timer (lets say, 33ms, which is about 30 frames per second), and on that interval, it figures out where the animated element should be, using some pretty complex math calculated by the CPU, and then puts it there. Unfortunately, if the CPU is too slow to do that calculation every interval, it gets backed up and the animation stutters. jQuery and other JS animation tools do some work to combat this, sensing a drop in frame rate and trying to make the interval wider, to give it more time to complete the calculations, but it’s often too-little-too-late.

On the other hand, when you do a CSS animation (e.g. using the transform or webkit-transform css properties), you leave it to the GPU to simply calculate where the element should be as often as it can. As soon as it calculates the element’s new position and places it there, the GPU starts the calculation again. This means the animation is always as smooth as the GPU can possibly make it, no matter what. If you animate too many elements for the GPU to handle it may still have trouble, but instead of ugly stuttering, it will be a more graceful drop in framerate.

That’s why CSS animation is a must for mobile targeted websites. The good news is that all mobile browsers (the ones we care about anyway) support CSS3’s animation techniques, so there’s no need for capability detection and fallbacks.

Fix: CSS Animation slow or choppy in mobile browsers

EDIT: DONT DO THIS. It breaks z-index rules sometimes on modern webkit based browsers.

I’m developing an iPhone app using the PhoneGap library, which lets you write your app as if it were a website. That means any UI transitions or animations have to be written the way you would for any modern website targeting Webkit: CSS3 transitions (mobile javascript is too slow for frame by frame calculations).

Unfortunately, CSS transitions can be a little slow, a little choppy, even on iPhone 4 and the faster Android based phones. The problem is that, by default, webkit doesn’t involve the GPU unless you’re doing 3D transforms. With desktop horsepower, thats fine. On a mobile device, that GPU could really help.

So how do you force webkit to share the processing load with the GPU?

Apply this style to the element you’re animating:

-webkit-transform: translateZ(0);

Simple, but effective.

TypeKit (@font-face) font’s look like crap in IE7 & IE8 when animated with jQuery

Internet Explorer does a pretty lousy job of displaying fonts anyway, especially fonts from all those new we font websites such as TypeKit (who I strongly recommend). But when you use, say, jQuery’s fadeIn() effect like this


… type looks like crap all the way through the animation, and stays that way after.
Crappy type after animating

The reason for this is a bad implementation of “filters” in IE. “What are filters” you ask? Well while the rest of the web was being standardized to make our lives easier, Microsoft was ignoring the new standards and making up their own, worse looking and harder to write versions. jQuery hides all that from us by turning your fadeIn() into the appropriate filter code in IE, or opacity code in everything else.

There are 3 fixes.
1) Avoid the problem. That means using a non-typekit font, or using an image of the type (which means it won’t be selectable, and google can’t see it).

2) Partial fix: At the end of your fadeIn() you can tell IE to drop all filters from the element. This isn’t great because it means that the type will still look like crap all the way through the animation, and then snap to normal looking (which is still not great in IE) at the end. Code for that below:

$(".fadeThisGuy").fadeIn(400, function() {
	$(this).css('filter', "");

3) “Graceful degradation”. The buzzword “graceful degradation,” refers to the act of checking the capabilities of your visitor’s browser, and scaling your awesomeness back to fit what their browser can do. They should still get the same website, just without the bells and whistles their browser can’t handle.

In this case, we’re checking for IE, and whether it’s version is < 9.

if ($.browser.msie && parseInt($.browser.version) < 9) {
	// dumb it down for old IE 6, 7, and 8
} else {
	// regular awesomeness for people who know how to use Windows Update

As an alternative to a fadeIn() and fadeOut() I’d suggest either a simple show() and hide(), or any of the other jQuery Effects that don’t have to do with opacity/filters. You can find a list of jQuery effects here in the effects section.

Make your own live, auto-updating data feed with PHP (CodeIgniter) and Javascript (jQuery)

EDIT: I guess you don’t need to store the last entry time in a cookie, storing it in a simple js var works fine. I’ve updated the code to reflect this.
EDIT2: If your server isn’t doing a good job of responding to the queries for the last entry time, and it’s just sending 304 Not Modified every time, then you can add a random GET var on the request, so it always give you a fresh copy. Takes more resources though.

	url: "latest_report_time.txt?" + Math.floor(Math.random()*10000),

Original Post:
I’ve been working on a webapp recently which involves users in the field sumbitting reports, and users back at the office seeing those reports as they come in. Eventually I’ll probably set it up right, with true two way communication between the field users and the office users, via an XMPP server, but in the meantime I went with a simpler approach.

Here’s what I came up with.

1) When a field user submits a report, it goes into the database as you would expect, but a new function I wrote just for this purpose also does something else, something very simple, it replaces a text file on the server with the current timestamp. Note that its a simple .txt file and that the single timestamp of the latest report is all it contains. That way there’s no PHP overhead when someone looks at that file, and if they’ve already looked it up once and it hasn’t changed, the server will return a 304 not modified message, and the browser won’t even download the file. Very efficient.
In CodeIgniter, that code looks like this, but it could be done in any framework, or plain old PHP.

$timestamp = now();
write_file('latest_report_time.txt', $timestamp);

2) When an office user opens a page in their browser that is meant to be auto-updating, a new javascript is run. All that script does is look up that file we made, latest_report_time.txt, every 5 seconds, and checks to see if the timestamp in it has changed since the last check (which was saved in a cookie on the last check). Note that I’m using jQuery and a very jQuery plugin called jQuery Cookie, but you could do this in any JS framework, or plain old JS.

setInterval(checkForReportUpdates, 5000);

var oldLastReportSeen = 0;
function checkForReportUpdates() {
		url: "latest_report_time.txt",
		success: function(data) {
			var newLastReportSeen = data;
			if (oldLastReportSeen != newLastReportSeen) {
				console.log("New report found, get it and set new cookie");
				oldLastReportSeen = newLastReportSeen;
			} else {
				console.log("No new reports");
				oldLastReportSeen = newLastReportSeen;

3) Finally, if the timestamp on the server has been updated, do another ajax call to grab the latest report. I leave the rest of this up to you, because the structures of our sites are probably very different, but you’ll probably end up writing some of your own logic to check to make sure the newly fetched report is not one you are already showing, and then use jQuery’s prepend() or append() to insert it into your page.

function getNewLatestReport() {
		url: "reports/latest",
		success: function(data){
			if (/*check for duplicates here*/) {

Load more content as the user reaches the bottom of your page, with jQuery

I recently built a project gallery for Primal Screen’s new website (not live yet). Instead of having the traditional < 1 2 3 4 5 > pagination setup, we decided to take a leap forward into the the world of modern browsers, ajax, and users with very short attention spans. In this on-demand world, we load the content as needed: when the user hits the bottom of page one, page two is loaded and appended, and so on until you run out of content, or the user’s scroll wheel breaks. It’s a never ending page, without a never ending load. You may have seen this functionality in the Firefox plugin, AutoPager, or the Chrome/Safari plugin AutoPagerize.

Anyway, basically it works like this:
1) User loads page with 15 thumbnails on it
2) User scrolls to bottom of page, triggering a jQuery event we set up
3) 15 more thumbnails appear like magic or voodoo or mindreading or something and the page gets longer! Without even clicking!
4) User scrolls some more, hits bottom of page again, jQuery event fires again and
5) 15 more thumbnails appear and the page gets longer, further blowing the user’s mind!
6) ?
7) Profit.

alreadyloading = false;
nextpage = 2;

$(window).scroll(function() {
	if ($('body').height() <= ($(window).height() + $(window).scrollTop())) {
		if (alreadyloading == false) {
			var url = "page"+nextpage+".html";
			alreadyloading = true;
			$.post(url, function(data) {
				alreadyloading = false;

Line 1 is a state variable. We set it to true when we start loading a new page, false when it's loaded and shown, and before we load a new page we check to make sure the variable is set to false to ensure that we don't try to load page three before page two is done loading.
Line 4 Standard jQuery scroll event. This fires several times per second when scrolling in a modern browser, but on old browsers it just fires once when you stop scrolling.
Line 5 Here we check to see if the user is at the bottom of the page.
Line 6 Here we check to make sure we aren;t already loading a page.
Line 7 This is where the magic happens. $.post is a jQuery function that loads the url you give it in it's first parameter, and runs the callback function you give it in it's second parameter (and that function get's the loaded data in it's one and only parameter, did you follow all that?).
Line 8 Now that we have the loaded page, we have to put it in our existing page. Translated to english, this line reads "find the element with the ID: galleryThumbsCol, find it's last child, and put the loaded data in after that child.
Line 9, 10 Then we set our variables so we're ready to load the next page.

Find the Pixel perfect position of an element with jQuery (and keep elements always visible)

If you need to know the exact x and y positioning of any element on your page (relative to the viewport), use jQuery like this.

var x = $("#myElement").offset().left;
var y = $("#myElement").offset().top;

Then you can do stuff like this

$(window).scroll(function() {
	var y = $("#myElement").offset().top;
	var scrollY = $(window).scrollTop();
	if (scrollY > y) {
		var padY = scrollY - y;
		$("#myElement").css("paddingTop", padY);

This script would compare your scrollbar’s position and the position of the element, #myElement, every time you scroll (several times per second on good browsers, when you let go of the scrollbar on bad browsers). Then, if the top of the element is above the top edge of the viewport, it adds some padding-top to it to keep it locked there at the top of the screen.

Use jQuery to attach an event to a key press

Keep in mind that some browsers don’t let jQuery know about certain key presses. For example, Chrome doesn’t send the event for keyCode 27 (escape), but Safari does.

$(document).keypress(function(e) {
	if (e.keyCode == 13) {
		alert("You pressed Enter!");

And here’s a list of all the available keyCodes for you to use.

UPDATE: jQuery: Get all objects of selector, and then loop through them

In my original post I mentioned how one can use the get() function in jQuery to gather up all the selected DOM elements and put them in an Array, like so:

var stuffArray = $(".stuff").get();

Well here’s another way to do the same thing:

var stuffArray = $.makeArray($(".stuff"));

And then I showed how to loop through that new array and do something with each of the DOM elements like so:

for (x in stuffArray) {

Well there’s another way to do that too:

$.each(stuffArray, function(key, value) {

The two variables (key and value in the example) in the each callback function can be called whatever you want, but they represent 1) the array key, or if it’s a numerical array, the index of the entry in the array your each function is working on, and 2) the value at that key/index.

CodeIgniter Rejecting Ajax XMLHttpRequest from jQuery?

I have a javascript/jQuery ajax function that tries to post some data to a URL in my CodeIgniter installation, and it fails. Looking at the console it gives me this message

XMLHttpRequest cannot load http://www.DOMAIN.com/chris/calendar/async/update/assignment/325. Origin http://DOMAIN.com is not allowed by Access-Control-Allow-Origin.

The problem is in the headers that CodeIgniter is set up to send by default. The fix is to use CodeIgniter’s Output class to send the Access-Control-Allow-Origin header allowing everything, or *.