CodeIgniter 2.0 … Whoa!

I used to be subscribed to the CI Forum RSS feed and kept-up with the daily conversations and heartbeat of the community. However, I stopped when I couldn’t keep up with the daily thread amidst all of the topics being started by people who were saying:

“Please help me, even though I haven’t done anything to even attempt to figure this out myself.”

Eventually I learned enough to be an expert with this framework, and I’ve designed a highly profitable CMS supporting a nice size company. We have billion dollar companies using our software all over the world and I’ll be damned if I’m going to upgrade the framework my team and I know inside and out to 2.0. We can’t afford the liability of upgrading to a framework that now has a development mentality of “features” over “stability.”

I was VERY happy with the infrequent release schedule because it meant the framework was solid. I was able to focus on developing features into my application, instead of worrying about new “features” within the framework compromising my application and breaking solid, optimized, perfectly working PHP code.

This holds true for security as well. Whenever the security schema for a very large web-facing application is being developed, there are assumptions made along the way. This could be something as simple as a boolean versus a string response from a function on failure/success, or as complex as which functions of a library are private/public, etc. To upgrade any sizable application to CodeIgniter 2.0, all of these schema have to be tested and reworked. Especially any code surrounding web forms (i.e. login pages, contact forms). Any person telling me that “a good programmer does it right the first time” is not an experienced developer. Nobody is perfect.

The reason we use and promote CodeIgniter is because it is a lightweight, stable framework. We don’t need functions that use PHP to write our Javascript & HTML for us. If I needed that I would have stuck with ASP.NET. We also don’t need a new feature every day, or daily fixes of bugs caused by the unnecessary features from the previous day. Touting the REACTOR branch as stable and mentioning that there were 11 commits in the last week is an oxymoron.

By comparison, I will not download the 2.0 “core” from EL because I don’t need it. There is nothing wrong with CI 1.7.3. It’s an amazing framework and as far as I’m concerned 1.7.3 is still the latest release. Everything after that is not necessary. We do a LOT with this framework, and have never thought:

Hmm… CI just isn’t fast enough, and it doesn’t have enough features.

Instead, we are able to sleep at night because we know our application is rock solid built on a concrete framework.

People need to stop trying to squeeze more power out of this framework and focus on becoming more robust programmers. Using tools like Amazon Web Services, Memcache, jQuery, and mod_pagespeed, our CI-based applications load websites faster than any Internet connection can keep up with. Now that HTML5 is on the rise, with constant changes and developments with HTML5 and CSS3 we don’t need our framework changing every day as well.

A Novel Idea

I tend to read a lot of non-fiction. Whether it be a technical tome, some over-worked physicists newest theory, or books related to the crises of our time (food, water, and shelter).

That being said, whenever I catch myself avoiding a daily read it’s normally because I’ve been reading too much non-fiction. I then immediately switch to fiction to quench my desire for literary immersion. The most recent occurrence of this took me back to J.K. Rowling’s “Harry Potter” septology.

My point… right.

A few nights ago I was working late on the Paradigm API code – which will be used to power the new and currently drives the website. It had been a long day of coding and when I tried to lay in bed I just kept seeing lines of code flash past my eyes. Trust me, it’s nothing like counting sheep!



I had just finished the third “Harry Potter” book and, while I was reluctant to start a new book at 2am… it’s Harry Potter! ;) I only had to read the first chapter though to receive my inspiration. Most likely due to my extremely hyper mental activity… I could see between the lines. I was able to relate with the baffling number of characters and plots on plots under plots. For the first time in my life I realized that web applications programming isn’t much different than writing a novel. Juggling thousands of variables, objects, functions, and conditions of a program. Along with the syntax for the various languages utilized to connect all the dots.

The things we are creating now at NetCrafters are cutting-edge. This is no doubt a product of the industry. However, the CMS (Content Management Sytem) and Paradigm API (Application Programming Interface) running on Amazon EC2 (Elastic Computing Cloud) and Google App Engine respectively, is a highly advanced and interconnected project.

Creating these tools (plots) using all of the different programming languages and platforms (characters) involved, yet having them talk to each other like their in the same room (dialogue), feels like my defining moment! I’ve written a lot of programs and applications over the last 20 years (unpublished stories). A large majority of them are running on giant servers, tucked-away behind expensive firewalls in hospitals and mental health facilities all around the country. Whereas hundreds have dissolved, and still more are archived… just in case. :)

I’ve always wanted to be a writer. It turns out I already am one, I just write in the language of machines instead of humans. I attribute J.K. Rowling with a few other things in my life as well, but this time, the first chapter of “Harry Potter and the Goblet of Fire” was the vessel. I can now move forward with my work and let the anxiety of, “When will I just sit down and write!” dissolve.

Now is my moment, I love what I do and what I’m doing… it’s time to be published!

To Code or Re-Code, That is the FAQ

Shakespeare photo courtesy of Wikipedia

(Shakespeare photo courtesy of Wikipedia)

Boy do I love a blank canvas!

That’s what NetCrafters has been for me over the past year. I came from a budding 8-year career working with closed systems that were purposely isolated from the Internet for security reasons. I worked on servers across the country that housed clinical information for millions of people. I was always working within very strict guidelines and always with proprietary systems which yielded zero results for the prized “ask Google” solution we’ve all come to rely on.

Now I’m on the cutting edge of technology in every way. The tools we’re implementing at NetCrafters are easily available and not a single one comes packaged on a disk, or inside a box, wrapped with cellophane. Everything can be downloaded, learned without a classroom, and in most cases without a visit to the book store! Every day I read my carefully selected RSS feeds for the latest updates on many different fronts. Each and every day something amazes me. Think about that… Every Day!

This correlates to a high probability that each and every day my understanding of the tech soup I’m swimming in will change. This doesn’t come without some serious challenges, of course. One of which being the urge to go backwards. The thousands of lines of code for our CMS tool is constantly swimming in my mind. Each time I discover something new I immediately start a cranial search for applications of this newly minted puzzle piece… “I know this will fit somewhere.”

The challenge comes with deciding where the line is. At what point does going back and updating old methods become counter-productive? They will eventually need to be updated – undoubtedly – but they’re working now. Nobody is complaining about them. In fact as far as everyone who isn’t as intimately familiar with the code is concerned, everything is working great! But now I’m waking up in the morning and immediately thinking about how exciting it would be to spend the first few hours of the day crawling like a spider through thousands of lines of code and cleaning out the cobwebs. It’s a glorious and refreshing feeling until…

Someone inevitably asks the question, “So whatcha been up to today?” A very well-intentioned and perfectly acceptable question, mind you.

To which I must reply, “Well, I discovered this amazing, flashy-new-shiny way to do the same thing we were already doing but now it’s 1.78 seconds faster and involves about 3 hundred fewer lines of code.”

And of course the response tends to be, “Ah, cool. Is it something you can show me?”

Which brings the, “Well, not really…”

But that’s the beauty of our team. We all trust each other to make good decisions while considering the whole. It may not have been one hundred percent necessary, but if it helps my mental sanity then we’re all better off!

I’m thankful and appreciative to be able to paint on the most interconnected, dynamic canvas in history – the highly malleable canvas of 1′s and zero’s – each and every day.

Upgrade an EC2 Instance Kernel

I’ve been using the Amazon Elastic Computing Cloud (EC2) for running our web servers at NetCrafters for almost a year now. It’s been an amazing experience and I’ve kept a detailed account of many of the lessons learned.

Amazon Web ServicesThe most recent challenge came when our servers just started locking-up for no reason. The sites would still be responding so we knew the LAMP stack was still limping along, but they were completely unresponsive via SSH. The only way to regain control was to issue a reboot through the Amazon API command line tools. Even this would sometimes take two or three times before the server would cycle.

Once I was able to login again, and look through the logs, there was absolutely nothing to go on…. until finally the server completely locked-up. Apache was not responding and we could not open a secure shell. Once the server was rebooted and came back up, there was an error in /var/log/dmesg:

kernel BUG at arch/i386/mm/pgtable-xen.c:306!

Unfortunately it seems that very few people are able to get this message because they are never able to recover the server once it happens. I was lucky and had this tiny shred of evidence to go on. After hours of searching I finally found that the kernel our instances were using (2.6.16) had been declared unstable for large CPU instance types. Originally these instances were started as m1.small and I had indeed converted them to c1.medium instances.

The Solution: Upgrade from the 2.6.16 kernel to the 2.6.18 (to be exact: vmlinuz-2.6.18-xenU-ec2-v1.0). Unfortunately, this too is a hard to find procedure, thus the long awaited point of this article:

How to Upgrade an EC2 Instance to 2.6.18 from 2.6.16

First, I recommend reading about Amazon EC2 User Selectable Kernels. I will outline the steps I took here, but if you want to know what you’re doing (versus just typing exactly what I tell you to!), please read at least the first paragraph or so then come back.

You’ll need the latest API and AMI tools installed on the instance, if you don’t know how to do that, search the AWS developer forums. There are plenty of tutorials already there, however, if you’re running Debian, there is a great post on the forums here for installing the AMI tools as a Debian package (this worked flawlessly for me).

Start by launching your instance from your existing AMI image (I’ll be using an arbitrary AMI: ami-99999999 as an example. Be sure to replace with your registered AMI), using the new Kernel (specified by the –kernel aki-9b00e5f2 directive), and the appropriate RAM disk (specified by the –ramdisk ari-67b95e0e).

ec2-run-instances ami-99999999 -k gsg-keypair --kernel aki-9b00e5f2 --ramdisk ari-67b95e0e -t c1.medium

This will start a new instance, using the vmlinuz-2.6.18-xenU-ec2-v1.0.i386 kernel (aki-9b00e5f2) and the initrd-2.6.18-xenU-ec2-v1.0.i386 RAM disk (ari-67b95e0e). These are both for a 32-bit instance, if you’re running 64-bit instances, you’ll want to use aki-9800e5f1 for the kernel and ari-64b95e0d for the RAM disk. You can see the list of AMI, ARI, and AKI available by issuing the following command:

ec2-describe-images -o amazon

The -t c1.medium is used to describe the instance type (number of processors, RAM, etc). There are others available:

Once the instance is up and running, login via SSH and we’ll need to install the new Kernel modules:

First verify you’re actually running the new kernel:

ec2# uname -a

Then, install the new kernel modules:

cd /usr/local/src
tar xzf ec2-modules-2.6.18-xenU-ec2-v1.0-i686.tgz -C /
modprobe -l

Next, we need to install udev. You may not need to do this, but we are running Debian servers and udev is required to make the new image bootable. Otherwise, none of the devices will mount and the new image will just hang at boot. This is really easy though:

apt-get dselect-upgrade
apt-get install udev
# prevent udev from keeping its old ip-adress
rm /etc/udev/rules.d/z25_persistent-net.rules

Next step is to rebundle and register your new image. The new image will use the 2.6.18 kernel and RAM disk by default. If you’re a bit rusty on this process, I recommend the EC2 Getting Started Guide. Once you’ve done this, launching the newly registered AMI will launch your instance with the new 2.6.18 kernel and RAM disk.

NOTE: Each time you rebundle, you’ll need to re-issue the following command or udev will keep it’s IP address and the new image will not get a new IP via DHCP at first boot:

# prevent udev from keeping its old ip-address
rm /etc/udev/rules.d/z25_persistent-net.rules

I wrote this a week or so after actually doing it, so I’m going mainly from rough notes and memory. If you’re having a specific error while working through this, let me know and it’ll probably knock something loose and I’ll have an answer for you. This took nearly 10 hours to aggregate the steps for making this work. Once you have the steps in place though it only takes about 10 minutes. I sure hope to save someone else all that trouble.


  1. Amazon EC2 User Selectable Kernels
  2. Programming Amazon EC2 (version 2008-09-01)
  3. Forum: AMI does not boot (post: j0nes2k, April 4, 2008 @ 5:25 am PDT)
  4. Forum: Installing ec2-ami-tools as a Debian package (post: Stephen Caudill, Dec 27, 2006 @ 8:26am PST)
Vince Stross