Showing posts with label thinking out loud. Show all posts
Showing posts with label thinking out loud. Show all posts

Thursday, June 16, 2011

All developers are not created equal - hence not interchangeable

Earlier yesterday I came across this article on the New York Times: Thieves Found Citigroup Site an Easy Entry. At first I thought, "Man, another big site had their customer data compromised", but as I continued reading this incident is a little bit different; especially the nature of the attack that was described in the article. The marketing and PR departments for these brands - and in this case Citigroup - need to be a little more careful about the kind of technical information that gets released when shit hits the fan.
Think of it as a mansion with a high-tech security system — but the front door wasn’t locked tight.
After reading through the article and the retarded nature of the attack you can't think of it as a mansion with a high tech security system; not even close. Some context on this attack:

In the Citi breach, the data thieves were able to penetrate the bank’s defenses by first logging on to the site reserved for its credit card customers.Once inside, they leapfrogged between the accounts of different Citi customers by inserting vari-ous account numbers into a string of text located in the browser’s address bar. The hackers’ code systems automatically repeated this exercise tens of thousands of times — allowing them to capture the confidential private data.The method is seemingly simple, but the fact that the thieves knew to focus on this particular vulnerability marks the Citigroup attack as especially ingenious, security experts said.
So, all these thieves needed to do is basically log in with their own or even someone else's Citigroup account and lo and behold this account number was present in the address bar after login. Changing it gave them access to someone else's account. A little script to repeat this for thousands of accounts and scrape the details.

This process was described by "security experts" as "especially ingenious". Really?!? This is the oldest trick in the book; i.e. mess around with the URL until you get somewhere. These "security experts" should get fired if this kind of attack was surprising.

The "what can we do, we got hacked" wagon got extremely popular in recent years, especially this year, but this Citigroup incident is different. There is no excuse for being on the "we are retards, we got hacked" wagon. When your "high-tech security system" is composed of changing account numbers in URLs, then what else can someone find if they look harder?

How does one get to this position? I think at the root of the problem is the thinking that people working in technology are interchangeable cogs in a giant machine. When you are building the pyramids, yes you can get 40,000 slaves and have them drag giant slabs of rock into place and stack them with virtually no way for an error to occur. And yes you can get another 40,000 slaves and replace the first 40,000 and they will still drag and stack the rocks as good as the previous 40,000 did. That mentality works when the tasks at hand are fairly simple and mechanical such as building the pyramids, or the production line at Ford. It is absolutely not valid in technology, yet there are many executives, project managers, and software architects today that think its possible.

The other part of the problem has to do with measuring expertise. The above assumption that developers, architects, designers, etc. are interchangeable also leads to the flawed assumption that a developer with 10 years of experience can replace any other developer with 10 years of experience as well. It is easy to get to that assumption when you think of these tasks as mechanical such as building the pyramids, or putting the wheels on a car. 10 years of experience developing doesn't have the same weight it did 30 years ago. Most developers today got into while they are teenagers, and hence by the time they graduate university they already have 10 years of experience developing stuff. Also, there are more technologies today that are available to the average developer to experiment with and try out, than there was 30 years ago. Hence why building technology systems  and development in general is a combination of science and art. The Sistine Chapel would have looked different if Leonardo da Vinci painted it instead even if he got the same directions from the Pope. The Pyramids would have looked the same regardless where the 40,000 slaves came from.

So for an online application that has to do with people's credit card accounts to fail at this level doesn't give me the warm fuzzy feeling that I should be getting when I read "Citi has implemented enhanced procedures to prevent a recurrence of this type of event." - if I were a customer.

Where else did you not do the due diligence you owe your customers? What other skeletons are in the closet? The New York Article should have started out like this:
Think of it as a tent with a zipper — but the zipper wasn’t closed.

Saturday, December 11, 2010

Mobile e-commerce & augmented reality?

When you think e-commerce what comes to mind? For me, I think Amazon.com and eBay.com. I remember reading somewhere that Amazon wasn't the first online bookstore, but in fact it was books.com - which is gone now by the way, and instead redirects to Barnes & Noble. Today, Amazon doesn't just sell books anymore, heck I can even order 18-inch wheels from there. But, what comes next? The e-commerce market is already at a point of saturation and it comes down to a battle-of-the-brands. Do I order this book from Amazon? or Barnes & Noble? or even Walmart? Should I buy a Dell from Dell.com? or from Walmart.com?

I blogged about social e-commerce at the beginning of this year, and some things are getting rolling in Collier's predictions, but we still have more to go before all her 2010 predictions are realized; sadly it won't be 2010; we're close I think, but I think social e-commerce is still a few years ahead. Its hard for these data silos to get broken down to really enable me to receiving recommendations from Amazon based on my Facebook friends or even tweets. Maybe I'm wrong.

But while these silos are up, m-commerce and augmented reality can go hand in hand, and it solves one of the problems faced by online shoppers; the "I'm not sure how this will look" problem. Its not a problem that every online store has, ex. I won't question how a PS3 will look like in my living room, nor will I question how a Mac mini will look like on my desk, so its not a problem there. It is a problem when buying something bigger, like furniture, appliances, decorations, wall paint, etc. Things that either take a lot of room, or might lead you to the dog house if your significant other does not approve take a lot of time to undo if you don't like where you placed it.

For e-commerce a picture is not worth 1000 words for most products out there. In the crazy world between my ears, I would go to the new place I'm moving into this month, pull out my iPhone 3GS and go to Ikea.com and start up their augmented reality furniture browser. I can then load up all the furniture I'm moving from the old place and see how everything comes together. It will then recommend other products based on the data it's collecting via my phone's camera, maybe different colours? maybe furniture pads to prevent the table from scratching the hardwood? Perhaps it'll recognize the TV and recommend a different place to reduce the glare in the morning since the windows face East?

Can it be done? I think so. The technology is already available. Where do you see mobile commerce heading in the next two years?

Friday, September 03, 2010

Don't tell me your TV supports Twitter (Part 2)

In Part 1 I ranted about the Samsung commercial that I caught on TV raving about accessing Facebook and Twitter from their new TV. I questioned how TV manufacturers are repeating the marketing campaigns the telecoms did a few years ago when the iPhone came out on select telecoms. The ones that weren't chosen such as Bell and Telus raved about their Blackberry, Palm and HTC lineups. They coined the term "social phones" or "smarter smart phone" and being able to access Facebook on their phones. Anyway, the future of mobile is in the apps, not the device nor the platform. I believe the same is true for TV.

In Part 2 I question the role of TV service providers and broadcasters in this new "smart TV" era that seems to be around the corner. The good thing that is going for these guys, is that they usually are Internet providers as well, or are at least partnered with an Internet provider. Smart TVs obviously need an Internet connection, so these guys will still be there. Their role could change a bit. As more Internet ready devices hit the market, it doesn't make sense for them to split their business into three lines; Internet, TV, Phone. What happens when we start getting fridges with WiFi / data chips? or washing machines? or even cars? I predict they'll just all converge into one line, connectivity and you pay for the data you use. Before we get there, we'll probably go through a stage where a standard package will give you X devices, a "gold" package gets you Y devices, and a "VIP" package gets you Z devices. Soon after that, that too won't cut it as it becomes the norm that devices have these connectivity chips built in.

Saturday, August 28, 2010

Fast Mover Advantage?

I just finished reading the book "The Accidental Billionaires" by Ben Mezrich. The book provides great stories into the lives of the "founding fathers" of probably the most successful website on the Internet today - Facebook - although it wasn't a first mover. What amazed me the most about the book is how fast everything was happening.

Maybe once upon a time "First Mover Advantage" was important, that time I believe is long gone. Maybe it was important when building and shipping any product or service was extremely costly and time consuming. It was probably an important idea or theory for the Space Race in the 60s and 70s. I don't think it is all that important today.

Image from http://www.flickr.com/photos/blackbutterfly/3051019058/

I think people should think more about the "Fast Mover Advantage" (I'm surprised there is no Wikipedia page on this today). The fast mover is able to counter all the hypothetical advantages associated with the first mover. Here is why:

Tuesday, August 24, 2010

Don't tell me your TV supports Twitter

Last night I caught a Best Buy ad on AMC about Samsung's smart/social/internet TV.  It reminded me of the Telus/Rogers/Bell BlackBerry ads a couple of years ago marketing Twitter and Facebook as features of their smartphones. They're still doing it with terms like "social phone" or "smarter smart phone" which I don't really understand.  I was still really excited about this ad, especially because it got the gears in the crazy place between my ears turning again...

Samsung seems confused about what to do with Google entering a market in which Samsung is one of the largest players. Add to that, Apple's arrival later this year with iTV. The same two companies that pretty much destroyed Samsung's chances in the phone industry. Samsung has its own OS for its phones, which also powers their smart tv - Bada. This is a bad idea:
  • PopularityRecent market results show iOS and Android capturing about 80% of the mobile web consumption. (Not including iPad). Since these numbers were gathered in June, the iPad has probably gained some more ground for iOS. Let's keep it at 80%. Samsung Bada's share is a fraction of that 10% for "Other", with probably an equal if not greater chunk of that "Other" going to SymbianOS.

Saturday, August 14, 2010

Yahoo's culture vs. Google's culture

Two interesting articles I have read in the past week had to do with Yahoo's "hacker-centric" culture - or lack of. Two different views, one from Paul Graham's "What happened to Yahoo" post and the second from Ryan Grove's "What's happening at Yahoo!". Both were excellent reads and provide an interesting look into Yahoo's culture.

These articles got me interested in comparing both Yahoo! and Google, and see which is more a technology/software firm. I started out with the leadership team of both companies and counted how many engineers, scientists and mathematicians are in both. I could be wrong, but I'm considering a "hacker-centric" culture is composed of a high number of engineers, scientists and mathematicians. So I'll refer to these as "techies".

The results were pretty amazing. The higher that number is, the more "hacker-centric" that company is. Feel free to comment about this, if I'm being biased or overlooked something. One thing I did, but not on purpose was I double counted some members if they were in both the board of directors and VP lists, so the ratios are a little skewed, but still the difference is huge.

First insight was the very different leadership structure in both companies. The boards of directors for both are rather similar, nine members in Google's, ten in Yahoo!'s, roughly the same number of "techies"; four in Google's vs three in Yahoo. Nothing surprising here.

Saturday, August 07, 2010

Stay on your toes, think proactively

I was coming back home today and while in the elevator the lady that just got on forgot to press her floor's button. She realized a little too late as we passed the 15th floor; her floor. She then press 17, but that too was too late, then she pressed 18, and again that was too late. She gave up and got off the 20th floor with someone else. At that point I was thinking, if she pressed 17 and 18 at the same time, she probably would have gotten off at the 18th floor. Anyway, no harm done when you miss your floor, I just thought its an interesting intro to this post.

When things don't go to plan, most start to think reactively, how do I get back on plan? We panic and try all sort of different things to reduce the damage - with little consideration to the consequences of our remediation. Ex. You are driving on the highway and tailgating the person in front. Breaks go on, and now you are forced to slam on the breaks to avoid a collision. Thinking proactively would mean you wouldn't have been tailgating the person ahead, it would mean you were aware of what is happening around you. Isn't that what you get taught at driving school? Defensive driving is proactive thinking.

Even when things go to plan, you can begin to think reactively. Task 1 is done, on to task 2, then task 3, and so on. Following your plan blindly is also reactive thinking. I also think the longer your actions have been in line with your plan, the worse the consequences of your reactions will be when things eventually don't go to plan.

So, shit happens and you can't do anything about it, but good leaders shine when shit hits the fan. Good leaders are proactive thinkers. People who think reactively, are good at following, they're bad at leading; and that is okay. Not everybody can lead, and not every leader can lead all the time. I've been in positions where I followed great leadership; proactive leadership. I have also been in ones with reactive leadership. Its very different, but both offer lessons to learn. Good leaders know when to step out of your way and let you run with it. Because reactive leaders are bad leaders, and they're good at following, they expect others to blindly follow them - blindly following someone isn't proactive thinking.

Proactive thinkers are Linchpins.

Sunday, July 04, 2010

John Underkoffler points to the future of UI

I caught this video by John Underkoffler - the science advisor for the movie Minority Report on TED the other day and it got me thinking about web interfaces, especially for online banking such as CIBC's and online billing such as Roger's. I'm not saying we need interfaces such as this for these kind of online applications, but we are surely due for a major overhaul of these interfaces. I sure hope we don't need movies about these before they become more mainstream.

Is technology capable of improving the interfaces for online banking/billing sites?



Hell yes! The customers just need to demand that. We have been stuck in this e-statement model for far too long. Technology isn't the hold up. We have had these brilliant interfaces in the wild for a while now. Consider Mint. That 15 person team was so successful they were even purchased by accounting software maker Intuit in 2009. So it is not technology that is limiting us, Mint did it and clearly they don't have as much resources as the banks or telecoms do. So what is the hold up?

Its a monopoly on my data


Consider the analyst who goes through some sort of overview of customer spending habits to figure out new plans or services to offer to these customers. Do they get a dump of the data? Probably not, you can't make sense of it. For these analysts to be able to see spending habits to justify new services they probably get to see some sort of visual, a chart, a graph, something. Human minds are visual, we are pretty good at spotting things if they are in a picture, not so good at spotting the one line in a contract that will
rip you a new one
cause you trouble in the future; that is why there are lawyers. Instead of dazzling me with loading icons please start focusing on delivering a little more value. I think as much right these analysts have to study my data to justify future products and services, I too have a right to my data to make sense of my spending habits. Here's a thought, I would happily pay $6 a month for that, instead of $6 a month for call display - which by the way probably requires you to do work to DISABLE. Why do we let these people reach into our pockets without providing anything of real value to us? Would you buy a car from a dealer who asks for more money to enable the 3rd gear? or charges you per minute per kilometer driving on an out of province highway? or you pay a service charge for filling up with a different grade gas?

Who would want this kind of stuff anyway?


Gen Y grew up in the Internet age, they expect this stuff to be the norm. Its only a matter of time before it becomes the norm so why fight it? I have been using Wesabe to make sense of my banking statements up until they posted a shut down notice. I didn't mind the hassle of exporting my statement and importing it in every month to see these beautiful spending charts. The tagging feature even allowed me to track my spending on smokes by making me more disciplined at buying my smokes from the same 2 places so that I could tag my spending at these 2 grocery stores as "smokes".


As much as I'd love to manipulate my phone bill in 3D space, I will be extremely happy if my interaction with it becomes more 2010, instead of 1997. All I'm asking at this point is for 3 fairly simple, yet significant improvements:
  1. The ability to tag transactions, and to view my statement by tag for both billing & banking. Rogers already allows me to tag numbers, so you are halfway there!.
  2. View tags as a pie chart -or any other visual- for both billing and banking
  3. For banking, graph my "income" against "expenses", month to month, year over year.


Simple? Can you do it for 2010? And then if you agree that these add value to your customer, then how about you replicate Wesabe for 2011?


Minority Report science adviser and inventor John Underkoffler demos g-speak -- the real-life version of the film's eye-popping, tai chi-meets-cyberspace computer interface. Is this how tomorrow's computers will be controlled?

Tuesday, June 29, 2010

the intersection of business and technology

Technology is useless if it does not deliver business value. Either it saves me time and hence money, or it makes me money. It has to be one or the other. Where does front end architecture fit in this picture? As the title points out, its right in the middle.

The front end is a misunderstood piece of any application, it is usually overlooked, underestimated, and belittled. Its fairly common to perceive it as "toying" around, "no/low value", etc. Its also very easy to believe that all the stuff that happens at the backend is the stuff that commands the big bucks. Unless you develop low level software such as compilers, web servers, drivers, etc. here is why you are wrong, and why front end architecture matters.

You can't deliver "customer focused solutions" if you belittle the front end


Successful front end architecture means focusing on what is important for the end user, not you the developer, nor you the SQL ninja, and not even you the business SME.
I don't know how this upside down tradition started, but I might have an idea. Application development usually starts with the back end framework, you know the Springs, and the Djangos, the Struts, and the Zends. None of these deliver any value to the user, they do add value to the delivery team and make them -in a perfect world- deliver better code, faster. So how did this tradition start? (One that focused on starting in an area that couldn't be any farther from the end user) My reasoning is that it started with equating building end user software with constructing a building. The first step of doing that, is to lay the foundation, the stuff that will carry all the weight of the rebar, steel, concrete, pillars, roof, and all the occupants and their equipment. All this stuff is as far away from the occupants of the building as possible, but it is by far more important than whether the doors open in or out. The occupants of a building are first concerned about their safety, a "customer focused solution" in the construction industry is one that is first safe for its occupants. Everything else comes later. However, when we use this analogy to building end-user software, we start out right off the bat focusing on the wrong things. A customer focused solution starts with the end user, what will he be interacting with, and then works backwards to define the solution that is required to support that end user.

You can't slap an interface on it


Okay, you can, but you shouldn't. Can you slap a steering wheel in the backseat of the car? Sure you can. Should you? probably not. Why is the steering wheel in the front? because the end user needs to see the road. Start with the end user and work backwards to the solution. A more accurate statement is actually "slapping a back end on it", or "wiring the back end to the front end". That you can do. Why? Because at that point you know what the user wants, and you know how your front end will achieve it.

Another reason why not to do this, is if you care about your users' experience, you would spend more time thinking through the front end, iterating and making it better. Forget the focus groups. Forget the design committees. Empower qualified, creative, and responsible people to make usability decisions. Have real users developers use your application, and keep your mouth shut. Don't show them how to use it, or what they're doing wrong. Observe, take notes, and make it better. Focus groups could just make you chase your own tail, as what happened with New Coke.

Phasing in features


Good back end frameworks and architecture allow you to phase in functionality as you progress in the project. A good front end architecture needs and should do the same. This means, just like a good back end does, a good front end must utilize a common framework. Today, not much focus is given to the front end. In fact it is assumed it can be completed 100% with a "big bang" approach. We don't use a "big bang" approach with the back end, why do you do it on the front end? Because you tried to slap an interface on it...

Front end components need to be thoughtfully designed, with re-use and phase-in in mind. Don't attempt a one-size-fits-all approach to these components. It might make for less development, but if your focus is "customer focused solutions" then you need to account for different use cases and different user types/roles. Also, just like back end components get re-factored when duplications occur, so must front end components. Why the double standard? because the front end gets belittled.

More data is better data


Yes, your gut can have a lot of say when it comes to the front end. However sometimes the change has no affect on your gut. Does it matter if your links are underlined? or are you just doing it because [insert your favorite reason here, ex. because my dog wags its tail when it sees underlined hyperlinks] Design your front end to be able to gather these usage patterns, because "customer-focused solutions" support their decisions on customers' actions. Don't even ask your customer whether they like A or B better, keep your mouth shut and observe. Do they use your application more? better? quicker? when A is there? or when B is there?

At the intersection of business and technology lies the role of the Front End Architect (FA). This person should be empowered and trusted to make front end architectural decisions based on supporting data that will deliver value to the end user. The FA, is not a business SME, they're not a designer, but they could be. They are a technical person, a developer with the scars to prove it. They work with the business to figure out how to deliver this end-user value. The FA also works with designers to iron out any usability issues that may affect the end-user value and can be fixed via enhancing the look and feel. They also work with the rest of the developers to keep front end components re-usable, and phase in friendly.

Do you have an FA on your project/in your organization?

Friday, May 28, 2010

Chefs, Curators, & Developers

Besides that these people all eat, sleep, and lineup for the latest iStuff, developers ought to have another thing in common with chefs and curators. I would say software development is part engineering/science and part creativity, even as high 50/50 or maybe a little more.

Curators

The term "custodian of a collection" doesn't describe what curators really do. It almost makes it sound as if they just take care of a collection, but not necessarily care for it. A museum without a curator simply turns into a warehouse. It is the curators job to prevent that from happening. To care for the collection and keep an eye at the big picture.
Every piece, big or small, serves a purpose and delivers an essential part of the experience. The opposite of a curator is a hoarder.

Chefs

Similarly the chef does not just cook the food, they keep an eye on how the items on their menu works together, how the dishes enhance the experience, and how every ingredient in a dish works with the other ingredients. Every detail counts, and great head chefs will look after it all. They are food curators.
Musuems, food, and software is made better by what has been left out - purposely - and not what's included.

How do they do it?

Great museums don't happen overnight, they happen over time, time spent doing the same thing - improving the museum. The task is never done. It's iterative, and perpetual. Curators aren't afraid of tossing things out because they don't work, or something better came along. Curators are careful not to hoard shit, even if it's all good shit. They don't go about it alone, they get feedback from other curators, from their clients, and finally their gut & others' guts. Your unconscious brain has a lot to do with this task, sometimes it is hard to explain, but often your gut is on target.

Lessons from Chef Ramsay

The number one thing Gordon does to fix a failing restaurant is "cut the crap" - after he is done ripping through the owners of course. If you watch the show, you probably have seen him throw out half the menu - if not all of it in some cases. Take the episode of the "Curry Lounge", tons
of curry based dishes. None of the waiters guessed what any dish was during the blind fold taste test - except for the french fries; at an authentic Indian restaurant. Chef Ramsay tweaks and iterates over the food, how it's made and served over the course of the week. That's his formula. That is why he is a great head chef.

Don't be scared of "No"

There are many good reasons for saying "No". There are also wrong times to say "No". Say you are at a restaurant, and you ask the chef to skip the pistachio on your baked chicken because of allergies. That is a valid request you cannot say "No" to. On the other hand, don't be surprised if the chef says "No" while at an authentic Italian pizzeria and ask for extra cheese on your Pizza Terreno - he knows better than you. A good way to say "No" is to always provide reasons and alternatives. Most clients will consider alternatives, after all they are paying you to provide the best possible service you can afford them. Sadly, you don't always win, or you are just wrong and you lose the battle and have to provide exactly what you have been asked.

"But that is not how we do things around here"

There is always room for improvement, and there is always areas and times where you can play the role of the curator at what you do. Always start with the things you control. What if you work at a hypothetical restaurant that followed the Waterfall model? retarded right? but at every point in the process a curator could help, and a curator can make the experience better - even a little. This Waterfall restaurant basically takes orders at 5pm, by the time they are validated and cooking starts at 8pm. Then the dishes get checked, fixed and reheated. Food starts to come out at midnight. And into the wee hours of the morning the cooks have gone mad trying to
figure out what was ordered eight hours ago. Every single role in this kitchen can play the curator in what they are doing. As badly suited the experience is, it's much worse if everybody minds their own business and blindly follow the blind.

Kitchens get dirty one dish at a time

"Too many chefs spoil the soup" True, and applies everywhere. Why else do the Marines and special commandos operate in small, tightly knit groups? Do you seriously believe that anything will need an army of develop
ers yet the army elites can perform heroic missions with a team of 15?
Don't let your kitchen get dirty, it's easy to say you will do it later, but know that your effort to clean it does not have a linear relationship with how long you left it dirty.

Everybody can be a curator

Whatever business you are in, you are involved in curating and improving the product/service you provide. Whether you design buildings, write software, an author, or a sales guy, a president, or an army general. We all do it. We all know that the first draft of an essay is always the worst. When you walk into a board room to make a pitch, how many times do you revise and tweak that slide deck? how many times do you do it the night before? or even 30 minutes prior? We all do it.

So here is the big question, when did we say that code should be written once and done with? What tends to happen with development teams is this task is not performed as often as it should - specially as the team size grows. It gets worse when dates start to slip, as things like continuous improvement and cleaning up the kitchen are ditched. Here is the other problem, managers, executives, planners, etc. don't like to see tasks that don't have end dates on the plan. They want to know things like effort spent, time elapsed and percentages for each. An iterative task only has elapsed time, but nothing to identify how much is left because it's never complete. The affects of this curating task are phenomenal. Almost always you will be amazed by how little you changed or even better removed and how substantial the improvement was.

Making something better is never easy

...but the harder the decision, the better shape that thing is. To be successful at being a curator you really need to understand what your customer wants. Not just a little, you need to get in their head and figure shit out. You need to figure out how you can add value, or if you just can't. Value add is highly dependent on the context. If I am in a rush, fast food will be the best value for me at that time. I'm hungry, and need a quick solution. On the other hand, you wouldn't take your wife there on your 10th anniversary. The best way to bring yourself to a point where you can understand how you can add value, is by aiming for the simplicity on the other side of complexity. If you are looking for a good healthy relationship, you need to invest the time up front, you need to climb that complexity hill, sometimes it's steep, sometimes it's long, but rest assured it will flatten out after the peak. Thats where you want to be. Not many people make it to the other side, most will be hanging around at the base or on the way up. That region is highly competitive. The elite don't hang out there. Take Apple for example. We can consider Apple to be one of the elites of the tech industry - almost like they have the Midas touch. People have tried to replicate the iPod with little success. This week, slowly but surely Apple wrestled the 2nd highest market cap from Microsoft - without even dominating the market. They know their customers, and they don't even want everyone to be their customer.

Everything looks great on paper, it's not until it comes to life you start spotting glitches and things to improve. Good restaurants will shutdown for a day occasionally and just cook the whole menu and their staff will critique everything and make adjustments. Iteration is key. In school you are taught to review your essays, read them out loud, tweak and adjust the paragraphs, shuffle things around, all of this to find the essence and flow of what it is you are trying to accomplish.

There are only two kinds of developers.

Those that just get shit done - literally -, and those that will get it done very well. From a time line point of view, you won't be able to tell the difference. They both finish on time, and their stuff will work. However, the first, don't care about what it is they are building, don't care about who they are building it with, they put in their hours, get paid and that is it. Excellent developers don't have to be geniuses or come with Masters & Phds. These are the ones that want to be proud of what it is they are building. They are the ones that think several steps ahead, and design for change. They will embrace change if it makes what they are building better. They will try to resist change that adds no value. They are the ones that keep an eye on the big picture. Those are the ones that will invest the extra time to re-factor the redundant classes they spotted. They are the ones that experiment and try stuff out because they are always on the look out for a better option. They are the ones that will tell you "No, you shouldn't do that". That is how you separate the two. An excellent developer will wave his hands and yell when something is just not right. The first kind will shrug their shoulders and pile on the dishes.

Monday, May 24, 2010

The Mythical Man-Month

I think it was in first year software engineering that we had to read this book, and nine years later I really, really understand the underlying purpose of this book. It may just be yet another book back then, but the lessons that are hopefully learned from it will last a life-time - not just for software projects, but any project.

The Mythical Man-Month

Unfortunately on software project plans, developers, designers, testers, business analysts, product managers, etc. etc. are considered "just another resource" that are added and removed off of tasks. The assumption is that all are equally effective and skilled in all the required domains, and all will produce the same volume and quality. So in a perfect world it makes sense to scale the team to meet deadlines, although we don't live in a perfect and linear world, this is still the method of choice. Even though the biggest effect of this method; drastically increased non-linear communication time is widely known but mostly ignored.

Sadly this is the state of this industry, project plans that are too often disconnected from reality. I think part of the problem is driven by dividing tasks into a unit of time, after all that is how budgets are built, teams are put together, and progress is tracked. However on the other hand, this unit of time does not measure the real size of the task. Its just an illusion. Its like a building, we don't measure building size in number of months it took to build, we measure it by number of floors or in meters i.e. something relevant and real. If a 40 meter building was estimated to take 12 months, and in 6 months we are at 10 meters, then we are 25% done, and not 50%. However if this building were a software project its assumed we are 50% complete. Then at 10 months we realize we won't meet the deadline, forget about the Mythical Man Month and scale up. Why did this happen? Because software does not have a realistic metric, software is abstract.

Some say you get better at estimating over time, but that too assumes we live in a linear world. Ex. Project X took us 3 months, so we will estimate that project Y will take 9 months. We don't live in a linear world, and humans aren't good at estimating non-linear stuff. We may think that the second project is 3x as long as the first, but it could be x^2. However the hope is that over time you can make a non-linear project more linear by improving the processes for the non linear components. That effort is also non linear.

You can try to measure by team size, effort, or lines of code, etc. but all are just an illusion of measurement, none are real. Whether you measure buildings by floors or meters, you can translate between both. On the other hand, you can't translate lines of code into time, effort or team size.

I don't know what a better alternative is, but surely it is not this. Perhaps the problem is just trying to estimate that far into the future with too many unknown variables. Feel free to comment.

I recently read the book "ReWork" by the guys at 37Signals and one paragraph I absolutely loved has to do with project estimation.


The book is a must read.


Tuesday, April 20, 2010

Crash 'n' Burn: The 11th hour for Flash

Adobe's rhetoric continues after the curve ball Apple threw. The whining continues with this post: On Adobe, Flash CS5 and iPhone Applications.

Sadly, the whining doesn't change anything, and Adobe's argument would have been more valid if they didn't trying to lock developers into Flash/Flex and if it -Flash- were really open. Also, I think Adobe's Flash/Flex tools favor developing using Cold Fusion on the server side... you can use other server-side technologies however I believe the tools "play" better with Cold Fusion.

Apple's decision makes 100% business sense to me. They're advocating for their own platform, or open standards. Just like Adobe advocates for their own platforms, or open standards. What's wrong with that?

Flash filled a void in the 90s, but where is that void today? Is it even still needed? Yes its far superior technology, but its a closed technology. And to think that Android will succeed because it has Flash is just absurd. Android could be the iPhone's real challenger ONLY because it is open. The above post also seems to confuse "open" with "cross-platform". They're very different. Flash is cross-platform because its not open.

Flash needs something different right now, we don't need Flash to deliver rich content online anymore. We don't need flash to deliver sexy fonts. We don't need Flash to scroll and fade text. Soon we won't need Flash to play video - my Youtube embed below is still in Flash- . We don't need navigation built in Flash. So much stuff we needed Flash for (right or wrong) , that are just not needed today.

On to Flex, Flash's younger cousin. We - the majority - don't need that as well. Slowly but surely applications will move to the web. They may have some Flash components that could now just be as easily done in HTML5 or even HTML and some nifty JavaScript. Where I can see Flex fitting, is for these extremely specialized software, such as CAD or medical imaging. Such software is expensive and time-consuming to write, and would be a pain to translate into different operating systems. Such software also comes with heavy visualization, so its a good fit with Flash. Maybe thats where Flash will head, who knows? But there is definitely hardly any room today for Flash on the web.

This song is dedicated to Adobe Flash, I don't know who your savior will be, but you really need one right now...bad.

Wednesday, March 03, 2010

Synchronized Cardioversion is print media's only chance

I just watched an interesting video posted on Mitch Joel's blog under the post "Print is Not Dead". The video is an attempt by some print magazine people to convince the rest of us that print magazines will still be around. Will they? 5 years from now? 10 years from now? I'm betting on no.

This video is exactly the problem with the current print industry. They are still in denial. And why wouldn't they be? It is not an easy to think about a technology killing another - especially when one works in the technology on the receiving end. Besides, what else could they have said? "Yes we are facing some challenges, but we are working on revolutionizing our model?" Yes that will work well for the advertising bottom line.

I absolutely agree with Mitch about the false thinking that Twitter or Facebook are challengers to print magazines. Those are not the technologies that will replace print. I'm surprised they didn't even mention mobile devices? The iPad? any other tablet? hello? However I disagree with the title.

On another blog, linked on Joel's. I read the following comment by "Jen":
If this story is best told in print, why are they telling it on a YouTube video?
Absolutely! Why? These print executive big wigs ought to reply.

So, why will print die? And by print I mean almost all variations of print media. National newspapers, books, national magazines, etc. I say national, because I still see some room for the local, neighborhood type of magazine/newspaper. Those serve the immediate community around them will still have a niche to fill.

  1. Its not portable. Yes, yes I can carry it in my bag man-purse but I have to remember to take it with me of course, and I have to pick up the right one that had the article that I wanted to read on the train. So, it is not portable.
  2. Its not searchable. You just can't do it with print. Gutenberg did not envision this when he started building the printing press. Way back then, the problem was copying and distributing books, not indexing them since there was no need for that. Unfortunately, these magazine executives are still stuck in the Gutenberg era.
  3. I can't share it, I can't tweet it, I can't start a public discussion about it. I can't refer to it later. So I read an article in the latest Times. How do I share it? Well I can lend the magazine out. My dad used to cut out the articles he was interested in and would stick them in his own 1980s version of InstaPaper i.e. a filing cabinet living room floor. But then back to number (2), he couldn't search it, but he did however have an indexing system; that he only knew how to use. My dad would then photocopy that snippet and share it with whoever. Times have changed, and today's technologies simply do what generations before us have done; but better. I'm sure 10 years from now my beloved iPhone and InstaPaper will be prehistoric, so why is it hard to say that print is prehistoric? Its been around for hundreds of years.
  4. Its not interactive. This is probably one of the things I have seen the print industry pay attention to. CBS apparently ran an video ad in a magazine in fall of 2009, and Esquire has an augmented reality edition of their magazine. I don't know how well these were received, did it work? well they do it again? who knows? But we know they tried to bring interactive material to a magazine. If you can say it with an interactive video then why just say it in words or pictures?
  5. It doesn't know who I am. The last two points are close to me. I am an evangelist for personalized and location aware anything. I would be in fact very disappointed with technology if these two points don't become as main stream as a coffee lineup at Tim Hortons in 20 years. Serving ads blindly is retarded, just like crossing the street blind-folded is retarded. But, if you are being chased by Michael Myers downtown Toronto and you happen to be blind-folded and can't take them off, then sure I guess you have to cross the street blind. This is the state of print advertisement. I'm sure careful consideration is taken to match ads with content, but there is so much you can do when you know very little about who is actually reading.
  6. It doesn't know where I am. This is an even bigger point with me. However, this one is not that far off from being realized - I think. The iPad whether it flops or succeeds will give this market the synchronized cardioversion it has been asking for. The ads will target me based on where I am. Advertisers will push their content to my device, and I will only see it if it matches certain criteria about me, my current location, or perhaps whereabouts, the time of day, weather, where I have been going, etc. In the crazy world between my ears I am able to subscribe to content I willingly publish on the Twitters, Facebooks, Foursquares, Flickrs, blog, etc. and present me with ads that match what I publish. If Foursquare shows I am bouncing between Cooksville and Union Station every day, maybe I'll see a Go Train ad when I'm reading an NYT article on an iPad? Or maybe when my kid loses their new cell phone and I go Kunfu Panda on Twitter I see an ad for someone selling the same phone on Kijiji - with the same serial number; then I'll see an ad for a new set of butcher knives on sale.

Thursday, February 18, 2010

why not Microsoft (Part 2 of many)

Round 2 of this series. I'm about to gut this video and turn it inside out. Kids, cover your eyes, this will get ugly.

  • Delegation of mail and calendar. Once upon a time, in a world when dinosaurs roamed freely on this Earth there was a need to do this. (Okay, it wasn't that long ago, but you get the point). However today, I don't think this is needed anymore. You get an e-mail invite, and you can quickly accept or reject it, find out if it conflicts with other meetings, etc. The need to delegate your mail and calendar is now obsolete. There is an exception to every thing, and some executives may still have a need for this, but come on, how many of these executives are there out there? This point is moot at best.
  • Folders or labels. That is the question. Yes you can't create folders because the whole concept of organizing mail in folders is from that 'dinosaur era' I mentioned above. The problem with folders, is its one-to-one. How do I file an e-mail as "High priority, I'll lose my job if I don't get this sorted out ASAP" and "this is for project X". Then after completing this task, I want to tag it with my "followup" tag so that I remember to check in a month later. I don't want to lose the other tags. I haven't found a way. Again, once upon a time, when you got physical mail you had to file it somewhere, you can only file it once. E-mail is different, and if you haven't figured that one out yet, then yes you should get an assistant.
  • Full corporate directory, and contact delegation. Ok you lost me here. There is an address book. I can search for people in it. Thats all most of us need.
  • Folks, its called GMail, and not GRemoteWipePhone. If that is available via Outlook, then it shouldn't. Remember point #3 from the first part? A costly excess of tools for people who don't use them.
  • Can't Manage Conference Rooms. These guys must be looking at the regular, public gmail and not the corporate google apps. You can manage conference rooms, and schedules. AND double booking still happens on Outlook. Don't blame the tools.
  • Sure, lets say you do need to install all this stuff to hookup Outlook with GMail. Sounds like FUD to me, but for the sake of argument I will accept this. Here, the problem is really Outlook. The world is moving to web based applications, keep up. You only need a browser if you were using it via the web interface, which by the way does not contain a costly excess of tools for people that don't use them. On the other hand, if you are obsessed with Outlook or absolutely need it - for whatever reason - , then perhaps Google Apps and Gmail is not for you.
Judge for yourself.




Tuesday, February 16, 2010

Why not Microsoft (Part 1 of many)

I just saw this Microsoft channel on Youtube comparing Microsoft solutions with Google's. I couldn't resist writing this up.

  1. If I need to mix different features in a blender then its not easy to manage. In fact it could very well turn out messy.
  2. "Documents zip across back and forth without a hitch". Okay, but what usually happens when you have "documents zipping back and forth, and back and forth, and back and forth, and..." Think turning the blender on without closing the lid.
    I'm also not sure why the diagram above shows a "phone" between a "PC" and a "browser"? Are you seriously telling me to e-mail my 40MB powerpoint slide on dialup? back and forth..back and forth..back and forth?
  3. "A costly excess of tools for people who don't use them" I just don't understand how the fool who approved this could have done so while keeping a straight face? Isn't MS Word and Outlook bloated beyond belief with tools that most people don't use?

Here's the video. Judge for yourself.


Saturday, January 30, 2010

On the rise of social eCommerce

Deborah Collier published her five predictions for social e-commerce in 2010:
  1. Goodbye to the Middleman
  2. The Year of the Deliver Company
  3. Creative Sponsored Advertising
  4. Mobile Commerce Revolution
  5. Free Culture Frenzy
Her predictions can't be anymore bang on, if these become a reality of 2010, then we are well on our way to reaching that stage of "Social eCommerce" in the first half of this decade.

As it stands today, our eCommerce networks are all rather isolated and built up silos. We have the Amazons, the Ebays, the Facebooks and the iTunes and App Stores to name a few. Social eCommerce requires these imaginary walls surrounding these walled gardens to come down - and I do expect them to. The reason they will is because as much revenue these networks generate, there is still much more left on the table. We just need to reach out for it.

The elimination of the middle man is a big step. Online applications that have carved themselves a small niche of the market have risen. Not surprisingly these bleeding edge, creative and unique applications are not operated by large corporations, but by the John and Jane Does that have operated them out of their home office or even basements. In today's online world a business does not need to provide a whole lot of services and products to corner the market - in fact focusing on your niche and a small set of products and services guarantees that you will provide better results. Amazon allows me to sell my books to other people, I can sell my old computer on eBay, or my music on iTunes. There is no middle man involved. I would say the middle man is mostly eliminated at this stage.

Delivery is an interesting service, as it increases with the growth of C2C markets, because the middle man does not exist. We are yet to see a creative, bleeding edge and unique process for delivery. Its a harder problem to tackle, but definitely still possible. I don't expect this advancement to come from the national postal services. Its tragic, but these creative solutions come out of the basements and dorm rooms of the World. The big corporations are too sluggish and paralyzed to move with the speed required for this sort of advancement.

Creative, seamless and relevant advertisement is a personal interest of mine. The future of online ads in the social eCommerce phase would be heavily wired with the abundance of data on today's and tomorrow's social networks. We have some creative advertisement solutions today such as AdSense that would push ads based on the content on the page. That was last decade's technology, the 2010s need something new that is even more seamless, more integrated, and finally more relevant to me. The only way I can see these advertising engines outdoing themselves is by personalizing these ads. The data to drive such personalization is present, just locked away in these individual silos.

Mobile. Mobile. Mobile. The iPhone has revolutionized this arena. When I went to highschool, not all the kids had cell phone, a good chunk did, but not everybody. I didn't get my first cellphone until grade 10 I think. Similarly with university, at least the earlier years. Slowly the mobile population grew, but at that time it was fairly uncommon to see a smart phone within the hands of a twenty year old. Today that is different. Now we spend more time with our iPhones, Blackberrys, and other smart phones than we do on our laptops or computers. This is just another bundle of cash waiting for someone to reach out. Those who don't keep up with these trends will surely suffer. Generation Y is closing in to their 30s and 40s, these are the future customers and they will naturally surround those that provide such services.

The freemium model. Another prediction that is directly tied to Generation Y. Unlike the preceding generation, this generation expects to get basic features for free. This generation does not tolerate the service charges and system access fees. I don't expect these old fashioned models to remain much longer. Take the service charge I pay to CIBC, what do I get in return for it? Absolutely nothing. On the other hand, they get to invest my hard earned money, make money off it, and then have the nerve to charge me a service charge or fee to take it out? They ought to be paying me a service charge! That model will change. On the other hand, I would gladly pay a service charge to get premium features for my online banking, as well as mobile banking - and by premium I don't mean printing my paper statement on my screen. Rogers mobile has slowly started introducing such free features to their customers. This includes the "My Account" Rogers iPhone app to monitor my usage, free Rogers OnDemand Online and the ability to tag phone numbers with names on my online billing. Not rocket science, but its an excellent step forward. Stop thinking of the web as e-paper.

In a perfect world, the data I publish on Facebook, could generate sales on Amazon, that will provide recommendations from Ebay along with tunes to match the occasion from iTunes in an experience that is seamless wrapped with personalized ads from Google.

Unfortunately we don't live in a perfect world, but these visions and ideas can' t be that far fetched. However, such ideas are a dime a dozen, what is important is how these ideas are executed, and not who is the first to execute them.

Thursday, January 28, 2010

iPad: Same Content, Different Form

After I got past the unfortunate branding fiasco of the latest and greatest from Apple, I can't wait to first get my hands on one (like millions of others) and then maybe even owning one (like millions of others will).

There are many pros and cons of this new device, just like there were many pros and cons of the first iPhone. When the first iPhone came out, all I thought back then was "Why?!" However, soon I realized that "because its Apple", and over the years - after the old colourful clamshell laptops were discontinued and the new generation PowerBooks came out, followed by the MacBooks - we became used to the idea that the latest and greatest will come first from Apple.

Back to these iPaddies...first I think they are a completely different breed of device, different from the Kindles. As far as I know, I can't have my pictures on a Kindle, nor can I browse the Internet, listen to music, play games, send/receive e-mails and pretty much everything else you can do on your iPhone except make phone calls. Sure you may ask yourself "So what? I can do that on my laptop" but there is a difference. Laptops are much bigger, heavier, and do far more than you may want to do lying on the couch.

The Kindle is an eBook/eNewspaper/blog reader; thats about it. If you have an iPhone, can you deny spending hours just fiddling around with it, browsing, reading, tweeting, listening to music, etc. ? Probably not. Now you can do the same but on a bigger screen, so what is the reason behind this public outcry?

Here's why I think that is. We - Apple's customers - became spoiled. We always expect Apple to release the greatest and latest, and we expect that to blow our minds away. For the most part, the majority seem to be undecided and equally amazed as frustrated due to the limitations. People expected it to be a full fledged tablet/laptop, but it couldn't have been. All the tablet PCs before it have so far failed. Why should Apple attempt going down that path? On top of that history, why should Apple create a product that would cannibalize their MacBook sales? Steve's diagram clearly set the expectation, it will be better than an iPhone without being an iPhone, but it also won't be a MacBook. Its a digital content reader.

The fact is, its just a new medium for content delivery, and it may be the holy grail for the newspaper industry. People don't pay for content, you can't own the content, but you can own the medium you purchased it on. Take the Davinci Code for example. I bought the book, I watched the movie, and I bought the DVD - others may have also bought the eBook. Each form has its benefits, and each form has its 'expiry date'. After reading the book once, I probably won't read it again, but what if I wanted to refer to a chapter later, how do I do that? how do I find it? Will I have the book on me to refer to that chapter?

You have a dinner party at home, and you want to share some photos with your guests. Either you load them up on your computer, or you plug your camera into your TV and show them. The content is exactly the same, the form is different. Showing them on your big screen is more convenient, but you need to find that silly cord first - which you can never find when you need it. You can huddle around your computer or laptop, but thats less convenient, and even less convenient is passing or rotating that laptop around. Oh wait, that iPad is on your coffee table. Same content. Different form.

Now we just wait, and see what happens between now and the launch. The use cases are endless, and are not just limited to content consumption. I definitely see use cases in at least education and healthcare, collaboration, and obviously entertainment. The Kindle and any other eBook reader just don't have this reach.

Sunday, January 24, 2010

The agile kitchen

Last year I posted about why successful restaurant kitchens are more Agile than many software 'kitchens'. Tonight, I'm posting more about this topic.

A good software developer is disciplined, just like a good chef is disciplined. Both will put extra effort to make that application or dish 'hit that sweet spot' nom nom nom. Both will cook stuff up that will add value, and not just because they can. The average chef will say "Yes" to every customer request, the good chef will use common sense and say "No" to some i.e. "No I will not put extra sauce in your lasagna because blah blah blah" - after all, this chef knows more than me and I trust their judgement. Certain requests are accepted right away - no discussion - like "sure I'll hold the nuts".

Now imagine the restaurant owner, they're not a chef, they're a manager. At the end, they are concerned about making money; after all it is a business and they didn't open this restaurant to just provide this excellent chef with a job. If the customer wants extra sauce, the manager's response will most likely be "Sure, thats X dollars extra" - the famous "change request", but thats another day's topic.

A Waterfall kitchen

Consider a "waterfall kitchen". The manager needs to know exactly how many people are seated, what each one of them are ordering, how complex each order is and how many chefs are in the back before scheduling any one order. In reality, most kitchens don't operate that way because you can't keep everybody waiting until all the orders have been taken. Finally, when the orders come out, some may be cold because of waiting so long and will be returned for re-heating. While re-heating, chef will notice that the broccoli doesn't look fresh anymore and needs to boil some new ones. Other orders will come back because they were incorrect. The chefs held the nuts on the wrong order and now a diner is choking outside. Now, all the chefs are waving their hands over their heads that it wasn't their screw up, the Manager is looking at the wait staff and developers to find someone to blame. The wait staff can't remember the details of the order they took 5 hours ago and then the customer mentioned they are allergic to nuts 15 minutes after that. The chefs say they cooked what is on the ticket...

So, clearly this "waterfall kitchen" can't work, why would it? The other problem with this kitchen is that by nature it will resist change. Because dishes are cooked at the same time, and the head chef inspects them at the same time, and waiters serve them at the same time, a change will wreck havoc.

But when can it work? That model would work when food is delivered. A catering service can't send each dish down on its own - obviously that won't work. There is only 2 trucks and they have two parties to cater to so all the food for one party needs to be delivered together to save delivery time, gas, etc. As for change, this model still works here because there are usually different varieties of food, and if you are a vegetarian or allergic to nuts the responsibility falls on your host to have something available for you - not on the caterer or their kitchen.

Anyway, I'll leave you with Chef Ramsay now and one my favorite episodes of Kitchen Nightmares - the case of the authentic Indian restaurant that served french fries and do-it-yourself curry...

Kitchen Nightmares - The Curry Lounge (sorry embed disabled for this video, and can't find one that can be embedded...) My favorite part starts at 5:40.

Sunday, January 03, 2010

Two Thousand & Ten

I'm going to go a little off track with what I'm going to post this time. I'm not going to talk about technology, Google Maps, web applications, software, photoshop, SEO, or how tos or any other "geeky" stuff. Its the start of a new year, a new decade, and hence I'm going to post about something different.

All in all 2009 was an interesting year, and good end to this decade. 2009 started out with a merger between the company I first joined - Pentura Solutions - after completing my masters and Thinknostic another company based out of Ottawa. Both companies have a very interesting past, they both are "re-births" of a previous company "Montage". I heard that name many times over the past 3 years, and today many of my colleagues were part of Montage in the past. Today this new company is known as ThinkWrap Solutions.

My start with Pentura was interesting. I was first interviewed in December 2005, and besides getting lost finding the office and arriving late, it went rather well I think. Unfortunately, at that time they were looking for someone to start right away, and I still had 6 months to go to finish my degree. Also, being an international student I couldn't just start working part-time without the necessary immigration papers. Fast forward to July 2007 and I completed a masters in computational engineering and thought of e-mailing Marc (the owner who interviewed me in 2005). Fortunately this time, they were hiring and I started that month. And that is my beginning with this great company; ThinkWrap. Its a great place to work, filled with talented people to work with and learn from. I think I'm one of the lucky ones who get a great job and a great place to start a career straight out of university. The funniest part is there is quite the McMaster alumni population at ThinkWrap - our ratio got a little diluted after the merger though. :)

2009 was also the year I finalized my immigration status in Canada and became a permanent resident. What does this mean for me? It means I have almost the same rights as a citizen, minus voting. I hear many Canadians don't vote so there you go. To me this means I don't need to apply for another study or work permit to stay in this great place called Canada. And in two years I get to apply for citizenship - then I'll vote for sure! I've been here since 2001 on numerous study and work permits. I really believe that over the years dealing with immigration, renewing papers, etc. I have gathered enough experience dealing with this to become just an average immigration lawyer - not that I would ever want to do it again, but my success rate is 100% : )
For 9 years I did it all myself, and haven't paid a penny to a lawyer to do it for me. Honestly, I never saw a reason why to, the documentation is there for you to read, the forms are online for you to get. Do it yourself.
However I do have to thank my student advisor at McMaster, my bosses at my jobs at McMaster, the guys and gals at Pentura and ThinkWrap for being there when I needed a signature or a letter, and my friends for maintaining my sanity. Without all these people, it wouldn't have been possible. Thankyou.

This decade is also an important one because I graduated high school in 2001, graduated from McMaster in 2006 and 2008. For that I have to thank my parents first for sending me half way across the world to study and supporting me throughout. I used to live in Abu Dhabi an hour or so away from Dubai, U.A.E - yes its a long way away from here. And second all my friends, without them I wouldn't have that many memories of this decade. Finally, as this decade is over, it means 2011 would be my 10yr highschool re-union.

Finally, this decade I also met this special somebody, 5 years ago. She bought me ice cream after a hiccup with my study permit. Now that I'm a permanent resident, she doesn't need to buy me ice cream anymore, but I'm still grateful I have her in my life. Thank you Carmen.

I'm a big Italian soccer fan, and when Italy is playing I'll be there cheering. This decade included the year Italy became the champs in the 2006 world cup. Last time they did that was the year before I was born - 82.

Y2K. Hey we survived what was hyped as doomsday. I think the only reason cell phones, phones, and even the internet hiccuped that night was because of all these people checking if we are still "online".

Here's to a great past decade, on to the next one. I just keep wondering, how will people refer to the past decade? The 20 zeros just does not sound cool enough.