Wednesday, June 27, 2007

Speaking to your Geek -- Bridging the communication gap between business and IT.

OK, its a common cliche, the socially awkward yet arrogant and slightly insulting IT guy clashing with brainless, CD-tray-as-cup-holder business user. In reality, most IT staff don't resemble the Simpson's comic-book guy and most end users are fairly knowledgeable when it comes to their desktops and office tools.

However, every day, I witness exchanges that highlighted for me the fact that, although we are all becoming more technologically savvy, there is still a divide in perception between those who work in IT and those who don't.

Consider this exchange:

Business: We need you to build an AJAX web application. It should allow customer create a login and, depending on who they are, pull different stuff from a CRM system. It might also need to pull information from our vendors' sites. It will also need to let the users makes some changes to the information that they see.

IT: Well, why do you want it?

Business: (slightly annoyed) It doesn't why, Marketing has requested it.

IT: What vendor systems do we need to interact with, what databases are they running?

Business: We aren't sure yet. It will depend on which vendors join our cross marketing program. Look, I just need to let the Marketing guys know if we can do it and get some ball park costs

IT: When do you need it?

Business: As soon as possible

IT: How big is the budget?

Business: It's not set yet, we are really looking to you to give us a sense of what this thing is going to cost. So what do you think, can you do it?


IT: Sure, I'll need a year and 1 million dollars


Business: Why are you making this so difficult?



Clearly, this is a bit of theatre but it I want to use it as a means to illustrate some common challenges that business users have when communicating with IT and why the person on the other side seems so difficult.

One of the first problems is what a friend of my referred to as "solutioning." That is, asking for a specific solution before you outline the problem. You may be reading a lot about web 2.0 technologies, but AJAX may not be the correct solution to your specific problem or work within your current environment.

A co-worker of mine often responds to solutioning by asking the question that get's people's back up: "why do you want it?" This question is often misinterpreted as a challenge to the asker. In reality what his trying to ascertain is the business requirement behind the request. I've told my buddy the question should be better restated as "What is the specific business needs that you are trying to address?"

When trying to describe a requirement do worry about so much about how but what. For example:

What business need(s) should the solution address?
Who will need to access the solution?
What special requirements must be meet to satisfy the business issues (for example, will different users need different level's of access?
What existing systems will the solution need to interact with?

It is important to remember that IT systems are often interrelated or interdependent and your request may have implications that you are completely unaware of.

The next issue is specificity. All too often I am asked to "ball-park" a solution on less than complete information. When presented with this I've see many IT people default to one of two typical response is to either highly buffer the scope/costs to offset the unknown, or make assumptions about what the client requires.

In both cases, the result is often friction, In the first case, the resulting ball-park figure will greatly exceed expectations (and available budgets). In the second instance, the clients are often unhappy because the resulting solution is based on incorrect assumptions and need considerable retooling (and expense) to meet the actual demand.

Remember you would never to a home builder and ask for "something largish with marble in the kitchen and red brick" You would be specific and go through the plans thoroughly before even breaking ground.

As a tactic in this situation, I often artificially define boundaries and scope within these boundaries. I then detail my assumption and exclusions in the scope document and ensure that I have not made any incorrect assumptions while reviewing the scope with the client.

Three last thoughts to consider when working with IT on a project:

1) Simple is complex. The easier an application needs to be for your users, the more work a developer must do on the back end to make things seem simple. Driving a car is as simple as turning the key and putting the car in gear. It's that simple, because a lot of complex moving parts under the hood hide the complexity from you.

2) Minor changes are not always minor. You may ask for what you think is a simple change, but often changing one element of a system can have a cascading affect. You might just want to add one more field to a newsletter sign-up page. However, that new field much live somewhere, which means a new column in a table in the database. If that table is accessed by other reports or tools, it's quite likely that those must also be changed, and so on.

3) You may want it complex, fast and one budget but you will likely only get two of the three. Adding to scope adds time and costs. If you have a complex scope and a fix timeline, more resources will need to be brought to bear to complete the job on time. If you budget doesn't account for the extra bodies, you will have problem.

When business groups and IT come together, business need to understand that there are complexities that they might not realize and the IT group needs to better communication the reason why they need to know the information that they are requesting.

Finally remember the advice from the back of the Hitchhiker's guide to the Galaxy: Don't Panic.


del.icio.us Tags: , , ,

Monday, June 18, 2007

Practicing what I preach

There are lots of reasons to start a blog. I started this one in the hopes of sharing what I'm learning with a larger community of people trying to use the Internet to help them run a business. Additionally, I'm trying to raise a bit of profile for myself in the already crowded technical expert field.

As a tangential benefit, I'm also using it as a test bed for all the technologies that I talk about in my daily work world. This post one such attempt, specifically getting a listing on Technorati. I'll let you you all know how it goes.

Technorati Profile

Tuesday, June 12, 2007

Geek fun with DDL Triggers

This positing first appeared in an earlier blog that I have now replaced with this current opus. It's a bit more technical that most of what I am putting in this blog, but I'm still a database geek at heart and I wanted to move it over before I wrapped up the earlier blog. I hope no-one takes offence at this bit of self plagiarism

With all the big changes in SQL Server 2005 I've been focusing on some small but significant changes. The first of these is the Data Definition Language (DDL) triggers.

In a nutshell, SQL Server 2005 will allow you to set a trigger at the database level for a myriad of DDL actions (consult your friendly neighborhood BooksOnline for the full list).

I've chosen an particular application of this functionality here, one that solves a problem I had with earlier SQL versions -- table management.

The problem is like this: suppose you, as DBA, notice that a table has been deleted. You go into the SQL Server logs but find no reference to the dropped table. Nor can you find out who dropped it and when.

Now you can.

In a test database, I created a log table in my test DB that looked like this:

CREATE TABLE [dbo].[DDL_Log]
( [id] [int] IDENTITY(1,1) NOT NULL,
[DDL_Event] [nvarchar](100) NOT NULL,
[TSQL_code] [nvarchar](2000) NOT NULL,
[ExecutedBy] [nvarchar](100) NOT NULL DEFAULT current_user,
[Action_date] [datetime] NULL DEFAULT (getdate()))
ON [PRIMARY];

I then created a DDL trigger on the database using the following syntax:

CREATE TRIGGER [log_table_actions]
ON DATABASE FOR DROP_TABLE, ALTER_TABLE, CREATE_TABLE
AS
BEGIN

DECLARE @data XMLSET @data = EVENTDATA()
INSERT dbo.ddl_log (DDL_Event, TSQL, ExecutedBy, Action_Date)
VALUES (@data.value('(/EVENT_INSTANCE/EventType)[1]','nvarchar(100)'),@data.value('(/EVENT_INSTANCE/TSQLCommand)[1]','nvarchar (2000)'), CONVERT(nvarchar(100), (CURRENT_USER)), GETDATE())
End;

A few thing will jump out immediately. The first is that the trigger is created "ON DATABASE" rather than on a table and secondly, that the trigger is create on all CREATE, ALTER and DROP TABLE statements. Finally it uses the eventdata() function to query information on the DDL statement that called the trigger.

Please note that this function returns XML, so you'll need to capture the output inside and XML typed variable, and you need to query elements inside the element to get the data. In this case, I've returned the event type and the full TSQL code for the DDL statement.I'm still looking for the full schema for the eventdata() xml document. But that will be a post for another day.

Monday, June 11, 2007

Web Applications: Death of the desktop app? (Hint: It depends on who you ask)

Lately, I have been noticing more and more commentators writing about, and podcasting about web-based applications making the desktop applications obsolete. The two most obvious examples cited are Salesforce.com and many of the Google applications like Gmail and the Google Docs and Spreadsheet. Certainly, anyone watching what Google is up to will get this sense.

The idea is a sound one: write applications using web-based technology, and make them available on the Internet, where, hypothetically, they will run in any web-browser, regardless of what operating system is running in the background. Anyone who tries to find desktop apps to run on both Windows and a Mac will appreciate the thought behind this level of abstraction.


Does this trend spell the end of desktop applications? For my money the answer is, as is so often the case, probably, but not now. It has a lot of promise, but there are currently, some major road blocks to adoption.

Let's look at three major ones:

1. Connectivity

As I type this blog entry I am using a web-based application which is, by most accounts, fairly robust. It allows me to format text, add hyperlinks (as you see above) and even finds my more egregious typos. It even has a auto save function which has saved me a few times. To use it, all I had to do was connect to a web site. Out of curiosity, I started this post on a Mac at the office and now finish it on my home PC.

However, I would have preferred to finish this post on the my train ride home, but the lack of connectivity meant I was left working on spreadsheet in Excel that was installed and saved on my hard drive.

Lack of universal connectivity is the first road-block to the complete adoption of online applications. There are times when we are just not able to connect. Unfortunately, often these periods of non-connectivity, like on long flights, or commuter train rides, are the times when we are most likely to want to to work.

2. Technical incompatibility

The second road block is the the technology itself. Microsoft, which admittedly has a lot to lose by the global adoption of web-based applications like Google spreadsheet, has been leveraging the .NET platform to drive the creation of web applications. It has also built a back-end infrastructure to support web apps. We are currently looking for a document management solution that both the PC and Mac users can use to collaborate on documents. I was doing some early testing on SharePoint portal server and thought I had a winner. I was almost right. The Macs, using FireFox could access the portal and browse to the document library, however, they were unable to check documents in and out. This functionality, of course, worked fine with the PC running Internet Explorer, but defeats its entire purpose on the Mac.

Now, it's easy to adopt the Microsoft is evil stance, however, as a developer of web-based applications, I know how hard it is to make things, even as simple as an HTML email, render properly across different platforms. The OS makers have a vested interest in keeping their platforms distinct and I don't see this changing any time soon. They also tune their backend platforms to work most efficiently with their own technology.

3. Data Security

One final barrier to full adoption of web-based applications is one that I also struggle with on both sides -- the storage of data. The apps live on a web server somewhere in the Cloud that is the Internet. In many cases, your data will live in a database or file store connected to that web server, or so you hope.

The truth is, in many cases, you don't actually know where your data is, and who has access to it. Most legitimate vendors, including my company, go to great lengths to secure data and set an appropriate privacy and security policies, but the truth is, once its out of your network, you are at the the mercy of someone else's systems and system administrators. So trust in your vendor becomes key.

I think the trend of moving applications online will continue. The rise of Software as a Service applications like Salesforce.com will push this field. Also, as Wi-Fi technology becomes more pervasive, and workforces become more remote and less tied to a cubicle, the reach of online applications will increase. In the meantime, find a vendor that you trust and look at the business needs driving the decision, but don't be afraid to make the leap.

Saturday, June 2, 2007

Microsoft Surface: Not close to home, but cool for business

Last week Microsoft release a preview of Microsoft Surface. I been watching some cool demos of surface and gestural interface computing systems and prototypes at sites like Engaget and YouTube for a while, but it looks like MS has something that is almost ready for prime time.

What MS is showing is a 30 inch table like screen that can interact with devices like the Zune (which as a Canadian still does not exist in my world) and a Mobile device.

I don't see this as a home item yet (although I'd love one right now), however, as a tool for marketers this is huge.

Take a look at the three flash videos on the Microsoft site (http://www.microsoft.com/surface).

My head is already filling with ideas for retail applications of this technology to their online initiatives. Now I need to see who I can plead with at Microsoft to get me early access to the technology. I'll keep you posted.

Friday, June 1, 2007

Some learning on successfully offshoring web development

My colleague, Gina Lijoi, recently commented in her blog on the challenges of outsourcing from a project management perspective. We have been using outsourcing for a number of years and most recently, have being offshoring work to India. While this has caused a paradigm shift for the project management team, it has also created a new set of challenges for the technical development. Our first forays into offshoring were less than smooth, but over time I've gained some learnings that make for successful offshore projects.

  1. Find a good partner – Just as with finding a local contractor, you need to know the people you work with offshore. The distance makes communication even for difficult, so you need to know that the people you are working with can understand what you need and do the job right. Miscommunication lead to incorrect work product which can wreak havoc with your timelines (if code needs to be re-written) and/or jeopardize you client deliverables. Good communication skills and a strong leadership team are essential to make sure your projects run smoothly. If possible, look for a company that will give you access to the same staff. This will allow you to get used to each other and will give your offshore team a sense of ownership with your projects.
  2. Planning and Documenting is key – I have great team in hours. We've worked closely together and I know that I could give any one of my guys a concept sketched on the back of a bar napkin and know with some certainty that they will deliver a solid product. You can't do this with an outsource partner, particularly one offshore. Your in-house staff knows your customers, they know the way you work and the kinds of projects you. Don't assume this will easily translate to your offshore partner. You need to provide full documentation. We include and a full System Architecture which includes a brief synopsis of the project, requirements document, flow charts, ERDs and all relevant UML models (class diagrams, activity diagrams, etc.). We also include a test plan. The test plan is useful because your offshore partner can use them as a vision of the finished product. The more detail you can provide up front, the better the outcome will be.
  3. Give yourself more planning, management and QA time – With proximity, your developers can ask questions as they work, you offshore team can't. Make sure you budget the time to create your documentation and extra time for QA on the back end. Also make sure that you budget time for regular status meetings. Which leads to the next point:
  4. Be prepared to shift your hours – The Indian partner that we work with is 10.5 hours off of us. This means that I often stay up to 11 or 12 at night to have conference calls, or get into the office at 7 or 8. The late night calls allow me to meet with the team before they start their workday, the morning calls allow me to get briefed on what they accomplished during the day and deal with any outstanding issues.
  5. Use communication technology to your advantage – Call to India are expensive. I try to use lower cost alternatives like Skype, email and MSN. MSN is particularly useful as I am a night owl and so is one of the team members in India that I work with. We often in the brief windows where we are both up, have running discussions on MSN that I review as part of my project tracking. For many this is the hard part, but I've found regular communications really do make for a better experience and better results.