Monthly Archives: July, 2011

T-SQL Tuesday and Best Practices – SQL is a Set-Based Language

This month’s T-SQL Tuesday#20 – T-SQL best Practices is a tall order to write about.  Best practices can cover coding style, documentation, performance, and so many other areas that whole books can and have been written about them.  When I consider the poor practices I’ve seen in T-SQL code I think about the basic flaws that seem to be commonly introduced by developers; the misuse of subqueries when joins are appropriate, the overuse of distinct when it’s not necessary, or the usage of Select * almost by default, among others.

SQL is Set-Based

The worst case of poor T-SQL practice I have come across in the last few years was by a client that had written a stored procedure in a non-set based manner.  The 3000 lines of code in a key SP iterated through a number of select statements that handled one entity at time in a procedural manner.  Data was queried in small chunks by incrementing row numbers in loops, placed in temp tables while other data was acquired, functions were then called and other code logic was performed, until finally the next set of data was processed.  A normal execution of this SP included anywhere from 1k – 10k database calls.  The developer’s rationale for coding it this way was that it would be faster to get smaller pieces of data each time the database was queried and not put as much of a burden on the database engine.

SQL Server likes Joins

The SP in question was completely rewritten to perform 3 large queries each with 15-20 joins and performance improved by a factor of 50.  The SQL Server database engine is tuned to perform joins very well, especially those with good indexing and appropriate foreign keys.  Trying to work around the engine got this company and developer in trouble, and made the code very slow and much harder to work with.

The Moral

Don’t turn T-SQL into a procedural language.  Do make use of it’s really strong capacity to process data in sets.

Advertisements

Measuring your BI Maturity

How would measure your organization’s business intelligence maturity?

For most organizations, their business intelligence is not leading edge, for others it’s getting there.  When I talk to BI analysts and developers it’s apparent that BI is very different from place to place.  There are some organizations that don’t have an organized BI effort yet or are barely getting it off the ground.  A good framework can contain several components that make up the full set of essential BI functions and features.  In general, these components aren’t dependent on each other so they don’t have to put in place in any particular order, but typically they are supported by an underlying data warehouse and a certain amount of data integration.

How many of the following are in place at your company?

  • Accurate and Consistent Reports
  • Ad-hoc reports by non-Reports Developers
  • Defined Metrics
  • Scorecards/Dashboards
  • Ad-hoc Analysis
  • Data Mining and Predictive Analytics

I still work with clients who have not achieved any of these on an enterprise level, although they may exist on a departmental or team basis.  The most common successful BI implementations include Accurate and Consistent Reports, Defined Metrics and Scorecard/Dashboards.  Ad-hoc reports and analysis are still not widely available at the enterprise level (being more difficult to implement), and few organizations employ predictive analytics beyond individual efforts.

A Reasonable BI Project

If you’re still trying to make BI work at your company, focus on delivering reports, metrics, scorecards and dashboards.  Not only are they attainable in an initial deployment, but they allow you to build a framework you can grow with.