Friday, July 23, 2010

self-exemplify - Dr. Edward Tufte's seminar in Chicago July 23

In his book, The Visual Display of Quantitative Information, 2nd edition, Edward Tufte describes how he wanted to make the book, self-exemplifying. In other words, the book should use the techniques about the effective display of information to describe techniques about the effective display of information; it should be an example of itself. Dr. Tufte's Seminar today in Chicago did the same thing.

I often find myself experiencing a great sense of relief and identification when encountering a thought leader who has wrestled with the same issues as I have in my work, and aspired towards some effective execution, whether this is with technology issues or softer issues like managing complexity, personal productivity, etc. I felt the same sense of relief in Dr. Tufte's extremely concise and coherent exposition on effective information display.

This was a fairly dense 5 hours. I found myself taking some notes, but for the most part trying to pay attention and absorb what was presented. I was noticing that Dr. Tufte was not shy about opinions and frequently would illustrate points with succinct and boldly-worded statements about best practices (or lack thereof).

I am reproducing some of these statements and impressions here, as hastily noted. If you do anything with the presentation of information (whether that be PTA newsletters or CPM Dashboards), please do yourself a favor and participate in these wonder learnings if you have the chance.

Here are Dr. Tufte's ideas and statements in roughly chronological order. Also, it's possible I have paraphrased him inaccurately. I'll claim any mistakes as the heat of the moment. Dr. Tufte is definitely more lucid than my note-taking ability. This is intentionally left in "brain-dump" format. I think many of these statements are provocative on their own, but I hope these might intrigue anyone reading to either seek out Dr. Tufte's books (Beautiful EvidenceVisual Explanations: Images and Quantities, Evidence and NarrativeEnvisioning InformationThe Visual Display of Quantitative Information, 2nd edition) and/or his seminar.
  • Presenting (to others) is a moral act.
  • Examination of the Music Animation Machine. Especially how there is no legend or guidelines yet it's a very dense and intuitive presentation of information
  • Dr. Tufte used this as an example that refutes the idea of "information overload" and he cited several examples of dense presentation throughout the seminar (e.g. 800,000 data points on 2 sides of an 11x17 page)
    • "There's no such thing as information overload, just lousy design"
  • A worthwhile diagram deserves the same amount of intention as the text that would impart the equivalent amount of information
  • Graphics are frequently used to depict causality
    • Policy and prevention missions both need to analyze causality
  • Every linking line should be annotated
  • The map is the gold standard for effective presentation
  • "Chart junk" should always be replaced by information
  • In a graphic presentation, are you using the results of evidence, or "evidence selection" i.e. are you cherry-picking favorable data?
  • One should assume that presenters have similar motivations as the audience (and vice-versa)
  • You want an open mind, but not an empty head.
  • Maximize content reasoning time; minimize content interpreting time.
  • Paper has 10 times the resolution as a computer display. Paper has 100 times the resolution of projected slides
  • Authoritarian presenters tend to distrust their audience. This creates the tendency to stint information. (3 points per slide - sound familiar?)
  • Rather than "know your audience", "know your content". Respect your audience instead.
  • Do whatever it takes to impart the content. e.g. Sock Puppets, real objects, physical models. Don't be constrained by convention
  • Every time you can get a real object in a presentation, do so.
  • If possible, see how data is originally collected.
    • Example of water being collected from the cleanest part of the river in a pollution impact study
  • 1 + 1 = 3 - describes the phenomenon that 2 graphic elements create 3 effects - the effect of each, and the effect of the juxtaposition of the 2 objects
  • Local Optimization = Global Pessimization
  • The goal is to zero out the interface
  • Omit grids. Good typography supplies enough guidelines.
  • Tufte then went through 8 fundamental principles which are discussed in his book, Beautiful Evidence
    1. show comparisons - "Compared to what?"
    2. Illustrate causality
    3. show multivariate data. Translation: enrich your data with dimensional attributes
    4. integrate all content. There shouldn't be different modes to view the comprehensive presentation
    5. Document all sources, scales, and any missing data. It enables the credibility of your presentation.
    6. Content counts most of all. Over presentations, style, formatting
    7. Locate imporant comparisons in a common space. Use small multipliers.
  • The point of information display is to assist thinking.
    • Most design can be placed in its decade because it is based on fashion. This is not necessary a complete evil
  • Design is based on human factors.
  • After 2D drawing is just that: 2D.  Perspective drawing is something like 2.33 dimensions
  • Navigation instruction is a 4D presentation:
    • 3 physical dimensions
    • and time (the 4th dimension)
  • Information resolution = the ability to communicate more bits per area unit  and/or per time unit
  • Galileo's telescope was the first increase of information resolution beyond the capabilities of the human eye
  • Since 1610, information resolution has increased 10 million to 100 million-fold
  • Make displays worthy of the the human eye/brain system
    • The human eye/brain system was measured to a capacity of 10 megapixels per second per optic nerve
    • Tufte asked, "Why are we looking at these moronic displays? (PowerPoint)
  • Label directly; don't use legends
  • How do you solve the flat-land problem (i.e. displaying 3 dimensional artifacts on a 2-dimensional surface [screen, paper, iphone])
    • Use a model
  • The principles of analytic design come from the principles of analytic thinking
  • Quoting Steve Jobs (?), "Real artists ship"
  • Interface design
    • (Quoting someone, not sure who - Alan Cooper) "No matter how beautiful your interface is, it would be better if there were less of it."
  • Forming your summary:
    • State what the problem is
    • State who cares
    • State the solution
  • Other key pieces of advice
    • Show up early
    • Finish early (Which Tufte did)

Friday, May 28, 2010

Inexpensive iPad holder - less than $5

I just needed a temporary solution to hold my iPad at my desk. It's worked out pretty well. Thought I'd post the tip:































It's an Office Depot Plate Holder, Clear Item # 544474

Thursday, May 27, 2010

Resetting "Windows XP Mode" to initial settings

Under Windows 7 64-bit ultimate edition, I've had the need to reset my "Windows XP Mode" to initial settings several times over the past few days. Initially I did this because as I used the XP virtual environment, the VHD file grew to around 10 Gig. Not huge, but I wasn't using it for anything except to run a client's 32-bit only VPN software, so it was wasting a fair amount of space. My main machine has a 256 Gig SSD for the main drive, so I am conservative with space consumption.


The second reason was that I was helping my fiancée (a high-school teacher) obtain some (public domain) video for use in her classroom. Unfortunately this video seemed only to be available in a heavily watermarked version on YouTube (ugh) or in a Real Media stream (double-ugh). So I wound up needing to use a mixture of open source and very old software to convert this video to a format that was playable in the classroom. As a rule I don't install software like that on my production machine - a virtual environment is a great "sandbox" for momentary needs like this one. After I converted this video, I no longer needed all these (rather buggy) utilities, so I wanted to start clean.


I didn't see much on the web on how to reset the "Windows XP Mode" to initial settings. I tried this experiment, which worked pretty well.


To reset Windows XP to factory new condition:
  1. Right click the vmcx file. Select Settings. (On my machine the file is called Windows XP Mode.vmcx and is located in C:\Users\Tom\Virtual Machines)
  2. Note the location of Hard Disk 1 (On my machine this location is called C:\Users\Tom\AppData\Local\Microsoft\Windows Virtual PC\Virtual Machines\Windows XP Mode.vhd)
  3. Exit the Settings Dialog
  4. Delete the files associated with your "Windows XP Mode" environment. On my machine, these are located in C:\Users\Tom\AppData\Local\Microsoft\Windows Virtual PC\Virtual Machines\Windows XP Mode.vhd and are called 
    1. Windows XP Mode.vhd
    2. Windows XP Mode.vmc
    3. Windows XP Mode.vmc.vpcbackup
    4. Windows XP Mode.vsv
  5. Now, find the original Start Menu shortcut for Windows XP Mode. As long as you've left the Parent Disk in place, the system will prompt you to recreate a new environment:

voilà! - New XP environment (after a few other dialogs and a few minutes building)

Standard cautionary notes: 
    • I haven't researched whether this is the "proper" way to do this. But I've done this multiple times on 2 different machines and it takes less than 10 minutes to reset, install anti-virus, and reinstall VPN software
    • Obviously when I suggest you delete files on your system, I am assuming you know the implications of this and have everything backed up or have determined that you no longer need anything from the files you are deleting

    Tuesday, May 25, 2010

    Burn the Ships

    Although it's probably an apocryphal story, legend has it that Cortez instructed his men to "Burn the Ships" upon encountering the New World. This has become (in my parlance, anyway) a catchphrase for being irrevocably committed to a particular course of action.

    In that spirit, yesterday I blew away v1 of my website and started up this version. This will remain a work in progress for a while, since other client commitments will have me quite busy through June.

    Anyone using WordPress who has themes or plugins that they love, please leave a comment.

    Wednesday, April 14, 2010

    Time (Not Date) Dimension Table SQL code

    I am seeing more call for Time Dimensions. By that I mean Time-of-Day Dimensions. So now I have to retrain myself to call Date Dimensions "Date" and Time Dimensions "Time".


    As a follow up to this post, here's some DDL and a quick routine to generate a time of day table. The resolution is to the second, which so far has proved sufficient for my clients' purposes.


    Enjoy


    SET nocount ON
    /*
    CREATE TABLE [dbo].[dim_Time](
      [TimeId] [int] NOT NULL,
      [Time] [time](7) NULL,
      [Hours] [tinyint] NULL,
      [Minutes] [tinyint] NULL,
      [Seconds] [tinyint] NULL,
     CONSTRAINT [PK_dim_Time] PRIMARY KEY CLUSTERED
    (
      [TimeId] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
    ) ON [PRIMARY]

    TRUNCATE TABLE [dbo].[dim_Time]
    GO
    SELECT * FROM dbo.dim_time
    */


    DECLARE @second1 INT = 0
    DECLARE @currtime TIME
    DECLARE @msg VARCHAR(30) = ''


    WHILE @second1 < 86400
      BEGIN
          SELECT @second1 = @second1 + 1
          SELECT @currtime = Dateadd(ss, @second1, '00:00.00')
          --  SELECT   
          --DatePart(hh, @currtime) * 10000 + 
          --DatePart(mi, @currtime) * 100 +
          --DatePart(ss, @currtime) AS TimeID
          --, @currtime AS [Time]
          --, DatePart(hh, @currtime) AS [Hours]
          --, DatePart(mi, @currtime) AS [Minutes]
          --, DatePart(ss, @currtime) AS [Seconds]
          INSERT dbo.dim_time
                 ([TimeID],
                  [Time],
                  [Hours],
                  [Minutes],
                  [Seconds])
          VALUES (
                  Datepart(hh, @currtime) * 10000 + 

                  Datepart(mi, @currtime) * 100 + 
                  Datepart(ss, @currtime),
                   @currtime,
                   Datepart(hh, @currtime),
                   Datepart(mi, @currtime),
                   Datepart(ss, @currtime) )
          IF @second1 % 1000 = 0
            BEGIN
                SELECT @msg = 'Now Processing ' + CAST(@second1 AS VARCHAR(12))
                PRINT @msg
            END
      END 

    Sunday, April 4, 2010

    Adventures with Virtualization

    Those of us needing to run BI software in virtual environments are well familiar with the challenges of getting things running smoothly, especially if you are using a laptop as your lab or demo environment. We have a new set of challenges with Windows 7, and with SharePoint 2010.



    Challenge #1 - SharePoint 2010 requires 64 bit OS, and the only MS virtualization software that allows for 64-bit is Hyper-V.

    Only problem with running Hyper-V is that you cannot sleep a machine with Hyper-V, so if you’re using you laptop as a lab machine, you need to retain an OS that will run productivity apps and sleep. I set up my old laptop using the technique described here and it works well. I haven’t repeated the process for my new laptop yet, but I plan to.

    Challenge #2 - Windows 7 Virtual PC requires Hardware Assisted Virtualization

    Apparently this is no longer true, but I've wrestled with this one for months, in fact in early March I returned a machine that didn’t support hardware assisted virtualization because I needed to run Windows 7 VPC. . MS just released a version of Virtual PC that does not require hardware assisted virtualization

    Other tips that have worked well for me:
    • Use a fast disk - I was using a 7200 RPM eSATA external drive - faster than the 5400 disk commonly found on laptops. Now I am using an SSD drive- even better.
    • Store the VHD in a compressed folder. This one struck me as counter-intuitive but anecdotally seems to work.

    Monday, March 22, 2010

    Brilliantly stupid MDX debugging technique

    Whenever I do something really stupid, rather than keep it to myself, I prefer to blog about it for the entire world to see. Even better - I came up with a debugging technique to save myself from my own, er, stupidity.


    The scenario is this: I am developing some SQL Server Analysis Services Date Calculations in MDX for a client. They have three different Date Hierarchies, and wanted the standard, Previous Period, Period to Date, etc calculations. (For a very good explanation of the techniques involved, read this PDF first, then study this refinement from Mosha Pasumansky. )


    The MDX looks like this: 


    Scope([Calculation].[Calculation].[Previous Period]);      
        ([Calendar].[Calendar].[Date]) = ([Calculation].[Current Period], ParallelPeriod([Calendar].[Calendar].[Calendar Year],1));      
        ([Calendar].[Week].[Calendar Week].MEMBERS) = ([Calculation].[Current Period], [Calendar].[Week].Lag(52));      
        ([Calendar].[Calendar].[Calendar Month].MEMBERS) = ([Calculation].[Current Period], [Calendar].[Calendar Month].Lag(12));      
        ([Calendar].[Calendar].[Calendar Quarter].MEMBERS) = ([Calculation].[Current Period], [Calendar].[Calendar Quarter].Lag(4));      
        ([Calendar].[Calendar].[Calendar Year].MEMBERS) = ([Calculation].[Current Period], [Calendar].[Calendar Year].Lag(1));      
        ([Calendar].[NRF].[NRF Week].MEMBERS) = ([Calculation].[Current Period], ParallelPeriod([Calendar].[NRF].[NRF Week],1));
        ([Calendar].[NRF].[NRF Period].MEMBERS) = ([Calculation].[Current Period], [Calendar].[NRF Period].Lag(13));
        ([Calendar].[NRF].[NRF Quarter].MEMBERS) = ([Calculation].[Current Period], [Calendar].[NRF Quarter].Lag(5));
        ([Calendar].[NRF].[NRF Year].MEMBERS) = ([Calculation].[Current Period], ParallelPeriod([Calendar].[NRF].[NRF Year],1));
    End Scope;      

    Pretty standard stuff (sorry about the line wraps): 

    The symptom was that my Calendar Data Hierarchies were working fine, but the other two (Week and NRF [National Retail Federation]) were not. I worked my way through the data, and it all looked correct. Even a dynamic MDX query in SSMS worked fine. But it was wrong in the cube.

    When debugging MDX, I find that it pays to go in small increments and to do patently obvious things like setting Scoped statements to constant values, as in: 


    Scope([Calculation].[Calculation].[Previous Period]);      
        ([Calendar].[Calendar].[Date]) = 7777;
    End Scope;  


    I had done this with this MDX code in an attempt to see what was going on, and it showed me that the scoped overwrites were working, but the calculations they were over-writing with were not. Then I got an idea to change how I was debugging: 


    Scope([Calculation].[Calculation].[Previous Period]);      
        ([Calendar].[Calendar].[Date]) = 1;
        ([Calendar].[Week].[Calendar Week].MEMBERS) = 10;
        ([Calendar].[Calendar].[Calendar Month].MEMBERS) = 100;
        ([Calendar].[Calendar].[Calendar Quarter].MEMBERS) = 1000;
        ([Calendar].[Calendar].[Calendar Year].MEMBERS) = 10000;
        ([Calendar].[NRF].[NRF Week].MEMBERS) = .0001;
        ([Calendar].[NRF].[NRF Period].MEMBERS) = .001;
        ([Calendar].[NRF].[NRF Quarter].MEMBERS) = .01;
        ([Calendar].[NRF].[NRF Year].MEMBERS) = .1;
    End Scope;      


    Essentially I am using a bit-mask, one position for each level. I split the two different hierarchies at the decimal. This allowed me to see which scoped overwrite statement was in effect as I browsed the cube, and it led me to the solution.

    It turns out I had defined my scoped overwrite assignment statements in reverse order. (i.e. from highest level (Year) to lowest (day)) Once I switched them so the highest level statement was evaluated last, everything worked as desired. This was a very stupid mistake, and once it was defined as part of the cube MDX script, was a difficult one to spot. Luckily the debugging technique described above made it pretty obvious what was happening.

    Wednesday, March 10, 2010

    Devils and Angels of BI Architecture


    Devil: You don't need to set up a separate star schema database


    Angel: Yes you do. We don't want to  interfere with the operational systems


    Devil: It's sooo much work


    Angel: We want to create a simplified schema to benefit our end users


    Devil: Phooeh. ETL is BORING


    Devil: Just query the source system directly


    Devil: What could be better than fresh data


    Devil: C'mon the reports not that big


    Devil: Everybody's doing it


    Intrepid BI Architect: sigh...

    Case in point:
     This is from an actual email exchange. Names have been changed to protect the innocent.






    From: Brave and True BI Developer
    Sent: Friday, March 05, 2010 9:33 AM
    To: BI Technical Staff
    Subject: SSRS Not Responding

    Well, this is a new one.  I’m on-site at [Dunder-Mifflin] today to do a quick fix on a report and the SSRS instance seems to be refusing to actually render a report.  I click on a report that’s been untouched for months with an ultra simple SELECT behind it and the browser just spins with the “Reporting being generated” message in the middle of the browser.


    From: Brave and True BI Colleague
    Sent: Friday, March 05, 2010 9:34 AM
    To: BI Technical Staff
    Subject: RE: SSRS Not Responding

    Check SQL Profiler as you run the report, verify that it is actually reaching the database (with the correct user information, etc.)


    From: Brave and True BI Developer
    Sent: Friday, March 05, 2010 10:00 AM
    To: BI Technical Staff
    Subject: RE: SSRS Not Responding

    Got it.  There was a transaction open on a table that’s part of a filter on most of the reports. That’s all it was.

    ______________________________




    [My emphasis]




    This is why we go to the trouble of setting up an entirely separate system. There should never be a "transaction open on a table that’s part of a filter on most of the reports."


    There are times even now when the voices of temptation ring really loud. Usually this occurs in the form of a client or colleague looking at me like I've got two heads when I tell them that we need to spend time building a separate redundant database and ETL to replace a currently running (but poorly performing or problematic) report.


    So to all the intrepid BI Architects out there: fight the good fight!




    Friday, February 5, 2010

    The "Million Dollar" BI Architecture Diagram

    Anyone who has worked with me or taken training from me has seen some variant of this diagram. It's not an original concept, although this version has been refined to include my personal experience with many Business Intelligence implementations, as well as my bias towards both the Kimball Business Dimensional methodology and using Microsoft SQL Server (especially Analysis Services) as part of successful deployments.




    This diagram is conceptual, although I have designed systems that did have physical servers for each of the servers represented, and these worked very, very well.


    I have lots to say about this, and will continue to do so on this blog. Perhaps the most significant concept to mention at the outset is that this is a recipe for success with Business Intelligence. Put another way, the level of success I've had with BI environments corresponds pretty closely with how well the architecture of these environments matched this diagram. (Also note that I said "a" recipe for success - not "the" recipe for success. I acknowledge that there are other approaches, but this is the one I've used successfully for over a decade.)


    Two important concepts inherent in this diagram are:


    a) each layer in this diagram has a primary purpose. From an architecture standpoint, designing these functional layers to do only a single thing very well creates a robust, yet simple, environment.


    b) There are well-defined interfaces between each functional layer. Having clearly defined boundaries allows the processes to be moved from server to server (which enables important operational benefits like high-availability, disaster recovery, and performance tuning)


    What are these primary purposes mentioned above, and how do they benefit the overall architecture?
    1. Source Systems  OLTP, Line of Business, ERP systems, and other sources - All BI systems are 100% dependent upon being able to source data from somewhere. The systems that run the business are our sources. All the rich data that we analyze and/or aggregate comes from there.
    2. Data Quality Extraction, Transformation, and Loading - Because the systems in #1 are generally concerned with running the business and not with pristine and perfect data, there must be intentional processes that clean and consolidate data from these sources. That happens in layer 2
    3. Single Vision Enterprise Data Warehouse - This is what an organization "knows" about itself, and enables a single vision of the truth. The is the consolidated, consistent historical data store, ideally becoming the "system of record" after the operational processes are no longer active. Note that under this model, this layer does not have to be tuned for interactive query performance - the primary purpose here is to act as a sound repository of enterprise information.
    4. Business Intelligence - Unified Dimensional Model - Nearly all successful implementations include a layer like this, either informally or formally. In my view, formally is the superior approach. In this layer, performance is improved by creating locally (or proximally) cached copies of information. This is where "Business Intelligence" is exposed - i.e. the metrics, KPI's that provide useful information. All presentation tools should source from this database.
    5. Presentation - By keeping the Business Intelligence in layer 4, the presentation tools really just become about delivering information to the various audiences in the format and vehicle most convenient for each. It allows different tools to be used as needed. 
    BTW - I call this the "Million Dollar" BI Architecture Diagram because:

    • if you're an organization using BI this effectively, it should represent (at least) a million dollars in increased revenue or reduced cost. 
    • If you're a BI practitioner, it should be worth that in revenue or perhaps even more significantly, a million dollars in reduced stress - because it represents a plan that is proven and works.

    Monday, January 25, 2010

    Example Code to build and populate a Date Dimension


    OK, I've found myself searching for this code a half dozen times over as many weeks. That tells me that I :
    a) Need to keep better track of my source code
    b) Need to share it on my blog


    Caveats & Comments: 

    • This is "warts and all" utility code. It could be optimized, but for the few rows that a Date Dimension represents, why?
    • One of the few cases where Dr. Kimball and his recommends meaningful surrogate keys is in the case of the date dimension. Our keys here use a YYYYMMDD or YYYYQQ format. 
    • This builds a Date Snowflake, which provides correctly ordered keys at any level in the dimension. 
    • I usually rejoin these tables in either the DSV or a SQL View to create the structure SSAS actually consumes.
    • There is a stub for a Week Hierarchy, which sometimes is called for, and sometimes not.
    • This was built in and is compatible with SQL 2005. It runs in SQL 2008 as well. In 2008, you could change the date type from DateTime to Date and save yourself some formatting overhead down the road.









    /* DDL Begin */
    /****** Object:  Table [dbo].[dim_CalendarYear]    Script Date: 11/02/2009 12:34:56 ******/
    CREATE TABLE [dbo].[dim_CalendarYear](
          [CalendarYearID] [int] NOT NULL,
          [CalendarYearName] [varchar](10) NULL,
     CONSTRAINT [PK_dim_CalendarYear] PRIMARY KEY CLUSTERED
    (
          [CalendarYearID] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO

    /****** Object:  Table [dbo].[dim_CalendarQuarter]    Script Date: 11/02/2009 12:34:56 ******/
    CREATE TABLE [dbo].[dim_CalendarQuarter](
          [CalendarQuarterID] [int] NOT NULL,
          [CalendarQuarterNumber] [int] NULL,
          [CalendarQuarterName] [varchar](10) NULL,
          [CalendarYearID] [int] NULL,
     CONSTRAINT [PK_dim_CalendarQuarter] PRIMARY KEY CLUSTERED
    (
          [CalendarQuarterID] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO

    /****** Object:  Table [dbo].[dim_CalendarMonth]    Script Date: 11/02/2009 12:34:56 ******/
    CREATE TABLE [dbo].[dim_CalendarMonth](
          [CalendarMonthID] [int] NOT NULL,
          [CalendarMonthNumber] [int] NULL,
          [CalendarMonthName] [varchar](20) NULL,
          [CalendarQuarterID] [int] NULL,
     CONSTRAINT [PK_dim_CalendarMonth] PRIMARY KEY CLUSTERED
    (
          [CalendarMonthID] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO

    /****** Object:  Table [dbo].[dim_CalendarDay]    Script Date: 11/02/2009 12:34:56 ******/
    CREATE TABLE [dbo].[dim_CalendarDay](
          [TimeID] [int] NOT NULL,
          [SQLDate] [datetime] NULL,
          [CalendarMonthID] [int] NULL,
          [WeekID] [int] NULL,
     CONSTRAINT [PK_dim_Date] PRIMARY KEY CLUSTERED
    (
          [TimeID] ASC
    )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    /* DDL End */


    /* Populate Begin */

    DECLARE  @StartSeedDate DATETIME
    DECLARE  @EndSeedDate DATETIME

    SET @StartSeedDate = '01 Jan 2007'
    SET @EndSeedDate   = '31 Dec 2009'
                      
    -- Set defaults if none are supplied
    IF @StartSeedDate IS NULL
      SET @StartSeedDate = DATEADD(YYYY,-1,CONVERT(CHAR,GETDATE(),112))

    IF @EndSeedDate IS NULL
      SET @EndSeedDate = DATEADD(YYYY,1,CONVERT(CHAR,GETDATE(),112))
                        
    CREATE TABLE #TMP_DIMDATE (
      SQLDATE DATETIME NULL)

    WHILE @StartSeedDate <= @EndSeedDate
      BEGIN
        INSERT #TMP_DIMDATE
              (SQLDATE)
        VALUES(@StartSeedDate)
        SELECT @StartSeedDate = DATEADD(DD,1,@StartSeedDate)
      END
     
    -- Year
    INSERT dbo.dim_CalendarYear
    SELECT DISTINCT
          DATEPART(YEAR,SQLDATE) AS [CalendarYearID]
          , DATENAME(YEAR,SQLDATE) AS [CalendarYearName]
    FROM   #TMP_DIMDATE
    -- SELECT * FROM dbo.dim_CalendarYear

    -- Quarter
    INSERT dbo.dim_CalendarQuarter
    SELECT DISTINCT
          DATEPART(YYYY,SQLDATE) * 100 + DATEPART(QUARTER,SQLDATE)     AS [CalendarQuarterID]
    ,     DATEPART(YYYY,SQLDATE) * 100 + DATEPART(QUARTER,SQLDATE)     AS [CalendarQuarterNumber]
    ,     CAST(DATEPART(YYYY,SQLDATE) AS char(4)) + ' Q' + CAST(DATEPART(QUARTER,SQLDATE) AS char(1))     AS [CalendarQuarterName]
    ,     DATEPART(YEAR,SQLDATE) AS [CalendarYearID]
    FROM   #TMP_DIMDATE
    -- SELECT * FROM dbo.dim_CalendarQuarter

    -- Month
    INSERT dbo.dim_CalendarMonth
    SELECT DISTINCT
          DATEPART(YYYY,SQLDATE) * 100 + DATEPART(MONTH,SQLDATE)     AS [CalendarMonthID]
    ,     DATEPART(YYYY,SQLDATE) * 100 + DATEPART(MONTH,SQLDATE)     AS [CalendarMonthNumber]
    ,     DATENAME(YYYY,SQLDATE) + ' ' + DATENAME(MM,SQLDATE)     AS [CalendarMonthName]
    ,     DATEPART(YYYY,SQLDATE) * 100 + DATEPART(QUARTER,SQLDATE)     AS [CalendarQuarterID]
    FROM   #TMP_DIMDATE
    -- SELECT * FROM dbo.dim_CalendarMonth

    -- Day
    INSERT dbo.dim_CalendarDay
    SELECT DISTINCT
          DATEPART(YYYY,SQLDATE) * 10000 + DATEPART(MONTH,SQLDATE) * 100 +  DATEPART(DAY,SQLDATE)     AS [TimeID]
    ,     SQLDATE
    ,     DATEPART(YYYY,SQLDATE) * 100 + DATEPART(MONTH,SQLDATE)     AS [CalendarMonthID]
    ,   NULL as [ShopLocalMonthID]
    FROM   #TMP_DIMDATE
    -- SELECT * FROM dbo.dim_CalendarDay


    /* Populate End */


    Followers