Posts Tagged ‘Business Intelligence’

Reflections on Microsoft TechEd 2010 & Business Intelligence Conference

June 11, 2010

This past week I attended the co-located Microsoft TechEd 2010 and 2010 Business Intelligence conferences in New Orleans, Louisiana.  The co-location of the conferences was a great initiative as it allowed the 10,000 attendees to mix sessions from both conferences.

TechEd 2010 Keynote

The day 1 keynote (TechEd) was presented by Bob Muglia (Microsoft President of the Server and Tools Business).  Bob discussed the continued evolution of Windows Azure with support for .Net 4.0, IntelliTrace debugging, and new tools within Visual Studio 2010.  SQL Azure has continued to evolve with increased storage limits, geospatial data, and data synchronization.  Bing Maps SDK was released to enable the visualization of data in maps.  Service Pack 1 was announced for Windows 7 and Windows Server 2008 R2 for release in July 2010.

The day 2 keynote (Business Intelligence) was presented by Ted Kummert (Microsoft Senior Vice President of the Business Platform Division).  Ted talked about managed self-service analytics and how PowerPivot provides users an Excel-like look and feel on the desktop that enables a variety of data sources to be related easily with high performance across high volume.  A demonstration of PowerPivot was given showing instant sorting and filtering of 2 billion rows of data.  Most of the keynote focused on the Microsoft BI technologies and their integration, these being Excel, SQL Server, SharePoint, and PowerPivot.

The most valuable sessions of TechEd 2010 that I attended were ‘Agile Planning’ presented by Peter Provost (Microsoft Senior Program Manager) and ‘Tough Lessons Learned as a Software Project Manager’ presented by Gregg Boer (Microsoft Principal Program Manager).  Both of these speakers shared their experiences managing projects and what they had learnt.  Gregg presented on 7 key project management learnings, those being:   

  • prioritize ruthlessly, cut judiciously
  • it’s not enough to understand what, you must understand why
  • aggressive schedules do not motivate
  • politics are dumb, ignoring politics is dumber
  • your project is at risk – are you handling it?
  • it’s not a popularity contest
  • you work with people – not resources

Other extremely interesting sessions of TechEd 2010 that I attended were ‘Business Intelligence Overview: Decisions, Decisions, Decisions’ presented by Donald Farmer (Microsoft SQL Server BI Management Program Manager) and ‘So Many BI Tools, So Little Time’ presented by Dan Buloss (Symmetry Corp President).  Both Donald and Dan presented some interesting conceptual models about business intelligence; two of which I have reproduced below:   

Business Intelligence and Analysis (Donald Farmer)

Reporting Spectrum (Dan Buloss)

The hidden gem in the rough of TechEd 2010 was ‘Build Your Own Cool Visualizations Using DGML’ presented by Suhail Dutta (Microsoft Program Manager).  Suhail showed how to use Directed Graph Markup Language (DGML) to visualize architectural dependencies using Visual Studio, as well as visualizing your own specific data.  The problem with this feature is that it can only be used within Visual Studio; it would make a great control for user applications to visualize and explore data. 

These are my thoughts after attending TechEd 2010 over the past 4 days and I look forward to sharing and implementing what I learnt.

Convergence of Analytics

February 9, 2010

Analytics is defined as the “science of analysis”.  Analytics is used to obtain an improved understanding of a complex topic or problem from collected data.  Through better understanding, better decisions can be made, and future outcomes can be predicted more accurately.

Presented below is a model I developed for categorizing today’s Analytics software based on two continuums – data type (structured versus unstructured) and analysis approach (user driven versus system driven).  These two continuums create four quadrants that I have labeled as Statistical Analysis, Qualitative Analysis, Business Intelligence, and Text Analytics.  Of course, in practice these quadrants overlap by varying degrees, but this model assists in describing the different methods employed for data collection and analysis techniques.

Analysis Software by Data Type and Analysis Approach

Statistical Analysis is basically described as the collection, analysis and interpretation of numerical data undertaken by a statistician.  Statistical Analysis is generally considered to be a mathematical science and is related to Predictive Analytics.  Microsoft Excel is the most commonly used software application for statistical analysis, however software applications such as SAS STAT and SPSS Statistics are popular and have more advanced features.                                    

Qualitative Analysis (or commonly referred as Qualitative Data Analysis) is basically described as the collection, analysis and interpretation of non-numerical data undertaken by a qualitative researcher.  Qualitative Analysis is generally considered to be a social science and is related to Content Analysis.  QSR NVivo is the most commonly used software application for qualitative analysis, however software applications such as SSD Atlas.ti and VERBI MAXQDA are also popular.                                       

Business Intelligence is basically described as the collection, analysis and dissemination of structured data performed predominately by a system.  Business Intelligence is used in a narrower context here and is related to Data Mining, Decision Support Systems, and Online Analytical Processing.  IBM Cognos, Microsoft SQL Server Analysis Services, and SAP Business Objects are examples of commonly used systems for business intelligence.                           

Text Analytics is basically described as the collection, analysis and dissemination of textual data performed predominately by a system.  Text Analytics is similar to Text Mining and related to Video Analytics and Sentiment Analysis.   IBM LanguageWare and Inxight LinguistX are examples of commonly used systems for text analytics.  Whilst, Microsoft Semantic Engine is currently planned for integration into a future version of Microsoft SQL Server.           

Statistical Analysis and Qualitative Analysis software (top half of model) are characterized by having flexible in-depth analysis capabilities with a relatively low cost of implementation, whereas Business Intelligence and Text Analytics software (bottom half of model) are characteristed by having fast scalable analysis capabilities with a high cost of implementation.  However, analysts want the best of both solutions, that is, flexible in-depth analysis capabilities coupled with fast scalable analysis capabilities, whilst still having a low price of implementation.  In respect to differences in data types, Mixed Methods Research is a popular means of combining the collection and analysis of structured data (left half of model) and unstructured data (right half of model) to gain a more comprehensive understanding of a topic or problem.       

Interestingly, the traditionally divergent data types and analysis approaches of Analytics software has started to converge.  For instance, Business Objects acquired Inxight in 2007 and was subsequently acquired by SAP in 2007.  IBM acquired Cognos in 2008 and SPSS in 2009.  These companies and others are looking at ways of handling the different data types and analysis approaches, either through the integration of technologies or the integration of an application portfolio.  In line with this trend in Analytics, QSR International last month announced that the next version of NVivo will be delivering new automated analysis capabilities and will provide support for structured data.