Does your integration strategy match the speed of your business? Many people would assume that it must but I can assure you in most companies IT struggles to deliver data and applications at the speeds necessary. For years this was the acceptable model, ask for a report and sit back for a month or so until you received a prototype from a technical person you had never met with a note that said” is this what you want?” There is nothing agile about this approach and agility is necessary for companies looking to innovate and survive in today’s competitive landscape.

Philip Russom, Research Director of Data Management at TDWI, and Grant Parsamyan, Director of BI & Data Warehousing at got together last week on Informatica’s Architect-to-Architect & Business Value Roundtable Series to discuss how data virtualization delivers agility while complimenting the investment you have made in your data warehouse technology. I’m participating in this series as well and will be hosting a session in June.

Data Virtualization is an excellent way to leverage and add value to the investment you have made in your data warehouse infrastructure. Data landscapes are expanding quickly and as they become more distributed DV technology can act as the trusted layer of data access across these systems. In the 90s some people saw DV technology as a possible path to circumventing DW’s that’s not the case today. I’m not aware of any vendor in the space that promotes this as a best practice. In the end, DV helps to secure the DW as a critical element in our data world and an agile way to leverage the data within it.

Philip made a great point early on in the segment when he compared the importance of agile development and responsible data access & preparation. His position is you can’t have one without the other and I think he makes a great point. Speedy application development is great but skipping the proper steps to prepare the source data can eliminate the value you get from being agile.

As I mentioned above adding agility to your data environment is critical Philip shared some research data that illustrates the problem. 41% of the respondents indicated that when they are faced with a standard data adjustment such as adding a new hierarchy it can take between 1-3 months in a non-agile environment. Grant Parsamyan stated without hesitation that time frame would never work for

Data virtualization plays a key role in addressing these challenges. Delivering a managed and trusted data layer speeds development. This is especially true when the data virtualization and data integration people can work more directly with the business building prototypes and managing processes. Companies who integrate data preparation tools in the process will see even greater gains in productivity accuracy and use.

As always, I’m just scratching the surface on the ideas and practices Philip and Grant discussed so follow the link to listen in for your self. For more of my thoughts on this series follow the link below.

Enhanced by Zemanta