Agility and Data Virtualization
As we move into 2012 analytics and business intelligence continue to be top of mind for C-level executives. Discovering new ways to leverage our existing systems coupled with the need to access new and dynamic data sources has exposed a lack of flexibility in traditional data management infrastructures. This lack of agility curtails innovation and reduces the effectiveness of mission critical business intelligence (BI) platforms.
BI initiatives are growing in nearly all companies while the supporting data landscapes grow and extend across distributed heterogeneous sources not easily managed or accessed. During a recent web seminar titled “Why Agile BI, Why Now and How Data Virtualization can Help” Informatica VP, Product Strategy, David Lyle called this challenge the “integration hairball” pointing out that serving “speed of the business” demands agility and removal of IT dependency. David makes a great point IT has long been the gatekeeper of data and often the reason for slow information access. As self-service BI solutions gain traction sophisticated users are demanding an agile environment that promotes speed and agility.
There are multiple reasons why traditional environments lack agility and are challenged to deliver fast time to value or more importantly fast time to data. The panel on this webinar did an excellent job of highlighting several challenges that contribute to this issue and can be quickly addressed by a more agile and innovative environment powered by data virtualization.
1. There are too many components involved in the BI stack
2. Business and IT don’t see things the same way
3. BI is treated as any other enterprise application
4. Data is every where not just in the EDW
5. It takes too long to deliver data / reports to business
Data virtualization offers a layer of managed data access that can overcome issues of speed. It delivers a common, logical and virtualized data abstraction and data access layer that analysts can directly access. The addition of data profiling and data quality functionality within the work process allows analysts to set data rules that IT can enforce better serving the needs of self-service end users. Applying these features to federated data in a virtualized environment provides even great agility to a BI system especially as BI tools simply assume the availability of fresh, accurate and consistent data – which is seldom the case.
Data virtualization is a strategic tool for companies to add value and agility to their traditional data management environments I’m participating in the series of web seminars hosted by Informatica that addresses this technology so watch my blog for continued coverage as the series moves forward.
Watch last weeks Data Virtualization Architect-to-Architect Roundtable and follow the series here